Not for the faint of art. |
Complex Numbers A complex number is expressed in the standard form a + bi, where a and b are real numbers and i is defined by i^2 = -1 (that is, i is the square root of -1). For example, 3 + 2i is a complex number. The bi term is often referred to as an imaginary number (though this may be misleading, as it is no more "imaginary" than the symbolic abstractions we know as the "real" numbers). Thus, every complex number has a real part, a, and an imaginary part, bi. Complex numbers are often represented on a graph known as the "complex plane," where the horizontal axis represents the infinity of real numbers, and the vertical axis represents the infinity of imaginary numbers. Thus, each complex number has a unique representation on the complex plane: some closer to real; others, more imaginary. If a = b, the number is equal parts real and imaginary. Very simple transformations applied to numbers in the complex plane can lead to fractal structures of enormous intricacy and astonishing beauty. |
This will probably be the last one of these for a while, as I'm going back to doing "30-Day Blogging Challenge ON HIATUS" [13+] next month. Which is tomorrow. Yikes. How Cats Became Divine Symbols in Ancient Egypt Felines served a useful purpose in ancient Egyptian households and were eventually associated with deities. There's an old joke going around that goes something like: Dogs: "These humans give us food and water and take care of us. They must be gods!" Cats: "These humans give us food and water and take care of us. We must be gods!" Alternatively, there's the quote I've heard attributed to Winston Churchill, which goes something like, "Dogs look up at us, cats look down on us, pigs treat us as equals." “Though it is hard to say the Egyptians thought one thing or another, since so much change happened across their 3,000+ years of history, the ancient Egyptians, in general, did not worship animals,” says Julia Troche, an Egyptologist, assistant professor of history at Missouri State University, and author of Death, Power, and Apotheosis in Ancient Egypt: The Old and Middle Kingdoms. “Rather, [they] saw animals as representations of divine aspects of their gods.” The distinction can be subtle, I suppose. If an archaeologist 4000 years from now dug up a typical home, they'd probably conclude that we worship the television, which is silly because the reality is that we worship the celebrities that appear on it. Whether or not they were worshiped as deities, cats were an integral part of ancient Egyptian life. And, based on mummified cats discovered in tombs alongside humans, they carried an important role in the afterlife, as well. Just yesterday, I saw that archaeologists uncovered a grave in Cyprus (which is kind of but not really near Egypt) from something like 10,000 years ago where some dude was buried with a bunch of his favorite toys... and his cat. The idea being that they sent the cat with him to whatever afterlife they imagined. Poor kitty was only eight months old, though. The cat didn't deserve that. For most of the civilization’s history, ancient Egyptians saw cats as mutually beneficial companions, according to Troche. “Cats might come inside when it was hot, and they in turn would chase away dangerous animals, such as snakes—many of which were venomous—and scorpions,” she explains. Most articles I've seen about cats emphasize their usefulness. As if that has to be a factor. How useful is the Mona Lisa? Cats are living art. Art that pukes on the couch sometimes, but still art. The presence of cats in tombs wasn’t limited to paintings—sometimes cats were mummified and placed inside their human companion’s tomb, according to the Carnegie Museum of Natural History. One reason this was done is that when cats were mummified, they could then be used as funerary goods. Again, the implication is that they killed the poor kitty just because its human died. “Ancient Egyptians held cats in such high regard because of the practices and preferences of their gods, but also because their kings, the pharaohs, kept giant cats,” Skidmore says. “Members of the Egyptian royal class dressed their cats in gold and let them eat from their plates.” And cats have never forgotten this. In addition to appreciating their ability to keep rodents, snakes and other pests out of their homes, the ancient Egyptians understood that cats of all sizes are smart, quick and powerful. Ah, no wonder the cats allowed that civilization to prosper for as long as it did. Through their ubiquitous presence in the art, fashion and home ornamentation of ancient Egypt, cats served as an everyday reminder of the power of the gods. Sure, if gods had the tendency to get underfoot and pee in your laundry basket. Come to think of it, though, cats are more useful than gods anyway. At least you can laugh at their antics. Try that with a god and you get smited. |
Wouldn't you like some new monsters for Halloween? Monsters help us face our fears, but familiarity with them takes the edge off... which is usually a good thing, because it means that the fear is no longer controlling us. You might have heard of some of these shards of nightmare, or you might not. I don't have much else to say about it, because I'm still recovering from some dental work I had done the other day. Speaking of fears. (Not one of mine, actually. Eye surgery is. And that's now scheduled for December.) Monsters are like porn. There eventually comes a time when you get bored with the normal stuff like vampires, zombies, or werewolves and want to spice things up a bit. And whether you're looking for new ways to scream or cream, your first stop should always be foreign countries where you can easily find stuff that was deemed too extreme for the US market... And since it's a Cracked article, it's a numbered list in descending order. 5. Ushi-Oni And Nure-Onna: Japan’s Damned Double-Act Of Doom The description of Nure-Onna (or Wetty Betty, if you will) differs from region to region. Sometimes she's human from the waist up with a monstrous snake tail where her legs should be, like H.R. Giger's take on The Little Mermaid, and sometimes she's just a head on a snake body. However, Nure-Onna can use magic to make herself appear as a human woman holding a baby. And that's how she gets you. Thankfully, that wouldn't work on me. I just assume anyone holding a baby and looking for help is some sort of scammer, and I back off and call the cops. Let them get sucked into the Nure-Onna's dark designs instead. Once she spots her victim, she magics herself into a mother in distress, calls for help, and then asks you to hold her baby while she pops to the store for some diapers or whatever. And then you're left raising the kid until they're 18! AAAAHHHHH! The horror! And yes, that disturbs me on a deep level, more than the whole "trapping you while the snake woman eats you and drinks your blood" thing. Same as Nure-Onna, Ushi-Oni likes to live near the water and loves having people for dinner together with Dame Judy Drench but what's weird is that no myth explains WHY. Like, Nure-Onna does all the work luring in people and all that, yet she allows the Cow Spider to join her. This leads us to only one logical conclusion: the two are banging. Now try to imagine what that looks like (and then try aiming for the trashcan.) I mean, really, for the full effect you'll need to click on the link, which has drawings. Go on. You know you want to. 4. Slavic Female Demons Sound Like An Excuse For Horrible Domestic Abuse Really? Because to me that sounds like the name of a hard metal Go-Gos cover band. Women could become a boginka, dziwozona, or mamuna by dying in childbirth, committing suicide, or killing their child. After the transformation, they liked to target young women, kidnapping them and beating them with sticks, dunking their heads in mud, and twisting their arms before ultimately returning them home … Which really sounds like a weirdly convoluted, ancient-Slavic version of "I walked into a door." For some reason, I've never heard stories about these. I wonder why. A lot of the myths surrounding Slavic female demons were actually meant to explain the difficult/unfortunate parts of life in a time when medicine boiled down to: "Sounds like you have goblins in your blood. Here's some vodka." Amusing as that is, vodka was a relatively recent invention in human history. 3. The Pishtaco Are The Embodiments Of The Andes’ Colonial Trauma I'll give you a few minutes to laugh at the name "pishtaco," which I suggest you do because things are going to get super gruesome super fast after that. So enjoy yourself while you still have time. They're not kidding here. "Pishtaco" comes from the South American Quechua-language "pishtay," meaning "to cut up/slaughter," and refers to a particularly effed up kind of vampire: one that steals your fat. Sounds like something a lot of us could use. The reverence of fat actually goes back to the dawn of human civilization, but the people of the Andes took it in a particularly disturbing direction with the stories of the terrifying pishtaco. I'm betting it goes back longer than that. The "pishtaco" boogeymen were created after the people of the Andes ran into Spanish conquistadors in the 16th century and saw them using the fat from slain enemies to treat their wounds. Just when you thought colonialism couldn't get any worse. 2. Tupilaq Is Greenland’s Blowjob Frankenstein Or a rapper. A rapper with an album called "Blowjob Frankenstein." Seriously, though, you're going to have to read this one for yourself. 1. Tokoloshe: Like A South African Graboid … Only Smaller … And Actually The Penis Of A Goblin Hm. Story idea: Combine 1 and 2. You're welcome. But again... I have a content rating to consider, so you're just going to have to read it yourself. Or not. I totally understand "not." In any case, it's probably too late to dress up as any of these for Halloween. Or, perhaps, there is no such thing as "not too late." |
Final entry for November's "Journalistic Intentions" [18+] One bit of trivia I remember from my misspent youth: "The pumpkin is the only edible gourd." Like much of the trivia foisted upon an impressionable child, it's false... though the definitions of "pumpkin," "edible," and "gourd" are as squishy as a jack-o'lantern on November 1. Botanically, cucurbita pepo (which if you follow the link would be the binomial for, among other vegetables, delicata squash) is interesting, similar in some ways to citrus fruits or the cruciferous vegetables cabbage/kale/broccoli/cauliflower/kohlrabi/etc. Or dogs, for that matter, though "botany" and "dog" don't really work together. The point being that for some species, innate genetic traits make diversification relatively easy. Based solely on the Wikipedia article linked above, Delicata squash is a cultivar of the same species that also yields such diverse foods as zucchini and spaghetti squash... as well as some varieties of pumpkins. The reason the definition of "pumpkin" is squishy is that, just to cloud the matter, some pumpkins are of an entirely different lineage. The reason the definition of "edible" is squishy is that for a long time, I didn't consider zucchini to be edible. Lately I've grown rather more fond of it, when properly prepared (it took a long time to get over my mother's cooking habits). But it obviously tastes quite different from pumpkin, and apparently also from delicata, which I haven't actually tried. And the definition of "gourd" is squishy because... well, that gets into a lot of details that are irrelevant to the discussion at hand; basically, gourds were known in Eurasia/Africa from antiquity - they're even mentioned in the Bible - whereas seeds taken in prehistoric times to the American continents took a different path, one which brought us to pumpkins and squash like zucchini. Which, to muddy the waters even further, is known as courgette in the UK and France. That's right -- despite the obvious Italian origins of the name, zucchini is (or to be pedantic, "are") American. By which I mean the continents in general, not specifically from the US. This origin, they share with things like tomatoes, potatoes, and maize, all of which Europeans happily adopted in an early bout of cultural appropriation. Anyway, despite its relationship with pumpkins, I don't think anyone's going to carve delicata squash for Halloween. But it sounds to me like a good addition to any harvest feast. |
I really, really hate self-checkout at grocery stores. The Banana Trick and Other Acts of Self-Checkout Thievery “Anyone who pays for more than half of their stuff in self checkout is a total moron.” But you know what I hate more than that? Well, I'm going to tell you anyway: Thieves. Beneath the bland veneer of supermarket automation lurks an ugly truth: There’s a lot of shoplifting going on in the self-scanning checkout lane. Now, I understand that sometimes, theft is acceptable. Hell, sometimes it's even necessary because of systemic issues. But people who steal just because they can? Honestly, if it didn't disincentivize the thieves to leave their victims alive, I'd support the death penalty for it. Not just the death penalty; torture first. That's how much I hate thieves. Self-checkout theft has become so widespread that a whole lingo has sprung up to describe its tactics. Ringing up a T-bone ($13.99/lb) with a code for a cheap ($0.49/lb) variety of produce is “the banana trick.” If a can of Illy espresso leaves the conveyor belt without being scanned, that’s called “the pass around.” “The switcheroo” is more labor-intensive: Peel the sticker off something inexpensive and place it over the bar code of something pricey. Just make sure both items are about the same weight, to avoid triggering that pesky “unexpected item” alert in the bagging area. I want to find everyone who does this and introduce them to Mister Pipe Wrench. You're not stealing from the grocery store, you know. They're going to find ways to make a profit, regardless. Not much of one, but if they can't make a profit, they go out of business. When they go out of business, you get food deserts. And if they do find a way to make a profit in spite of these despicable assholes, how does it happen? That's right - by a) raising prices for the rest of us, b) keeping wages and benefits down for workers or c) both. Either way, you're stealing from honest people and deserve to be flayed alive. No, I'm not kidding. The Leicester researchers concluded that the ease of theft is likely inspiring people who might not otherwise steal to do so. Rather than walk into a store intending to take something, a shopper might, at the end of a trip, decide that a discount is in order. Stores could do more to prevent this sort of thing, but that too costs money. Eventually, you reach a point of diminishing returns, where the money you spend on security is more than the losses involved. In their zeal to cut labor costs, the study said, supermarkets could be seen as having created “a crime-generating environment” that promotes profit “above social responsibility.” And yes, that's a problem too. The worst thieves aren't even these checkout scammers; they're corporate assholes. Whether out of social responsibility or frustration with shrinkage, some retailers, including Albertsons, Big Y Supermarket, Pavilions, and Vons, have scaled back or eliminated self-scanning, at least in some stores. Which brings me to why I hate self-checkout in the first place. Well, there are lots of reasons, not the least of which is that it never actually saves time since some wage slave has to check my ID anyway because I'm inevitably buying beer and/or wine. But another reason is that it takes jobs away from people. It's not immigrants people should be raging at for "taking our jobs;" it's management's insistence upon automation. “There is NO MORAL ISSUE with stealing from a store that forces you to use self checkout, period. THEY ARE CHARGING YOU TO WORK AT THEIR STORE.” Bull. Shit. “Shopping can be quite boring because it’s such a routine, and this is a way to make the routine more interesting. These can be risk-taking, stimulation-seeking people.” And when they get caught, 20 to 30 years in Federal prison should give them all the thrills they'd ever seek. No pity for such people. No tolerance whatsoever. Lock 'em up. I don't even care if it ends up costing society more in the long run; it's the principle of the matter. ...the problem being that because of the way society is set up, some people are forced into thievery, rather than just doing it for the thrill of the five-finger discount, and that is a systemic problem that needs to be addressed. But the court system isn't really set up to be able to tell the difference. And I strenuously object to being called names for being honest. That alone is worth a few yards of duct tape and a pair of pliers. |
I mean, aren't we always quoting Shakespeare? William Shakespeare devised new words and countless plot tropes that still appear in everyday life. And yet, when I devise a new word, people tell me I don't speak English good. But an incredible number of lines from his plays have become so ingrained into modern vernacular that we no longer recognize them as lines from plays at all. One wonders if he really did coin these phrases or adapted them from the vernacular. It's like... if someone's passing a meme around, and I put it in a book, and somehow the internet dies but the book survives, in five hundred years will they say I invented the phrase from the meme? ...probably not; they'll just assume I lifted it from somewhere. But for some reason Shakespeare gets the benefit of the doubt there. I'm not going to quote all of them, just my thoughts on a few. 2. "GREEN-EYED MONSTER" // OTHELLO, ACT III, SCENE III Before Shakespeare, the color green was most commonly associated with illness. Coincidentally, just this evening, I was perusing the contents of my liquor cabinet and went, "Hm. I wonder what would happen if I mixed this absinthe with that Midori?" (Both are green, if you don't know.) And now at least I know what to call it when I concoct that unholy abomination, probably right after I publish this. 3. "PURE AS THE DRIVEN SNOW" // HAMLET, ACT III, SCENE I AND THE WINTER'S TALE, ACT IV, SCENE IV For the record, this simile works best right after the snow falls, and not a few hours later when tires and footprints turn it into brown slush. But then it's not "driven" snow, is it? IS IT? Before cars, "driving" meant driving a team of horses (or other animals like oxen or whatever), maybe in front of a carriage or plow. Driven snow is thus snow that has been driven on. At least that's the interpretation I use; I just assume "pure as the driven snow" is sarcasm, like "clear as mud" or "smooth as sandpaper." But then, I'm of the considered opinion that Romeo and Juliet is best viewed as satire. 7. "GOOD RIDDANCE" // TROILUS AND CRESSIDA, ACT II, SCENE I Bye, Felicia. 13. "LOVE IS BLIND" // THE MERCHANT OF VENICE, ACT II, SCENE VI Chaucer actually wrote the phrase ("For loue is blynd alday and may nat see") in The Merchant’s Tale in 1405, but it didn't become popular and wasn't seen in print again until Shakespeare wrote it down. More likely, love is deaf. I mean, who hasn't fallen into a thirst trap and ended up missing red flags because they're just so damn hot? Too much looking, not enough listening. Also, remember what I said about maybe Shakespeare didn't coin all of these? Well. 21. "THE GAME IS AFOOT" // HENRY V, ACT III, SCENE I Nope! It wasn't Sir Arthur Conan Doyle who coined this phrase—Sherlock Holmes' most famous catchphrase comes from Henry V... At some point, somewhere, I wrote "the foot is a game!" in a zombie story. Can't for the life of me find it now, or remember what it was about. It's not on WDC. So anyway, short but interesting article for Shakespeare, idiom, and cliché fans. |
It's not just Cracked that tackles today's hottest topics. Sometimes it's also Popular Science. The best way to reheat pizza (and some things you should never do) We ruined some pizza so you won’t have to. And you're damn right the pun up there was intentional. While ruining pizza is normally sacrilege, I can somewhat forgive it for SCIENCE! because, hopefully, ruining a few now is better than ruining more in the future. It's kind of like a trolley problem, only for pizza. Pizza cart problem. Order pizza, and there’s a good chance it’s gone within hours. At least the first sentence addresses my incredulity: I mean, what even is "leftover" pizza? Something about that round wheel of dough, melted cheese, warm tomato sauce, and seemingly countless topping possibilities is simply irresistible. Seemingly? Let's assume there are one hundred possible toppings. I believe the actual number must be higher than that, because more than a hundred items of food exist. But just make it 100. The total possible combinations of potential toppings for pizza is therefore 100 factorial, which is written 100! because holy shit! that's a big number -- you get it by multiplying every integer from one to 100. Try it. It starts out slow -- 1x2 is 2, times 3 is 6, etc. but long before you get to 100 you will break your calculator. Hence, "countless." Which is not the same thing as infinite, but if you'll never get to try every possible pizza topping combination in your lifetime, what's the practical difference? Still, it’s hard to resist the temptation of a leftover slice as you rummage through the fridge for food the next day. Big assumption there. But okay, yes, I admit that sometimes I have leftover pizza. And before you say anything, eating it cold is fucking disgusting. I'm not a cockroach. Hell, I'm pretty sure I've seen cockroaches sitting around a campfire reheating pizza; even they know it's an abomination. Maybe you like cold pizza—there’s no shame in that... Yes. Yes there is. There is ALL the shame. Ugh. Foul. ...but if you’re looking learn how to reheat pizza in a way that restores some of that fresh-pizza magic, you’ll need to know what you’re doing. It always helps to know what you're doing. The very moment pizza is born out of the oven, it’s too hot to eat and might not even be done cooking. But right around 140 degrees Fahrenheit—the temperature experts recommend you dig in so you don’t burn your mouth—it begins its inevitable march toward complete decay, just like everything else on Earth. Existential nihilism from Popular Science? Okay. Incidentally, if you're hungry enough, you know you're not going to wait; you're going to shove that gooey mess right into your piehole right away. It will burn the roof of your mouth and particularly inflame the spot right behind your upper incisors. We even call that spot the "pizza bump" for that reason. You see, cheese only likes to be melted once, because when it does, it loses its integrity. So, it's like a politician, then. You should never leave pizza out on the counter or in the oven overnight (because of bacteria), but putting it in the fridge doesn’t do it any favors. Low temperatures congeal everything the dough has absorbed and accelerate the staling process, or retrogradation. In short: The starch in the crust recrystallizes, and all that fresh-pizza chewiness goes out the window. I will say that the absolute ideal pizza isn't fresh out of the oven. No, it's removed, sliced, and then left under a heat lamp behind glass somewhere in New York City for some indeterminate amount of time. Then, when you order it, it's returned to the oven for a couple of minutes. This twice-baking technique makes the absolute best pizza in the world. I have spoken. PopSci is based in New York City, so we tested each method with the thin-crust style of pizza the Big Apple is famous for. Good, at least they're starting with the Platonic ideal of pizza. The official reheating method of the /r/pizza subreddit, this calls for placing your cold pizza on an oiled, preheated non-stick pan and cooking it for two minutes over medium-low heat (or until the bottom of the slice is crispy). Then, pour two drops of water (less than a teaspoon) into the pan as far from the pizza as you can get. Cover the pan with a lid and turn the heat to low. Cook it for another minute. Way, way, way too much work for reheating pizza. Wash a pan, oil it, preheat it, pizza it, water it, cover it, uncover it, retrieve pizza, wash pan again. Hot tray in a hot oven Put a baking tray in your oven and heat it to 500 degrees Fahrenheit. If you don’t want to clean the tray later, you can line it with foil. Once the oven has reached the proper temperature, use an oven mitt to take the hot tray out and put your slices on it. Do you know how long it takes for an oven to preheat to 500F? This method is absolutely incompatible with my core philosophy of Instant Gratification. Air fryer Specialized equipment. Straight-up rebaking Seriously, this is what I usually do. The microwave + oven combo Bread, including pizza crust, should never, ever, ever go in a microwave. Not even for a moment. Not even if you don't turn the microwave on. Just pretend the microwave is a bread repellent. Yeah, I know it's faster, but in this case Instant Gratification is trumped by Taste. (The one exception to this rule is things like frozen breakfast sandwiches if you follow the instructions to the letter, because they were designed to be nuked, but even there, if there's a baking option, I use that instead.) Low and slow in the oven "25 to 30 minutes?" I can get a whole new pizza delivered by then. The other methods listed are even worse, mostly also involving microwaves. I mean, yeah, the microwave is one of the greatest inventions of all time, but it's important to know its limitations. One-Sentence Movie Review: Dune: Watching Dune is like viewing a painting at a museum... for three freakin' hours; they were too busy making it look visually stunning to make me care about any of the characters, and too busy setting up a sequel to actually resolve anything. Rating: 3.5/5 |
Entry #7 for the November edition of "Journalistic Intentions" [18+] I'm moderately adventuresome when it comes to food, in case you haven't noticed. While I won't try the truly weird stuff, whenever something new comes along that's not, like, mud or bugs or something else with a high grossness factor, I'll usually give it a shot. It took me a while to try Ethiopian food. If you've already clicked on the link, you'll see that fonio comes from about as far away from Ethiopia as it's possible to get while still being on the same continent, but I'm getting to a point here, so bear with me. So. Ethiopian food. When I was young, Ethiopia had a pretty well-known famine going on. There were all kinds of concerned PSAs on television with malnourished kids who had flies buzzing around them and whatnot. Very sad. And it turned out that, if I remember correctly, any food aid delivered to the country was intercepted by corrupt officials and rarely got to the kids with the bloated abdomens. So when I first heard of an Ethiopian restaurant, I was like, "What, you go in and sit there with empty plates and watch rich people eat?" But it turned out, when I finally went, they have an intriguing cuisine with a lot of tradition behind it -- as I would hope, since Ethiopia is probably one of the cradles of humanity -- but it also turned out that I didn't like it at all. Does this mean I don't like any food from Africa? No, probably not. Africa's a big continent with lots of different cultures with foods I cerainly haven't heard of, just like I never heard of fonio before seeing it on the prompt list. And apparently fonio is supposed to be even "better" than quinoa, whatever that means. Also probably easier to pronounce. What is it with these grains and their nonstandard orthography for names, anyway? I couldn't find a source for how to pronounce fonio. I'm nearly certain, though, that it's not phony-o, which is what the plain reading of that word would be. Oh, no, we can't have that; it's gotta be a big secret, so people in the know can look down on us ignorant slobs, like with quinoa. "Yeah, I think I'll try the quin-o-a." Sneer. "You mean the keen-wa?" "Well, how in the fuck was I supposed to know how to pronounce it? Is your menu audio? No, it's got words written on it, and words are supposed to be spelled something like they're pronounced, unless you're French." "Ahem, anything else, sir?" "Yeah, lemme have the gee-oh-duck." Sneer. No, I'm betting the pronunciation is like "phone-yo." But I'm not taking actual bets. For all I know it's pronounced like "asterisk." Apparently, it's gluten-free, which to me is a major downside. I don't have celiac, so I like my damn gluten, thanks. But that's not a dealbreaker, unless I'm someplace where they're gushing about being gluten-free, in which case I'd leave and find a baguette somewhere made with actual wheat flour. So yeah, I'd like to give it a try, but only if smug trendy foodies who would gloat. Best of all, though, you can even make beer out of it, and I'm absolutely down for trying that. At least once. I've had beer made with non-barley grains, and some of it doesn't completely suck. Now, let's try making whiskey out of it, too. That's healthy, right? |
I'm a bit hungover right now, so even this concept is giving me a headache. Why hard work alone isn't enough to get ahead We're constantly taught the recipe for getting ahead is to put our heads down and outwork everyone else. But that's not quite right. Some adages seem custom-made to keep people in their perceived social underclass. Things like "Money can't buy happiness," "The early bird catches the worm," and "If a job is worth doing, it's worth doing well." But to me the worst offender of all, the one trap they lay to give people the illusion that they might actually escape the clutches of poverty, is "Hard work pays off." 1) It does not, usually and 2) What even is hard work? Late this summer, UK author Kate Lister had a realisation that resonated. On Twitter, she wrote: “How old were you when you realised your original plan of being really nice, working really hard, & taking on much more than you should in the hope you would be automatically rewarded for this without asking, was totally shit?” Even earlier than I realized (US spelling) that Twatter is a cesspool. Because Twatter hadn't even been invented yet. Ah, the good old days. Despite adages and advice that tell people from a young age hard work will get you everywhere, it really won’t, says Jeff Shannon, an executive coach, and author of Hard Work is Not Enough: The Surprising Truth about Being Believable at Work. "They" tell you that hard work pays off so "they" can profit off your hard work. Consider this: if hard work were all it took to become financially successful, migrant laborers would be multi-millionaires. Also consider, for example, the work done by Amazon warehouse workers. I'm sure you've all heard the stories: on your feet, hustling around, following algorithms. Indisputably hard work, though I understand some find it fulfilling. (Get it? Fulfillment center? Badumtish.). Is it going to get you promoted to a cushy manager job? No. And on the other side, as far as I'm concerned, any job that has you sitting on your ass all day cannot be described as "hard work," no matter how diligently you focus on your task(s). It may be rewarding work, it may be your calling, it may drain you emotionally, it may even be difficult, but having done both physically- and mentally-demanding jobs, I can say without doubt that it is not. Hard. Work. But it’s not enough to take you all the way to the top. “At a certain point you look around and realise, wow, everyone works hard at this level. Expertise and hard work just become the expectation, and will not help you up the ladder.” And part of that is that it's not a ladder, it's a pyramid. Not necessarily as in "pyramid scheme," but if you have nine colleagues at your lowly mailroom-equivalent level, and one of you is up for a promotion, then nine of you will be stuck in the mailroom. This continues on up the corporate pyramid. And that's not even taking into account that companies don't promote from within like they used to. To really get ahead, you need to be doing more than just your job. Realisations like Lister’s often come on the heels of watching colleagues with similar (or fewer) abilities soar, while your career stagnates. More often than not, those who rise are the ones willing to politick their way to the top, while you were too busy just working hard to notice you should be working the room. And while this article started out promisingly, here it veers into "you need to 'work hard' AND get noticed AND also kiss ass." This flies in the face of societal training that begins as early as primary school, when students are taught that the quiet, hard workers are those most likely to prosper. They're told that because the corporate world needs people willing and eager to put in the time and effort, and management would prefer if such people didn't raise a fuss. Hence the "quiet" part. Think of the mule with a carrot tied in front of it like the bait on an anglerfish. In fact, as Shannon notes, hard work alone typically goes unnoticed after a certain point, because everyone around you is working at or about the same level. If you don’t draw attention to yourself in other ways, it’s easy to fade into the background. And then, as everyone else begins to take notice of this advice, you're once again lost in the crowd, and then you need to find even more ways to (positively) draw attention to yourself, and the escalation continues. Unsustainably. In the end, you're back where we started: career advancement is as much luck as anything. You're lucky if your co-workers can't be as self-promoting as you are. You're lucky if management happens to prefer your style over that of your colleagues. You're lucky if you have just the right personality, or know how to fake it. And let's not forget that prejudice, even unconscious prejudice, plays a role. But I'm not wading into that swamp while nursing a hangover. So, what, am I advocating a lack of "hard work," whatever the fuck that means if you're in, say, engineering design or IT? Well, no, not unless it's going to burn you out. I'm just saying: don't believe the hype. Think of all the people above you in the hierarchy who are partial or complete slack-offs. Did they get where they are by "hard work?" I don't think so. They also didn't get it by being slackers, but it apparently didn't hurt. Me, I was able to escape that race most of my life, and I am, admittedly, a slacker. But that's as much luck as winning the race is. |
The headline may be a bit misleading, but as a movie fan, I find this stuff fascinating. Also because it's Cracked, it's funny. Like I said, the title's misleading -- science fiction predates movies by nearly a hundred years. I suppose "helped create science fiction movies" would be more accurate. The movie camera was the internet of its day. As in, it was touted as this great technical achievement that would revolutionize science and usher in a new era of progress and enlightenment. But, instead, people almost immediately started using it to make dirty movies and goof around. Someone has probably enshrined this into a law already, but it seems to me that the first thing anyone does with any new technology is figure out how to make porn with, or of, it. Hence the old observation that the holodeck will be humankind's last invention. Born in the mid-19th century, Méliès was an illusionist who wanted to use the camera to film plays and his magic acts. Turns out magic acts don't translate well to recorded media, for the simple reason that it's too easy for cameras to do tricks, and audiences will think it's all film effects. This is why magic acts are usually filmed in front of studio audiences. Then one day, a weird thing happened. When filming scenes of city life, Méliès’ camera jammed. It took him a few seconds to fix, and when he later developed the film, he noticed that he accidentally invented the jump cut/stop trick. While the camera was malfunctioning, the scenery around the amateur filmmaker continued to change, none of which was being captured on film until Méliès started rolling again. And because the angle and position of the camera didn’t change, the resulting film seemed to show objects disappearing, a man turning into a woman, or a carriage turning into a hearse, etc. While it's inevitable that someone would have thought of this sort of thing eventually, accidents can absolutely drive invention. Méliès immediately realized he had stumbled onto something big, and he continued to experiment with other camera tricks until he was ready to put them all into a short film: A Trip to the Moon. That's the one with the space capsule in the man in the moon's eye. Yeah, you know the one. It's iconic, and for a reason. Go to the original article; the video is embedded there, so I won't reproduce it again here. Released in 1902, it was heavily influenced by the writings of Jules Verne and depicted the thing from the title. It also employed “every trick had learned or invented” to create what some consider the first science-fiction movie ever. It is my unshakable opinion that the first science fiction book was Frankenstein. Whether this movie was the first SF movie or not, I'm not as sure of. But like I said, it's iconic. And great job, Méliès, since A Trip to the Moon apparently convinced the world that movies could be full of fantastical scenes, flights of imagination, and, eventually, pants-shitting horror. Seriously, watching some of those old-timey movies feels like snorting bath salts off a heavily decomposed clown corpse dressed in your mother’s wedding gown. Just leaving this here so I'm sure you read that imagery. The point is, Méliès ushered in an era of fantasy and sci-fi movies, which officially ended in 2015 when Mila Kunis said out loud that she wanted to bone the half-human half-dog Channing Tatum in Jupiter Ascending. Come on now, Jupiter Ascending wasn't that bad. I mean, it was bad, but it certainly didn't end SF movies. Hell, Dune just came out. I'm not seeing it until Sunday. Which reminds me, I keep forgetting to do a review of the movie I saw earlier this week (which was definitely not science fiction for once), so now's as good a time as any. One-Sentence Movie Review: The Last Duel I keep going back to Ridley Scott movies in the vain hope that at some point, one of them will come close to the greatness that is Blade Runner: The Director's Cut; this hope is inevitably dashed, but The Last Duel, unlike some of his other historical fiction movies, doesn't completely suck -- it's longer than it needs to be; Matt Damon has already been in France once before this year; and Kylo Ren has already gratuitously taken his shirt off approximately 143 times, but the acting is superb and the movie is surprisingly topical for modern audiences despite being set in medieval France (which the director doesn't let you forget even though everyone speaks American English, because you keep seeing shots of Notre Dame de Paris under construction). Rating: 3.5/5 |
I find that many articles in The New Yorker are long and meandering, taking forever to get to a point, if they ever do. The Frustration with Productivity Culture Why we’re so tired of optimizing our work lives, and what we should do about it. This, however, is the opposite of long, clocking in at a whole two paragraphs. But I'm still not really sure what the author is trying to say, and I don't think the promise in the headline is ever actually realized. It feels unfinished. I even poked around on the internet to see if there's a longer version somewhere -- maybe it got trapped in limbo by my ad- and script- blockers? But no. If anyone does find one, let me know. On the surface, the article's about what it says: productivity. Not really my bag, though I have some opinions I'll note below. But that's not why I'm sharing it here: it's not just about "productivity," but about the connotations that any word accretes over time, and how it can mean different things to different people. Early in the pandemic, I received an e-mail from a reader who embraced my writing about the importance of deep work and the need to minimize distractions, but was thrown by my use of the term “productivity” to describe these efforts: “The productivity language is an impediment for me.” The plain meaning of "productivity" is just a qualitative measure of how much shit you're getting done. This can be for a job, or, in the case of writers, maybe how many words you can crank out in a period of time. I mean, to me, that's all it is, because I haven't been goaded into constantly increasing my productivity in my professional career. But apparently it's a loaded buzzword now. The comments were filled with a growing distaste for the many implications and exhortations that had become associated with productivity culture. “The productivity terminology encodes not only getting things done, but doing them at all costs,” one reader wrote. That's one take on it, I suppose. Others advocated for alternative terms, such as “alive time,” or “productive creativity”—anything to cleave the relationship between “productivity” the signifier and all that it had come to signify. And so it occurred to me that this is one way language changes: an old term gets negative connotations, so they come up with a new term to replace it. Eventually that new term turns negative, so they come up with something else entirely. I think a good example of this is the word "idiot." Originally, "idiot," along with other words like "cretin" and "moron," described, in an attempt at value-neutral scientific classification, those with mental development disability to a particular degree. People being people, we started using the words as insults, so they came up with a new category to subsume the meanings of the former words: "retarded." It just means "slow," as in "slow learner." Naturally, that one got misused, so they started using terms like "special." Now you can't call someone "special" without them thinking you mean "retarded." I don't know what the current polite term is. Doesn't matter, my point remains: words that get negative connotations get replaced. But another thing occurred to me here. Well, actually, it's been bugging me for some time, but this reminded me of it. It has to do with productivity itself. Employers sometimes demand that productivity continue to increase, probably so profits can increase. But isn't there a point of diminishing returns? Like athletes running faster and faster... but there's some asymptotic limit beyond which the human body just can't run. Demanding further increases in productivity is a little bit like expecting profit growth because the population increases, but at some point, the population just won't be able to grow anymore because there's no way to sustain it. I don't know what the limit actually is (for either), but there has to be one. That's about all I have to say; I think my commentary is longer than the original article as it is. But I'd welcome other interpretations. |
"Journalistic Intentions" [18+], #6 of 8... If the unthinkable happened, and I had to choose just one culture's cuisine to eat for the rest of my life, it would probably be Japanese. As much as I love Italian, Mexican, French, Thai, and even American, along with many others, I could probably limit myself to Japanese if I had to. And one of the reasons for that is sushi. I don't remember exactly when I first tried sushi, but I do remember where: a hole in the wall in Greenwich Village, whose name I've forgotten but it doesn't matter because restaurants in the Village have a half-life of approximately five days, so it's most likely gone now. I think I was in my 20s, and sushi was practically unknown where I came from (though at this point, there are a few good sushi restaurants around here, and you can even get something approximating it at the grocery stores). Sushi is, of course, not the only thing I like in Japanese cuisine, but I have certainly come to expect it at any place that serves anything remotely Japanese in style. There's an artistry to it, naturally. I've tried making my own, sans the raw fish which I don't trust around here, and it's hard to get it right. (It's a common misconception among us gaijin that sushi contains raw fish -- while it can, the defining aspect that makes something sushi is the presence of sticky, vinegared rice.) I only have one hard-and-fast rule about my sushi: I never eat octopus, cuttlefish, or squid. They're smarter than I am, and I make it a point to never eat anything smarter than me (bacon is the exception because it's just so damn tasty). Other than that, when it comes to sushi, I'm willing to try almost anything. I always said that if I'm ever in the cliché situation of a doctor telling me I have x months to live, I'm going to fly to Japan and eat sushi there. And also drink some really premium sake, as well as the scotch-like 25 year Yamazaki. Hell, I might have to do all that even if I don't have a deadline, but I'll at least wait until the pandemic dies down. Anyway, point is, if I only have x months to live I'd find a chef willing to prepare fugu sushi for me. Fugu, as you might already know, comes from the blowfish, whose liver is deadly to humans and so it has to be prepared just right. Not a chance I'd normally take, but what the hell, right? Having never been to Japan, though, the best sushi I ever had was in an obscure town on the Oregon coast whose name escapes me and I can't be arsed to look it up. The second best was in Las Vegas, and in the Before Time, I'd go to that restaurant every time I visited that city. Would I seek out the restaurant featured in the above link? You bet your ass I would. Apparently you can only get there by arranging it through the concierge of a luxury hotel. . I'd absolutely do that, even if the place has by that time been passed down to Jiro Ono's son. And now I'm really craving sushi. The third-best sushi I ever had was right here in my town, at a place called Ten. I think it's called that because it costs ten times as much as most other restaurants. But it's totally worth it, and I think I'm going to have to visit again soon. |
You might need to use a private window in your browser for this one, but not because of offensive content. I mean, maybe you'll find the content offensive; I don't know. The Opposite of Toxic Positivity “Tragic optimism” is the search for meaning during the inevitable tragedies of human existence, and is better for us than avoiding darkness and trying to “stay positive.” I haven't mentioned much in here about toxic positivity, which is basically what used to be called Pollyanna syndrome: the idea that we should always maintain a positive mindset. To be clear, I call it bullshit, and apparently I'm not the only one. In fact, I like to do just the opposite. Every silver lining has a cloud. Refusing to look at life’s darkness and avoiding uncomfortable experiences can be detrimental to mental health. This “toxic positivity” is ultimately a denial of reality. Telling someone to “stay positive” in the middle of a global crisis is missing out on an opportunity for growth, not to mention likely to backfire and only make them feel worse. Especially be on the lookout for any sentence that contains the phrase "at least." As in, "I know you just lost your arm in an accident, but at least you still have the other one." As the gratitude researcher Robert Emmons of UC Davis writes, “To deny that life has its share of disappointments, frustrations, losses, hurts, setbacks, and sadness would be unrealistic and untenable. Life is suffering. No amount of positive thinking exercises will change this truth.” 1. "Gratitude researcher?" Fuck right off with that bullshit. 2. "Life is suffering?" Sod back off to Tibet with that crap. The antidote to toxic positivity is “tragic optimism,” a phrase coined by the existential-humanistic psychologist and Holocaust survivor Viktor Frankl. And right now I'm picturing an emo kid with a black t-shirt sporting an image of Frankl. (I had no idea what he looked like. Turns out: exactly as you'd expect.) Tragic optimism involves the search for meaning amid the inevitable tragedies of human existence, something far more practical and realistic during these trying times. Meaning? HA! Hang on while I put on my black t-shirt with the picture of Nietzsche. (I do know what he looked like. Epic mustache and all.) The gratitude researcher Lilian Jans-Beken and existential positive psychologist Paul Wong created an “Existential Gratitude Scale” to measure the tendency people have to feel grateful for all of human existence, not just the positive aspects. I'm starting to need booze for this crap. I seriously can't copy any more of this. The article, in my considered opinion, fails to make any meaningful distinction between "toxic positivity" and "tragic optimism." So I'm going to propose a third approach; there are probably many more. What I propose is, as you might expect, drinkingcomedy. There's a quote I used to use as the introduction to every Comedy newsletter I wrote. It's by Robert A. Heinlein, from Stranger in a Strange Land, and one version of it goes like this: "I've found out why people laugh. They laugh because it hurts so much... because it's the only thing that'll make it stop hurting" ... "But that's not all people laugh at." "Isn't it? Perhaps I don't grok all its fullness yet. But find me something that really makes you laugh... a joke, or anything else- but something that gave you a a real belly laugh, not a smile. Then we'll see if there isn't a wrongness wasn't there." I first read that book when I was a kid, which is probably one of the reasons why I'm so warped (along with the tunnel scene from Willy Wonka). The quote struck me as true at the time, and my life experience since then has only reinforced its veracity. Think of a joke. Any joke; doesn't matter, as long as it's one that you absolutely laughed at (which means probably not jokes that rely on puns; those are mostly only painful to the victim of the telling of them). If you can't think of any, here's a joke that was once ruled to be The Funniest Joke in the World: Two hunters are out in the woods when one of them collapses. He doesn't seem to be breathing and his eyes are glazed. The other guy whips out his phone and calls the emergency services. He gasps, "My friend is dead! What can I do?" The operator says, "Calm down. I can help. First, let's make sure he's dead." There is a silence; then a gun shot is heard. Back on the phone, the guy says, "OK, now what?" Leaving aside for a moment the obvious problems with the delivery (not least of which is the use of passive voice), that joke absolutely relies on tragedy and misunderstanding. I mean, a guy is dead! Not just any guy, but the main character's friend! This would suck if it happened to you, wouldn't it? And the tragedy is compounded by the MC's misunderstanding of the instructions given, something that would surely keep him up at night for the rest of his life. But a whole lot of people laughed at the joke, as is evidenced by its status as World's Funniest Joke. Of course, at this point that joke, and variations of it, have been circulating long enough that it's probably not as funny as it was when it was fresh. Also any time a joke is analyzed like I just did, it becomes not funny. And finally, senses of humor differ; you personally may not find it amusing. So here's another one with less death involved. It's from the same project to find the funniest joke: Sherlock Holmes and Dr Watson were going camping. They pitched their tent under the stars and went to sleep. Sometime in the middle of the night Holmes woke Watson up and said: “Watson, look up at the stars, and tell me what you see.” Watson replied: “I see millions and millions of stars.” Holmes said: “and what do you deduce from that?” Watson replied: “Well, if there are millions of stars, and if even a few of those have planets, it’s quite likely there are some planets like earth out there. And if there are a few planets like earth out there, there might also be life.” And Holmes said: “Watson, you idiot, it means that somebody stole our tent.” I'm not going to ruin that one by analysis; hopefully the essential tragedy at the core of the joke is obvious. Yes, comedy also requires the unexpected, but there's also, almost always, an element of what in another context would be tragedy or suffering. I know I've said this sort of thing before, even quoting the same jokes, probably in a Comedy newsletter, but it's been a while and it's relevant to my thesis today. My point being that we all know that life can sometimes suck, and one of the main purposes of humor is to acknowledge this and prepare for it. While it's hard, perhaps even impossible, to find humor when you, personally, have just received bad news, comedy can still work to take the edge off if it's not related to the subject at hand. As Mel Brooks once noted, "Tragedy is when I cut my finger. Comedy is when you fall into an open sewer and die." And bad shit will happen to all of us. As I've said before, my eyesight is deteriorating, and I'm going to have to have cataract surgery. I have a phobia about anything touching my eyeballs (though my eyelids get a pass there). How do I deal with this? I say things like, "Damn, I shouldn't have watched so much porn." There is (quite literally) no "bright side" to losing one's vision. But humor takes some of the sting out of it. Now if you'll excuse me, I have to go shave my palms. |
I can admit when I just don't understand something. This is one of those times. How to be a man Old ideas of manliness make us miserable. Being labelled ‘toxic’ doesn’t help. A reimagined masculinity is the way forward As is often the case for these things, the author is basically trying to sell his books. I've noted before that this is not necessarily a bad thing, especially since a lot of us are writers. The techniques used for selling books are writing exercises in themselves. That said, I still don't fully understand this. I don't know whether it's because I'm completely stupid when it comes to such topics as emotions, gender identity, and how to relate to other people, or if it's a really clever marketing thing: "If you want to know more, buy my book!" And that’s what is needed to be a man today: the freedom to customise one’s gender identity and not be forced into what’s on the rack. I think that's how I live my life? Mostly? I don't know. One essential article we all need in our wardrobe is emotional resiliency. And I probably should have stopped reading there, because I don't have a good idea about what that actually is. We might still buy into the beliefs that we’re supposed to avoid asking for help and that we should not talk about our fears, sadness or emotional isolation. Maybe because talking about such things is a great way to become even more isolated? There is very little more unattractive than depression and loneliness, so admitting to those things is a surefire route to becoming more depressed and lonely because people scatter like roaches in the light. So I'm not going to say much more about the article. It's fairly long and I'd understand if you didn't want to read it. I did, and then I had to go watch videos on quantum physics and mathematics because I understand them better. But I'm putting it here in case a) someone has some insight and b) something sinks in with me and I can go back and revisit it at some later date. Mini-Contest Results! I enjoyed all the comments yesterday. Hard to pick just one word, right? Considering the different pronunciations of, for example, through, tough, trough, and thorough, it's a wonder anyone can learn English at all. Some of you mentioned "knight;" I've heard that the word used to be pronounced more like how the French Persons in Holy Grail said it: k'nigk't, or something similar to that. The K and the GH ended up becoming silent and the vowel lengthened. Don't ask me how or why. People are weird. But yeah, place names are probably the worst. I'm reminded of the town in Indiana called Versailles. Anyone with even a passing familiarity with even the slightest bit of French (or some knowledge of WW1) knows to pronounce it like "ver-sigh." But no, not in Indiana, where you have to call it "ver-sails" or they run you out of town on a ryel. So today's MB goes to Pumpkin Spice Sox for the place name nonsense. We'll do this again soon. |
Ever wonder why spelling is so damn hard? Typos, tricks and misprints Why is English spelling so weird and unpredictable? Don’t blame the mix of languages; look to quirks of timing and technology Yeah, I don't know about that, a bunch of stolen words from other languages has to take at least part of the blame. Part of the problem is that English spelling looks deceptively similar to other languages that use the same alphabet but in a much more consistent way. You can spend an afternoon familiarising yourself with the pronunciation rules of Italian, Spanish, German, Swedish, Hungarian, Lithuanian, Polish and many others, and credibly read out a text in that language, even if you don’t understand it. Your pronunciation might be terrible, and the pace, stress and rhythm would be completely off, and no one would mistake you for a native speaker – but you could do it. Even French, notorious for the spelling challenges it presents learners, is consistent enough to meet the bar. There are lots of silent letters, but they’re in predictable places. French has plenty of rules, and exceptions to those rules, but they can all be listed on a reasonable number of pages. Sure, but consider the French translation of "birds:" oiseaux. Not one single solitary goddamned letter in that word is pronounced the way it ought to be pronounced. The answer to the weirdness of English has to do with the timing of technology. The rise of printing caught English at a moment when the norms linking spoken and written language were up for grabs, and so could be hijacked by diverse forces and imperatives that didn’t coordinate with each other, or cohere, or even have any distinct goals at all. If the printing press had arrived earlier in the life of English, or later, after some of the upheaval had settled, things might have ended up differently. Okay. This author is a linguist so I'm inclined to give more weight to what they're saying than, for example, what I'm thinking. I don't have much more to say about the article, but it's a fascinating brief history of written English. I will say this, though: At some point, the spelling/pronunciation link becomes a shibboleth. I think people use it to identify in-groups. For example, in my area, there's a road with the name Rio Road. We use it to spot tourists. "Yeah, I hear (business) is on Ree-oh Road." Oh, they must be from out of town; the proper pronunciation is Rye-oh. Or there's a nearby town named Staunton. You hear someone pronounce it "stawn-ton" and you know they ain't from around here and need to be watched carefully and maybe lynched. Probably the worst offender in the orthography world, though, is the geoduck. You see a word like that, and you think: oh, it must be gee-oh-duck. And it's probably a bird, right? An... earth bird? Well, obviously you're an ignorant rube and unworthy of respect because you didn't know it's pronounced "gooeyduck" and it's actually an enormous mollusk. How in the inconsistent hell do you get "gooey" from "geo?" I mean, seriously, goddamn, STOP IT. So my contention is a language's inconsistencies are mostly there to fool outsiders to the language. It's like a secret code between club members: there to distinguish the cool people from the out-of-touch. You can also identify nerds with it: people who read more than they ought to tend to pronounce words they way they sound in their heads. For example, I know someone who, upon seeing a picturesque scene, always called it "picture-skew." Because that's a perfectly legitimate reading of the pronunciation of all those letters. Now, I'm not one of the long line of idiots who try (and certainly fail) to "reform" English spelling to something more consistent. Such efforts were always doomed to failure, because, as I noted, it's a private club and you can't get in until you familiarize yourself with, at least, 90% of the silly rules, and even then, we'll still mark you as an outsider because you'll certainly have an accent. But dammit, I'm tempted. Merit Badge Mini-Contest! What's your favorite example of a word that ought not to be spelled like it is? Comment here before midnight tonight WDC time (Monday night) and you might get a Merit Badge on Tuesday. |
Entry #5 of 8 for "Journalistic Intentions" [18+]... The weather forecast for tonight predicts incredibly cold temperatures for the first time this fall. Of course, by "incredibly cold," I mean anything less than about 55F; for some reason, some people like those temperatures. See yesterday's entry about being crazy. I hate fall. There I said it. I hate it for the temperature (yes, I hate winter more), and for the incessant screaming about the impending holiday season, but also because of the goddamned leaves. I swept my new deck today because it had accumulated leaves. I went inside for ten minutes and came back to find a new blanket of crunchy, dry leaves on the deck. Having done all the physical exertion I was willing to do for one day, I gave up. But even I can admit that there are some good things about fall. Oktoberfest (which is long over but the beers remain, which is the important thing); some of the pumpkin beers, which are only offered in the fall; and my beer chili. You sense a theme here? I certainly do. My beer chili isn't really mine; I got it from a "quick and easy" recipe book many years ago, but over the centuries, I've adapted it for my own tastes, as one does with chili. For instance, the recipe calls for an optional can of chili peppers. Chili peppers aren't spicy enough, especially the canned crap, but I don't consider hot peppers optional in chili, so I use chopped serranos or, if they're not available, jalapenos. I also throw in a few drops of ghost pepper sauce. And I make it with a dark beer for reasons best left to myself. And yes, my chili includes beans in addition to ground beef. I don't think any Texans reading this know exactly where I live, so I'm probably safe. All of which is to say that, as I've noted before, when I'm first following a recipe, I follow that fucker to the letter. Which is not as easy as it sounds. A recipe necessarily incorporates some hidden assumptions; it's not a complete set of instructions for cooking something. There's some basic knowledge assumed, in addition to some basics that most of us take for granted: A heat source, usually; access to running water; the existence in one's house of the proper pots, pans, spoons, dishes, etc., and how to decide which one is appropriate. That said, a recipe that starts with "First, buy or rent a house with a stove" gets real boring real fast, so they apply certain compression algorithms. You don't usually think of them as such, but they exist, just as they exist in all languages. But I digress. The most important thing for me with eating is taste. A close second, though, is ease of preparation; hence the "quick and easy" bit above. (Nutrition is way down the list.) I've seen chili recipes that start with something like "Buy [pepper that only grows in one obscure Mexican state and has to be harvested during a full moon that falls on the equinox]. Dry the pepper for 40 days and 40 nights, then hand-grind it into powder..." I'm sure the resulting chili would be out of this world, but mine takes about an hour from start to finish, including cleanup. Sometimes, as with the chili, I revisit old favorites, but sometimes I like to try new stuff -- again, as long as it's quick. And the recipe in the above link seems to fit the bill: a dish I'd never heard of (there aren't any Indonesian restaurants around here), but with mostly familiar ingredients, and it doesn't seem like it takes all day to prepare. It seems to be basically vegetables in a peanut sauce, and I've had the Thai and Vietnamese versions of that sort of thing -- at restaurants; anyway. As usual, I have Opinions on some aspects of it. This is a dish for the veg hating child within all of us. Yeah, no, turns out I only hated vegetables when my mom cooked them. Sure, I won't eat eggplant (because I don't consider it food at all), but most vegetables suffer from only one drawback: by the time I get them home, they've gone wilty and/or moldy. I should note, however, that I have an intense dislike for peanuts (which aren't nuts and are only vegetables in the most technical sense, the way wheat is a vegetable). But for some reason I'm okay with peanut butter or Asian peanut sauces. Go figure. Gado Gado! We love saying the name, we love how colourful it is, we most definitely love eating it, and we REALLY love that how virtuous it makes us feel, scoffing down so many vegetables for dinner! Ever want to smack an article so hard the writer rubs her face? Gado Gado is all about the peanut sauce which is a slight variation of Thai Peanut Sauce. When made from scratch, it’s a bit of a pain, calling for pureeing roasted peanuts (and it’s tough to make it completely smooth), a handful of aromatics like lemongrass, galangal, garlic, South East Asian “umami” from shrimp paste, plus sauces. Now, see, if I had to do all that crap, I'd end up ordering a pizza. Fortunately, the recipe bypasses such nonsense. Here's where I point out, if you haven't read the link already, that the target audience seems to be Australians. Which is fine. I mean, people have to eat no matter where they're from. But I doubt that I'd find everything on the list here in the US. Thai red curry paste – my favourite brand its Maesri. Best most authentic flavour by far – and happens to be the cheapest at ~$1.50 for a little can. Available at large grocery stores in Australia (Coles, Woolworths, Harris Farms) and of course, Asian stores. Today I learned that Woolworths is a grocery store in Oz. Here in the US, it was a department store, emphasis on the "was." Natural peanut butter – Natural peanut butter is 100% peanuts and has a stronger peanut flavour than commercial peanut butter which has sugar and other additives. Look, I'm about as far from being a granola-cruncher as you can get. Like I said above, my main criterion for eating is taste. And, to put it bluntly, commercial peanut butter in the US tastes like slightly sweet plastic. Jippy or Skif or whatever. Not only do the additives negate any potential health benefits of peanut butter, but it just doesn't taste right unless you're like 8 and eating it on a sandwich your mom hastily slapped together in the morning before you left for school. The downside of "natural" peanut butter is that it generally lacks emulsifiers, so you gotta mix the oil back in when it "naturally" separates out. This is work. It's worth it. Kecap Manis – dark sweet thick sweet Indonesian soy sauce. Thicker and sweeter than normal soy sauce, with a consistency like syrup. Here in Australia, kecap manis is available in major supermarkets and Asian stores. Easy sub: honey and dark soy sauce. I've never even seen this, but then, I haven't been looking. Of course Japanese soy sauce is one of my staples, though. The one ingredient you spy in the above that you mightn’t be familiar with is tempeh. Tempeh is an Indonesian fermented soy bean product that vegetarians are mad for! [precious emoji here] Smack. Okay, I don't know much about food availability in Oz (mostly I know you gotta avoid whatever's trying to turn you into food there), but tempeh is definitely a thing here. Confession time: I actually like it. Not as much as I like meat, of course, but I actually find it quite tasty when it's prepared in an appropriate dish. The original article goes on about preparation technique, and includes a video that I didn't bother with because I cannot cook something from a damn video. Fortunately, there's also the standard recipe format below the video. In conclusion, while this recipe is more complicated than my chili (and a lot more complicated than UberEats), it seems interesting enough that I might actually try it sometime. And then I can finally say I had Indonesian food. Sort of. I know at least some of you reading this have Indonesian ancestry. If you want, I'll share my latkes recipe sometime so we can engage in mutual cultural appropriation. |
I don't know how I found this article from 2013, but it's unlikely that the age of it makes a difference. What Neuroscience Says About The Link Between Creativity And Madness New research sheds more light on the strong ties between an original mind and a troubled one. Creativity is a thing that eludes me. Madness? Well. I wouldn't know, would I? The idea that very creative people are also a little crazy has been around since humanity’s earliest days. Maybe the article was written before we were supposed to stop using "crazy," which is retarded. In ancient Greece, Plato noted the eccentricities of poets and playwrights, and Aristotle saw that some creative types were also depressives. In modern times, that connection has persisted, from Robert Schumann hearing voices guide his music to Sylvia Plath sticking her head in an oven to Van Gogh cutting off his ear to Michael Jackson … being Michael Jackson. And there you have it, folks: Michael Jackson in the same paragraph as Plato. Today the link between creativity and mental illness is firmly embedded in the public conscience. Unlike some supposed cultural wisdoms, however, there’s a good bit of scientific evidence behind this one. First you'd have to define "crazy." I mean, sure, there are modern psychological definitions of all kinds of mental disorders, but it seems to me that by the common definition of "crazy," in order to be creative, one has to be crazy. Because the word describes someone whose thoughts and actions are outside the "normal" (whatever that is). So all creative people are crazy, but that doesn't mean (sorry, crazy people) that all crazy people are creative. The article, though, mentions particular recognized mental disorders, and I won't copy all of the examples here. The new work enhances a theory by Shelley Carson, a Harvard psychologist and author of the book Your Creative Brain, which says that creativity and mental illness share a process called “cognitive disinhibition.” The term is a mouthful, but essentially cognitive disinhibition describes a failure to keep useless data, images, or ideas out of conscious awareness. There is no such thing as useless data. There is only data to be stored away for later use. “[Y]ou have more information in conscious awareness that could be combined and recombined in novel and original ways to come up with creative ideas,” Carson tells Co.Design. Which kind of touches on how I view creativity: it is, at least in part, the ability to make connections that others might miss. I think of it as living a metaphor. That’s why not all creative people are a bit crazy and why not every mentally ill person is especially creative. “It’s not a one-on-one correspondence,” says Carson. In fact, she says, most creative people don’t exhibit severe mental problems at all; rather, the notable examples stick in our minds. Sure, if you use the DSM to diagnose "crazy." My point above still stands in opposition to this assertion: that creative people are crazy by definition. Which brings us back to our list of eccentric artists through the ages. Perhaps genes contributing to mental problems have persisted across humanity in part because they also contribute to superior creativity. That is not how evolution works. Genes persist if the people possessing them live long enough to reproduce, and then reproduce. Having "crazy genes" (the existence of which I doubt in the first place) doesn't necessarily keep someone from finding a mate and making little crazy people. I mean, look around if you need evidence for that. “Even though we know mental illness in and of itself is not conducive to survival of the individual, there may be aspects of mental illness that promote survival in the overall species,” Carson says. Sure, profound mental illness may preclude passing on one's genes, but it doesn't have to be profound to exist. Also, at least some mental illnesses are environmentally-driven. Anyway, my point in sharing this isn't to rag on their view of evolution, but because a few craz- er, I mean, creative people read this blog, and you should feel better about being just a little bit whacked. |
Yeah, I know the entry title is clickbait. Hey, it's a legitimate writing skill. Of course, the actual article is from Cracked, and it's only about individual words which, of course, only have whatever meaning we decide they have at any particular time. Kind of like everything else, actually. The Bible is how we got such words as "spake," "mammon," and of course "Nehemiah." Spake -- wasn't he the cousin Spock never talked about? Also surprised they don't mention Nimrod in this article. You've probably called, or heard someone call, someone a "nimrod" to mean something like "idiot" or "fool." Nimrod was described as "a mighty hunter" in the Bible; Bugs Bunny called Elmer Fudd "Nimrod" ironically and boom, a new usage is born. Praise Bugs! It also contributed some words that we actually use. Well, actually... Apart from a boatload of names, the Bible didn't directly contribute a lot of words; the English translation of it did. Of course it's translated into hundreds of languages now, but the original was written primarily in ancient Hebrew, Greek, and Aramaic. But I think the article is mostly taking about the KJV, which uses language similar to Shakespeare because it was published at about the same time. 4. There's Nothing "Immaculate" About Virginity It seems a lot of people don't know what "immaculate conception" means. Because the Immaculate Conception was not the birth of Jesus to a virgin (an event you can instead conveniently just refer to as "the virgin birth"). It was the conception of Mary herself. Mary's mother was no virgin -- Mary's parents had sex, which is the normal way of making kids -- but the conception of Mary is called "immaculate" because she herself had no original sin. To further complicate matters, "immaculate" is a Latin-root word, and you'll note that Latin isn't in the list of languages in which the Bible was originally written. It was, of course, translated into Latin very early on. Bonus knowledge: "macula" in Latin meant "spot," in case, you know, you were looking for a name for your dog. Unborn babies aren't generally known for sinning much, especially if you revere unborn children as much as the Catholic Church does. And yet, an immaculately conceived baby is actually one whose parents had sex just fine but whose own soul is free of sin. Free of original sin, the flaw every other baby starts out with, ever since those ancestors sinned back in Eden. Look, I like Cracked, obviously, but they're not theologians; they're comedians. If the sentences I just quoted seem to contradict themselves, they do; welcome to the world of theology. But rest assured, immaculate conceptions have nothing to do with abstinence, and if you yourself were conceived thanks to good old copulation, there's nothing dirty about that. Religions may have some strange ideas about sex, but no religion says it's dirty for a husband and wife to have sex to make a child (least of all Catholicism, which is super into marital sex). Any religion that does claim that will have some trouble creating new followers. There have absolutely been religions that say this, and in accordance with the prophecy, they had trouble creating new followers. In any case, as we've seen numerous times (see also "decimate,") the meanings of words and phrases change over time, so unless you're a Bible literalist, at some point you're going to have to go with the flow and accept that Anakin Skywalker had an immaculate conception. 3. There's Nothing Especially Good About Samaritans (Though Also Nothing Bad) This one has bugged me for a while, because unlike mere words, "Samaritan" denotes a particular group of people, particularly the group living circa 30 C.E. The most famous parable in the Bible is surely the one about the Good Samaritan, the guy who helped a battered robbery victim after various other travelers turned a blind eye. And so today, anytime someone does a good deed, you might call them a "good Samaritan." Or, just call them a Samaritan -- "good Samaritan" feels redundant. There's even a charity simply called "the Samaritans." This is like calling your group "the Edomites" or "the Third Dynasty Egyptians." Samaritans were an ethnic group. Still are an ethnic group, actually, which is why it's kinda weird that we use the word to refer to anything other than the real people today. Okay, so their descendants are in fact still around. Yes, I learned something. In a Biblical context, Samaritans were mainly relevant for being an ethnic group Jews didn't much like. I think a modern parallel would be how a lot of people in Europe feel about the Roma or... well, you know, the Jews. The reasons for this prejudice were long and complicated, so all you need to know is that, as with every interethnic political feud, there was never any good reason to hate individual members of the maligned group. I'm just leaving this quote here because it's vitally important to remember. It would be like if, today, someone teaches a lesson by telling a story about three men locked out of their homes. The first man, a priest, climbs up a drainpipe to an unlocked window, slips, and breaks his neck. The second man, a rabbi, drives his pickup truck straight through the closed garage door, causing much property damage in his quest to gain entry. But the third man is Florida Man, and despite what you'd think, he phones a locksmith and has his door opened professionally. This is the story of the Prudent Florida Man, a lesson that subverts our expectations. That's... almost... a good joke, which is in fact what one would expect from a comedy writer. As years go by, now imagine people start calling various prudent people "prudent Florida Man," and then simply "Florida Man," because that's what they now think “Florida Man” means. That means the parable does a hell of a good job at breaking the original Florida Man stereotype, breaking it so hard it's eventually forgotten. But it also forgets half the original message of Jesus' parable, which is "just in general, don't be racist." And again, just, you know, leaving this here. 2. Every Prodigal Son You Can Think Of Probably Isn't Prodigal Over the years, people often quipped "the prodigal son returns" when someone returned, so "prodigal son" itself became a phrase that means "one who returns." It's funny how often that sort of things happens with language. Like how we said "good Samaritan" so much, we started thinking "Samaritan" means "someone good." Or, say, what happened with the phrase "damsel in distress." We used the phrase "damsel in distress" so much, we started thinking "damsel" itself means "woman in distress." Some people even now use the word damseling, when they want to say turning a woman into a woman in distress. And yet damsel really just means "young woman." As an aside, so does the word in Isaiah 7:14 which is usually translated as "virgin." That particular verse wasn't a prophecy; the writer of Isaiah was basically saying "when pigs fly." But anyway, back to prodigal. What does prodigal really mean, you ask? It means "spends wastefully." Meanings change, yes, but I do think it's important to know how they change. In fact, in the news media, "prodigal son" is often misused for someone who leaves home and makes a lot of money, which might be the opposite of being prodigal. Not the first time a word has come to mean its opposite, and likely not the last. See also: cleave. 1. "Vengeance Is Mine" Means Don't Seek Vengeance Here's another phrase that appears lot in pop culture. "Vengeance is Mine" is the title of a dozen different movies, as well as TV episodes, books, and songs. Always, the title comes from someone getting revenge. Take the classic Roald Dahl story "Vengeance Is Mine, Inc." Two broke bothers (prodigal ones perhaps) decide on a moneymaking scheme: taking revenge on clients' enemies, such as by stripping victims to their underpants and then dumping them on a public street. We see them carry out one job, breaking a newspaper columnist's nose, and then they ride off into the sunset. I mean, okay, I admit to being misled by most of these, but I always knew about the "...sayeth the Lord" completion of that particular Bible quote. Which makes sense. Most religions believe sinners get punished after death. Well, maybe not Judaism, so there could be room for vengeance if you take an Old Testament view, but other religions preach some form of hell, or karma, or reincarnation. That means religious people should see no need for exacting vengeance on those who are still alive, since that'll be taken care of supernaturally in the afterlife. And yet, it's often religious people who most believe that it's our duty to punish the wicked. No, that is not what most Jews believe about vengeance, and thinking that they do contributes to centuries of Judaeophobia (a word I coined because everything has to have a -phobia ending these days, so let's be inclusive here), so stop that. There are other reasons for punishment (deterrence, rehabilitation, incapacitation), but it's retribution (vengeance) that so many moral crusaders demand when they call for harsh sentences. Well, the next time you hear a Bible thumper say we need to fry a criminal because it's what they deserve, ask, "Have you so little faith in our Lord in heaven, that you doubt He will punish evildoers as He sees fit?" On second thought, don't say that. That's how Jesus talked, and we all know what they did to that guy. So there you have it, and yes, sometimes it pays to get your theological instruction from a comedy writer. On a personal note, I think I've fully recovered from my procedure yesterday morning, and everything went well. So, bad news, you'll probably have to put up with me for a while longer. Probably. |
Very short article today, which is good because I don't really have much energy to comment on it. Mathematicians have solved traffic jams, and they’re begging cities to listen Mathematicians are unimpressed by engineers’ solutions. I know the headline talks about engineering and mathematics, but it's about something most of us have to deal with on a personal level. Most traffic jams are unnecessary, and this deeply irks mathematicians who specialize in traffic flow. Not as much as it irks drivers. Krylatov would like to solve urban traffic jams forever, so much so that he has coauthored a book of new math approaches to traffic and ways to implement them. I've said before that I don't mind people using articles to promote their book. This might be an exception. Few people who read this article are in a position to give a damn what the book says. 1. All drivers need to be on the same navigation system. Yeah, that's going to happen. Maybe when people get over their fear of autonomous vehicles, which will be shortly after pigs fly and right before hell freezes over. I had a rant prepared in my mind about that, but now it's going to have to wait. 2. Parking bans. Many urban roads are too narrow and cannot be physically widened. What's the damn point of driving anywhere and suffering through traffic jams if there won't be a place to park? 3. Green lanes. For cities that want to increase electric car use, special lanes should be created for electric cars, providing an incentive for their use. Or, you know, you could put the parking back in. 4. Digital twins. Traffic demands and available infrastructure can only be balanced with digital modeling that creates an entire “twin” of existing roadways. I know I haven't messed with traffic engineering for many years, but this bit made no sense to me. And just in case you were wondering, “The mathematical approach in this case is superior to the engineering and economic one.” Look. Yes, it's been a while for me. But one thing hasn't changed since I went to engineering school: Engineers do not ignore mathematics. It's, like, an integral part of what they do. If a mathematician comes up with better traffic modeling ideas, I guarantee you it's not the engineers who aren't listening. It's the politicians. |
It's a good thing I'm tackling this now. Today and going into tomorrow, I'll need to be on a liquid diet. And not my usual liquid diet, either. So I expect to be mind-destroyingly hungry, which is never how I want to be when I'm reading or writing about food. Halfway through "Journalistic Intentions" [18+] with American Butchery A while back, I encountered an article that talked about a food vendor that had been excavated in Pompeii. I don't remember what the original article was, but here's NPR's take on it. The fascinating thing about this, for me anyway, wasn't just how well-preserved the building was; that's pretty normal for Pompeii from what I understand. Nor was it how they were able to identify nearly 2,000 year old remnants of the food served there. No, my takeaway (pun intended) from this find is how, even that long ago, people went to what are essentially fast-food restaurants. While I don't think they had drive-thru windows for chariots, the idea of walking into a place and ordering prepared food from a picture menu is not, as some would insist, an especially new, modern, or American invention. The defining feature of civilization is specialization. And civil engineering, of course, but that's really part of specialization (they call prostitution the "oldest profession," but someone had to build a road to the whorehouse, and that someone was a civil engineer). Specialization enables a person to focus on one or a few jobs, and do them well, rather than trying to do everything themselves and fucking up most of it. With specialization came the need to trade for services. At first, this was probably some kind of barter system, but when coinage was invented, things became much easier. Say you're a blacksmith, and you want, I don't know, wood for a tool handle. Sure, you can trade an axe head or two for some of the wood chopped, but what if the woodcutter already has all the axes he needs? Then you'd have to figure out what the woodcutter actually wants. A new pig, maybe. So then you have to make a more complex trade: horseshoes to a farmer for a pig, and then you gotta haul the pig out to the woodcutter to get the lumber you need. Coinage simplifies the process: you simply pay the woodcutter with shiny metal portraits of the Emperor or whatever, and then the woodcutter can pay for his own damn pig, leaving you more time to make swords or plowshares or whatever. When it comes to specialization, though, few human needs are more basic than having food to eat. Hence, the butcher. Now, I know a guy who bought a farm. Not the farm, but a farm; he's very much alive last time I checked, which was about a week ago. The idea was he and his wife could have their own livestock, and learn how to slaughter them for eatin'. Lots of people do that without necessarily owning a farm; I've also known quite a few deer hunters who were well-versed in the arcane practice of field-dressing venison. But these two bit off more than they could chew, as they needed to have actual 21st century jobs in addition to the whole farm thing, and as I could have told them if they'd asked, trying to run a farm is a full-time job in itself (which is why I don't do it). So they took to delivering their livestock to a local butcher, who turned pigs into chops and delicious bacon. Well, it would have been delicious, but they oversalted the meat, but that's another story. I think they made it an ethical choice: if we're going to eat meat, we should go whole hog (look, just assume every pun in this entry is intended; it's easier that way) and raise the animals ourselves, and take them through every step of the process. We're invested, that way. Of course, not eating meat wasn't an option, because meat is tasty. Then they lost the farm, moved to a suburb, then to the middle of a city, and now they're back in the countryside again, but they get their meat from a grocery store as Nature intended. What's my point? Well, I'm not sure that I have one, except, to get all pretentious about it, plus ça change, plus c'est la même chose. Everything sounds more pretentious in French, except maybe the word for seal (the animal with the flippers), which is phoque, and that's pronounced exactly like a certain four-letter anglo-saxonism. Also, though, I think I'm trying to say that my friend and his wife may have had a point: it's good to know where we're ultimately getting our food. As I've mentioned too many times, I spent my childhood on a farm, so I had a pretty damn good idea already, and yet I have no problem buying ground beef at a supermarket. This guy grew up in a suburb or city or something, so didn't have that first-hand knowledge. Some people learn the awful truth and swear off meat entirely. I don't do that myself, obviously, but I understand the ethics behind it. The steaks can get pretty high. (Okay, yeah, I forced that one.) Hell, if science ever manages to realize the potential of vat-grown meats, I'd totally switch to that. But either way, I think it's worse to mentally divorce yourself from the process of turning a steer into delicious steaks and hamburgers. Consequently, I took the time to watch the entire video that served as the prompt for today's entry, and I'm linking it here. |
It's almost like my random number generator has become sentient. The Body’s Most Embarrassing Organ Is an Evolutionary Marvel And yet we have very little idea where anuses come from. I'm still kind of drunk from my journey to the drafthouse cinema today (I hope I remember to do a movie review at the bottom), so it's Confession Time. Confession #1: This article has been hanging around in my queue for a while, but it popped up this week. Why is this significant? Because, being old, I have a colonoscopy scheduled this week, so it's relevant. Go look at the article. Look at it. The top picture is a cat butt, something I'm way more familiar with than I'd prefer. You might even call it... a cat-ass-trophy. You know, if you've been drinking, that is. To peer into the soul of a sea cucumber, don’t look to its face; it doesn’t have one. Gently turn that blobby body around, and gaze deep into its marvelous, multifunctional anus. No, thanks. Confession #2: I don't like butts. I just don't get the sexualization of them I see everywhere. "Wow, nice ass." "So?" Butts are vehicles for shit, and that's all they are. Okay, they're also moderately useful for sitting on -- what would I do if I couldn't sit on my ass all day, every day? -- but that's ancillary to their only real purpose, which is waste elimination. I'm not trying to kink-shame anybody here. If that's what you're into, hey, you do you. But for me, I don't care -- male, female, trans, whatever, I don't want to see your ass. I have what I consider to be a perfectly reasonable aversion to shit, no matter whose ass it comes out of. One of the main reasons I don't have a dog is I'd have to walk the little bastard and carry plastic bags with me. Cats? I can scoop that shit up without touching it. Yes, even through a plastic bag I'm still touching it. Nope. This also applies to life forms that are as different from human as they can be; e.g., the sea cucumber. The sea cucumber's posterior is so much more than an exit hole for digestive waste. It is also a makeshift mouth that gobbles up bits of algae; a faux lung, latticed with tubes that exchange gas with the surrounding water; and a weapon that, in the presence of danger, can launch a sticky, stringy web of internal organs to entangle predators. It can even, on occasion, be a home for shimmering pearlfish, which wriggle inside the bum when it billows open to breathe. I could have gone my entire life without knowing this, but since I do, I have to inflict it on you readers, too. It would not be inaccurate to describe a sea cucumber as an extraordinary anus that just so happens to have a body around it. Oh, so, kind of like a politician or a talk radio host. Bodily taboos have turned anuses across the tree of life into cultural underdogs, and scientific ones too: Not many researchers vocally count themselves among the world’s anus enthusiasts, which, according to the proud few, creates a bit of a blind spot—one that keeps us from understanding a fundamental aspect of our own biology. And look, I'm not ignorant of biology. Hell, I take my username from a carrion-eater. I'm aware of the functionality and even appreciate it. I just don't need to go anywhere near it. The appearance of the anus was momentous in animal evolution, turning a one-hole digestive sac into an open-ended tunnel. I'm also peripherally aware of the mathematical discipline of topology. In topology, as I understand it, they only care about things like how many perforations a shape has. You have a sphere, for example, which is topologically equivalent to a cube, a dodecahedron, or even a cylinder -- because it doesn't have any holes. Then you have shapes with one hole: a toroid, shaped like a bagel (or I could say donut, but my people invented bagels so I'm going with that), for example, which is topologically equivalent to... a teacup (or I could say coffee cup, but I dislike coffee too). The point being that both a bagel and a teacup only have one perforation. And when you think about it, humans (and most other animals) are topologically equivalent to bagels. So I try not to think about it. But anuses are also shrouded in scientific intrigue, and a fair bit of squabbling. Researchers still hotly debate how and when exactly the anus first arose, and the number of times the orifice was acquired or lost across different species. To tap into our origins, we’ll need to take a squarer look at our ends. Confession #3: The only reason I put up with this article at all was in hope of butt puns, and I wasn't disappointed. One of the oldest hypotheses holds that the anus and the mouth originated from the same solo opening, which elongated, then caved in at the center and split itself in two. It's rather important here to understand that this is evolution-speak shorthand for changes that occurred in animal species over a long, long evolutionary time, and not something that literally happened to a single organism over its lifespan. Cloacae are fixtures among birds, reptiles, and amphibians, and although they tend to get a bad rap, their internal architecture is actually quite sophisticated, Patricia Brennan, a cloaca expert at Mount Holyoke College, in Massachusetts, told me. "Glad you could join me for dinner. What do you do for a living?" "I'm a cloaca expert." "Check, please!" Whatever the reason behind it, the partitioning that did away with the cloaca made human anuses, as Manafzadeh said, “completely boring.” As far as exit holes go, ours are standard-issue, capable of little more than extruding waste from the gut, with no frills to speak of. See? I told you. The only redeeming quality of humans’ humdrum posterior hole is the feature we evolved to cushion it: our infamous buttocks, the most voluminous one documented to date, thanks to our bizarre tendency to strut around on our two primate legs. In the Before Time, when I was still going to the germ-infested hellhole known as a "fitness center" or "gym," I'd see videos on screens. It's about the only time I ever saw ads, because I avoid them with the same single-minded focus that I avoid assholes, and for generally the same reasons. Unfortunately, one time, I happened to see an ad for... I don't know what it was for, because the volume was off (you can select your own channel for audio; I chose to ignore all of them and instead watch videos about physics, math, and sometimes writing on my mobile). This ad featured butts. Not just any butts. Female butts. And for some reason, they featured expanded, unnatural-looking, enormous butts. I don't mean fat, either; I'm not body-shaming here any more than I'm kink-shaming. But it gave me the impression that it's somehow fashionable to make your buttocks look bigger, which, all my life, I'd been told was the polar opposite of what's attractive. "Does this dress make my ass look big?" "Why, no, honey." "Oh, good." But now? Apparently now, big butts are back in style, and I'm in hell. It's been a couple of weeks, but there was finally a movie I wanted to see. Because of my upcoming medical procedure, though, I was unable to munch on popcorn, as is my habit while watching a movie in a theater. They didn't tell me I couldn't have beer, though. Or a shaken-not-stirred martini. Okay, two martinis. One-Sentence Movie Review: No Time to Die While previous Daniel Craig Bond movies didn't do much for me because of their impenetrable plots, this one is notable not just for the acting, stunts, and more straightforward story, but because it features some of the best camera work I've ever seen -- almost enough to distract from the actual content. Rating: 4.5/5 |