“Woman-to-Woman-to…Huberman:” What journalism looks like from the inside 

This is written by a woman (and friend of mine!) who wishes to remain anonymous

When the article about Andrew Huberman was published in March, I wasn’t surprised, because “Sarah” had contacted me months before, seeking answers from women she says didn’t know about her, though they were having sex with her then-partner, Andrew. 

On a Sunday night in February, I received a text from an unknown number—the texter introduced herself as “Sarah,” the woman Andrew shared his life with for the last five years. She shared deeply personal, deleterious, and unsubstantiated details about Andrew cheating on her with four to ten women, spreading rumors about her, and verbally abusing her. She assumed I was one of those hapless women, and she apologized for being the one to tell me. She was cloaking gossip in virtue, though she reassured me that she held no ill will against me and saw me as another victim. All of this happened before ending her opening message to me by asking me, “woman-to-woman,” to bring her comfort and closure by admitting that I was sleeping with Andrew. 

I felt a mix of shame, suspicion, confusion, hurt, degradation, empathy, and curiosity. Who was on the other end of this message? Could I trust her, or him, or them? Was this a trap? What sort of nightmare love triangle did Andrew drag me into? “Triangle” is probably not even the right geometric shape. I asked her how she found my contact information. It was hard not to feel solidarity with her, she was kind and spilling her guts about her heartbreak—yet, her approach was unapologetically intrusive and felt manipulative. 

Sarah said she found my name in Andrew’s journal one day and instinctively took a photo of the page and later googled me to find my number. When I asked why she’d assume I had an “affair” with Andrew after reading my name in his journal, she replied: “Because of him talking about a long-term relationship…with somebody beautiful. I looked at your picture and you seemed beautiful and private.” 

I admit some susceptibility to flattery, and yet it was as if Sarah thought I owed her answers regarding my relationship with Andrew. After I felt confident that this was the woman that Andrew had been seeing for the last few years, I told her that I’d not seen Andrew since before the pandemic. She rapid-fire texted: 

  • “So he cheated on me with you in the early part of our relationship?”
    • No, I’ve been in relationships. 
  • “Oh, he reached out, but you didn’t accept.”
    • No.
  • “Were you in a relationship with him? Or was it just more casual?”

I told Sarah I’d not been romantically involved with Andrew since before their relationship started in ~2018. 

She declared how relieved she felt and we discussed in limited detail our histories with Andrew. Sarah said nothing about going to the press and I said I wasn’t interested in any sort of PR takedown of him. It’s possible she wasn’t planning to at that moment, but I felt she had an agenda beyond closure. I thought she wanted revenge.

I told her I’d known Andrew for nearly 20 years and was aware that he had some struggles in his relationships—and don’t we all! Sarah said she was also aware of his past. Despite what she said earlier in her texts, Andrew had nothing but very positive things to say about her whenever I spoke to him. He told me all about their struggles with fertility and how much he loved their shared life with her children from a former marriage. While Andrew and I had dated off and on for many years, he did not reach out to me for anything romantic when they were together, indicating to me that he must be quite committed and in love with her. I expressed compassion and empathy for her and with any woman he’s not been truthful to, but I also expressed sympathy for Andrew because I know that, despite himself, he wants a life partner. The whole thing seemed sad to me. 

After Sarah realized I saw Andrew as more than what she and these other women experienced (he is more than that), she acknowledged Andrew was generous and kind with her in many ways throughout their relationship. She repeated to me that part of her healing process is knowing the full truth. I am sure she meant this, though I don’t know where she picked up this notion or how she knows it’s true. I think she should read Esther Perel’s books. The sense that she was seeking more than “healing” persisted. The truth came out when the article hit. 

I can’t decide what stood out to me more when I first read it: that Sarah cherry-picked whose contact info she provided to Kerry Howley, conveniently excluding me, or that a story which doesn’t amount to much more than a gossip column about an accomplished neuroscientist-turned-podcaster’s propensity for wandering made the front cover of New York Magazine. There’s no abuse of power, no exploitation, no inspirational story of female empowerment—there’s simply an opportunistic journalist writing an unflattering portrayal of Andrew Huberman as a narcissistic, philandering liar. Is someone’s admittedly salacious private life news? 

Howley might’ve squandered an opportunity to empower women who may have felt powerless in their relationships or perhaps open a dialogue about the complexity of human relationships gone awry. Something about how these women found themselves involved and, in some cases, in love with a man who seemed unreliable and even deceptive in his personal life while earning a public reputation as thoughtful, insightful, and charming. Instead of complexity, she chose simplicity. Howley didn’t explore the characters or backgrounds of the women in this story. Who are they? What were they seeking? Had she done more diligence of her own, Howley would’ve at least alluded to the background of one of them whose company was investigated for consumer fraud and sued by former employees for wage theft—clear instances of deception and abusing one’s power. The latter of the two was settled out of court and as they say, guilty people don’t settle (looking at you, Michael Jackson). 

Instead, Howley wrote about a series of anonymous women who say they thought they were in a monogamous relationship with a man, only to find out it was not monogamous at all. She highlighted how he repeated the same lines over and over again to these women. A lot of the language sounded familiar to me—oh wait, that’s because I’ve known Andew for years. I’m pretty certain my vernacular doesn’t reinvent itself every time I’m in a new relationship, and I’m pretty sure that’s true of most people. The article also includes a number of barely corroborated, seemingly petty things Andrew lied about to demonstrate his supposed lack of moral compass. One that stood out to me was that he lied about living in Piedmont, a wealthy enclave in the East Bay. Andrew’s home, while technically not part of the Piedmont zip code, was a literal stone’s throw away. The article felt like a jilted-lovers’ fantasy come true: an expose detailing every dark and mortifying secret about your cheating ex. 

Perhaps there just wasn’t a great story to tell and that’s why it merely reads as gossip. Were Howley and the New York Magazine editor also duped into sleeping with Andrew Huberman under the guise of monogamy and a great future together? Did they do it anyway, for the story?

Look, I get it. I have a pretty deep well of empathy for a woman scorned; I tell friends that I’ll provide transportation across international borders should they seek revenge and need to make a quick getaway. What I really want my friends to know when I make that joke is, if anyone ever betrays their trust, I’ll empathize with their feelings of anger and hurt and won’t judge them for acting out while they process it. 

I’ve been inspired by women who seek revenge on their exes, particularly when they empower themselves as women in the process. The difference between empowerment and disempowerment is important. One such example is the article that Justine Musk penned herself about her ex-husband, Elon Musk. Justine didn’t write this anonymously or use it as an opportunity to unearth gossip from all corners of Musk’s life (even though I think he deserved it then and deserves it even more now), weaving together a hit-piece without any substantive commentary on the complexities of life and relationships. Justine bravely laid bare her participation in the slow relinquishing of her own identity and career in support of her talented but painfully insecure partner, who turned around and dumped her anyway. The story inspires because it’s multifaceted, introspective, and offers insight into how someone might find themself in that exact same situation. And perhaps a roadmap to escape it.

The Sarah I communicated with in February sounded capable of writing something more cogent and inspiring. Something revealing, and introspective, while also untangling the complexities of getting involved with someone we can’t fully trust. I think Howley failed her by turning this article into the hack job that it is. I don’t know whether Sarah or these other women found closure or peace of mind by participating. I can’t help but feel like this article could serve as a lesson for Andrew and for them, but one that the author failed to articulate anywhere among its 10,000 words. What stories aren’t being told as this one is? What would someone with a broader, more humane vision of the world than Howley’s have done with the material? If we’re going to talk about lying, why don’t we talk about Sarah’s motives, and what she said when she approached women on Howley’s behalf? Why aren’t we looking into the relationship between Sarah and Howley? 

Much of the legacy media has turned into a hit-piece machine. It’s sad, but also common, and yet I still think many people don’t realize how the media sausage gets made. Once a journalist has a point of view, they often act like a prosecutor. We saw what the New York Times did to Astral Codex Ten writer Scott Alexander. Now we have this attack against Huberman. I don’t condone his dating habits, but I also don’t think this amounts to a public story. Ryan Holiday published Trust Me, I’m Lying: Confessions of a Media Manipulator back in 2012. A dozen years later, it remains distressingly relevant. I want someone to investigate Howley and Sarah, and tell us how the article came together. That’s the story that’s most important to the public interest, because so many of the media’s sleazy operations are cloaked in secrecy. I can reveal just a little bit of that story: “journalism” can pretend to be a private story when it’s actually prep for a public social attack.

Many of us have unfortunate periods in our romantic histories, or pathologies we battle in our relationships today. But if you become famous, you become a target for the Howleys and New York Magazines of the world.

I don’t think there is a there there with this story. I think Sarah and Andrew did have real love and a real relationship, and she knows Andrew. She knows about his childhood, about his struggles to get where he is, about his deep desire for a loving family. Regardless of how much Howley attempted to undermine and trivialize it. I’ve had men betray me and I’ve fantasized about their personal or professional demise. But over time I’ve come to see them more fully. They are more than the hurt they caused me, and they were more to me than the hurt they caused me. Someone once told me that the only thing more emotionally damaging than feeling abandoned or betrayed by someone you trust, is abandoning our own sense of truth and morality. I believe that. But, if you’re my friend or a woman in need and your man has cheated on you, you know where to find me if you need a getaway car. 

Will things get better? Suicide and the possibility of waiting to find out

If you find this piece worthwhile, consider the Go Fund Me that’s funding ongoing care.

Suicide is a one-way valve: once done, it can’t be undone. I’d known the May 25 surgery that took my tongue would be hard and have a long recovery period, but I didn’t understand what “hard” and “long” truly meant, and during that post-operative June and July, when the level of physical misery was not, for me, compatible with life—not long term—I told Bess about “the question.” But if I delayed, the choice could always be made later. Knowing the option for exit remained allowed me to keep living, or whatever that simulacrum of living was, to see how things played out, despite how bleak life was. Many burdens can be borne for a short time, provided that there’s legitimate hope for a brighter future. Maybe there was. Maybe there wasn’t. I wouldn’t know if I was dead.

Back then I’d look at the man in the mirror, bloated, hideous, covered with stitches and thought, though I knew the answer: who or what is that? I’d expected to lose half my tongue to cancer, but when I awoke from surgery, I discovered the whole thing gone, along with some important nerves in my neck. For more than a month, I wasn’t able to breathe comfortably. Mucus production dominated my life, apparently due in part to the loss of the cancerous nerves. The days I spent in the hospital after the surgery were among the darkest in my life, and all the darker because of a thought: What if it doesn’t get better than this? The question wasn’t rhetorical. I saw the answer whenever I looked into the void.

Though I knew the answer, I didn’t like it. Worse than what I saw in the mirror was what I felt: an inability to be comfortable, in any position, anywhere. Breathing hurt, and I felt like I was drowning all the time. It wasn’t possible to clear sufficient mucus from my airway or nasal passages to breathe. Waterboarding is a form of torture, and, while I hope never to experience it directly, the descriptions I’ve read of it resonate with what I felt after the surgery. I was dependent on machines to keep me relatively alive. One day I hope man and machine can merge in a beautiful symbiosis, but my partial merger with the machine world was not like that—yes, they kept me alive, but I was fighting them, and they were fighting me, rather than us working together towards some greater mechanical whole.

If anything kept me alive, it was Bess. Every moment hurt, but I saw how fiercely she clung to the idea that things might get better. She was so diligent about caring for my wounds, cleaning the surgical sites, and monitoring my progress; it had to be because she expected progress. She might’ve been subconsciously motivated because she’s a doctor and can’t ignore a medical task, or, alternately, she was deluded by love and false hope. But her own optimism helped me understand there was a chance things would get better, however much everything, moment by moment, hurt. Which was good, because things hurt. A lot. I breathed through a trache tube in my throat that was constantly clogging and suffocating me. Pushing bags of liquid food through my PEG tube using a pump was a relentless struggle. I barely had the energy to walk across the room. The level of absolute, continuous exhaustion is hard to convey to anyone who’s not been through something analgaous. With normal exhaustion, sleep is curative. I couldn’t even sleep well because I couldn’t breathe well.

The pain wasn’t solely physical; it was also the pain of trying to understand where I fit into the world and how to live; not just existentially but quite literally how to manage simple day-to-day tasks that were now impossible. When I got out of the hospital, I immediately faced a barrage of fucked-up bureaucracy: the hospital and medical suppliers kept calling me and wouldn’t talk to Bess without my verbal consent, which I couldn’t give, because I couldn’t speak. Insurance wanted to fight. We weren’t sent home with the right food pump. It took two weeks to get said pump. Most adults figure out how to exist in the sometimes-insufferably bureaucratic society we inhabit; I couldn’t do so, because I couldn’t speak, or think, or move. David Brooks just wrote an essay, “Death by a Thousand Papercuts,” that captures a little of what I felt:

[Administrators’] power is similar to what Annie Lowrey of The Atlantic has called the “time tax.” If you’ve ever fought a health care, corporate or university bureaucracy, you quickly realize you don’t have the time for it, so you give up. I don’t know about you, but my health insurer sometimes denies my family coverage for things that seem like obvious necessities, but I let it go unless it’s a major expense. I calculate that my time is more valuable.

My time wasn’t valuable[1] but I lacked the means to pay the time tax. I was already suffering so severely in the physical realm that I didn’t have the wherewithal to fight for the pump and the food and medications. Even now, I’m facing potential mystery bills generated by United Healthcare; the person at the Mayo Clinic who is supposed to interface with the specialty pharmacy says the specialty pharmacy won’t talk to her[2], and, while the specialty pharmacy hasn’t generated any bills directly to me yet, I sense that they’re coming. Maybe it sounds absurd to be talking of bureaucracy in an essay about suicide, but probably it makes sense to anyone whose entire life has ever been at the mercy of one[3]. Bureaucracy can be a form of exhaustion and misery. It eats at your resolve. It’s its own kind of slow death.

During the summer, I couldn’t see a way forward towards a better life, and I knew that if I couldn’t get to a better, more tolerable life, I wouldn’t want to live further. Bess worried horribly about me, though I did promise her that I wouldn’t leave without telling her first. She worked frantically to keep me here, and to make life as good as it could be, given the privations of the surgery and cancer. She did as well as anyone could. But the suffering persisted. I don’t know precisely where the line was between “tolerable” and “intolerable” except that I was on the wrong side immediately after the surgery. Probably each person has to decide for him or herself where the line is. I don’t generally favor suicide—I prefer hope to despair, life to death, success to failure—but I don’t consider it taboo or unthinkable, either. Life and human consciousness are in general good, and, as far as we can tell, rare in the universe. They should be fostered, though not at the expense of all other values and costs.

In the months after the surgery, I felt like I had no slack—no physical slack, no energetic slack, no intellectual slack. I hardly had the ability to do anything or to think anything. Commonplace tasks felt like climbing the Himalayas. And I was besieged by tasks: doctor appointments, wound care, antibiotics, food, managing the healthcare team and system. I didn’t have energy or attention for anything. Life’s pleasures, whether normal or small, weren’t available: sleep, rest, food, coffee, sex, showers. I was technically alive but felt like I shouldn’t be.

There’s a weird tendency for people to view others persisting despite suffering as if they’re watching the vapid inspiration videos infesting social media like so many varmints. They fantasize that suffering serves a purpose. It teaches us…something, beyond itself, I guess. Wisdom, or something. I think that’s true of some kinds of suffering, like completing a project at the limits of one’s abilities, or other activities that generate mental fortitude and knowledge. Other kinds of suffering, like medical suffering, seem more pointless. I’ve learned that medical suffering sucks, but I knew that going in. I don’t think I’m a better or wiser or more enriched person for having been through what I’ve been through; I’ve just been miserable. That kind of adversity isn’t worth the price of adversity.

I could construct a bogus story in which I’ve learned from the suffering of the last year, but I don’t think it’d be true. It’d just be a form of cope. Bess confirms that, for every person she sees who beatifically (and irrationally) convinces themselves that their suffering has a purpose, there are five more who are miserable and mean about the hand they’re dealt. She confirms I’m not miserable or mean,[4] but I am a realist. If I’ve learned anything, it’s what I already knew: technology is good; cancer is bad; using technology to defeat cancer and other forms of human immiseration is good. We should accelerate technological progress in the pursuit of improving human flourishing. In another world, a world with less FDA intransigence and blockage, I’d have gotten Transgene’s TG4050 cancer vaccine after my first surgery, and it would’ve prevented the recurrence that took my tongue. Fortunately, the FDA has been diligently protecting me from being harmed, and it has thus ensured that cancer will kill me. Thank you, FDA.

If suffering has done anything, it’s made me more willing to speak out for the importance of technological acceleration, and for the need to give people the option to take more risks and block fewer technologies. We can’t build AI to improve the human condition soon enough. Forty thousand people a year die in car crashes; if AI plus LIDAR leads to self-driving cars, great. MobileEye and Luminar are leaders in self-driving cars, but the other efforts to build out AI and, eventually, the machine god, shouldn’t be discounted.

I don’t know when I consciously realized that I might be doing well enough to ask myself more questions about how I might live as opposed to when I might choose to die—probably sometime in August or September. Improvements have been slow—so slow. I learned to swallow slurries again. For a long time, every swallow was a struggle. I choked so severely on water in late July or early August that I thought I might die. Bess witnessed it, and pounded on my back to attempt to help me, and said she found that episode terrifying, because the Heimlich maneuver isn’t efficacious against drowning.

As I became somewhat better able to breathe, and the number of medical appointments began to decline, I also planned for another set of privations in the form of chemotherapy. What happened on May 25 is called “salvage surgery.”[5] I guess the surgery salvaged my life, at the expense of my tongue, which had been replaced it with a flap of muscle from my thigh. But the flap felt like an inert, alien thing, that constantly alerted my brainstem to a foreign threat inside my own mouth. It was immobile and insensate and yet I felt it, constantly. Was I what had been salvaged? It sure didn’t feel like it.

Failure to eliminate head and neck cancer in the first go-round is extremely bad, though my surgeon, Dr. Hinni, got clean margins in May. The question became: should I do any chemotherapy in an attempt to eliminate any remaining cancer cells? No one gave us a clear answer, because one doesn’t exist: Bizarrely, no one had comprehensively studied the question. Almost all the oncologists Bess and I consulted with said they either didn’t know the answer, and most said that the choice was really 50/50. It seemed we had to “decide what we wanted,” which seemed like a great way to run a Montessori preschool, but a less great way to decide on life-altering cancer care. Oncologists are strangely loathe to provide real, data-driven recommendations. There’s a lot of misplaced hope and enthusiasm for debilitating therapies, while, at the same time, thinking outside the box seems to be viewed with unearned futility.

I looked at the odds of surviving a second recurrence—essentially zero—and decided to go for chemo. My first chemo infusion was scheduled for July 24, and on July 21 I got CT scans to see whether I could begin performing jaw exercises that might improve my mobility; those scans showed the recurrence and metastases. That horrible surgery had bought a mere two months. Chemo went from “maybe curative” to “palliative, and an attempt to buy time.” I was barely healed enough from surgery when the chemo began, and so the physical improvements were setback by chemo.

Yet even though the chemo was miserable, I’d gotten better enough to have pulled back from the brink. I was getting a little better at swallowing. I was able to breathe without constant, continual pain. The PEG tube that protruded from my stomach was a constant bother, but one that was manageable enough. Progress was just slow. Unbelievably slow. Every day, I pressed forward as best I could. I used the exercise bands. I walked a little farther. I tried to push in as much nutrition as possible. I adjusted medications to help me sleep. Most importantly, I spent time with Bess. The purpose of life is other people. For me, that’s presently instantiated by being with Bess, by being with friends and family, and by writing. The writing is an attempt to help others, especially the people who are facing their own cancers. Oncologists apparently aren’t, as a group, going to do enough to help people who need clinical trials, so I’m stepping into that gap.

There’s a common distinction between surviving and thriving. Many people who survive traumatic or horrifying events never thrive after. Esther Perel has spoken about the difference between Holocaust survivors who managed to thrive after, as her parents seemed to, versus those who didn’t, as two of my grandparents seemed not to.[6] I’ve been trying to thrive, as best as I can discern how, with the aid of Bess, and despite the challenges of the incurable disease that’s killing me, held at bay right now only by the clinical trial petosemtamab.

For now, not exiting was the right decision, thanks to the aid I received and am recreiving from many others around me. I’m trying to lead a generative, positive life with what time I have left, and writing is a key part of that effort. Few people understand how bad the FDA is, or the degree to which the FDA is retarding progress in oncology in particular, and consequently letting cancer patients die. Perhaps there are too few faces to associate with the statistics about cancer deaths, and so I’m attempting to associate a single person with the bureaucratic edifice that is the FDA, killing through its nominal mission to “protect.”

One day, maybe soon, may not, it will be time to enter the one-way portal. The preferred, antiseptic modern term is “death with dignity.” But the people around me and with me keep me alive, and show that we really do live for one another. The physical challenges are still great, but not as severe as they were last summer. I’m able to get up and engage in meaningful activity most days. I don’t want to be a burden—a burden on family, friends, or society, and by my own judgment I think myself not too great a burden for others. That line will likely be crossed in the next year, but it’s not been crossed yet. And the clinical trial I’m participating in—and the one after it, and, if that one is successful enough, the one after it—is generating the data necessary to make effective cancer drugs available to other people. My role is small—I’m not inventing the drug, I’m not manufacturing it, I’m not setting up the trials themselves—but it is a role, and it is one someone has to fulfill. Fulfilling it generates some meaning in my life, and meaning is an essential component of thriving. Maybe there will be other roles for me, before the end.

 I’ll probably never be as effective as I was before the cancer, but I’ve been working, every day, at being more effective and less of a burden to the degree that I can achieve either. There’s plenty of physical pain in my life—as I write this, I have cuts on the pads of my fingers that won’t heal, I’m bleeding or barely not bleeding from my toenails, and my lower lip cycles between cracking and bleeding from those cracks. But the pain is bearable enough. I can breathe well enough. I’m able enough to write. So much has been taken, though enough remains for me to remain. I still believe what I wrote in “I know what happens to me after I die, but what about those left behind?”:

At some point, the suffering may be too much, and then I hope to exit by my own hand, gracefully, not having been wholly unmanned by disease. “Unmanned:” it’s an old-fashioned word, and one that appears in the appendices of The Lord of the Rings, when it is time for Aragorn to department the world. His wife Arwen pleads with Aragorn “to stay yet for a while” because she “was not yet weary of her days.” Aragorn asks her if she would have him “wait until I wither and fall from my high seat unmanned and witless.” I didn’t imagine that I might face the same question so soon, and yet it’s here, before me, and I hope to depart before the pain robs me of my mind and leaves me witless and suffering. Aragorn says that “I speak no comfort to you, for there is no comfort for such pain within the circles of the world.” And that I fear is true of Bess, too, that there will be no true comfort for her pain. Her parents will help her, our friends will help her, she will not be alone—and yet the pain at the moment of my own departure will remain.

Aragorn and by extension Tolkien understood death with dignity. For a lot of the summer, I felt unmanned and witless. Now I’m sufficiently manned and witted to be writing this, to be cooking, and to consider a future I probably won’t get, but I might. I don’t want to be caught off guard by success, like a teen boy whose efforts to get laid work when he thought they never would. The incremental improvements have added up, and suicide is an all-or-nothing proposition. The decision not to die last summer was the right one. The show goes on. Life goes on. For now, I am a part of it.

If you’ve gotten this far, consider the Go Fund Me that’s funding ongoing care.


[1] In the monetary sense: my marginal product of labor then was $0/hour. Bess wants me to point out that my time is inherently valuable to me as a human being.

[2] The specialty pharmacy also told her that they won’t talk to us. “They don’t speak directly to customers.” When the bill comes, I guess I’ll just send it back with an explanation that if they won’t interface with me, I won’t interface with them. Something tells me that will change their policy.

[3] Or who has read Kafka.

[4] Some of the nurses on the post-surgery recovery floor told Bess I was nice. I was trying to do unto others as I’d have them do unto me, and I hope I succeeded.

[5] The term for the surgery after the first surgery to remove head and neck cancer, and associated adjuvant treatment like radiation, fails

[6] They died before I was born, so I have no ability to judge for myself.

On not being a radical medicine skeptic, and the dangers of doctor-by-Internet

In part 1 I wrote about the struggles that come with complex healthcare problems, like the cancer that’s killing me, the efforts to treat it, and the numerous ancillary problems those treatments have caused. I lacked meaningful guidance on important topics like clinical trials or how to significantly decrease the incapacitating side effects of chemotherapy. I had to seek out other interventions that would significantly improve my quality of life, like a low-profile mic-key PEG tube. Instead of being guided by experts, I often had to crowd-source recommendations and double-check (and drive) treatment plans, or else so much would have fallen through the cracks. I’d likely be dead. My experiences should help guide others in similar situations, so they can better advocate. But I’m not a radical skeptic and, though I’d like to see improvements in healthcare and other institutions, I also don’t see fantastic alternatives at present levels of technology. If you find this piece worthwhile, consider the Go Fund Me that’s funding ongoing care.

What I’m suggesting isn’t the same as getting your medical degree from Dr. Google

Patients love to tell doctors what to do, and it drives doctors crazy. Online, and sometimes in the legacy media, you might’ve seen quotes from doctors complaining about know-it-all patients who attempt to incorrectly drive treatment. Demanding inadvisable treatment isn’t just bad for the doctor’s sanity; it’s bad for the patient’s health outcomes. Bess, to cite one example who happens to be sitting next to me as I write this, is barraged by ER patients demanding antibiotics for their viral illness or steroids for their chronically sore backs—even though these treatments won’t address the problem and may cause real harm—all because the patient “knows their body,” evidence-based medicine be damned. Many, if not most, people aren’t great at gathering and evaluating evidence, or reading, and even doctors don’t appear to be great at statistical literacy.  

I’m sympathetic to doctors’ views regarding patient knowledge or lack thereof, especially when doctors are trying to protect patients from unnecessary medications with real and serious side effects, and yet, at the same time, I continue to be (stupidly, foolishly) surprised at all the things not being done by the doctors who’re supposed to be driving my care. The first time something negative happens can reasonably be a surprise; the eighth time should not. They’re the experts and I’m the amateur, so why am I outperforming them in important ways? If Bess and I don’t drive, there’s no one behind the wheel, and that’s bad. Beyond my individual case, there’s also a larger question: What happens to trust in doctors as a whole when so many individual doctors aren’t providing the guidance or care they should?

Martin Gurri wrote a now-famous and excellent book called The Revolt of the Public and the Crisis of Authority in the New Millennium. It’s about, among other things, the loss of confidence in institutions of all sorts, including doctors and medical institutions. If you’re trying to understand the present better, The Revolt of the Public is a great, essential read. Patients need to listen to their doctors, yes, but for healthcare to benefit patients, doctors also need to listen to their patients. I’m not supposed to be an expert in every aspect of healthcare, and yet, as described in Part I, Bess and I have done and caught a bunch of things that the people who’re supposed to catch and do those things haven’t. In Poor Charlie’s Almanack, Charlie Munger wrote that “If, in your thinking, you rely entirely on others—often through purchase of professional advice—whenever outside a small territory or your own, you will suffer much calamity. And it is not just difficulties in complex coordination that will do you in.”* While it’s true that relying entirely on others isn’t a great idea, we all have to rely on others to some extent, and I’ve had to rely heavily on what doctors, nurses, physicians assistants, and others tell me. It’s hard to know what I don’t know.

Doctors go to school for four years and residency for a minimum of three. So why have I, a writer, had to double check so much? Why have so many of the plans that have kept me alive revolved around suggestions that Bess and I have made to oncologists and other experts—plans and treatments that wouldn’t have otherwise been considered? Bess and I did almost all the work and all the learning about clinical trials to keep me alive. It’s sub-optimal for me to do the double-checking because I don’t know everything the doctors know, or what I don’t know. Bess is an ER doctor and so doesn’t know oncology well. Still, Bess would agree that it only takes one minute for a doctor to ask him or herself: “if I was in my patient’s position, is there anything I can do to simply and easily make their situation better?

I’m not anti-doctor. This isn’t a screed about how doctors are dumb (they’re not, in the main). Although I’m not writing a screed, I am describing what I’ve faced and experienced in trying to not die, including many of the unflattering parts. After I die, I know Bess will be consumed by crushing existential loneliness, and I want to delay that day as long as possible. Delaying that day as long as possible means that Bess and I are constantly fighting to get the care that doctors haven’t been providing. Bess has been able to keep a close eye on most emergent medical matters, and she’s activated the doctor-network to beg for help from peers in Facebook medical groups. She’s banged down the digital doors of so many oncologists, trying to crowd-source a sense of whether the path we’re on makes sense (we appreciate the help, I want to emphasize: many of you have literally been lifesavers).

We’ve gotten some real medical oncology help, to be sure: a head and neck oncologist at Mayo Rochester named Dr. Kat Price has been hugely helpful in clinical trials, chemotherapy regimen questions, and other matters. Dr. Assuntina Sacco at UCSD understands the clinical trial landscape and is more knowledgeable than we are about what’s out there. Both have, I think, asked themselves what they would want in my situation. But they’ve been the exception, not the rule, which seems crazy to Bess and to me—I guess we live in a crazy upside-down world. By writing about what I’ve seen and experienced, I’m trying to help others, and to warn them of the many challenges Bess and I have faced and, based on experience, are likely to continue to face.

Continue reading

Global warming is here and it’s everyone’s fault

Maybe you’ve seen: “The 15 hottest days, in the world’s hottest month.”

It’s not like we weren’t warned: Nasa scientist James Hansen testified to Congress in 1988 about what was coming. We ignored it. By now, it’s everyone’s fault.

It’s the fault of:

* People who have spent decades voting against nuclear power.

* People who support NEPA. People who have never heard of NEPA.

* NIMBYs who work and vote to keep the vast majority of domiciles car-dependent.

* NIMBYs who make sure we can’t build more housing in dense, green cities like NYC (where I used to live, but moved, due to affordability issues).

* People who vote against bike lanes.

* People who could have picked the smaller vehicles and didn’t.

* People who could have picked up the bikes and didn’t.

* People who could have installed solar and didn’t.

* People who vote against mass transit (“It will never be practical”).

* Me. I only have so much effort to push into resisting the efforts of hundreds of millions if not billions of other people who are enacting the system. I try to resist but it’s hard for one person.

* People who realize that they’d like to live differently but are pushed into that single exurban direction by the legal and regulatory structure of American and, often, Canadian life.

Even the people who’d like to live greener—without a car, without relentless parking lots blighting the landscape, without having to live in single-unit housing—mostly can’t, in the United States. Or if we can, we’re merely moving the next marginal candidate who’d like to live densely into the exurbs of Phoenix, Dallas, Houston, Miami, and so on. Those are the places where it’s legal to build housing, so that’s where most people are going. I’ve moved from New York to Phoenix because I can afford the latter and can (barely) afford the former. Most of Phoenix is impossible without a car, and dangerous on a bike. It’s tragic, and I’d love to see change, but the system is forcing me in a particular direction and it’s incredibly expensive to try resisting it.

It’s the fault of no one, and everyone. There are some green shoots of change happening, albeit slowly, but we needed to get serious about nuclear power and the removal of non-safety zoning restrictions decades ago. We didn’t, and now the price is showing up. We need to get serious today, but we’re not.

Because fault is diffused, most of us, me included, feel there’s nothing substantial we can do—so we do nothing. Years pass. The problems worsen, though we can justify to ourselves that the problems are just headlines. Insurance becomes hard to get. The deniers set up their own alternative universes, where information only confirms and never disconfirms their worldviews. The bullshit asymmetry principle plays out: “The amount of energy needed to refute bullshit is an order of magnitude bigger than that needed to produce it.”

“What if scientists have over-predicted the consequences of global warming?” people ask. The flipside is never considered: “What if they’re underpredicting the consequences?”

The system goes on. Maybe solar, wind, and geothermal get cheap fast enough to partially save us. Maybe direct air capture (DAC) of carbon dioxide proceeds fast enough.

But maybe it doesn’t. And then the crisis will be all of our faults. And no one’s.

A life-changing encounter with a man named Dan

This essay is by my brother, Sam.

In 2009, I had a life-changing encounter with a man named Dan; he was the top salesman at our company and left an indelible mark on my career. Dan was an impressive figure, standing at six feet four with a heroic build, fierce red hair, and striking green eyes. He possessed an air of confidence, always dressed impeccably, never seen with a loosened tie, even during late nights working on proposals. His crisp, white shirt occasionally had its sleeves rolled up, but he always exuded professionalism and ownership. People naturally gravitated towards him, stepping aside to listen to his words. Dan treated everyone with a warm smile and friendliness, be it the company president or the person serving us lunch at Subway. His positive attitude was unparalleled. Whenever asked how he was doing, his unwavering response was, “I have never been better”—and he genuinely meant it.

Then, one day, Dan received devastating news about one of his children, who passed away. He took some time off from work, but, upon his return, he walked into the building with his laptop in hand, his tie tightly knotted, and a radiant smile on his face. As we were close colleagues, I felt concerned and decided to visit his office that morning, closing the door behind me.

“How are you really doing?” I asked sincerely. “Is there anything I can do for you? I mean it, anything, just ask.”

With a grin, Dan replied, “You know, I’ve never been better,” tossing his empty Starbucks cup into a trash can across the room. I stood there in silence, processing his words.

“How?” I finally managed to ask. “How can you maintain such a positive outlook? How can you genuinely claim that you’ve never been better?”

Dan leaned in and spoke softly, capturing my full attention. “Listen carefully,” he began. “You don’t truly know anything about me or my life. You only think you do. Here’s something you must remember, and I won’t mention it again. Your attitude sucks. Frankly, I’m surprised they tolerate it here. Your attitude defines everything. It shapes your life. You think things are bad? Let me tell you, buddy, they could be a lot worse. A lot worse. You’re standing there, upset because a meeting didn’t go your way, dressed in your shirt and cheap tie. Well, go out and start digging sewers and tell me how much that meeting mattered today. And maybe, after digging sewers, you’ll get laid off and find yourself living in one, eating from a dumpster. You don’t know anything. So, listen up. When someone asks how you’re doing, there’s only one answer: ‘I’ve never been better.’ And you live your life as if it’s true because here’s the stone-cold truth — no matter how bad you think things are right now, they can always be worse. So, wake up and change your attitude. Right fucking now.”

With that, he leaned back in his chair, his smile returning as if nothing had happened. I stood there in stunned silence, my shirt drenched in sweat.

“I need more coffee,” Dan happily announced. “Care to join me? It’s on me. Sales always buys the damn coffee!”

We went to Starbucks in his new Mercedes, and while everything seemed unchanged for him, everything had changed for me. I realized I couldn’t fulfill Dan’s request within that job: so I mustered the courage to quit, eventually finding a position at another company. It was a terrifying move, as I had spent my entire professional career at the previous company.

As I was walking into the new office, the receptionist greeted me with a smile and asked how I was doing.

“I’ve never been better,” I replied, sporting a wide grin.

“Well, that’s a fantastic attitude,” she beamed. “You’ll fit right in here if you can maintain that!”

And so it went. I became the most cheerful and upbeat person in the company. Though I became the subject of jokes, I also became a beacon of hope for those feeling downtrodden. Unbeknownst to me, I’d joined a company on the verge of collapse, but, as things worsened, my attitude gained more attention. I rapidly climbed the ranks, despite lacking expertise in the company’s technology. Layoffs hit, one after another, but I survived each round despite being the most junior member. Perplexed, I asked my boss how this was possible.

“Well,” he explained, “During meetings to discuss layoffs, your name consistently comes up. You’re inexperienced and new to the company, making you the logical choice. However, each time, everyone decides you should stay. Your attitude is so positive that everyone wants you here. The president even said he’d prefer one average employee with a great attitude over five brilliant but gloomy experts. Attitude sells. So, you don’t have to worry. You’ll still be here long after I’m gone, until they turn off the lights, if you want to be.”

And so it unfolded. As things deteriorated, my promotions accelerated. Within 18 months, I became the senior member of the sales team. I became the face of the company’s improbable turnaround. And when things reached their breaking point (the turnaround effort was not enough), a friend offered me a job, and that very day, I walked out.

From my experience with Dan and the job after Dan, I developed a list of three priorities necessary for success in the workplace. Having spent considerable time in the business world, let me share these priorities:

  • Firstly, your boss. Your number one priority is to make your boss look good. This is not a joke.
  • Secondly, your company. Your top priority is to increase revenue. Following closely is improving profitability. These two priorities should guide your thoughts and actions.
  • Finally, yourself. Your primary priority is to maintain an unwaveringly positive attitude, self-confidence, and the appearance of success.

The third item is crucial for your career and life. No amount of education or expertise surpasses its significance in most circumstances. An employee with average skills and a positive attitude holds greater value than five brilliant but unpleasant individuals. As pilots say, “your attitude determines your altitude.” Maintaining a positive attitude at all costs ensures your success, as surely as day follows night. Failure is not an option.

Since then, I’ve strived to adhere to these priorities. Where I succeeded, they brought me great achievements. Where I faltered, they resulted in failure and misery. Attitude stands as the foremost determinant of success in life. You must consistently exhibit a positive attitude, no matter the circumstances. Because it’s true—no matter how dire things may seem, they can always be worse. Your attitude will dictate how you navigate through it all.

If I could impart one thing to anyone, regardless of their stage in life, it would be to always display a positive attitude. It holds immeasurable power in the universe.

The death of literary culture

At The Complete Review, Michael Orthofer writes of John Updike that

Dead authors do tend to fade fast these days — sometimes to be resurrected after a decent interval has passed, sometimes not –, which would seem to me to explain a lot. As to ‘the American literary mainstream’, I have far too little familiarity with it; indeed, I’d be hard pressed to guess what/who qualifies as that.

Orthofer is responding to a critical essay that says: “Much of American literature is now written in the spurious confessional style of an Alcoholics Anonymous meeting. Readers value authenticity over coherence; they don’t value conventional beauty at all.” I’m never really sure what “authenticity” and its cousin “relatability” mean, and I have an unfortunate suspicion that both reference some lack of imagination in the speaker; still, regarding the former, I find The Authenticity Hoax: How We Get Lost Finding Ourselves persuasive.

But I think Orthofer and the article are subtly pointing towards another idea: literary culture itself is mostly dead. I lived through its final throes—perhaps like someone who, living through the 1950s, saw the end of religious Christianity as a dominant culture, since it was essentially gone by the 1970s—though many claimed its legacy for years after the real thing had passed. What killed literary culture? The Internet is the most obvious, salient answer, and in particular the dominance of social media, which is in effect its own genre—and, frequently, its own genre of fiction. Almost everyone will admit that their own social media profiles attempt to showcase a version of their best or ideal selves, and, thinking of just about everyone I know well, or even slightly well, the gap between who they really are and what they are really doing, and what appears on their social media, is so wide as to qualify as fiction. Determining the “real” self is probably impossible, but determining the fake selves is easier, and the fake is everywhere. Read much social media as fiction and performance and it will make more sense.

Everyone knows this, but admitting it is rarer. Think of all the social media photos of a person ostensibly alone—admiring the beach, reading, sunbathing, whatever—but the photographer is somewhere. A simple example, maybe, but also one without the political baggage of many other possible examples.

Much of what passes for social media discourse makes little or no sense, until one considers that most assertions are assertions of identity, not of factual or true statements, and many social media users are constructing a quasi-fictional universe not unlike the ones novels used to create. “QAnon” might be one easy modern example, albeit one that will probably go stale soon, if it’s not already stale; others will take its place. Many of these fictions are the work of group authors. Numerous assertions around gender and identity might be a left-wing-valenced version of the phenomenon, for readers who want balance, however spurious balance might be. Today, we’ve in some ways moved back to a world like that of the early novel and the early novelists, when “fact” and “fiction” were much more disputed, interwoven territories, and many novels claimed to be “true stories” on their cover pages. The average person has poor epistemic hygiene for most topics not directly tied to income and employment, but the average person has a very keen sense of tribe, belonging, and identity—so views that may be epistemically dubious nonetheless succeed if they promote belonging (consider also The Elephant in the Brain by Robin Hanson and Kevin Simler for a more thorough elaboration on these ideas). Before social media, did most people really belong, or did they silently suffer through the feeling of not belonging? Or was something else at play? I don’t know.

In literary culture terms, the academic and journalistic establishment that once formed the skeletal structure upholding literary culture has collapsed, while journalists and academics have become modern clerics, devoted more to spreading ideology than exploring the human condition, or to art, or to aesthetics. Academia has become more devoted to telling people what to think, than helping people learn how to think, and students are responding to that shift. Experiments like the Sokal Affair and its successors show as much. The cult of “peer review” and “research” fits poorly in the humanities, but they’ve been grafted on, and the graft is poor.

Strangely, many of the essays lamenting the fall of the humanities ignore the changes in the content of the humanities, in both schools and universities. The number of English majors in the U.S. has dropped by about 50% from 2000 to 2021:

Decline of English majors

History and most of other humanities majors obviously show similar declines. Meanwhile, the number of jobs in journalism has approximately halved since the year 2000; academic jobs in the humanities cratered in 2009, from an already low starting point, and have never recovered; even jobs teaching in high school humanities subjects have a much more ideological, rather than humanistic, cast than they did ten years ago. What’s taken the place of reading, if anything? Instagram, Snapchat, TikTok, and, above all, Twitter.

Twitter, in particular, seems to promote negative feedback and fear loops, in ways that media and other institutions haven’t yet figured out how to resist. The jobs that supported the thinkers, critics, starting-out novelists, and others, aren’t there. Whatever might have replaced them, like Twitter, isn’t equivalent. The Internet doesn’t just push most “content” (songs, books, and so forth) towards zero—it also changes what people do, including the people who used to make up what I’m calling literary culture or book culture. The costs of housing also makes teaching a non-viable job for a primary earner in many big cities and suburbs.

What power and vibrancy remains in book culture has shifted towards nonfiction—either narrative nonfiction, like Michael Lewis, or data-driven nonfiction, with too many examples to cite. It still sells (sales aren’t a perfect representation of artistic merit or cultural vibrancy, but they’re not nothing, either). Dead authors go fast today not solely or primarily because of their work, but because the literary culture is going away fast, if it’s not already gone. When John Updike was in his prime, millions of people read him (or they at last bought Couples and could spit out some light book chat about it on command). The number of writers working today who the educated public, broadly conceived of, might know about is small: maybe Elena Ferrante, Michel Houllebecq, Sally Rooney, and perhaps a few others (none of those three are American, I note). I can’t even think of a figure like Elmore Leonard: someone writing linguistically interesting, highly plotted material. Bulk genre writers are still out there, but none who I’m aware of who have any literary ambition.

See some evidence for the decline of literary cultures in the decline of book advances; the Authors Guild, for example, claims that “writing-related earnings by American authors [… fell] to historic lows to a median of $6,080 in 2017, down 42 percent from 2009.” The kinds of freelancing that used to exist has largely disappeared too, or become economically untenable. In If You Absolutely Must by Freddie deBoer, he warns would-be writers that “Book advances have collapsed.” Money isn’t everything but the collapse of already-shaking foundations of book writing is notable, and quantifiable. Publishers appear to survive and profit primarily off very long copyright terms; their “backlist” keeps the lights on. Publishers seem, like journalists and academics, to have become modern-day clerics, at least for the time being, as I noted above.

Consider a more vibrant universe for literary culture, as mentioned in passing here:

From 1960 to 1973, book sales climbed 70 percent, but between 1973 and 1979 they added less than another six percent, and declined in 1980. Meanwhile, global media conglomerates had consolidated the industry. What had been small publishers typically owned by the founders or their heirs were now subsidiaries of CBS, Gulf + Western (later Paramount), MCA, RCA, or Time, Inc. The new owners demanded growth, implementing novel management techniques. Editors had once been the uncontested suzerains of title acquisition. In the 1970s they watched their power wane.

A world in which book sales (and advances) are growing is very different from one of decline. It’s reasonable to respond that writing has rarely been a path to fame or fortune, but it’s also reasonable to note that, even against the literary world of 10 or 20 years ago, the current one is less remunerative and less culturally central. Writers find the path to making any substantial money from their writing harder, and more treacherous. Normal people lament that they can’t get around to finishing a book; they rarely lament that they can’t get around to scrolling Instagram (that’s a descriptive observation of change).

At Scholar’s Stage, Tanner Greer traces the decline of the big book and the big author:

the last poet whose opinion anybody cared about was probably Allen Ginsberg. The last novelist to make waves outside of literary circles was probably Tom Wolfe—and he made his name through nonfiction writing (something similar could be for several of other prominent essayists turned novelists of his generation, like James Baldwin and Joan Didion). Harold Bloom was the last literary critic known outside of his own field; Allan Bloom, the last with the power to cause national controversy. Lin-Manuel Miranda is the lone playwright to achieve celebrity in several decades.

I’d be a bit broader than Greer: someone like Gillian Flynn writing Gone Girl seemed to have some cultural impact, but even books like Gone Girl seem to have stopped appearing. The cultural discussion rarely if ever revolves around books any more. Publishing and the larger culture have stopped producing Stephen Kings. Publishers, oddly to my mind, no longer even seem to want to try producing popular books, preferring instead to pursue insular ideological projects. The most vital energy in writing has been routed to Substack.

I caught the tail end of a humane and human-focused literary culture that’s largely been succeeded by a political and moral-focused culture that I hesitate to call literary, even though it’s taken over what remains of those literary-type institutions. This change has also coincided with a lessening of interest in those institutions: very few people want to be clerics and scolds—many fewer than wonder about the human condition, though the ones who do want to be clerics and scolds form the intolerant minority in many institutions. Shifting from the one to the other seems like a net loss to me, but also a net loss that I’m personally unable to arrest or alter. If I had to pick a date range for this death, it’d probably be 2009 – 2015: the Great Recession eliminates many of the institutional jobs and professions that once existed, along with any plausible path into them for all but the luckiest, and by 2015 social media and scold culture had taken over. Culture is define but easy to feel as you exist within and around it. By 2010, Facebook had become truly mainstream, and everyone’s uncle and grandma weren’t just on the Internet for email and search engines, but for other people and their opinions.

Maybe mainstream literary culture has been replaced by some number of smaller micro-cultures, but those microcultures don’t add up to what used to be a macroculture.

In this essay, I write:

I’ve been annoying friends and acquaintances by asking, “How many books did you read in the last year?” Usually this is greeted with some suspicion or surprise. Why am I being ambushed? Then there are qualifications: “I’ve been really busy,” “It’s hard to find time to read,” “I used to read a lot.” I say I’m not judging them—this is true, I will emphasize—and am looking for an integer answer. Most often it’s something like one or two, followed by declamations of highbrow plans to Read More In the Future. A good and noble sentiment, like starting that diet. Then I ask, “How many of the people you know read more than a book or two a year?” Usually there’s some thinking, and rattling off of one or two names, followed by silence, as the person thinks through the people they know. “So, out of the few hundred people you might know well enough to know, Jack and Mary are the two people you know who read somewhat regularly?” They nod. “And that is why the publishing industry works poorly,” I say. In the before-times, anyone interested in a world greater than what’s available around them and on network TV had to read, most often books, which isn’t true any more and, barring some kind of catastrophe, probably won’t be true again.

Reading back over this I realize it has the tone and quality of a complaint, but it’s meant as a description, and complaining about cultural changes is about as effective as shaking one’s fist at the sky: I’m trying to look at what’s happening, not whine about it. Publishers go woke and see the sales of fiction fall and respond by doubling down, but I’m not in the publishing business and the intra-business signaling that goes on there. One could argue changes noted are for the better. Whining about aggregate behavior and choices has rarely, if ever, changed it. I don’t think literary culture will ever return, any more Latin, epic poetry, classical music, opera, or any number of other once-vital cultural products and systems will.

In some ways, we’re moving backwards, towards a cultural fictional universe with less clearly demarcated lines between “fact” and “fiction” (I remember being surprised, when I started teaching, by undergrads who didn’t know a novel or short stories are fiction, or who called nonfiction works “novels”). Every day, each of us is helping whatever comes next, become. The intertwined forces of technology and culture move primarily in a single direction. The desire for story will remain but the manifestation of that desire aren’t static. Articles like “Leisure reading in the U.S. is at an all-time low” appear routinely. It’s hard to have literary culture among a population that doesn’t read.

See also:

* What happened with Deconstruction? And why is there so much bad writing in academia?

* Postmodernisms: What does that mean?

Where are the woke on Disney and China?

I have sat through numerous talks and seen numerous social media messages about the evils of imperialism, and in particular western imperialism—so where’s the mass outrage over China today, and the efforts by Disney and Hollywood to court China? China is a literal, real-world imperialist power, today; China has crushed Hong Kong’s independent, imprisoned perhaps a million of its own people based on their race and religion, and invaded and occupied Tibet—and Taiwan may be next. But I never read “imperialist” or “racist” critiques from the usual suspects. Why not?

Search for “imperialism” on Twitter, for example, and you’ll find numerous people denouncing what they take to be “imperialism” or various kinds of imperialisms, but few dealing with China. This bit about Bob Iger’s complicity with Chinese government repression got me thinking about why some targets draw much “woke” ire while others don’t. My working hypothesis is that China seems far away from the United States and too different to understand—even though companies and individuals are regularly attacked for their associations with other Americans, they rarely seem to be for their associations with China. The NBA, to take another example, fervently favors police reform in the United States, but is largely silent on China (to be sure, I don’t agree with all the posturing at the link, but pay attention to the underlying point). My working theory is that the situation between the woke and China is analogous to the way that comparisons to your wife’s sister’s husband’s income can create a lot of jealousy while comparisons to the truly wealthy don’t.

In addition, could it be that Disney’s specialty in child-like stories of simple, Manichaean stories of good versus evil appeal to the same people, or kinds of person, most likely to be attracted to the quasi-religious “woke” mindset? To my knowledge, I’ve not seen these questions asked, and Disney products, like Star Wars movies and TV shows, seem to remain broadly popular, including on the far left. It’s also worth emphasizing that some have spoken about Disney’s action’s; the Twitter thread about Iger links to “Why Disney’s new ‘Mulan’ is a scandal.” But the issue seems to elicit relatively little ire and prominence, compared to many others. Few sustained movements or organizations are devoted to these issues.

What views make someone a pariah, and why? What associations make someone a pariah, and why? What views and associations elicit intense anger, and why? I don’t have full answers to any of these questions but think them worth asking. No one seems to be calling for boycotts of Disney, even though Disney is toadying to an actual imperialist state.

Dissent, insiders, and outsiders: Institutions in the age of Twitter

How does an organization deal with differing viewpoints among its constituents, and how do constituents dissent?

Someone in Google’s AI division was recently fired, or the person’s resignation accepted, depending on one’s perspective, for reasons related to a violation of process and organizational norms, or something else, again depending on one’s perspective. The specifics of that incident can be disputed, but the more interesting level of abstraction might ask how organizations process conflict and what underlying conflict model participants have. I recently re-read Noah Smith’s essay “Leaders Who Act Like Outsiders Invite Trouble;” he’s dealing with the leadup to World War II but also says: “This extraordinary trend of rank-and-file members challenging the leaders of their organizations goes beyond simple populism. There may be no word for this trend in the English language. But there is one in Japanese: gekokujo.” And later, “The real danger of gekokujo, however, comes from the establishment’s response to the threat. Eventually, party bosses, executives and other powerful figures may get tired of being pushed around.”

If you’ve been reading the news, you’ll have seen gekokujo, as institutions are being pushed by the Twitter mob, and by the Twitter mob mentality, even when the mobbing person is formally within the institution. I think we’re learning, or going to have to re-learn, things like “Why did companies traditionally encourage people to leave politics and religion at the door?” and “What’s the acceptable level of discourse within the institution, before you’re not a part of it any more?”

Colleges and universities in particular seem to be susceptible to these problems, and some are inculcating environments and cultures that may not be good for working in large groups. One recent example of these challenges occurred at Haverford college, but here too the news has many other examples, and the Haverford story seems particularly dreadful.

The basic idea that organizations have to decide who’s inside and who’s outside is old: Albert Hirschman’s Exit, Voice, and Loyalty: Responses to Decline in Firms, Organizations, and States is one great discussion. Organizations also used to unfairly exclude large swaths of the population based on demographic factors, and that’s (obviously) bad. Today, though, many organizations have in effect, if not intent, decided that it’s okay for some of their members to attack the good faith of other members of the organization, and to attack the coherentness of the organization itself. There are probably limits to how much this can be done, and still retain a functional organization, let alone a maximally functional organization.

The other big change involves the ability to coordinate relatively large numbers of people: digital tools have made this easier, in a relatively short time—thus the “Twitter mob” terminology that came to mind a few paragraphs ago; I kept the term, because it seems like a reasonable placeholder for that class of behavior. Digital tools ease the ability of a small percentage of total people to be a large absolute number of people. For example, if 100,000 people are interested in or somehow connected to an organization, and one percent of them want to fundamentally disrupt the organization, change its direction, or arrange an attack, that’s 1,000 people—which feels like a lot. It’s far above the Dunbar number and too many for one or two public-facing people to deal with. In addition, in some ways journalists and academics have become modern-day clerics, and they’re often eager to highlight and disseminate news of disputes of this sort.

Over time, I expect organizations are going to need to develop new cultural norms if they’re going to maintain their integrity in the face of coordinated groups that represent relatively small percentages of people but large absolute numbers of people. The larger the organization, the more susceptible it may be to these kinds of attacks. I’d expect more organizations to, for example, explicitly say that attacking other members of the organization in bad faith will result in expulsion, as seems to have happened in the Google example.

Evergreen College, which hosted an early example of this kind of attack (on a biology professor named Bret Weinstein), has seen its enrollment drop by about a third.

Martin Gurri’s book The Revolt of The Public and the Crisis of Authority in the New Millennium examines the contours of the new information world, and the relative slowness of institutions to adapt to it. Even companies like Google, Twitter, and Facebook, which have enabled sentiment amplification, were founded before their own user bases became so massive.

Within organizations, an excess of conformity is a problem—innovation doesn’t occur from simply following orders—but so is an excess of chaos. Modern intellectual organizations, like tech companies or universities, probably need more “chaos” (in the sense of information transfer) than, say, old-school manufacturing companies, which primarily needed compliance. “Old-school” is a key phrase, because from what I understand, modern manufacturing companies are all tech companies too, and they need the people closest to the process to be able to speak up if something is amiss or needs to be changed. Modern information companies need workers to speak up and suggest new ideas, new ways of doing things, and so on. That’s arguably part of the job of every person in the organization.

Discussion at work of controversial identity issues can probably function if all parties assume good faith from the other parties (Google is said to have had a freewheeling culture in this regard from around the time of its founding up till relatively recently). Such discussions probably won’t function without fundamental good faith, and good faith is hard to describe, but most of us know it when we see it, and defining every element of it would probably be impossible, while cultivating it as a general principle is desirable. Trying to maintain such an environment is tough: I know that intimately because I’ve tried to maintain it in classrooms, and those experiences led me to write “The race to the bottom of victimhood and ‘social justice’ culture.” It’s hard to teach, or run an information organization, without a culture that lets people think out loud, in good faith, with relatively little fear of arbitrary reprisal. Universities, in particular, are supposed to be oriented around new ideas and discussing ideas. Organizations also need some amount of hierarchy: without it, decisions can’t or don’t get made, and the organizational processes themselves don’t function. Excessive attacks lead to the “gekokujo” problem Smith describes. Over time organizations are likely going to have to develop antibodies to the novel dynamics of the digital world.

A lot of potential learning opportunities aren’t happening, because we’re instead dividing people into inquisitors and heretics, when very few should be the former, and very are truly the latter. One aspect of “Professionalism” might be “assuming good faith on the part of other parties, until proven otherwise.”

On the other hand, maybe these cultural skirmishes don’t matter much, like brawlers in a tavern across the street from the research lab. Google’s AlphaFold has made a huge leap in protein folding efforts (Google reorganized itself, so technically both Google and AlphaFold are part of the “Alphabet” parent company). Waymo, another Google endeavor, may be leading the way towards driverless cars, and it claims to be expanding its driverless car service. Compared to big technical achievements, media fights are minor. Fifty years from now, driverless cars will be taken for granted, along with customizable biology, people will be struggling to understand what was at stake culturally, in much the way most people don’t get what the Know-Nothing party, of the Hundred Years War, were really about, but we take electricity and the printing press for granted.

EDIT: Coinbase has publicly taken a “leave politics and religion at the door” stand. They’re an innovator, or maybe a back-to-the-future company, in these terms.

 

Personal epistemology, free speech, and tech companies

The NYT describes “The Problem of Free Speech in an Age of Disinformation, and in response Hacker News commenter throwaway13337 says, in part, “It’s not unchecked free speech. Instead, it’s unchecked curation by media and social media companies with the goal of engagement.” There’s some truth to the idea that social media companies have evolved to seek engagement, rather than truth, but I think the social media companies are reflecting a deeper human tendency. I wrote back to throwaway13337: “Try teaching non-elite undergrads, and particularly assignments that require some sense of epistemology, and you’ll discover that the vast majority of people have pretty poor personal epistemic hygiene—it’s not much required in most people, most of the time, in most jobs.”

From what I can tell, we evolved to form tribes, not to be “right:” Jonathan’s Haidt’s The Righteous Mind: Why Good People Are Divided by Politics and Religion deals with this topic well and at length, and I’ve not seen any substantial rebuttals of it. We don’t naturally take to tracking the question, “How do I know what I know?” Instead, we naturally seem to want to find “facts” or ideas that support our preexisting views. In the HN comment thread, someone asked for specific examples of poor undergrad epistemic hygiene, and while I’d prefer not to get super specific for reasons of privacy, I’ve had many conversations that take the following form: “How do you know article x is accurate?” “Google told me.” “How does Google work?” “I don’t know.” “What does it take to make a claim on the Internet.” “Um. A phone, I guess?” A lot of people—maybe most—will uncritically take as fact whatever happens to be served up by Google (it’s always Google and never Duck Duck Go or Bing), and most undergrads whose work I’ve read will, again uncritically, accept clickbait sites and similar as accurate. Part of the reason for this reasoning is that undergrads’s lives are minimally affected by being wrong or incomplete about some claim done in a short assignment that’s being imposed by some annoying professor toff standing between them and their degree.

The gap between elite information discourse and everyday information discourse, even among college students, who may be more sophisticated than their peer equivalents, is vast—so vast that I don’t think most journalists (who mostly talk to other journalists and to experts) and to other people who work with information, data, and ideas really truly understand it. We’re all living in bubbles. I don’t think I did, either, before I saw the epistemic hygiene most undergrads practice, or don’t practice. This is not a “kids these days” rant, either: many of them have never really been taught to ask themselves, “How do I know what I know?” Many have never really learned anything about the scientific method. It’s not happening much in most non-elite schools, so where are they going to get epistemic hygiene from?

The United States alone has 320 million people in it. Table DP02 in the Census at data.census.gov estimates that 20.3% of the population age 25 and older has a college bachelor’s degree, and 12.8% have a graduate or professional degree. Before someone objects, let me admit that a college degree is far from a perfect proxy for epistemic hygiene or general knowledge, and some high school dropouts perform much better at cognition, meta cognition, statistical reasoning, and so forth, than do some people with graduate degrees. With that said, though, a college degree is probably a decent approximation for baseline abstract reasoning skills and epistemic hygiene. Most people, though, don’t connect with or think in terms of aggregated data or abstract reasoning—one study, for example, finds that “Personal experiences bridge moral and political divides better than facts.” We’re tribe builders, not fact finders.

Almost anyone who wants a megaphone in the form of one of the many social media platforms available now has one. The number of people motivated by questions like “What is really true, and how do I discern what is really true? How do I enable myself to get countervailing data and information into my view, or worldview, or worldviews?” is not zero, again obviously, but it’s not a huge part of the population. And many very “smart” people in an IQ sense use their intelligence to build better rationalizations, rather than to seek truth (and I may be among the rationalizers: I’m not trying to exclude myself from that category).

Until relatively recently, almost everyone with a media megaphone had some kind of training or interest in epistemology, even they didn’t call it “epistemology.” Editors would ask, “How do you know that?” or “Who told you that?” or that sort of thing. Professors have systems that are supposed to encourage greater-than-average epistemic hygiene (these systems were not and are not perfect, and nothing I have written so far implies that they were or are).

Most people don’t care about the question, “How do you know what you know?” are fairly surprised if it’s asked, implicitly or explicitly. Some people are intrigued by it but most aren’t, and view questions about sources and knowledge to be a hindrance. This is less likely to be true of people who aspire to be researchers or work in other knowledge-related professions, but that describes only a small percentage of undergraduates, particularly at non-elite schools. And the “elite schools” thing drives a lot of the media discourse around education. One of the things I like about Professor X’s book In the Basement of the Ivory Tower is how it functions as a corrective to that discourse.

For most people, floating a factually incorrect conspiracy theory online isn’t going to negatively affect their lives. If someone is a nurse and gives a patient a wrong medication or incorrect medication, that person is not going to be a nurse for long. If the nurse states or repeats a factually incorrect political or social idea online, particularly but not exclusively under a pseudonym, that nurse’s life likely won’t be affected. There’s no truth feedback loop. The same is true for someone working in, say, construction, or engineering, or many other fields. The person is free to state things that are factually incorrect, or incomplete, or misleading, and doing so isn’t going to have many negative consequences. Maybe it will have some positive consequences: one way to show that you’re really on team x is to state or repeat falsehoods that show you’re on team x, rather than on team “What is really true?”

I don’t want to get into daily political discourse, since that tends to raise defenses and elicit anger, but the last eight months have demonstrated many people’s problems with epistemology, and in a way that can have immediate, negative personal consequences—but not for everyone.

Pew Research data indicate that a quarter of US adults didn’t read a book in 2018; this is consistent with other data indicating that about half of US adults read zero or one books per year. Again, yes, there are surely many individuals who read other materials and have excellent epistemic hygiene, but this is a reasonable mass proxy, given the demands that reading makes on us.

Many people driving the (relatively) elite discourse don’t realize how many people are not only not like them, but wildly not like them, along numerous metrics. It may also be that we don’t know how to deal with gossip at scale. Interpersonal gossip is all about personal stories, while many problems at scale are best understood through data—but the number of people deeply interested in data and data’s veracity is small. And elite discourse has some of its own possible epistemic falsehoods, or at least uncertainties, embedded within it: some of the populist rhetoric against elites is rooted in truth.

A surprisingly large number of freshmen don’t know the difference between fiction and nonfiction, or that novels are fiction. Not a majority, but I was surprised when I first encountered confusion around these points; I’m not any longer. I don’t think the majority of freshmen confuse fiction and nonfiction, or genres of nonfiction, but enough do for the confusion to be a noticeable pattern (modern distinctions between fiction and nonfiction only really arose, I think, during the Enlightenment and the rise of the novel in the 18th Century, although off the top of my head I don’t have a good citation for this historical point, apart perhaps from Ian Watt’s work on the novel). Maybe online systems like Twitter or Facebook allow average users to revert to an earlier mode of discourse in which the border between fiction and nonfiction is more porous, and the online systems have strong fictional components that some users don’t care to segregate.

We are all caught in our bubble, and the universe of people is almost unimaginably larger than the number of people in our bubble. If you got this far, you’re probably in a nerd bubble: usually, anything involving the word “epistemology” sends people to sleep or, alternately, scurrying for something like “You won’t believe what this celebrity wore/said/did” instead. Almost no one wants to consider epistemology; to do so as a hobby is rare. One person’s disinformation is another person’s teambuilding. If you think the preceding sentence is in favor of disinformation, by the way, it’s not.

Have journalists and academics become modern-day clerics?

This guy was wrongly and somewhat insanely accused of sexual impropriety by two neo-puritans; stories about individual injustice can be interesting, but this one seems like an embodiment of a larger trend, and, although the story is long and some of the author’s assumptions are dubious, I think there’s a different, conceivably better, takeaway than the one implied: don’t go into academia (at least the humanities) or journalism. Both fields are fiercely, insanely combative for very small amounts of money; because the money is so bad, many people get or stay in them for non-monetary ideological reasons, almost the way priests, pastors, or other religious figures used to choose low incomes and high purpose (or “purpose” if we’re feeling cynical). Not only that, but clerics often know the answer to the question before the question has even been asked, and they don’t need free inquiry because the answers are already available—attributes that are very bad, yet seem to be increasingly common, in journalism and academia.

Obviously journalism and academia have never been great fields for getting rich, but the business model for both has fallen apart in the last 20 years. The people willing to tolerate the low pay and awful conditions must have other motives (a few are independently wealthy) to go into them. I’m not arguing that other motives have never existed, but today you’d have to be absurdly committed to those other motives. That there are new secular religions is not an observation original to me, but once I heard that idea a lot of other strange-seeming things about modern culture clicked into place. Low pay, low status, and low prestige occupations must do something for the people who go into them.

Once an individual enters the highly mimetic and extremely ideological space, he becomes a good target for destruction—and makes a good scapegoat for anyone who is not getting the money or recognition they think they deserve. Or for anyone who is simply angry or feels ill-used. The people who are robust or anti-fragile stay out of this space.

Meanwhile, less ideological and much wealthier professions may not have been, or be, immune from the cultural psychosis in a few media and academic fields, but they’re much less susceptible to mimetic contagions and ripping-downs. The people in them have greater incomes and resources. They have a greater sense of doing something in the world that is not primarily intellectual, and thus probably not primarily mimetic and ideological.

There’s a personal dimension to these observations, because I was attracted to both journalism and academia, but the former has shed at least half its jobs over the last two decades and the latter became untenable post-2008. I’ve enough interaction with both fields to get the cultural tenor of them, and smart people largely choose more lucrative and less crazy industries. Like many people attracted to journalism, I read books like All the President’s Men in high school and wanted to model Woodward and Bernstein. But almost no reporters today are like Woodward and Bernstein. They’re more likely to be writing Buzzfeed clickbait, and nothing generates more clicks than outrage. Smart people interested in journalism can do a minimal amount of research and realize that the field is oversubscribed and should be avoided.

When I hear students say they’re majoring in journalism, I look at them cockeyed, regardless of gender; there’s fierce competition coupled with few rewards. The journalism industry has evolved to take advantage of youthful idealism, much like fashion, publishing, film, and a few other industries. Perhaps that is why these industries attract so many writers to insider satires: the gap between idealistic expectation and cynical reality is very wide.

Even if thousands of people read this and follow its advice, thousands more persons will keep attempting to claw their way into journalism or academia. It is an unwise move. We have people like David Graeber buying into the innuendo and career attack culture. Smart people look at this and do something else, something where a random smear is less likely to cost an entire career.

We’re in the midst of a new-puritan revival and yet large parts of the media ecosystem are ignoring this idea, often because they’re part of it.

It is grimly funny to have read the first story linked next to a piece that quotes Solzhenitsyn: “To do evil a human being must first of all believe that what he’s doing is good, or else that it’s a well-considered act in conformity with natural law. . . . it is in the nature of a human being to seek a justification for his actions.” Ideology is back, and destruction is easier the construction. Our cultural immune system seems to have failed to figure this out, yet. Short-form social media like Facebook and Twitter arguably encourage black and white thinking, because there’s not enough space to develop nuance. There is enough space, however, to say that the bad guy is right over there, and we should go attack that bad guy for whatever thought crimes or wrongthink they may have committed.

Ideally, academics and journalists come to a given situation or set of facts and don’t know the answer in advance. In an ideal world, they try to figure out what’s true and why. “Ideal” is repeated twice because, historically, departures from the ideal is common, but having ideological neutrality and an investigatory posture is preferable to knowing the answer in advance and judging people based on demographic characteristics and prearranged prejudices, yet those traits seem to have seeped into the academic and journalistic cultures.

Combine this with present-day youth culture that equates feelings with facts and felt harm with real harm, and you get a pretty toxic stew—”toxic” being a favorite word of the new clerics. See further, America’s New Sex Bureaucracy. If you feel it’s wrong, it must be wrong, and probably illegal; if you feel it’s right, it must be right, and therefore desirable. This kind of thinking has generated some backlash, but not enough to save some of the demographic undesirables who wander into the kill zone of journalism or academia. Meanwhile, loneliness seems to be more acute than ever, and we’re stuck wondering why.