Surcease of Sorrow

I’m typing case notes in the hospice office when one of the home care nurses walks up and sings, “The son’s gonna come in to-mor-row.”

I chuckle-groan. Sons from out of town haven’t been around to experience the patient’s decline, so they can’t understand the decisions made by caregivers who have been. Sons from out of town, whether they are conscious of it or not, believe they can swoop in for a few days and fix whatever’s wrong. Sons from out of town, poor guys, are a pain to educate.

***

The next day, I ring the doorbell at the patient’s house. The son opens the door, walks out and closes it behind him. He crosses his arms over his chest and frowns down at me. “I’m taking Mom off hospice.”

Here we go. I keep my face and tone of voice neutral. “What does your mother say about that?”

He scoffs. “You’ve got her so doped up she doesn’t know what she’s saying. I’m doing what’s best for her.”

I try a sympathetic smile. “I know it’s hard to see your mom like this, but as long as she’s mentally competent, she has the legal right to make her own decisions.”

“It’s for her own good.”

The temperature is in the triple digits. I wish we could do this inside. “I understand you have her best interests at heart, but the law says two doctors have to agree she’s not competent before her health care power of attorney can take over her decision-making. And that’s your sister. Besides, would you want to put your mom through all that?”

He takes a step forward. I get ready to dodge. I probably won’t need to, but it’s best to prepare for the worst. He says, “You don’t know what you’re talking about.”

Why is it that ignorance and arrogance so often occur together?

Facts aren’t working, so I default to empathy. “If I’m hearing you right, you’re worried about the drugs we’re giving her.”

He spreads his arms wide and thrusts his face toward me. “Yes! She’s getting addicted!”

And there it is.

***

In my years of hospice work, dozens of patients have told me, “I’m not afraid of dying. I’m afraid of suffering.”

Death can be beautiful, but it is never pretty. And it’s always painful. As the natural dying process goes on, there’s more and more pain, and patients need higher and higher doses of morphine to ease it. But hospice patients don’t get addicted. Even if they did (which I repeat, they don’t), what would it matter? The dead don’t need drugs to be at peace.

But we as a society are so afraid of addiction that families cannot understand it isn’t an issue for the dying. That stems from an inability to accept the coming death. Humans can only take reality in small doses, especially when we are about to lose someone we can’t bear to be without. Even people who believe in an afterlife sometimes doubt when they’re face-to-face with the possibility of final separation.

Because morphine and the end of people’s lives coincide, families often ask to have the patient taken off it during the final days of active dying. They say, “The morphine is killing him.”

When that happens, it’s the hospice staff’s unpleasant task to confront them with hard truths. “The (cancer, congestive heart failure, stroke, etc.) is killing him,” we say. “The choice you have to make is whether he dies in comfort or in pain. We can take him off the morphine, but do you really want to increase his suffering during his last days?” Families most often continue the morphine, but some have to see the consequences of removing it before they understand how necessary it is.

***

As a former substance abuse counselor and recovering alcoholic with almost fourteen years of sobriety—which could vanish in an instant, in any instant—I can say with authority that I understand addiction. In my opinion, addiction happens more because of emotional pain than because someone develops a physical tolerance for a medication. Until the past few years, the medical establishment has done a good job of ensuring that treatment of bodily pain doesn’t result in addiction. But emotional pain is often ignored because its disabling power is not well-understood.

If you scratch the surface of any person you pass on the street, you will find sorrow, regret, guilt, shame, self-doubt, loneliness, and a pantheon of other woundedness. Not to mention scars from traumatic experiences. To be human is to carry tragedy in our hearts.

While effective treatments exist to deal with emotional pain—talk therapy, support groups, and psychiatric medications, among others—there is a stigma attached to people who admit they need that kind of help. Having emotional problems or an imbalance of brain chemicals is still often seen as a defect of character, a moral failing, weakness. And no one wants to be called crazy.

Drugs and alcohol can distract us from our emotional pain, for a moment. So can food, sex, and work. But that distraction, that “high,” is hard to maintain. The more people use their drug or activity of choice, the more they need to use it to get the same effect, until they have to use very large amounts just to be as miserable as they started out. And their emotional pain is right there waiting for them. This is why so many people become addicted to multiple substances or activities. For example, my parents had to drink two pots of coffee and smoke a pack of cigarettes just to make it out the door in the morning.

According to the Centers for Disease Control and Prevention (CDC), addiction to nicotine kills 480,000 people a year in this country.1 The CDC also reports that pregnant women who smoke give birth to more premature babies, more infants with low birth weight, more with weak lungs, and their babies are more prone than others to sudden infant death syndrome (SIDS).2 But, in my opinion, society doesn’t regard smokers as addicts because tobacco products are legal.

What would we learn if someone compared the number of deaths in this country caused by legal substances to the number of deaths caused by illegal substances?

Caffeine keeps the economy running. It makes it possible for us to put one foot in front of the other as we do things we don’t enjoy, in places we don’t want to be, with companions we had no say in choosing, five days a week.

According to a study by the Kuakini Medical Center research center in Honolulu, 90% of Americans consume caffeine regularly.3 Their study calls caffeine this country’s most popular drug. Even though people build up a tolerance for caffeine and experience withdrawal from it, experts differ on whether it can be classified as addictive. Evidence supporting that it is: the combination of caffeine and sugar has built a Starbucks on almost every street corner in the United States.

Because part of my stomach is paralyzed, I had to give up both caffeine and sugar. For me, it was harder to stop using caffeine than alcohol. Sugar was harder than both of those. Most of us know that sugar isn’t good for us, but we don’t think of ourselves as addicted.

Everyone’s addicted to something.

So it’s no wonder that when opioids became easy to get, people lined up for them. Without effective oversight, widespread addiction was inevitable. As I write this in early 2019, the United States is suffering an epidemic of deaths from opioid overdose.

What is it about the human race that compels us to monetize, and/or weaponize, every good thing?

***

As a species, humans are addicted to black-and-white thinking. We want things to be simple. We want things to stay the same. Good/evil, male/female, strength/weakness, patriotism/treason. We don’t, don’t, don’t want to consider multiple possibilities for these or anything else.

To experience this, just try to get people to vary the way they do their jobs. I’m willing to bet money that most of them won’t be able to see over the sides of the rut they’ve dug for themselves.

Ruts are comfortable. Ruts don’t require thinking. Ruts give us the illusion that the world is under control. Ruts make us feel safe.

Thinking about things in depth leads to uncertainty, which is the opposite of feeling safe.

***

Modern hospice care traces its roots to Dr. Cicely Saunders, who opened the first hospice in England in 1967.4 Her strategy—using multiple disciplines to treat the physical, psychological, and spiritual problems dying people encounter—proved so successful that other medical specialties adopted it. As better treatments for incurable diseases reduced the number of deaths and increased the number of patients living with chronic pain, acceptance of “palliative care” grew. Medical professionals began to focus on the dignity, integrity, and quality of life of people with ongoing, long-term pain. The modern Palliative Care Movement was born.

The World Health Organization (WHO) began researching palliative care in the 1980s. In 1990, it published a paper recognizing it as a necessary part of patient treatment, and it became more possible for people worldwide to get adequate pain control without having to prove they were going to die soon.5

Which brings me to the final addiction I want to talk about: the addiction to the idea of cure. Cure is often only a fantasy. The more wedded people are to this fantasy, the more the patients involved suffer.

According to the National Hospice Organization, in 2017, 27.8% of Medicare-funded hospice patients were on service for seven days or less.6 That includes the people who had the misfortune to die in the ambulance on the way from the hospital to the inpatient unit. The reason that percentage is so large is that so many patients and family members are addicted to the hope for a cure. They can’t let go of it. And they fear “giving up,” as though their mental attitude can control survival.

Not that mental attitude means nothing. I have often seen patients hang on until that last person comes and says goodbye, then draw their final breath as soon as that goodbye is over. In our hospice about half the people who died on our inpatient unit waited until they were alone to die.

***

The out-of-town son, Jackson, extends his stay as long as he can. He stays awake to watch over his mother at night, so his sister can get some sleep. I give him resources on overnight caregivers for after he leaves.

I ring the doorbell. Jackson answers and steps back to let me in. His head hangs heavy into his slumped shoulders. His skin is pale except for the purplish darkness around his eyes.

“You look exhausted.”

“Ya think? I’m making coffee. Want some?”

I shake my head. “I’ll look in on your mom.”

“We’ll be in the kitchen.”

The smell of active dying—a combination of concentrated urine and the acetone-like odor of dying cells—is an invisible fog in the unlit room. The odor and I are not friends, but we are longtime companions.

The patient’s hospital bed is set up in the living room so she can still feel like a part of the family’s life, and so she can see the Catalina Mountains out the picture window. Today, the curtains on the windows are closed.

I bend over the bed and touch her hand. “Hello, Cathy, it’s Peggy. Do you feel like talking?” Without opening her eyes, she moves her head one slow trip left, right, center. I squeeze and release her emaciated hand. “I’ll go talk to your kids, see if I can help them out in any way.” The corners of Cathy’s mouth turn up a little, and drop again.

The kitchen has a fog, as well—cigarette smoke. I sit at the table with Jackson and his sister, Treacy. No one speaks for a minute.

“Your mom’s very weak now,” I say.

They look at me with eyes that don’t want to know.

“It might not be long,” I add.

Treacy goes to the counter and pours more coffee. She stays there with her back turned.

Jackson clears his throat. “Will dying hurt?”

I know he has already asked the nurse the same question. And the nurse’s aide. And the chaplain. I give the same answer we always do. “Helping people be comfortable all the way to the end is the thing hospice does best.”
He rubs his hands across his face. “Yeah. Thank God.”

Treacy’s sob is so quiet it couldn’t be heard anywhere that was not a place of vigil.

After a respectful pause, I say, “What’s happening with hiring caregivers?”

***

Three days later, Cathy dies a pain-free death with her children by her side.

After the nurse pronounces the death, she and I clean Cathy’s body and re-dress her, so the children can say a last goodbye before the funeral home van comes. After that, the R.N. collects the several forms of opiate Cathy took during her time in hospice and takes them to the hospital for disposal. On her way out the door, Jackson gives her an awkward one-armed hug. He says, “Thanks for not letting her suffer.”

***

The overdose epidemic is not okay. We need to do something about it. But. I’m afraid humanity’s desire for simple solutions to complex problems will rocket us back to the days when non-hospice patients were denied pain control. I worry that our fears will overwhelm our capacity for empathy. I fear the return of preventable physical suffering, and I weep for the helpless patients who will be forced to endure it.

 

Reference List:

  1. Fast facts and fact sheets: smoking and tobacco use. Centers for Disease Control and Prevention.  https://www.cdc.gov/tobacco/data_statistics/fact_sheets/fast_facts/index.htm. Updated February 6, 2019. Accessed September 11, 2019.
  2. Substance use during pregnancy. Centers for Disease Control and Prevention.  https://www.cdc.gov/reproductivehealth/maternalinfanthealth/substance-abuse/substance-abuse-during-pregnancy.htm. Updated July 24, 2019.  Accessed September 11, 2019.
  3. Caffeine: America’s most popular drug. Kuakini Health System. https://www.kuakini.org/wps/portal/public/Health-Wellness/Health-Info-Tips/Miscellaneous/Caffeine–America-s-Most-Popular-Drug. Updated 2019. Accessed September 11, 2019.
  4. Clark, D. Cicely Saunders and her early associates: a kaleidoscope of effects. In: To Comfort Always: A History of Palliative Medicine since the Nineteenth Century. Oxford, England: Oxford University Press; 2016:1-33. doi:10.1093/med/9780199674282.001.0001. Accessed September 12, 2019.
  5. Cancer pain relief and palliative care: report of a WHO expert committee (meeting held in Geneva from 3 to 10 July 1989). World Health Organization. http://apps.who.int/iris/handle/10665/39524. Published January 1, 1990. Accessed September 14, 2019.
  6. National Hospice and Palliative Care Organization (NHPCO) facts and figures: 2018 edition. NHPCO. https://www.nhpco.org/wp-content/uploads/2019/07/2018_NHPCO_Facts_Figures.pdf. Updated July 2, 2019. Accessed September 21, 2019.

Dressing the Mutton

 I grow old . . . I grow old . . .
I shall wear the bottoms of my trousers rolled. —T.S. Eliotalice lowe

I throw on purple tights and an orange racer-back tank top with the logo of Milestone Running—the San Diego store where I buy my athletic gear—emblazoned on the front. I’m out for an early five-mile run. After I return home and shower, I dress in jeans and a t-shirt for a day at my desk, add a hoodie when I walk out later to the library and grocery store. This is me, age seventy-four, on a typical Tuesday.

My grandmother died when she was younger than I am now. In my memory, she’s an old woman, blue-haired, stocky and shelf-bosomed. In a family photo from my brother’s 1956 high school graduation, she wears a dark shapeless dress under a long shapeless coat, a little pancake hat perched on her tight tinted coils. She dressed her age, like most sixty-something women of her day. In tights and a tank top or jeans and a t-shirt, she’d have caused a stir. She’d have been accused of making a pathetic attempt to pass herself off as younger, of being “mutton dressed as lamb.”

Will I wake up one Tuesday morning—next month or next year or ten years from now—and want to wear a powder-blue polyester tracksuit on my morning run? Will I deem it more appropriate, decide it’s time to dress my age? What does that mean, and is it still a valid consideration? One question conjures another; they pile up in my mind like back issues of The New Yorker. I’ve explored and written about women’s aging from both societal and personal perspectives, seeking answers to questions like these. The matter of dress presents itself as a thread worth following.

The expression “mutton dressed as lamb” originated in Britain, where both are commonly eaten. Lamb is under a year old, tender and desirable. Mutton is the meat from older, tougher, gamier sheep—an acquired taste, I’m told—often relegated to stews and soups. Hosts were known to try to fool their guests by having mutton roasts prepared and presented to pass off as lamb.

Lamb chops often are adorned with little paper frills—accessories like bows in the hair, scarves or beads around the neck—but lamb roasts are served without ornamental embellishments, so whatever made them lamb-like is a mystery. Recipes for mutton recommend marinating and tenderizing it, cooking it long and slow. It may turn out as a respectable roast, but no one’s going to mistake it for lamb. Mutton is mutton, and ostensibly you love it for itself.

In reference to women, “mutton dressed as lamb” was attributed to the Prince of Wales, later George IV, in the early nineteenth century. He used the expression to praise older women, however, not to denigrate them. He liked the frills or perhaps was titillated by the subterfuge: “Girls are not to my taste,” he said. “I don’t like lamb; but mutton dressed like lamb.” In recent times, it’s become a form of ridicule addressed toward older women who defy what they believe to be antiquated standards.

When we’re young we’re birds, chicks, fillies. Then, without warning, we’re no longer “spring chickens.” We become crows, hens, nags. One day we’re lamb; the next, mutton.

When I entered the workforce in the early sixties, women, regardless of their age, did not wear pants—slacks, trousers, or any legged garments—in business offices. Chanel and Schiaparelli designed women’s pantsuits in the thirties; Marlene Dietrich and Katharine Hepburn wore them on screen and in public; but they didn’t catch on with the greater population of women and were outlawed in the secretarial pool. My mother was hired as a bank teller in her fifties, becoming the oldest nonmanagement employee at the branch. She was caught in a limbo of needing to appear both young enough to be thought viable in her position and tastefully mature but not superannuated. We dressed much the same in skirts and blouses, dyed-to-match sweater sets, shirtwaist and A-line dresses, feminine-cut jackets. 

In my mother’s time, as in my grandmother’s, older women—whatever number is assigned to the category—were still expected to look and dress their age. They must be tidily outfitted and coiffed, with nothing too short or showy or sexy, nothing that would draw attention, nothing that would seem unseemly. Over recent decades, we’ve seen the relaxation of social mores and a loosening of formality in dress. The social revolution of the late sixties opened the door to an anything-goes assault—but on what? On hypocrisy and pretentiousness, some believed; on decorum and decency, according to others. How far one went was subject to cultural, socioeconomic, geographic, and generational divides.

Growing up in a San Diego beach town, I was aware that what was accepted in casual Southern California may not have washed on the East Coast, where my family originated, or even in sophisticated San Francisco, where my grandmother lived. I don’t recall particular shibboleths about dress, but my mother recited and adopted this refrain: when a woman reaches forty, her hair shouldn’t reach her shoulders.

Now it would be easy to say that standards of dress based on age are no longer a consideration. That we’re freed of constraints. Except that’s not entirely true. I can wear my skinny jeans, or I can don polyester with an elastic waist, but my choices are still subject to scrutiny and consequences. At stake is my identity, the statement I make about myself to the world. It’s a Catch-22. Women past a certain age are mocked if they dress like their daughters, dismissed if they dress like their mothers. They can straddle the middle, hug a safe space along the continuum, but they still teeter on the brink of uncertainty. Is there a right and a wrong, and is it as capricious as it appears? Is seventy the new fifty, or is it still irredeemably geriatric?

When I retired after thirty years in human services management, I donated my casual/professional nonprofit “dress-for-success” attire—suits, blazers, tailored slacks and blouses—to a program that helps homeless women get back into the workforce. I winnow my wardrobe periodically and discard garments I no longer wear, but I’ve never rejected anything because it was no longer age appropriate. I was always a conventional dresser and aimed for good taste and a mature-but-youthful appearance, so with the exception of a few impulse buys later regretted (like the open-weave crocheted mini number I wore to a friend’s wedding), my style of dress hasn’t changed.  

I’m bemused by this whole idea of appropriateness. Whether it pertains to dress or speech or behavior, it carries a weight of judgment: appropriate (fitting, suitable, seemly, apt, proper, correct) according to whom, and to what standard? With no fixed criteria, it becomes a precarious balancing act. Even in that seemingly safe middle ground, scrutiny dangles over you. Magazines and the internet, as wide-ranging as The New York Times, Cosmopolitan, and Forbes, proffer advice: how to know if you’re dressing too old, too young, or just right—like Goldilocks sampling the three bears’ porridge—in six signs, nine mistakes, ten tips. At some indeterminate point, showing skin (midriff, cleavage, thigh), wearing t-shirts with slogans, or borrowing your daughter’s clothes may trigger an alarm, a flashing neon sign—“Act your age!”—or a convulsive shock.

Arbiters of fashion pay lip service to a liberating consensus—you should wear whatever makes you feel confident and comfortable. But there’s a caveat: don’t take this freedom too far. The women surveyed in the 2014 Women in Clothes, by Sheila Heti, Heidi Julavits, and Leanne Shapton, took a fatalistic view; they appeared ready to give up the fight before it began. The study’s older contingent consisted of women in their forties, whose days, to hear them tell it, were dwindling down to a precious few: “After forty it’s time to give up your game.” “I’m almost forty, but I still want to wear tight pants.” “…straddle the line between slutty and frumpy.” They sound as if they’d stepped into a bottomless middle-aged crevasse. My grandmother at forty might have been one of the respondents.

In vivid contrast, Ari Seth Cohen declares in “Advanced Style” (books, website, and documentary), “The ladies I photograph challenge stereotypical views on age and aging. They are youthful in mind and spirit and express themselves through personal style and individual creativity.” Cohen’s subjects are a striking bunch—some glamorous and elegant, others ranging from quirky to outrageous, all over sixty, all eye-stopping. For them, “appropriate” could be a verb: to wrest, seize, commandeer. They’ve laid claim to the art and fun of dressing up, of making a statement, of performing. Their age is a license to dress as lamb or lion or lemon meringue pie, and they go to great lengths to create their public personas.       

As a sociology major in the eighties, I was surprised and delighted to learn that one of my instructors had studied dress as it defines our identities, that such a topic was deemed worthy of sociological research. Another professor published a paper on thrift-store shopping. She found that women’s behavior and purchases in second-hand stores were more adventurous and assertive, less constrained by societal “should,” than in department stores. These projects employed qualitative methodology—interviews, observation, direct experience—rather than quantitative. They drew from Erving Goffman’s 1956 seminal work, The Presentation of Self in Everyday Life, which posits that social interaction is a kind of theatrical performance, that we’re acting out roles, whether in front of an audience or backstage. As one of Ari Cohen’s models remarked, “We must dress every day for the theatre of our lives.”

In the perennial student costume of jeans and tee, I don my sociologist/detective hat and magnifying glass, a la Sherlock Holmes, and take up my post in a carrel at the San Diego State University library, nested among the same stacks I inhabited as an undergraduate. I scan the shelves and peruse professional journals for “aging and dress” and find its continued significance in provocative titles that include:

    • “‘Growing old gracefully’ as opposed to ‘MDAL’: the social construction of recognizing older women”
    • “‘No one expects me anywhere’: invisible women, ageing & the fashion industry”
    • “‘Bat wings, bunions, & turkey wattles’: body transgressions & older women’s strategic clothing choice”

Fashion and Age: Dress, the Body and Later Life is grounded in fashion theory and cultural gerontology. Working from the premise that age is a social division like gender, race, class, sexuality, or disability, author Julia Twigg found that while aging has become more fluid as a result of the Baby Boomer effect, it is still associated with a toned-down, self-effacing presentation. A moral code of dress continues to exist, and it’s fraught with ambiguity. Don’t dress too young, expose too much, let yourself go. Do resist age, be up to date and well dressed. The women surveyed wanted neither to stand out nor to disappear—the message they sought to convey was “look at me, I’m not invisible,” echoing another of Ari Cohen’s subjects: “I would rather be considered different and somewhat mysterious than ignored.” I may never make the pages of “Advanced Style,” but I recoil at the idea of “toned-down and self-effacing,” of invisibility.

My investigations reveal a scantily clad elephant in the room. Sex has been glaringly absent and/or discreetly swept under the bed. The literature is silent about sexuality in older women, lingering sexual taboos, or the conflict between the two as they relate to the way we dress. Changes wrought by the sixties became a demarcation point between my mother’s generation and my own; now Baby Boomers are entering their seventies with visions of eternal youth and undiminshed libido. Today most people would agree that older women are sexual beings and as such can, if they choose, announce it in their appearance and dress. Celebrities are trotted out as evidence—Helen Mirren in a bikini, Jane Fonda in a miniskirt, Sophia Loren in anything—but it wasn’t apparent in Vogue or Sociology Today.  

In her 2008 memoir, Somewhere Towards the End, Diana Athill reflects, at ninety, on differences between present and past for old women. In her grandmother’s day, women wore what amounted to a uniform; they “went a bit drab and shapeless, making it clear that this person no longer attempted to be attractive.” Old women still can’t get away with dressing like teenagers, she says, “but I have a freedom of choice undreamt of by my grandmothers.” She recalls past loves and writes fondly of the sexual relationship that “accompanied me over the frontier between late middle-age and being old.” At its end, past seventy, “I might not look, or even feel, all that old, but I had ceased to be a sexual being.”

No one wants to be old, fat, ugly, undesirable. Age is in conflict with society’s focus on youth, beauty, and sex. The fashion industry, keen to capitalize on the lucrative gray market, promotes “new ways of being older” yet still bows to the ambiguous standard of age appropriateness. Complicit are bodily changes that accompany age and defy fashion. The response is to cover up and hide sags, spots, veins, and wrinkles in longer skirts and higher necks, darker colors (as if in mourning), or bland neutrals. Or to adapt, by means of elastic waistbands and stretchy fabrics, toddler-like shapes and asexual styles. Think Annie Hall’s “Grammy” in lavender and lace, my grammy in pastel prints. In “I Feel Bad About My Neck,” Nora Ephron observes that she and her women friends wear turtlenecks, scarves, and mandarin collars to hide their telltale crepey necks. She calls it compensatory dressing. For me it’s jeans like NYDJ (Not Your Daughter’s Jeans), with high waists and hidden tummy tuckers, fitted to enhance mature bodies without sacrificing style.

In my working years, I shopped equally at Nordstrom and second-hand stores. I combined smart professional wear with shabby chic to stretch my dollars and achieve my take on an Annie Hall-in-middle-management look. I still rifle through the racks at thrifts for jackets, workout gear, and t-shirts (with or without logos). My body is contoured differently from when I was young—there’s been some shift over time—but I’m still slender and toned. People say I look younger than my age, including my daughter, who will be the first to tell me if and when I’m out of line.

Thanks to cultural and social changes over the past fifty years, we have the freedom Diana Athill celebrated. We no longer have to dress in high collars and low hems or accept that being old means being dismissed, undervalued. We’re still sexual beings. But there continue to remain capricious and often concealed standards and expectations, rights and wrongs. If I’ve exaggerated the dichotomy between mutton and lamb, it’s to emphasize that a quandary still exists. Older women walk a narrow tightrope of judgment, because, like upstart toddlers or defiant teens, we’re still seen as needing to be controlled. Seventy is still seventy, but a new seventy, and yes, this is what it looks like—it can wear purple tights and orange tanks. I don’t think a baby-blue polyester tracksuit is in my future, but I have that choice too.

Eckleburg