I grew up in rural Herefordshire, entrenched deep in its farming community. So this article strikes a very poignant chord for me as it goes to the heart of one of the hardest conflicts I have being a vegan with my background. How can I be comfortable with and respect my friends and family who make a living doing something I intrinsically believe is cruel and wrong? A lot of my farming friends are sadly turning to this form of factory farming of chickens in order to try and stay financially afloat. I have huge sympathy for how hard farmers are finding it to make a living – especially the potato and dairy farmers, many of whom are going under all over the UK or having to diversify away from what they have done for generations. But does that excuse them turning to such a depraved method of farming? Who am I to think badly of someone trying to keep their family above water? At what point do their immediate needs have to take priority over my ethical ideals?
As a passionate vegan everything about this form of factory farming appalls me – both ethically and environmentally. But whilst famers feel they have no other option, they are going to continue down this route of desperate mass farming which only spells out bad news for us, the animals and the environment. The responsibility ultimately lies with the consumers. When will we wake up to the effects our everyday choices have on the world at large? When will we stop demanding cheaper and cheaper meat and dairy products in greater and greater quantity at the expense of our own personal health, the animals’ rights and the health of the environment.
The below article is from George Monbiot’s website and was published yesterday in the Guardian:
Fowl Deeds
The astonishing, multiple crises caused by chicken farming.
(By George Monbiot, published in the Guardian 20th May 2015)
It’s the insouciance that baffles me. To participate in the killing of an animal: this is a significant decision. It spreads like a fungal mycelium into the heartwood of our lives. Yet many people eat meat sometimes two or three times a day, casually and hurriedly, often without even marking the fact.
I don’t mean to blame. Billions are spent, through advertising and marketing, to distract and mollify, to trivialise the weighty decisions we make, to ensure we don’t connect. Even as we search for meaning and purpose, we want to be told that our actions are inconsequential. We seek reassurance that we are significant, but that what we do is not.
It’s not blind spots we suffer from. We have vision spots, tiny illuminated patches of perception, around which everything else is blanked out. How often have I seen environmentalists gather to bemoan the state of the world, then repair to a restaurant in which they gorge on beef or salmon? The Guardian and Observer urge us to go green, then publish recipes for fish whose capture rips apart the life of the sea.
The television chefs who bravely sought to break this spell might have been talking to the furniture. Giant chicken factories are springing up throughout the west of England, the Welsh Marches and the lowlands of the east. I say factories for this is what they are: you would picture something quite different if I said farm; they are hellish places. You might retch if you entered one, yet you eat what they produce without thinking.
Two huge broiler units are now being planned to sit close to where the River Dore rises, at the head of the Golden Valley in Herefordshire, one of the most gorgeous landscapes in Britain. Each shed at Bage Court Farm – warehouses 90 metres long – is likely to house about 40,000 birds, that will be cleared out, killed and replaced every 40 days or so. It remains to be seen how high the standards of welfare, employment and environment will be.
The UK now has some 2,000 of these factories, to meet a demand for chicken that has doubled in 40 years*. Because everything is automated, they employ few people, and those in hideous jobs: picking up and binning the birds that drop dead every day, catching chickens for slaughter in a flurry of shit and feathers, then scraping out the warehouses before the next batch arrives.
The dust such operations raise is an exquisite compound of aerialised faeces, chicken dander, mites, bacteria, fungal spores, mycotoxins, endotoxins, veterinary medicines, pesticides, ammonia and hydrogen sulphide. It is listed as a substance hazardous to health, and helps explain why 15% of poultry workers suffer from chronic bronchitis. Yet, uniquely in Europe, the British government classifies unfiltered roof vents on poultry sheds as the “best available technology”. If this were any other industry, it would be obliged to build a factory chimney to disperse the dust and the stink. But farming, as ever, is protected by deference and vested interest, excused from the regulations, planning conditions and taxes other business must observe. Already, Herefordshire County Council has approved chicken factories close to schools, without surveying the likely extent of the dust plumes either before or after the business opens. Bage Court Farm is just upwind of the village of Dorstone.
Inside chicken factories are scenes of cruelty practised on such a scale that they almost lose their ability to shock. Bred to grow at phenomenal speeds, many birds collapse under their own weight, and lie in the ammoniacal litter, acquiring burns on their feet and legs and lesions on their breasts. After slaughter they are graded. Those classified as grade A can be sold whole. The others must have parts of the body removed, as they are disfigured by bruising, burning and necrosis. The remaining sections are cut up and sold as portions. Hungry yet?
Plagues spread fast through such factories, so broiler businesses often dose their birds with antibiotics. These require prescriptions but – amazingly – the government keeps no record of how many are issued. The profligate use of antibiotics on farms endangers human health, as it makes bacterial resistance more likely.
But Herefordshire, like other county councils in the region, scarcely seems to care. How many broiler units has it approved? Who knows? Searches by local people suggest 42 in the past 12 months. But in December the council claimed it has authorised 21 developments since 2000§. This week it told me it has granted permission to 31 since 2010. It admits that it “has not produced any specific strategy for managing broiler unit development”¤. Nor has it assessed the cumulative impact of these factories. At Bage Court Farm, as elsewhere, it has decided that no environmental impact assessment is neededɷ.
So how should chicken be produced? The obvious answer is free range, but this exchanges one set of problems for another. Chicken dung is rich in soluble reactive phosphate. Large outdoor flocks lay down a scorching carpet of droppings, from which phosphate can leach or flash into the nearest stream. Rivers like the Ithon, in Powys, are said to run white with chicken faeces after rainstorms. The River Wye, a special area of conservation, is blighted by algal blooms: manure stimulates the growth of green murks and green slimes that kill fish and insects when they rot. Nor does free range solve the feed problem: the birds are usually fed on soya, for which rainforests and cerrado on the other side of the world are wrecked.
There is no sensible way of producing the amount of chicken we eat. Reducing the impact means eating less meat – much less. I know that most people are not prepared to stop altogether, but is it too much to ask that we should eat meat as our grandparents did, as something rare and special, rather than as something we happen to be stuffing into our faces while reading our emails? To recognise that an animal has been sacrificed to serve our appetites, to observe the fact of its death, is this not the least we owe it?
Knowing what we do and what we induce others to do is a prerequisite for a life that is honest and meaningful. We owe something to ourselves as well: to overcome our disavowal, and connect.
* Total purchases for household consumption (uncooked, pre-cooked and take-aways combined) rose from 126 grammes per person per week in 1974 to 259 grammes in 2013 (see the database marked UK – household purchases).
§ BBC Hereford and Worcester, 15th December 2014
¤ Response to FoI request IAT 7856, 13th August 2014
ɷ Herefordshire County Council, 22nd December 2014. Screening Determination of Bage Court Farm development, P143343/F
Skool of Vegan is a new initiative aimed at trying to get people to look at their eating habits and attitudes towards animals in a more critical way. Their mission statement is: ‘Because making the connection is child’s play’. It certainly makes for some uncomfortable reading and I admire their original approach. Whether you like the drawings or not its hard to deny the underlying truth and i think they do a good job of highlighting the hypocrisy and inconsistencies of what we teach our kids. I think it’s probably a little too heavy handed for most people’s taste and therefore I doubt they will reach people in the way they’d like to. Perhaps a less aggressive tone might have spoken to more people…? What do you think? Here a few…
Have just watched this feature length documentary on veganism and would highly recommend it to everyone, vegan or not.
It examines our relationship with animals, the history of veganism and the ethical, environmental and health reasons that move people to go vegan.
Food scandals, climate change, lifestyle diseases and ethical concerns move more and more people to reconsider eating animals and animal products. From butcher to vegan chef, from factory farmer to farm sanctuary owner – Live and Let Live tells the stories of six individuals who decided to stop consuming animal products for different reasons and shows the impact the decision has had on their lives.
Philosophers such as Peter Singer, Tom Regan and Gary Francione join scientists T. Colin Campbell and Jonathan Balcombe and many others to shed light on the ethical, health and environmental perspectives of veganism.
Through these stories, Live and Let Live showcases the evolution of veganism from its origins in London 1944 to one of the fastest growing lifestyles worldwide, with more and more people realising what’s on their plates matters to animals, the environment and ultimately – themselves.
Here’s a great article by the wonderfully eloquent and engaging George Monbiot which was published in The Guardian on the 16th Dec 2014.
If you must eat meat, save it for Christmas
From chickens pumped with antibiotics to the environmental devastation caused by production, we need to realise we are not fed with happy farm animals.
What can you say about a society whose food production must be hidden from public view? In which the factory farms and slaughterhouses supplying much of our diet must be guarded like arsenals to prevent us from seeing what happens there? We conspire in this concealment: we don’t want to know. We deceive ourselves so effectively that much of the time we barely notice that we are eating animals, even during once-rare feasts, such as Christmas, which are now scarcely distinguished from the rest of the year.
It begins with the stories we tell. Many of the books written for very young children are about farms, but these jolly places in which animals wander freely, as if they belong to the farmer’s family, bear no relationship to the realities of production. The petting farms to which we take our children are reifications of these fantasies. This is just one instance of the sanitisation of childhood, in which none of the three little pigs gets eaten and Jack makes peace with the giant, but in this case it has consequences.
Labelling reinforces the deception. As Philip Lymbery points out in his book Farmageddon, while the production method must be marked on egg boxes in the EU, there are no such conditions on meat and milk. Meaningless labels such as “natural” and “farm fresh”, and worthless symbols such as the little red tractor, distract us from the realities of broiler units and intensive piggeries. Perhaps the most blatant diversion is “corn-fed”. Most chickens and turkeys eat corn, and it’s a bad thing, not a good one.
The growth rate of broiler chickens has quadrupled in 50 years: they are now killed at seven weeks. By then they are often crippled by their own weight. Animals selected for obesity cause obesity. Bred to bulge, scarcely able to move, overfed, factory-farmed chickens now contain almost three times as much fat as chickens did in 1970, and just two thirds of the protein. Stalled pigs and feedlot cattle have undergone a similar transformation. Meat production? No, this is fat production.
Sustaining unhealthy animals in crowded sheds requires lashings of antibiotics. These drugs also promote growth, a use that remains legal in the United States and widespread in the European Union, under the guise of disease control. In 1953, Lymbery notes, some MPs warned in the House of Commons that this could cause the emergence of disease-resistant pathogens. They were drowned out by laughter. But they were right.
What comes out is as bad as what goes in. The manure from factory farms is spread ostensibly as fertiliser, but often in greater volumes than crops can absorb: arable land is used as a dump. It sluices into rivers and the sea, creating dead zones sometimes hundreds of miles wide. Lymbery reports that beaches in Brittany, where there are 14 million pigs, have been smothered by so much seaweed, whose growth is promoted by manure, that they have had to be closed as a lethal hazard: one worker scraping it off the shore apparently died of hydrogen sulphide poisoning, caused by the weed’s decay.
Four years ago, I softened my position on meat-eating after reading Simon Fairlie’s book Meat: A Benign Extravagance. Fairlie pointed out that around half the current global meat supply causes no loss to human nutrition. In fact it delivers a net gain, as it comes from animals eating grass and crop residues that people can’t consume.
Since then, two things have persuaded me that I was wrong to have changed my mind. The first is that my article was used by factory farmers as a vindication of their monstrous practices. The subtle distinctions Fairlie and I were trying to make turn out to be vulnerable to misrepresentation.
The second is that while researching my book Feral, I came to see that our perception of free-range meat has also been sanitised. The hills of Britain have been sheepwrecked – stripped of their vegetation, emptied of wildlife, shorn of their capacity to hold water and carbon – all in the cause of minuscule productivity. It is hard to think of any other industry, except scallop dredging, with a higher ratio of destruction to production. As wasteful and destructive as feeding grain to livestock is, ranching could be even worse. Meat is bad news, in almost all circumstances.
So why don’t we stop? Because we don’t know the facts, and because we find it difficult even if we do. A survey by the US Humane Research Council discovered that only 2% of Americans are vegetarians or vegans, and more than half give up within a year. Eventually, 84% lapse. One of the main reasons, the survey found, is that people want to fit in. We might know it’s wrong, but we block our ears and carry on.
I believe that one day artificial meat will become commercially viable, and that it will change social norms. When it becomes possible to eat meat without keeping and slaughtering livestock, live production will soon be perceived as unacceptable. But this is a long way off. Until then, perhaps the best strategy is to encourage people to eat as our ancestors did. Rather than mindlessly consuming meat at every meal, we should think of it as an extraordinary gift: a privilege, not a right. We could reserve meat for a few special occasions, such as Christmas, and otherwise eat it no more than once a month.
All children should be taken by their schools to visit a factory pig or chicken farm, and to an abattoir, where they should be able to witness every stage of slaughter and butchery. Does this suggestion outrage you? If so, ask yourself what you are objecting to: informed choice, or what it reveals? If we cannot bear to see what we eat, it is not the seeing that’s wrong, it’s the eating.
Barley Rose MacLaren zoomed into the world on Tuesday at 3pm, weighing a hefty 9lb, deliciously pink and chubby. Hoorah!
This was the first pregnancy I have experienced as a vegan and I have to say the stats are pretty compelling…
I only put on a stone and a half in the whole pregnancy (2 and a quarter with Arcadia and 2 with Indigo) and Barley was not only the biggest (Arcadia 8lb 2, Indigo 8lb 10) but by far the chubbiest and pinkest baby of all three. I’m also 4/5 years older than I was with the last 2 pregnancies and yet had much more energy throughout.
2 very proud big sisters!
Being pregnant is great as it means you have your bloods, urine, heart rate, blood pressure etc tested regularly. Mine all tested great throughout. So to all those out there who still believe you need meat, dairy and eggs to get enough calcium, iron, potassium etc into your diet – it is absolutely NOT true. So long as you eat sensibly and take a vitamin B12 supplement then a well balanced vegan diet will give you all the nutrition you need! Add to that the massive increase in energy, the sharpened clarity of thought, the ‘feel good factor’ of knowing that you are eating much more compassionately and I just can’t recommend it enough!
Right – I’m off to feed Barley.
Next post should undoubtedly be about the dairy industry as there is nothing like breastfeeding your baby to remind you how appallingly cruel the dairy industry is…
Here’s the article recommended in the previous post… well worth reading!
This article was written by Maryn McKenna and produced in collaboration with the Food & Environment Reporting Network, an independent, non-profit news organization producing investigative reporting on food, agriculture and environmental health.
After 85 years, antibiotics are growing impotent. So what will medicine, agriculture and everyday life look like if we lose these drugs entirely?
A few years ago, I started looking online to fill in chapters of my family history that no one had ever spoken of. I registered on Ancestry.com, plugged in the little I knew, and soon was found by a cousin whom I had not known existed, the granddaughter of my grandfather’s older sister. We started exchanging documents: a copy of a birth certificate, a photo from an old wedding album. After a few months, she sent me something disturbing.
It was a black-and-white scan of an article clipped from the long-gone Argus of Rockaway Beach, New York. In the scan, the type was faded and there were ragged gaps where the soft newsprint had worn through. The clipping must have been folded and carried around a long time before it was pasted back together and put away. The article was about my great-uncle, the younger brother of my cousin’s grandmother and my grandfather.
In a family that never talked much about the past, he had been discussed even less than the rest. I knew he had been a fireman in New York City and died young, and that his death scarred his family with a grief they never recovered from. I knew that my father, a small child when his uncle died, was thought to resemble him. I also knew that when my father made his Catholic confirmation a few years afterward, he chose as his spiritual guardian the saint that his uncle had been named for: St. Joseph, the patron of a good death.
I had always heard Joe had been injured at work: not burned, but bruised and cut when a heavy brass hose nozzle fell on him. The article revealed what happened next. Through one of the scrapes, an infection set in. After a few days, he developed an ache in one shoulder; two days later, a fever. His wife and the neighborhood doctor struggled for two weeks to take care of him, then flagged down a taxi and drove him fifteen miles to the hospital in my grandparents’ town. He was there one more week, shaking with chills and muttering through hallucinations, and then sinking into a coma as his organs failed. Desperate to save his life, the men from his firehouse lined up to give blood. Nothing worked. He was thirty when he died, in March 1938.
The date is important. Five years after my great-uncle’s death, penicillin changed medicine forever. Infections that had been death sentences—from battlefield wounds, industrial accidents, childbirth—suddenly could be cured in a few days. So when I first read the story of his death, it lit up for me what life must have been like before antibiotics started saving us.
Lately, though, I read it differently. In Joe’s story, I see what life might become if we did not have antibiotics any more.
Predictions that we might sacrifice the antibiotic miracle have been around almost as long as the drugs themselves. Penicillin was first discovered in 1928 and battlefield casualties got the first non-experimental doses in 1943, quickly saving soldiers who had been close to death. But just two years later, the drug’s discoverer Sir Alexander Fleming warned that its benefit might not last. Accepting the 1945 Nobel Prize in Medicine, he said:
“It is not difficult to make microbes resistant to penicillin in the laboratory by exposing them to concentrations not sufficient to kill them… There is the danger that the ignorant man may easily underdose himself and by exposing his microbes to non-lethal quantities of the drug make them resistant.”
As a biologist, Fleming knew that evolution was inevitable: sooner or later, bacteria would develop defenses against the compounds the nascent pharmaceutical industry was aiming at them. But what worried him was the possibility that misuse would speed the process up. Every inappropriate prescription and insufficient dose given in medicine would kill weak bacteria but let the strong survive. (As would the micro-dose “growth promoters” given in agriculture, which were invented a few years after Fleming spoke.) Bacteria can produce another generation in as little as twenty minutes; with tens of thousands of generations a year working out survival strategies, the organisms would soon overwhelm the potent new drugs.
Fleming’s prediction was correct. Penicillin-resistant staph emerged in 1940, while the drug was still being given to only a few patients. Tetracycline was introduced in 1950, and tetracycline-resistant Shigella emerged in 1959; erythromycin came on the market in 1953, and erythromycin-resistant strep appeared in 1968. As antibiotics became more affordable and their use increased, bacteria developed defenses more quickly. Methicillin arrived in 1960 and methicillin resistance in 1962; levofloxacin in 1996 and the first resistant cases the same year; linezolid in 2000 and resistance to it in 2001; daptomycin in 2003 and the first signs of resistance in 2004.
With antibiotics losing usefulness so quickly — and thus not making back the estimated $1 billion per drug it costs to create them — the pharmaceutical industry lost enthusiasm for making more. In 2004, there were only five new antibiotics in development, compared to more than 500 chronic-disease drugs for which resistance is not an issue — and which, unlike antibiotics, are taken for years, not days. Since then, resistant bugs have grown more numerous and by sharing DNA with each other, have become even tougher to treat with the few drugs that remain. In 2009, and again this year, researchers in Europe and the United States sounded the alarm over an ominous form of resistance known as CRE, for which only one antibiotic still works.
Health authorities have struggled to convince the public that this is a crisis. In September, Dr. Thomas Frieden, the director of the U.S. Centers for Disease Control and Prevention, issued a blunt warning: “If we’re not careful, we will soon be in a post-antibiotic era. For some patients and some microbes, we are already there.” The chief medical officer of the United Kingdom, Dame Sally Davies — who calls antibiotic resistance as serious a threat as terrorism — recently published a book in which she imagines what might come next. She sketches a world where infection is so dangerous that anyone with even minor symptoms would be locked in confinement until they recover or die. It is a dark vision, meant to disturb. But it may actually underplay what the loss of antibiotics would mean.
In 2009, three New York physicians cared for a sixty-seven-year-old man who had major surgery and then picked up a hospital infection that was “pan-resistant” — that is, responsive to no antibiotics at all. He died fourteen days later. When his doctors related his case in a medical journal months afterward, they still sounded stunned. “It is a rarity for a physician in the developed world to have a patient die of an overwhelming infection for which there are no therapeutic options,” they said, calling the man’s death “the first instance in our clinical experience in which we had no effective treatment to offer.”
They are not the only doctors to endure that lack of options. Dr. Brad Spellberg of UCLA’s David Geffen School of Medicine became so enraged by the ineffectiveness of antibiotics that he wrote a book about it.
“Sitting with a family, trying to explain that you have nothing left to treat their dying relative — that leaves an indelible mark on you,” he says. “This is not cancer; it’s infectious disease, treatable for decades.”
As grim as they are, in-hospital deaths from resistant infections are easy to rationalize: perhaps these people were just old, already ill, different somehow from the rest of us. But deaths like this are changing medicine. To protect their own facilities, hospitals already flag incoming patients who might carry untreatable bacteria. Most of those patients come from nursing homes and “long-term acute care” (an intensive-care alternative where someone who needs a ventilator for weeks or months might stay). So many patients in those institutions carry highly resistant bacteria that hospital workers isolate them when they arrive, and fret about the danger they pose to others. As infections become yet more dangerous, the healthcare industry will be even less willing to take such risks.
Those calculations of risk extend far beyond admitting possibly contaminated patients from a nursing home. Without the protection offered by antibiotics, entire categories of medical practice would be rethought.
Many treatments require suppressing the immune system, to help destroy cancer or to keep a transplanted organ viable. That suppression makes people unusually vulnerable to infection. Antibiotics reduce the threat; without them, chemotherapy or radiation treatment would be as dangerous as the cancers they seek to cure. Dr. Michael Bell, who leads an infection-prevention division at the CDC, told me: “We deal with that risk now by loading people up with broad-spectrum antibiotics, sometimes for weeks at a stretch. But if you can’t do that, the decision to treat somebody takes on a different ethical tone. Similarly with transplantation. And severe burns are hugely susceptible to infection. Burn units would have a very, very difficult task keeping people alive.”
Doctors routinely perform procedures that carry an extraordinary infection risk unless antibiotics are used. Chief among them: any treatment that requires the construction of portals into the bloodstream and gives bacteria a direct route to the heart or brain. That rules out intensive-care medicine, with its ventilators, catheters, and ports—but also something as prosaic as kidney dialysis, which mechanically filters the blood.
Next to go: surgery, especially on sites that harbor large populations of bacteria such as the intestines and the urinary tract. Those bacteria are benign in their regular homes in the body, but introduce them into the blood, as surgery can, and infections are practically guaranteed. And then implantable devices, because bacteria can form sticky films of infection on the devices’ surfaces that can be broken down only by antibiotics
Dr. Donald Fry, a member of the American College of Surgeons who finished medical school in 1972, says: “In my professional life, it has been breathtaking to watch what can be done with synthetic prosthetic materials: joints, vessels, heart valves. But in these operations, infection is a catastrophe.” British health economists with similar concerns recently calculated the costs of antibiotic resistance. To examine how it would affect surgery, they picked hip replacements, a common procedure in once-athletic Baby Boomers. They estimated that without antibiotics, one out of every six recipients of new hip joints would die.
Antibiotics are administered prophylactically before operations as major as open-heart surgery and as routine as Caesarean sections and prostate biopsies. Without the drugs, the risks posed by those operations, and the likelihood that physicians would perform them, will change.
“In our current malpractice environment, is a doctor going to want to do a bone marrow transplant, knowing there’s a very high rate of infection that you won’t be able to treat?” asks Dr. Louis Rice, chair of the department of medicine at Brown University’s medical school. “Plus, right now healthcare is a reasonably free-market, fee-for-service system; people are interested in doing procedures because they make money. But five or ten years from now, we’ll probably be in an environment where we get a flat sum of money to take care of patients. And we may decide that some of these procedures aren’t worth the risk.”
Medical procedures may involve a high risk of infections, but our everyday lives are pretty risky too. One of the first people to receive penicillin experimentally was a British policeman, Albert Alexander. He was so riddled with infection that his scalp oozed pus and one eye had to be removed. The source of his illness: scratching his face on a rosebush. (There was so little penicillin available that, though Alexander rallied at first, the drug ran out, and he died.)
Before antibiotics, five women died out of every 1,000 who gave birth. One out of nine people who got a skin infection died, even from something as simple as a scrape or an insect bite. Three out of ten people who contracted pneumonia died from it. Ear infections caused deafness; sore throats were followed by heart failure. In a post-antibiotic era, would you mess around with power tools? Let your kid climb a tree? Have another child?
“Right now, if you want to be a sharp-looking hipster and get a tattoo, you’re not putting your life on the line,” says the CDC’s Bell. “Botox injections, liposuction, those become possibly life-threatening. Even driving to work: We rely on antibiotics to make a major accident something we can get through, as opposed to a death sentence.”
Bell’s prediction is a hypothesis for now—but infections that resist even powerful antibiotics have already entered everyday life. Dozens of college and pro athletes, most recently Lawrence Tynes of the Tampa Bay Buccaneers, have lost playing time or entire seasons to infections with drug-resistant staph, MRSA. Girls who sought permanent-makeup tattoos have lost their eyebrows after getting infections. Last year, three members of a Maryland family — an elderly woman and two adult children — died of resistant pneumonia that took hold after simple cases of flu.
At UCLA, Spellberg treated a woman with what appeared to be an everyday urinary-tract infection — except that it was not quelled by the first round of antibiotics, or the second. By the time he saw her, she was in septic shock, and the infection had destroyed the bones in her spine. A last-ditch course of the only remaining antibiotic saved her life, but she lost the use of her legs. “This is what we’re in danger of,” he says. “People who are living normal lives who develop almost untreatable infections.”
In 2009, Tom Dukes — a fifty-four-year-old inline skater and body-builder — developed diverticulosis , a common problem in which pouches develop in the wall of the intestine. He was coping with it, watching his diet and monitoring himself for symptoms, when searing cramps doubled him over and sent him to urgent care. One of the thin-walled pouches had torn open and dumped gut bacteria into his abdomen — but for reasons no one could explain, what should have been normal E. coli were instead highly drug-resistant. Doctors excised eight inches of his colon in emergency surgery. Over several months, Dukes recovered with the aid of last-resort antibiotics, delivered intravenously. For years afterward, he was exhausted and in pain. “I was living my life, a really healthy life,” he says. “It never dawned on me that this could happen.”
Dukes believes, though he has no evidence, that the bacteria in his gut became drug-resistant because he ate meat from animals raised with routine antibiotic use. That would not be difficult: most meat in the United States is grown that way. To varying degrees depending on their size and age, cattle, pigs, and chickens — and, in other countries, fish and shrimp — receive regular doses to speed their growth, increase their weight, and protect them from disease. Out of all the antibiotics sold in the United States each year, 80 percent by weight are used in agriculture, primarily to fatten animals and protect them from the conditions in which they are raised.
A growing body of scientific research links antibiotic use in animals to the emergence of antibiotic-resistant bacteria: in the animals’ own guts, in the manure that farmers use on crops or store on their land, and in human illnesses as well. Resistant bacteria move from animals to humans in groundwater and dust, on flies, and via the meat those animals get turned into.
An annual survey of retail meat conducted by the Food and Drug Administration—part of a larger project involving the CDC and the U.S. Department of Agriculture that examines animals, meat, and human illness—finds resistant organisms every year. In its 2011 report, published last February, the FDA found (among many other results) that 65 percent of chicken breasts and 44 percent of ground beef carried bacteria resistant to tetracycline, and 11 percent of pork chops carried bacteria resistant to five classes of drugs. Meat transports those bacteria into your kitchen, if you do not handle it very carefully, and into your body if it is not thoroughly cooked—and resistant infections result.
Researchers and activists have tried for decades to get the FDA to rein in farm overuse of antibiotics, mostly without success. The agency attempted in the 1970s to control agricultural use by revoking authorization for penicillin and tetracycline to be used as “growth promoters,” but that effort never moved forward. Agriculture and the veterinary pharmaceutical industry pushed back, alleging that agricultural antibiotics have no demonstrable effect on human health.
Few, though, have asked what multi-drug–resistant bacteria might mean for farm animals. Yet a post-antibiotic era imperils agriculture as much as it does medicine. In addition to growth promoters, livestock raising uses antibiotics to treat individual animals, as well as in routine dosing called “prevention and control” that protects whole herds. If antibiotics became useless, then animals would suffer: individual illnesses could not be treated, and if the crowded conditions in which most meat animals are raised were not changed, more diseases would spread.
But if the loss of antibiotics change how livestock are raised, then farmers might be the ones to suffer. Other methods for protecting animals from disease—enlarging barns, cutting down on crowding, and delaying weaning so that immune systems have more time to develop—would be expensive to implement, and agriculture’s profit margins are already thin. In 2002, economists for the National Pork Producers Council estimated that removing antibiotics from hog raising would force farmers to spend $4.50 more per pig, a cost that would be passed on to consumers.
H. Morgan Scott, a veterinary epidemiologist at Kansas State University, unpacked for me how antibiotics are used to control a major cattle illness, bovine respiratory disease. “If a rancher decides to wean their calves right off the cow in the fall and ship them, that’s a risky process for the calf, and one of the things that permits that to continue is antibiotics,” he said, adding: “If those antibiotics weren’t available, either people would pay a much lower price for those same calves, or the rancher might retain them through the winter” while paying extra to feed them. That is, without antibiotics, those farmers would face either lower revenues or higher costs.
Livestock raising isn’t the only aspect of food production that relies on antibiotics, or that would be threatened if the drugs no longer worked. The drugs are routinely used in fish and shrimp farming, particularly in Asia, to protect against bacteria that spread in the pools where seafood is raised—and as a result, the aquaculture industry is struggling with antibiotic-resistant fish diseases and searching for alternatives. In the United States, antibiotics are used to control fruit diseases, but those protections are breaking down too. Last year, streptomycin-resistant fire blight, which in 2000 nearly destroyed Michigan’s apple and pear industry, appeared for the first time in orchards in upstate New York, which is (after Michigan) one of the most important apple-growing states. “Our growers have never seen this, and they aren’t prepared for it,” says Herb Aldwinckle, a professor of plant pathology at Cornell University. “Our understanding is that there is one useful antibiotic left.”
Is a post-antibiotic era inevitable? Possibly not — but not without change.
In countries such as as Denmark, Norway, and the Netherlands, government regulation of medical and agricultural antibiotic use has helped curb bacteria’s rapid evolution toward untreatability. But the U.S. has never been willing to institute such controls, and the free-market alternative of asking physicians and consumers to use antibiotics conservatively has been tried for decades without much success. As has the long effort to reduce farm antibiotic use; the FDA will soon issue new rules for agriculture, but they will be contained in a voluntary “guidance to industry,” not a regulation with the force of law.
What might hold off the apocalypse, for a while, is more antibiotics—but first pharmaceutical companies will have to be lured back into a marketplace they already deemed unrewarding. The need for new compounds could force the federal government to create drug-development incentives: patent extensions, for instance, or changes in the requirements for clinical trials. But whenever drug research revives, achieving a new compound takes at least 10 years from concept to drugstore shelf. There will be no new drug to solve the problem soon—and given the relentlessness of bacterial evolution, none that can solve the problem forever. In the meantime, the medical industry is reviving the old-fashioned solution of rigorous hospital cleaning, and also trying new ideas: building automatic scrutiny of prescriptions into computerized medical records, and developing rapid tests to ensure the drugs aren’t prescribed when they are not needed. The threat of the end of antibiotics might even impel a reconsideration of phages, the individually brewed cocktails of viruses that were a mainstay of Soviet Union medical care during the Cold War. So far, the FDA has allowed them into the U.S. market only as food-safety preparations, not as treatments for infections.
But for any of that to happen, the prospect of a post-antibiotic era has to be taken seriously, and those staring down the trend say that still seems unlikely. “Nobody relates to themselves lying in an ICU bed on a ventilator,” says Rice of Brown University. “And after it happens, they generally want to forget it.”
When I think of preventing this possible future, I re-read my great-uncle’s obit, weighing its old-fashioned language freighted with a small town’s grief.
The world is made up of “average” people, and that is probably why editorials are not written about any one of them. Yet among these average people, who are not “great” in political, social, religious, economic or other specialized fields, there are sometimes those who stand out above the rest: stand out for qualities that are intangible, that we can’t put our finger on.
Such a man was Joe McKenna, who died in the prime of life Friday. Joe was not one of the “greats.” Yet few men, probably, have been mourned by more of their neighbors — mourned sincerely, and sorrowfully — than this red-haired young man.
I run my cursor over the image of the tattered newsprint, the frayed creases betraying the years that someone carried the clipping with them. I picture my cousin’s grandmother flattening the fragile scrap as gently as if she were stroking her brother’s hot forehead, and reading the praise she must have known by heart, and folding it closed again. I remember the few stories I heard from my father, of how Joe’s death shattered his family, embittering my grandfather and turning their mother angry and cold.
I imagine what he might have thought — thirty years old, newly married, adored by his siblings, thrilled for the excitement of his job — if he had known that a few years later, his life could have been saved in hours. I think he would have marveled at antibiotics, and longed for them, and found our disrespect of them an enormous waste. As I do.
I came across this great article by L.V. Anderson on http://www.slate.com (an online daily magazine) and thought I’d share it with you.
What if Everyone in the World Became a Vegetarian?
Calculating the chaos and the changed climate.
The meat industry is one of the top contributors to climate change, directly and indirectly producing about 14.5 percent of the world’s anthropogenic greenhouse gas emissions, and global meat consumption is on the rise. People generally like eating meat—when poor people start making more money, they almost invariably start buying more meat. As the population grows and eats more animal products, the consequences for climate change, pollution, and land use could be catastrophic.
Attempts to reduce meat consumption usually focus on baby steps — Meatless Monday and “vegan before 6,”passable fake chicken, and in vitro burgers. If the world is going to eat less meat, it’s going to have to be coaxed and cajoled into doing it, according to conventional wisdom.
But what if the convincing were the easy part? Suppose everyone in the world voluntarily stopped eating meat, en masse. I know it’s not actually going to happen. But the best-case scenario from a climate perspective would be if all 7 billion of us woke up one day and realized that PETA was right all along. If this collective change of spirit came to pass, like Peter Singer’s dearest fantasy come true, what would the ramifications be?
At least one research team has run the numbers on what global veganism would mean for the planet. In 2009 researchers from the Netherlands Environmental Assessment Agency published their projections of the greenhouse gas consequences if humanity came to eat less meat, no meat, or no animal products at all. The researchers predicted that universal veganism would reduce agriculture-related carbon emissions by 17 percent, methane emissions by 24 percent, and nitrous oxide emissions by 21 percent by 2050. Universal vegetarianism would result in similarly impressive reductions in greenhouse gas emissions. What’s more, the Dutch researchers found that worldwide vegetarianism or veganism would achieve these gains at a much lower cost than a purely energy-focused intervention involving carbon taxes and renewable energy technology. The upshot: Universal eschewal of meat wouldn’t single-handedly stave off global warming, but it would go a long way toward mitigating climate change.
The Dutch researchers didn’t take into account what else might happen if everyone gave up meat. “In this scenario study we have ignored possible socio-economic implications such as the effect of health changes on GDP and population numbers,” wrote Elke Stehfest and her colleagues. “We have not analyzed the agro-economic consequences of the dietary changes and its implications; such consequences might not only involve transition costs, but also impacts on land prices. The costs that are associated with this transition might obviously offset some of the gains discussed here.”
Indeed. If the world actually did collectively go vegetarian or vegan over the course of a decade or two, it’s reasonable to think the economy would tank. According to “Livestock’s Long Shadow,” the influential 2006 U.N. report about meat’s devastating environmental effects, livestock production accounts for 1.4 percent of the world’s total GDP. The production and sale of animal products account for 1.3 billion people’s jobs, and 987 million of those people are poor. If demand for meat were to disappear overnight, those people’s livelihoods would disappear, and they would have to find new ways of making money. Now, some of them—like the industrial farmers who grow the corn that currently goes to feed animals on factory farms—would be in a position to adapt by shifting to in-demand plant-based food production. Others, namely the “huge number of people involved in livestock for lack of an alternative, particularly in Africa and Asia,” would probably be out of luck. (Things would be better for the global poor involved in the livestock trade if everyone continued to consume other animal products, such as eggs, milk, and wool, than if everyone decided to go vegan.) As the economy adjusted to the sudden lack of demand for meat products, we would expect to see widespread suffering and social unrest.
A second major ramification of global vegetarianism would be expanses of new land available. Currently, grazing land for ruminants—cows and their kin—accounts for a staggering 26 percent of the world’s ice-free land surface. The Dutch scientists predict that 2.7 billion hectares (about 10.4 million square miles) of that grazing land would be freed up by global vegetarianism, along with 100 million hectares (about 386,000 square miles) of land that’s currently used to grow crops for livestock. Not all of this land would be suitable for humans, but surely it stands to reason that this sudden influx of new territory would make land much cheaper on the whole.
A third major ramification of global vegetarianism would be that the risk of antibiotic-resistant infections would plummet. Currently, the routine use of antibiotics in animal farming to promote weight gain and prevent illness in unsanitary conditions is a major contributor to antibiotic resistance. Last year the Centers for Disease Control and Prevention announced that at least 2 million Americans fall ill from antibiotic-resistant pathogens every year and declared that “much of antibiotic use in animals is unnecessary and inappropriate and makes everyone less safe.” The overprescription of antibiotics for humans plays a big role in antibiotic resistance, but eradicating the factory farms from which many antibiotic-resistant bacteria emerge would make it more likely that we could continue to count on antibiotics to cure serious illnesses. (For a sense of what a “post-antibiotics future” would look like, read Maryn McKenna’s amazing article on the topic for Medium and her story about a possible solution for chicken farming in Slate.)
So what would be the result, in an all-vegetarian world, of the combination of widespread unemployment and economic disruption, millions of square miles of available land, and a lowered risk of antibiotic-resistant gonorrhea? I can only conclude that people would band together to form communes in order to escape capitalism’s ruthlessness, squat on the former pasture land, and adopt a lifestyle of free love.
I kid. Mostly. It’s easy to get carried away when you’re speculating about unlikely scenarios—and sudden intercontinental vegetarianism is very much an unlikely scenario.
But if the result of a worldwide shift to a plant-based diet sounds like a right-winger’s worst nightmare, it’s worth pointing out that continuing to eat as much meat as we currently do promises to result in a left-winger’s worst nightmare: In a world of untrammeled global warming, where disastrous weather events are routine, global conflicts will increase, only the wealthy will thrive, and the poor will suffer.
Let’s try a middle path. We’re not all going to become vegetarians, but most of us can stop giving our money to factory farms—the biggest and worst offenders, from a pollution and public health perspective. We can eat less meat than we currently do, especially meat from methane-releasing ruminants (cattle, sheep, goats, etc.). Just because a sudden global conversion to vegetarianism would have jarring effects doesn’t mean we can’t gradually reduce our consumption of meat, giving the market time to adjust. We not only can; we must. After all, with the world’s population slated to grow to 9 billion by 2050, we’ll be needing to take some of the 25 percent of the world’s land area back from the cows.