Next time you eat a piece of meat, take a moment to think about the fact that it had a mother.
If it’s pork you’re eating – think about that piglet being removed from it’s mother within just a few days of being born and slaughtered within 3 – 6 months.
If it’s lamb you’re eating – know that it was removed from its mother within a few months of being born and killed within 3 – 10 months.
If it’s chicken you’re eating – know that it was never even allowed to meet it’s mother and was killed within 6 weeks of being born.
If it’s beef you’re eating – know that they have been slaughtered within just 1 to 2 years.
If it’s dairy you’re eating, know that the calf which this mother had to bear in order for you to steal and consume her milk, was taken away within the first 2 days of its life and either shot or slaughtered at 16 – 20 weeks for veal.
And if this thought alone doesn’t make you reconsider eating meat then please take a long hard look at these photos and ask yourself how you can possibly justify stealing any animal’s young away from them for the brutal and shameful act of slaughter, merely because you like the way they taste.
Whilst doing my research into the British meat industry, this is something that really hit me and made me question how justifiable eating meat could possibly be. Like most people, I always thought to myself that farm animals had long, happy, healthy lives before being sent to a humane slaughter. I think this is something that we all tell ourselves, although deep down we know that all these animals are killed at a horribly tender age and that there’s no such thing as a humane slaughter. It’s murder, whether we think it’s justified or not.
But a good life? Can such a short lifespan really ever be considered ‘a good life’? Even if they were living on a 5* luxury farm complete with piggy pedicures and farmyard facials, I still don’t think being slaughtered so young can even qualify as ‘a good life’.
Look at the poster below. Take a good look. It’s pretty shocking when you think about it. I know that when I first looked into this I realised that it wasn’t something I’d ever really considered. Even ‘lamb’ – the clue’s in the name right? But even though I knew it was lamb I never really pictured a lamb when eating lamb. It’s amazing what associations we let our minds make and what we managed to ignore or suppress.
If a person dies at these equivalent ages (so with a life expectancy here in the UK of nearly 81 years – that would be the equivalent of being killed between the ages of 1 hour old and 9 years old. Hardly what we’d ever term ‘a good innings’ is it? however a happy life a child might have, if that child dies before it’s tenth birthday it’s considered an absolute tragedy to die so young; before his/her life had really even begun; what a waste; how incredibly sad.
So how on earth do we justify taking the lives of these animals at such a tender age? We can’t really can we if we really stop to think about it. but somehow we all turn a blind eye and billions of animals are slaughtered every year way way before their time. That’s the real tragedy.
the reality of how our sausages get from piggy to pan is something that none of us are actually comfortable with. When faced with the reality of it, we are completely repulsed by it. So why do we happily buy and east sausages? Because we can do so without ever having to face up to the reality of the hideously cruel world we are financing and supporting. How many of you have been to a pig farm like this one in Scotland?
Or this one in Vermont?
How many of you have ever been inside a slaughterhouse and watched a pig being ‘processed’?
I was bought up surrounded by animals and farmers and my dad was a sheep farmer. But the reality of the slaughterhouse process, especially the industrial scale ones we are seeing more and more of as the world’s appetite for meat grows, sickens me to my stomach and I’m sure it would yours if you were brave enough to do your research and take a closer look at how your sausages arrive on your supermarket shelves or butchers hooks.
After 85 years, antibiotics are growing impotent. So what will medicine, agriculture and everyday life look like if we lose these drugs entirely?
A few years ago, I started looking online to fill in chapters of my family history that no one had ever spoken of. I registered on Ancestry.com, plugged in the little I knew, and soon was found by a cousin whom I had not known existed, the granddaughter of my grandfather’s older sister. We started exchanging documents: a copy of a birth certificate, a photo from an old wedding album. After a few months, she sent me something disturbing.
It was a black-and-white scan of an article clipped from the long-gone Argus of Rockaway Beach, New York. In the scan, the type was faded and there were ragged gaps where the soft newsprint had worn through. The clipping must have been folded and carried around a long time before it was pasted back together and put away. The article was about my great-uncle, the younger brother of my cousin’s grandmother and my grandfather.
In a family that never talked much about the past, he had been discussed even less than the rest. I knew he had been a fireman in New York City and died young, and that his death scarred his family with a grief they never recovered from. I knew that my father, a small child when his uncle died, was thought to resemble him. I also knew that when my father made his Catholic confirmation a few years afterward, he chose as his spiritual guardian the saint that his uncle had been named for: St. Joseph, the patron of a good death.
I had always heard Joe had been injured at work: not burned, but bruised and cut when a heavy brass hose nozzle fell on him. The article revealed what happened next. Through one of the scrapes, an infection set in. After a few days, he developed an ache in one shoulder; two days later, a fever. His wife and the neighborhood doctor struggled for two weeks to take care of him, then flagged down a taxi and drove him fifteen miles to the hospital in my grandparents’ town. He was there one more week, shaking with chills and muttering through hallucinations, and then sinking into a coma as his organs failed. Desperate to save his life, the men from his firehouse lined up to give blood. Nothing worked. He was thirty when he died, in March 1938.
The date is important. Five years after my great-uncle’s death, penicillin changed medicine forever. Infections that had been death sentences—from battlefield wounds, industrial accidents, childbirth—suddenly could be cured in a few days. So when I first read the story of his death, it lit up for me what life must have been like before antibiotics started saving us.
Lately, though, I read it differently. In Joe’s story, I see what life might become if we did not have antibiotics any more.
Predictions that we might sacrifice the antibiotic miracle have been around almost as long as the drugs themselves. Penicillin was first discovered in 1928 and battlefield casualties got the first non-experimental doses in 1943, quickly saving soldiers who had been close to death. But just two years later, the drug’s discoverer Sir Alexander Fleming warned that its benefit might not last. Accepting the 1945 Nobel Prize in Medicine, he said:
“It is not difficult to make microbes resistant to penicillin in the laboratory by exposing them to concentrations not sufficient to kill them… There is the danger that the ignorant man may easily underdose himself and by exposing his microbes to non-lethal quantities of the drug make them resistant.”
As a biologist, Fleming knew that evolution was inevitable: sooner or later, bacteria would develop defenses against the compounds the nascent pharmaceutical industry was aiming at them. But what worried him was the possibility that misuse would speed the process up. Every inappropriate prescription and insufficient dose given in medicine would kill weak bacteria but let the strong survive. (As would the micro-dose “growth promoters” given in agriculture, which were invented a few years after Fleming spoke.) Bacteria can produce another generation in as little as twenty minutes; with tens of thousands of generations a year working out survival strategies, the organisms would soon overwhelm the potent new drugs.
Fleming’s prediction was correct. Penicillin-resistant staph emerged in 1940, while the drug was still being given to only a few patients. Tetracycline was introduced in 1950, and tetracycline-resistant Shigella emerged in 1959; erythromycin came on the market in 1953, and erythromycin-resistant strep appeared in 1968. As antibiotics became more affordable and their use increased, bacteria developed defenses more quickly. Methicillin arrived in 1960 and methicillin resistance in 1962; levofloxacin in 1996 and the first resistant cases the same year; linezolid in 2000 and resistance to it in 2001; daptomycin in 2003 and the first signs of resistance in 2004.
With antibiotics losing usefulness so quickly — and thus not making back the estimated $1 billion per drug it costs to create them — the pharmaceutical industry lost enthusiasm for making more. In 2004, there were only five new antibiotics in development, compared to more than 500 chronic-disease drugs for which resistance is not an issue — and which, unlike antibiotics, are taken for years, not days. Since then, resistant bugs have grown more numerous and by sharing DNA with each other, have become even tougher to treat with the few drugs that remain. In 2009, and again this year, researchers in Europe and the United States sounded the alarm over an ominous form of resistance known as CRE, for which only one antibiotic still works.
Health authorities have struggled to convince the public that this is a crisis. In September, Dr. Thomas Frieden, the director of the U.S. Centers for Disease Control and Prevention, issued a blunt warning: “If we’re not careful, we will soon be in a post-antibiotic era. For some patients and some microbes, we are already there.” The chief medical officer of the United Kingdom, Dame Sally Davies — who calls antibiotic resistance as serious a threat as terrorism — recently published a book in which she imagines what might come next. She sketches a world where infection is so dangerous that anyone with even minor symptoms would be locked in confinement until they recover or die. It is a dark vision, meant to disturb. But it may actually underplay what the loss of antibiotics would mean.
In 2009, three New York physicians cared for a sixty-seven-year-old man who had major surgery and then picked up a hospital infection that was “pan-resistant” — that is, responsive to no antibiotics at all. He died fourteen days later. When his doctors related his case in a medical journal months afterward, they still sounded stunned. “It is a rarity for a physician in the developed world to have a patient die of an overwhelming infection for which there are no therapeutic options,” they said, calling the man’s death “the first instance in our clinical experience in which we had no effective treatment to offer.”
They are not the only doctors to endure that lack of options. Dr. Brad Spellberg of UCLA’s David Geffen School of Medicine became so enraged by the ineffectiveness of antibiotics that he wrote a book about it.
“Sitting with a family, trying to explain that you have nothing left to treat their dying relative — that leaves an indelible mark on you,” he says. “This is not cancer; it’s infectious disease, treatable for decades.”
As grim as they are, in-hospital deaths from resistant infections are easy to rationalize: perhaps these people were just old, already ill, different somehow from the rest of us. But deaths like this are changing medicine. To protect their own facilities, hospitals already flag incoming patients who might carry untreatable bacteria. Most of those patients come from nursing homes and “long-term acute care” (an intensive-care alternative where someone who needs a ventilator for weeks or months might stay). So many patients in those institutions carry highly resistant bacteria that hospital workers isolate them when they arrive, and fret about the danger they pose to others. As infections become yet more dangerous, the healthcare industry will be even less willing to take such risks.
Those calculations of risk extend far beyond admitting possibly contaminated patients from a nursing home. Without the protection offered by antibiotics, entire categories of medical practice would be rethought.
Many treatments require suppressing the immune system, to help destroy cancer or to keep a transplanted organ viable. That suppression makes people unusually vulnerable to infection. Antibiotics reduce the threat; without them, chemotherapy or radiation treatment would be as dangerous as the cancers they seek to cure. Dr. Michael Bell, who leads an infection-prevention division at the CDC, told me: “We deal with that risk now by loading people up with broad-spectrum antibiotics, sometimes for weeks at a stretch. But if you can’t do that, the decision to treat somebody takes on a different ethical tone. Similarly with transplantation. And severe burns are hugely susceptible to infection. Burn units would have a very, very difficult task keeping people alive.”
Doctors routinely perform procedures that carry an extraordinary infection risk unless antibiotics are used. Chief among them: any treatment that requires the construction of portals into the bloodstream and gives bacteria a direct route to the heart or brain. That rules out intensive-care medicine, with its ventilators, catheters, and ports—but also something as prosaic as kidney dialysis, which mechanically filters the blood.
Next to go: surgery, especially on sites that harbor large populations of bacteria such as the intestines and the urinary tract. Those bacteria are benign in their regular homes in the body, but introduce them into the blood, as surgery can, and infections are practically guaranteed. And then implantable devices, because bacteria can form sticky films of infection on the devices’ surfaces that can be broken down only by antibiotics
Dr. Donald Fry, a member of the American College of Surgeons who finished medical school in 1972, says: “In my professional life, it has been breathtaking to watch what can be done with synthetic prosthetic materials: joints, vessels, heart valves. But in these operations, infection is a catastrophe.” British health economists with similar concerns recently calculated the costs of antibiotic resistance. To examine how it would affect surgery, they picked hip replacements, a common procedure in once-athletic Baby Boomers. They estimated that without antibiotics, one out of every six recipients of new hip joints would die.
Antibiotics are administered prophylactically before operations as major as open-heart surgery and as routine as Caesarean sections and prostate biopsies. Without the drugs, the risks posed by those operations, and the likelihood that physicians would perform them, will change.
“In our current malpractice environment, is a doctor going to want to do a bone marrow transplant, knowing there’s a very high rate of infection that you won’t be able to treat?” asks Dr. Louis Rice, chair of the department of medicine at Brown University’s medical school. “Plus, right now healthcare is a reasonably free-market, fee-for-service system; people are interested in doing procedures because they make money. But five or ten years from now, we’ll probably be in an environment where we get a flat sum of money to take care of patients. And we may decide that some of these procedures aren’t worth the risk.”
Medical procedures may involve a high risk of infections, but our everyday lives are pretty risky too. One of the first people to receive penicillin experimentally was a British policeman, Albert Alexander. He was so riddled with infection that his scalp oozed pus and one eye had to be removed. The source of his illness: scratching his face on a rosebush. (There was so little penicillin available that, though Alexander rallied at first, the drug ran out, and he died.)
Before antibiotics, five women died out of every 1,000 who gave birth. One out of nine people who got a skin infection died, even from something as simple as a scrape or an insect bite. Three out of ten people who contracted pneumonia died from it. Ear infections caused deafness; sore throats were followed by heart failure. In a post-antibiotic era, would you mess around with power tools? Let your kid climb a tree? Have another child?
“Right now, if you want to be a sharp-looking hipster and get a tattoo, you’re not putting your life on the line,” says the CDC’s Bell. “Botox injections, liposuction, those become possibly life-threatening. Even driving to work: We rely on antibiotics to make a major accident something we can get through, as opposed to a death sentence.”
Bell’s prediction is a hypothesis for now—but infections that resist even powerful antibiotics have already entered everyday life. Dozens of college and pro athletes, most recently Lawrence Tynes of the Tampa Bay Buccaneers, have lost playing time or entire seasons to infections with drug-resistant staph, MRSA. Girls who sought permanent-makeup tattoos have lost their eyebrows after getting infections. Last year, three members of a Maryland family — an elderly woman and two adult children — died of resistant pneumonia that took hold after simple cases of flu.
At UCLA, Spellberg treated a woman with what appeared to be an everyday urinary-tract infection — except that it was not quelled by the first round of antibiotics, or the second. By the time he saw her, she was in septic shock, and the infection had destroyed the bones in her spine. A last-ditch course of the only remaining antibiotic saved her life, but she lost the use of her legs. “This is what we’re in danger of,” he says. “People who are living normal lives who develop almost untreatable infections.”
In 2009, Tom Dukes — a fifty-four-year-old inline skater and body-builder — developed diverticulosis , a common problem in which pouches develop in the wall of the intestine. He was coping with it, watching his diet and monitoring himself for symptoms, when searing cramps doubled him over and sent him to urgent care. One of the thin-walled pouches had torn open and dumped gut bacteria into his abdomen — but for reasons no one could explain, what should have been normal E. coli were instead highly drug-resistant. Doctors excised eight inches of his colon in emergency surgery. Over several months, Dukes recovered with the aid of last-resort antibiotics, delivered intravenously. For years afterward, he was exhausted and in pain. “I was living my life, a really healthy life,” he says. “It never dawned on me that this could happen.”
Dukes believes, though he has no evidence, that the bacteria in his gut became drug-resistant because he ate meat from animals raised with routine antibiotic use. That would not be difficult: most meat in the United States is grown that way. To varying degrees depending on their size and age, cattle, pigs, and chickens — and, in other countries, fish and shrimp — receive regular doses to speed their growth, increase their weight, and protect them from disease. Out of all the antibiotics sold in the United States each year, 80 percent by weight are used in agriculture, primarily to fatten animals and protect them from the conditions in which they are raised.
A growing body of scientific research links antibiotic use in animals to the emergence of antibiotic-resistant bacteria: in the animals’ own guts, in the manure that farmers use on crops or store on their land, and in human illnesses as well. Resistant bacteria move from animals to humans in groundwater and dust, on flies, and via the meat those animals get turned into.
An annual survey of retail meat conducted by the Food and Drug Administration—part of a larger project involving the CDC and the U.S. Department of Agriculture that examines animals, meat, and human illness—finds resistant organisms every year. In its 2011 report, published last February, the FDA found (among many other results) that 65 percent of chicken breasts and 44 percent of ground beef carried bacteria resistant to tetracycline, and 11 percent of pork chops carried bacteria resistant to five classes of drugs. Meat transports those bacteria into your kitchen, if you do not handle it very carefully, and into your body if it is not thoroughly cooked—and resistant infections result.
Researchers and activists have tried for decades to get the FDA to rein in farm overuse of antibiotics, mostly without success. The agency attempted in the 1970s to control agricultural use by revoking authorization for penicillin and tetracycline to be used as “growth promoters,” but that effort never moved forward. Agriculture and the veterinary pharmaceutical industry pushed back, alleging that agricultural antibiotics have no demonstrable effect on human health.
Few, though, have asked what multi-drug–resistant bacteria might mean for farm animals. Yet a post-antibiotic era imperils agriculture as much as it does medicine. In addition to growth promoters, livestock raising uses antibiotics to treat individual animals, as well as in routine dosing called “prevention and control” that protects whole herds. If antibiotics became useless, then animals would suffer: individual illnesses could not be treated, and if the crowded conditions in which most meat animals are raised were not changed, more diseases would spread.
But if the loss of antibiotics change how livestock are raised, then farmers might be the ones to suffer. Other methods for protecting animals from disease—enlarging barns, cutting down on crowding, and delaying weaning so that immune systems have more time to develop—would be expensive to implement, and agriculture’s profit margins are already thin. In 2002, economists for the National Pork Producers Council estimated that removing antibiotics from hog raising would force farmers to spend $4.50 more per pig, a cost that would be passed on to consumers.
H. Morgan Scott, a veterinary epidemiologist at Kansas State University, unpacked for me how antibiotics are used to control a major cattle illness, bovine respiratory disease. “If a rancher decides to wean their calves right off the cow in the fall and ship them, that’s a risky process for the calf, and one of the things that permits that to continue is antibiotics,” he said, adding: “If those antibiotics weren’t available, either people would pay a much lower price for those same calves, or the rancher might retain them through the winter” while paying extra to feed them. That is, without antibiotics, those farmers would face either lower revenues or higher costs.
Livestock raising isn’t the only aspect of food production that relies on antibiotics, or that would be threatened if the drugs no longer worked. The drugs are routinely used in fish and shrimp farming, particularly in Asia, to protect against bacteria that spread in the pools where seafood is raised—and as a result, the aquaculture industry is struggling with antibiotic-resistant fish diseases and searching for alternatives. In the United States, antibiotics are used to control fruit diseases, but those protections are breaking down too. Last year, streptomycin-resistant fire blight, which in 2000 nearly destroyed Michigan’s apple and pear industry, appeared for the first time in orchards in upstate New York, which is (after Michigan) one of the most important apple-growing states. “Our growers have never seen this, and they aren’t prepared for it,” says Herb Aldwinckle, a professor of plant pathology at Cornell University. “Our understanding is that there is one useful antibiotic left.”
Is a post-antibiotic era inevitable? Possibly not — but not without change.
In countries such as as Denmark, Norway, and the Netherlands, government regulation of medical and agricultural antibiotic use has helped curb bacteria’s rapid evolution toward untreatability. But the U.S. has never been willing to institute such controls, and the free-market alternative of asking physicians and consumers to use antibiotics conservatively has been tried for decades without much success. As has the long effort to reduce farm antibiotic use; the FDA will soon issue new rules for agriculture, but they will be contained in a voluntary “guidance to industry,” not a regulation with the force of law.
What might hold off the apocalypse, for a while, is more antibiotics—but first pharmaceutical companies will have to be lured back into a marketplace they already deemed unrewarding. The need for new compounds could force the federal government to create drug-development incentives: patent extensions, for instance, or changes in the requirements for clinical trials. But whenever drug research revives, achieving a new compound takes at least 10 years from concept to drugstore shelf. There will be no new drug to solve the problem soon—and given the relentlessness of bacterial evolution, none that can solve the problem forever. In the meantime, the medical industry is reviving the old-fashioned solution of rigorous hospital cleaning, and also trying new ideas: building automatic scrutiny of prescriptions into computerized medical records, and developing rapid tests to ensure the drugs aren’t prescribed when they are not needed. The threat of the end of antibiotics might even impel a reconsideration of phages, the individually brewed cocktails of viruses that were a mainstay of Soviet Union medical care during the Cold War. So far, the FDA has allowed them into the U.S. market only as food-safety preparations, not as treatments for infections.
But for any of that to happen, the prospect of a post-antibiotic era has to be taken seriously, and those staring down the trend say that still seems unlikely. “Nobody relates to themselves lying in an ICU bed on a ventilator,” says Rice of Brown University. “And after it happens, they generally want to forget it.”
When I think of preventing this possible future, I re-read my great-uncle’s obit, weighing its old-fashioned language freighted with a small town’s grief.
The world is made up of “average” people, and that is probably why editorials are not written about any one of them. Yet among these average people, who are not “great” in political, social, religious, economic or other specialized fields, there are sometimes those who stand out above the rest: stand out for qualities that are intangible, that we can’t put our finger on.
Such a man was Joe McKenna, who died in the prime of life Friday. Joe was not one of the “greats.” Yet few men, probably, have been mourned by more of their neighbors — mourned sincerely, and sorrowfully — than this red-haired young man.
I run my cursor over the image of the tattered newsprint, the frayed creases betraying the years that someone carried the clipping with them. I picture my cousin’s grandmother flattening the fragile scrap as gently as if she were stroking her brother’s hot forehead, and reading the praise she must have known by heart, and folding it closed again. I remember the few stories I heard from my father, of how Joe’s death shattered his family, embittering my grandfather and turning their mother angry and cold.
I imagine what he might have thought — thirty years old, newly married, adored by his siblings, thrilled for the excitement of his job — if he had known that a few years later, his life could have been saved in hours. I think he would have marveled at antibiotics, and longed for them, and found our disrespect of them an enormous waste. As I do.
An interesting anti-veganism argument put to me by a friend over Easter:
“If you like the sight of lambs playing in the fields and cattle grazing in the meadows then you should really eat meat. You can’t have it both ways”.
Hmmm…. Yes I do like the sight of lambs playing in the fields at this time of year – but the knowledge that those lovely lambykins will be slaughtered at around 20 weeks old so that we can enjoy its succulent juicy flesh with a dollop of mint sauce and redcurrant jelly just doesn’t seem right to me. 20 weeks. What kind of life is that? That’s the equivalent (pro rata in average life expectancy) of killing a human at 18 months old. Not what you’d call a great innings is it?
It’s bizarre isn’t it that we are so far removed from the brutal reality of this industry that we sit there and tell ourselves that because we love seeing these animals roaming about the fields, that in some way justifies the means. It’s actually quite an amusing argument. except of course that it’s not. It’s incredibly naïve, hugely hypocritical and a deeply misguided sentiment entirely.
A ‘vegan’ is someone who chooses to avoid using or consuming animal products. So no meat, dairy or eggs. Vegans try and avoid buying any animal fur, real leather, wool, angora, alpaca, silk and down. They will try and avoid any cosmetics, beauty and cleaning products that have any animal derivatives in them or which have been tested on animals.
So the big question – Why?
My biggest fear about ‘coming out’ as a vegan was how my mother was going to take it. I was desperately worried that she would take it as a personal insult and a rejection of her values and the way in which she raised us, which it absolutely is not. So to soften the blow, I decided to write her a letter.
Here’s what I wrote…
I am writing to you because I am too scared to tell you what I am about to tell you in person! I don’t think you will agree or understand why I’m doing it but I do want and need you to respect and support it if you can.
I’m not gay, I’m not pregnant, I’m not joining the labour party (but will probably vote for them…), I’m not joining a cult, I’m not starting a revolution, I’m not getting a divorce, I’m not converting to Islam, I’m not getting my nipples pierced or my knuckles tattooed… but……………….. I am…………… going to try and adopt a vegan lifestyle.
This came about first of all through talking to a vegan friend of mine who spoke very passionately and articulately about it and made me want to go and find out more for myself. Secondly, the horse burger furore recently made me realise how ridiculously arbitrary it is that we happily eat pork, lamb, beef and salmon but are horrified by the thought of eating horse, dog, rhino or goldfish! And then Lent is coming up and I wanted to eat more healthily. So all of these things led me to do lots of reading around farming practices around the world, climate change, meat production and consumption, dietary needs etc and I was horrified by what I learned.
Below I have tried to cover most of the reasons why I’m doing it without blabbering on too much. But the biggest, overriding point I think is that we don’t need to include any animal products in our diet whatsoever. We can get a perfectly healthy, nutritionally balanced diet from plants alone. So even if you’re not totally convinced by the arguments below, you don’t need to even risk being wrong so why do we?
I really hope that you don’t take this as a personal attack on everything that you bought me up to believe in. This is not a rejection of your values. This is not remotely personal. This is not an attack on farming and farmers! This is an ideology which seems to make sense to me and black and white numbers which don’t.
So please don’t be disappointed in me or embarrassed of me but try and be proud of me for having the guts to try to do the responsible, compassionate and decent thing (even if you don’t think that it is).
I love you and I feel sick with fear at how hard this will be both physically and socially but also quite sure that this it’s the right thing to do.
Farming livestock is incredibly wasteful of natural resources:
– Raising animals for food (including land used for grazing and land used to grow feed crops) uses 30 per cent of the Earth’s land mass.
– More than 260 million acres of U.S. forests have been cleared to create cropland to grow grain to feed farmed animals, and the equivalent of seven football fields of land is bulldozed worldwide every minute to create more room for farmed animals.
– Raising animals for food is grossly inefficient, because while animals eat large quantities of grain, soybeans, oats, and corn, they only produce comparatively small amounts of meat, dairy products, or eggs in return. This is why more than 70 percent of the grain and cereals grown in the US are fed to farmed animals.
– 7kg of grain will feed 10 people for 1 day. Or it can be used to produce 650 calories of meat.
– It takes more than 2,400 gallons of water to produce 1 pound of meat, while growing 1 pound of wheat only requires 25 gallons – so you save more water by not eating a pound of beef than not showering for 6 months!
– Between watering the crops that farmed animals eat, providing drinking water for billions of animals each year, and cleaning away the filth in factory farms, transport trucks, and slaughterhouses, the farmed animal industry uses half of the entire water used by the US each year.
It’s a massive contributor to Global Warming and climate change.
– raising animals for food is the second most significant contributor to global warming. (Carbon dioxide, methane, and nitrous oxide together cause the vast majority of global warming. Raising animals for food is one of the largest sources of carbon dioxide and the single largest source of both methane and nitrous oxide emissions).
– The meat, fish and dairy industries directly contribute to all the major environmental catastrophes facing our planet. The number of farmed animals in the world has quadrupled in the last 50 years, putting an incredible strain on the environment. Food production no longer nurtures land; instead both animals and soil are pushed to their limits and beyond in an effort to satisfy the voracious appetite of the Western world.
– The current buzz word is ‘sustainable’ and yet modern agriculture is anything but sustainable. Rainforests are still being chopped down at an alarming rate either for grazing or to grow crops to feed to animals. Oceans are being destroyed by overfishing, which is devastating entire marine ecosystems, while coastal fish farms are causing extensive pollution and wildlife decline.
– The most powerful step that we can take as individuals to avert global warming is to stop eating meat, eggs, and dairy products.
I am doing this for animal welfare reasons. Factory farming methods and standards around the world are sadly not what they are in Herefordshire and most of the UK! I imagine if most of us spent a day inside an abattoir we would be vegetarians before we could get out. And I don’t think sadly it matters whether you buy locally farmed, organic, free range or not – all meat consumption is increasing demand for meat and I don’t want to be a part of it any longer. If animal welfare was my only concern, then I could certainly ensure that I only buy responsibly farmed meat and dairy produce but unfortunately this is just an aside to the far greater and more urgent environmental reasons listed above, and so is not a solution.
And there are other things I hadn’t ever realised which I suppose are incredibly obvious when you think about it – I just never really had:
– Most dairy cows are forced to have a calf every year (which in itself seems rather cruel considering their calves are taken away within a day of being born so that we can have the milk). 100,000 male dairy calves (in the UK alone – so don’t even think about US stats!) are killed shortly after birth each year as there’s not enough demand for veal.
– 30 to 40 million male chicks (UK alone) are minced alive or gassed every year (this is completely legal and approved by both the Humane Slaughter Association and the RSPCA). I’ve seen the videos and it’s unbelievable!
The effects of livestock farming on global poverty I also hadn’t understood previously:
– There is more than enough food in the world to feed the entire human population yet there are more than billion people starving to death. Obviously there are various other factors at play here, including political corruption, farming subsidies, grain stores etc but our overwhelming demand for meat is largely responsible also. We funnel huge amounts of grain, soybeans, and corn through all the animals we use for food. If we stopped intensively breeding farmed animals and grew crops to feed humans instead, we could easily feed everyone on the planet with healthy and affordable vegetarian foods.
– If this trend continues, the developing world will never be able to produce enough food to feed itself, and hunger will continue to plague hundreds of millions of people around the globe. Author George Monbiot, writing in the U.K.’s The Guardian, explains that there’s only one solution: “It now seems plain that a vegan diet is the only ethical response to what is arguably the world’s most urgent social justice issue”.
– This trend will contribute to continuing malnourishment in the developing world, global warming, widespread pollution, deforestation, land degradation, water scarcity and species extinction because more animals mean more crops are needed to feed them: the planet cannot feed both increasing human and farmed animal populations.
– So if we are trying to reduce our car use, limit the amount of water we waste, become more ‘energy-efficient’ and generally lessen our environmental impact, we must also examine the most important factor of our personal ecological footprint: what we eat.
Loads of love,
So that was nearly a year ago and those were my main reasons for making the change. I can now add several other points to that list, including;
I’ve come to see that our attitudes towards different animals are completely arbitrary and nonsensical and are merely a product of our upbringing and what we become used to – habit! We are used to seeing dogs, cats and horses as pets and wouldn’t dream of eating them and yet we look at cattle, sheep, pigs and chicken as food because we have been bought up to view them that way. When my girls (aged 4 and 3) are around animals they don’t make this distinction because it’s not a human instinct – it’s something that we learn. They don’t look at a pig and see food any more than they do when they look at a puppy – and quite rightly they would be horrified if I said “right poppet, pass me that knife would you, mama wants some bacon!”. Of course over time we become used to this process and we accept that animals need to die in order for us to thrive because we are told that we need milk and cheese for calcium and strong bones (not true), that we need meat for protein (not true). The only reasons we eat meat are that it tastes good, everyone does it and we’ve always done it. That’s it! And they are not justification for doing something that we instinctually know is wrong! We have just become so desensitised and switched off to the fact that millions of animals are being slaughtered behind closed doors so that we can have pepperoni on our pizza and steak frites. Yet there are very few people I know who are entirely comfortable with the idea of killing an animal – everyone would like it to be as painless and humane as possible and some are happy to do this themselves to ensure that it is, but it’s still not something anyone enjoys doing and if you did you would be referred to a psychiatrist to be looked at. So once you step outside of what you have grown to understand and know and look at it with fresh eyes, it is startlingly clear that the only reason we are able to be part of something so cruel and unnecessary is because we have been taught it from a young age by those we respect and admire. This is how the horrific events in history came about and I just don’t see how we can say in one breath that gassing people alive is evil beyond words and in the next say that it’s ok to gas millions more baby male chicks alive just so that we can eat chicken and eggs. I’m sorry to draw the comparison but I think its worth making. When you are brainwashed into thinking that doing something utterly unthinkable is necessary and acceptable then we are capable of behaving in a way we wouldn’t imagine possible otherwise. Slaughtering animals for no good reason is no different as far as I can see. Just because you’re not the one doing it does not make you any more unaccountable – if you’re consuming the products then you are merely paying someone else to do it for you.
About 3 weeks in I noticed my energy levels improving. I hadn’t had low energy levels before but suddenly I had buckets of energy and didn’t have those peaks and troughs throughout the day (which I’d always contributed to coffee, carbs etc). My bowel movements changed dramatically – without wanting to paint too full a picture, I became much more regular (same number as number of meals) and they were what Dr Gillian Mckeith would describe as “marvellous in every way!”. I’ve got clearer skin, I sleep better, I have a higher libido, think more clearly, feel more positive and I feel much happier in general. I also lost a stone quite quickly (within about 3 months) and haven’t lost a pound since so my weight stabilised very quickly. And I’ve never eaten more food or more carbs so all those potatoes and rice I’d been avoiding previously to stave off those extra pounds seems to be twaddle when it comes to me. I’m 5ft 10″ and did weigh between 10.5 and 11 stone (BMI of around 22) and now I’m between 9.5 and 10 stone (BMI of around 19). How much of this is psychosomatic and how much is real I have no idea – but the bowel movements and the weighing scales don’t lie!
Our weekly food shopping is much cheaper also as meat and cheese are jolly expensive (especially if you’re trying to buy organic, grass fed, free range etc). Replacing those items with more pulses, grains, fruit and veg is much better for you and much cheaper.
Because being vegan is something you’re aware of regularly throughout the day – every time you have a drink, a snack, a meal etc – I’ve found that the constant reminder of your principles and values and the constant opportunity to exercise personal choice has made me much more mindful in other aspects of my life. I feel much more aware of the effects that our choices make and have found myself being much more proactive than I ever used to be – buying local and organic as much as possible, camping holidays instead of a flight abroad, switching to a mooncup, installing a composting bin, taking the bus instead of driving places more often, being more inventive with leftovers than I used to be, having less baths and quicker showers, supporting an independent coffee stall instead of Starbucks, buying second hand as much as possible, supporting ethically minded companies more etc. The list goes on and of course I’m not saying I always make these decisions – but certainly a lot more than I used to and I think of the impact of my decisions every time I set out to buy something.
The social side of veganism I have found by far the hardest challenge. I’ll talk about in more detail in another post but there have been lots of tricky situations – many of which I’ve handled terribly! I had not foreseen what a hugely contentious issue it would be for so many people and I certainly hadn’t accounted for how many people would take my being vegan as some sort of personal attack on their lifestyles and choices (which it certainly is not!). More of that later… but on the plus side, I have had so many engaging, fascinating, heated, passionate, enlightening conversations over the past year that even if I were to give it all up tomorrow, it would have been a really worthwhile experiment in that regard alone.
I don’t think I would have got through the first year if it wasn’t for Ed doing this too. I would have crumbled at the first dinner invitation or disapproving gaze from an aged aunt…! Thank goodness we both felt exactly the same way about it. We’ve debated it endlessly and continue to do so and it’s been a really fun and engaging project for us to share together. We’ve changed our style of cooking entirely and our cupboards are completely unrecognisable from what they were a year ago. We’ve spent weekends experimenting with recipes and evenings scouring out the best vegan food in London. We now spend less money on food (both at home and out) but more time planning and experimenting and more time in the kitchen together chopping, preparing, cooking and chatting. So I’m really grateful for having the perfect teammate and also very aware of how much harder and less enjoyable it would have been on my own. So thank you – I love you so much and am so bloody relieved we seem to be leaning in the same direction! x