‘Someday’

Every year, 8 billion chickens are slaughter.

“Someday,” is a rather sentimental but poignant animated video created by PETA along with Morrissey to try and draw attention to the abhorrent conditions these poor birds live in and the biblical numbers that are slaughtered every year.

She’s Alive

This is a non-commercial attempt to highlight the fact that world leaders, irresponsible corporates and mindless ‘consumers’ are combining to destroy life on earth. It is dedicated to all who died fighting for the planet and those whose lives are on the line today. The cut was put together by Vivek Chauhan, a young film maker, together with naturalists working with the Sanctuary Asia network (www.sanctuaryasia.com).

Elderflower Champagne

I promise to make this every year. Perhaps this is the year I finally do!

The Vegan Forager

Elderflower 2Nothing heralds the arrival of summer quite like elderflowers. I’m lucky enough to have quite a large elder at the end of my garden and I know that when those big splashes of cream appear amongst the green leaves, long warm days, music festivals and picnics in the park are not far off. But it also means it’s time to make one of my favourite drinks – elderflower champagne.

Elders are fairly small trees, rarely growing more than a few metres tall, and have oval-shapped leaflets a few inches long. The bark of the trunk usually has verticle furrows and the branches can often become brittle and snap easily. The flowers are tiny and cream-coloured with five petels, but grow in big bunches. Elders grow just about everywhere, from hedgerows, to woodland, to gardens, so you should have no trouble finding them.

To make your own elderflower champagne, you will…

View original post 372 more words

Piglets: It’s Where Food Comes From

piglet

A person I met recently through a professional network this week posted an album of his kids visiting a nearby pig farm. This is one of those “pasture-raised” farms where the pigs can lounge about happily and interact with visitors. The photos showed his daughter snuggling with some piglets and a family of pigs relaxing in the early summer sun. I wrote a blog post not too long ago that parodied this very notion, you can read it here. Come to find out, that blog posted wasn’t satire at all.

The caption that accompanied the photos read something like: “Took the kids to the The Piggery to ‘show them where food comes from.'”

Where food comes from?* Pigs aren’t food. They are animals. Sentient beings. Wait, am I missing something here? So, I visited the Piggery’s website and found this explanation of this pig paradise located on a 70-acre farm…

View original post 332 more words

So this is an amusing prank video carried out in a supermarket in Brazil which is worth watching just for people’s faces and reactions:

Click here to watch video

the reality of how our sausages get from piggy to pan is something that none of us are actually comfortable with.  When faced with the reality of it, we are completely repulsed by it.  So why do we happily buy and east sausages?  Because we can do so without ever having to face up to the reality of the hideously cruel world we are financing and supporting.  How many of you have been to a pig farm like this one in Scotland?

Or this one in Vermont?

How many of you have ever been inside a slaughterhouse and watched a pig being ‘processed’?

Or butchered?

I was bought up surrounded by animals and farmers and my dad was a sheep farmer.  But the reality of the slaughterhouse process, especially the industrial scale ones we are seeing more and more of as the world’s appetite for meat grows, sickens me to my stomach and I’m sure it would yours if you were brave enough to do your research and take a closer look at how your sausages arrive on your supermarket shelves or butchers hooks.

Study: Global Veganism Would Reduce Carbon Emissions More Than Energy Intervention

Yet another study proving what a devastating effect the meat industry is having on climate change.

Producing nearly 15% of the Earth’s greenhouse gas emissions, the meat industry is one of the top contributors to climate change. Slowly, very slowly, movements like Meatless Mondays and Vegan Before 6 have demonstrated the value, and deliciousness, of adopting a vegan diet, but a carnivorous diet is still seen as evidence of prosperity.

In 2009, researchers at the Netherlands Environmental Assessment Agency calculated that global veganism would reduce agriculture-related carbon emissions by nearly 17%, methane emissions by 24%, and nitrous oxide emissions by 21% by 2050.

The researchers discovered that worldwide veganism, or even just worldwide vegetarianism, would achieve gains at a much lower cost that an energy intervention, like carbon taxes, for instance.

The study demonstrated tremendous value of a vegan or vegetarian diet in staving off climate change, but there are so many other benefits as well. Antibiotic resistance stemming from the meat consumed that has been pumped full of antibiotics would plummet. Pollution rates would drop significantly as factory farms, the biggest polluters in the meat industry, became a thing of the past. General human health and well-being would rise from a plant-based diet free from cholesterol and pharmaceuticals.

By 2050, the global population is predicted to reach a staggering 9 BILLION people. What are we going to do with all the cows currently taking up 25% of the Earth’s land area?

Breakfast smoothie!

20140511-090828.jpg

Woke up to a wet and grey London morning and felt like something creamy and comforting but still healthy and energising. This smoothie is doing the trick nicely.

Ingredients:

1 x banana
Big handful of baby spinach leaves
Juice of 1 lime
Half a cup of soaked almonds
1 cup almond milk
1 tbsp chia seeds

Directions: blend till smooth and creamy!

20140511-090819.jpg

What do you eat all week?!

So I’ve just done a weekly food shop and thought I’d photograph it for you as people are always asking ‘what on earth do you eat all week’?  So here it is…

Image

So this was two trips – one to an independent greengrocers in Southfields for all the fruit and veg…

Image

…and one to Wholefoods for everything else…

Image

The fruit and veg cost £31 and includes some quite expensive imported goodies such as pineapple, avocados, limes etc..  and the Wholefoods shop came to £52 and includes some quite specialist expensive things like a big bag of Cocoa nibs (£14) to keep me in chocolate and banana soy milkshakes for the rest of my pregnancy!  and Arrowroot for tonight’s frittata fiesta… posh crackers, posh chocolate, a sushi rolling mat, posh dressing, very posh crackers, elderflower cordial etc so this shop would normally have been more like £30.  We then usually do an online shop at GoodnessDirect.com for all our toiletries and house cleaning kit, roughly every 3 months, and that comes to about £50.  So that’s a monthly spend on everything of between £250 and £300 which for a greedy family of four I’d say is pretty good. 

Before turning vegan, we shopped in Sainsburys and I could never keep the weekly shop to under £100 a week.  Meat and cheese are expensive!  And we hardly eat any processed food any more.  We were always filling the trolley up with whatever was on offer in an attempt to spend les and the result was we ate far more, far less healthily, always shopped in supermarkets and spent more money. 

Now, we shop in far more ethical sops, have massively reduced our carbon footprint as a family, buy far better quality food, way healthier food and spend less overall.  And the whole shopping experience is a far nicer one too.  I don’t miss battling through Sainsburys on a Saturday afternoon with screaming children hanging out of the trolley whilst I stuff breadsticks into them in a bid to keep them occupied whist I grab anything with a 2 for 1 sticker on it…    

Now I’m on first name terms with my veg man and the kids help him fill up the bags whilst he teaches them the difference between yellow courgettes and Spanish courgettes and Wholefoods is basically like food porn for anyone who enjoys eating!  

 

 

 

Double Chocolate Lovers Pie

Tomorrow afternoon’s kiddie activity done. Thank you! X

The Vegan Discovery

chocolate-torte-LEAD
These chilled fudge-like treats will satisfy any
sweet tooth with their rich, decadent flavor.
Topped with fresh fruit, they are the perfect dessert!
So lets get to it, this is how its made:

Avocado is filled with healthy fats, which means the dessert is thick, creamy and filling while giving your body a boost of the nutrients and fats it needs to thrive. And yet, there is no hint of avocado flavor! Instead, you get delicate notes of peanut butter and pecan mixed in with the chocolate. Deliciousness awaits!
Use mini tart tins with pop-out bottoms to make these sweet treats. There are a wide variety of tart tins, so you have your choice of size, depth, shape and even if you want smooth or fluted edges to your tortes. Just make sure they have pop-out bottoms so you can easily remove your chilled torte from the tin to serve…

View original post 482 more words

Photo of the Day: “Nice To Meat You”

The Paw Report

Studies have demonstrated that to win people over to a cause, such as alleviating world hunger, it is far better to appeal to the heart rather than the head: people tend to respond more charitably to “identifiable victims” than they do to statistics. “The death of one man is a tragedy, the death of millions is a statistic.” (It does get more complicated than that, however; please see this article.) Knowing how people are generally going to respond has been beneficial to animal advocacy. When we advocate for veganism, for example, and we’re emphasizing the large numbers of animals — all the billions of nameless victims — suffering in our food system, we may be deterring others from joining the cause. When discussing animal welfare issues or promoting veganism, focusing on individual, identifiable victims rather than statistical data seems to generate more success. (Again, this is complicated — I…

View original post 488 more words

Imagining the Post-Antibiotics Future by Maryn McKenna

Here’s the article recommended in the previous post…  well worth reading!

This article was written by Maryn McKenna and produced in collaboration with the Food & Environment Reporting Network, an independent, non-profit news organization producing investigative reporting on food, agriculture and environmental health.

After 85 years, antibiotics are growing impotent. So what will medicine, agriculture and everyday life look like if we lose these drugs entirely? 

A few years ago, I started looking online to fill in chapters of my family history that no one had ever spoken of. I registered on Ancestry.com, plugged in the little I knew, and soon was found by a cousin whom I had not known existed, the granddaughter of my grandfather’s older sister. We started exchanging documents: a copy of a birth certificate, a photo from an old wedding album. After a few months, she sent me something disturbing.

It was a black-and-white scan of an article clipped from the long-gone Argus of Rockaway Beach, New York. In the scan, the type was faded and there were ragged gaps where the soft newsprint had worn through. The clipping must have been folded and carried around a long time before it was pasted back together and put away. The article was about my great-uncle, the younger brother of my cousin’s grandmother and my grandfather.

In a family that never talked much about the past, he had been discussed even less than the rest. I knew he had been a fireman in New York City and died young, and that his death scarred his family with a grief they never recovered from. I knew that my father, a small child when his uncle died, was thought to resemble him. I also knew that when my father made his Catholic confirmation a few years afterward, he chose as his spiritual guardian the saint that his uncle had been named for: St. Joseph, the patron of a good death.

I had always heard Joe had been injured at work: not burned, but bruised and cut when a heavy brass hose nozzle fell on him. The article revealed what happened next. Through one of the scrapes, an infection set in. After a few days, he developed an ache in one shoulder; two days later, a fever. His wife and the neighborhood doctor struggled for two weeks to take care of him, then flagged down a taxi and drove him fifteen miles to the hospital in my grandparents’ town. He was there one more week, shaking with chills and muttering through hallucinations, and then sinking into a coma as his organs failed. Desperate to save his life, the men from his firehouse lined up to give blood. Nothing worked. He was thirty when he died, in March 1938.

The date is important. Five years after my great-uncle’s death, penicillin changed medicine forever. Infections that had been death sentences—from battlefield wounds, industrial accidents, childbirth—suddenly could be cured in a few days. So when I first read the story of his death, it lit up for me what life must have been like before antibiotics started saving us.

Lately, though, I read it differently. In Joe’s story, I see what life might become if we did not have antibiotics any more.


Predictions that we might sacrifice the antibiotic miracle have been around almost as long as the drugs themselves. Penicillin was first discovered in 1928 and battlefield casualties got the first non-experimental doses in 1943, quickly saving soldiers who had been close to death. But just two years later, the drug’s discoverer Sir Alexander Fleming warned that its benefit might not last. Accepting the 1945 Nobel Prize in Medicine, he said:

“It is not difficult to make microbes resistant to penicillin in the laboratory by exposing them to concentrations not sufficient to kill them… There is the danger that the ignorant man may easily underdose himself and by exposing his microbes to non-lethal quantities of the drug make them resistant.”

As a biologist, Fleming knew that evolution was inevitable: sooner or later, bacteria would develop defenses against the compounds the nascent pharmaceutical industry was aiming at them. But what worried him was the possibility that misuse would speed the process up. Every inappropriate prescription and insufficient dose given in medicine would kill weak bacteria but let the strong survive. (As would the micro-dose “growth promoters” given in agriculture, which were invented a few years after Fleming spoke.) Bacteria can produce another generation in as little as twenty minutes; with tens of thousands of generations a year working out survival strategies, the organisms would soon overwhelm the potent new drugs.

Fleming’s prediction was correct. Penicillin-resistant staph emerged in 1940, while the drug was still being given to only a few patients. Tetracycline was introduced in 1950, and tetracycline-resistant Shigella emerged in 1959; erythromycin came on the market in 1953, and erythromycin-resistant strep appeared in 1968. As antibiotics became more affordable and their use increased, bacteria developed defenses more quickly. Methicillin arrived in 1960 and methicillin resistance in 1962; levofloxacin in 1996 and the first resistant cases the same year; linezolid in 2000 and resistance to it in 2001; daptomycin in 2003 and the first signs of resistance in 2004.

With antibiotics losing usefulness so quickly — and thus not making back the estimated $1 billion per drug it costs to create them — the pharmaceutical industry lost enthusiasm for making more. In 2004, there were only five new antibiotics in development, compared to more than 500 chronic-disease drugs for which resistance is not an issue — and which, unlike antibiotics, are taken for years, not days. Since then, resistant bugs have grown more numerous and by sharing DNA with each other, have become even tougher to treat with the few drugs that remain. In 2009, and again this year, researchers in Europe and the United States sounded the alarm over an ominous form of resistance known as CRE, for which only one antibiotic still works.

Health authorities have struggled to convince the public that this is a crisis. In September, Dr. Thomas Frieden, the director of the U.S. Centers for Disease Control and Prevention, issued a blunt warning: “If we’re not careful, we will soon be in a post-antibiotic era. For some patients and some microbes, we are already there.” The chief medical officer of the United Kingdom, Dame Sally Davies — who calls antibiotic resistance as serious a threat as terrorism — recently published a book in which she imagines what might come next. She sketches a world where infection is so dangerous that anyone with even minor symptoms would be locked in confinement until they recover or die. It is a dark vision, meant to disturb. But it may actually underplay what the loss of antibiotics would mean.

 

In 2009, three New York physicians cared for a sixty-seven-year-old man who had major surgery and then picked up a hospital infection that was “pan-resistant” — that is, responsive to no antibiotics at all. He died fourteen days later. When his doctors related his case in a medical journal months afterward, they still sounded stunned. “It is a rarity for a physician in the developed world to have a patient die of an overwhelming infection for which there are no therapeutic options,” they said, calling the man’s death “the first instance in our clinical experience in which we had no effective treatment to offer.”

They are not the only doctors to endure that lack of options. Dr. Brad Spellberg of UCLA’s David Geffen School of Medicine became so enraged by the ineffectiveness of antibiotics that he wrote a book about it.

“Sitting with a family, trying to explain that you have nothing left to treat their dying relative — that leaves an indelible mark on you,” he says. “This is not cancer; it’s infectious disease, treatable for decades.”

As grim as they are, in-hospital deaths from resistant infections are easy to rationalize: perhaps these people were just old, already ill, different somehow from the rest of us. But deaths like this are changing medicine. To protect their own facilities, hospitals already flag incoming patients who might carry untreatable bacteria. Most of those patients come from nursing homes and “long-term acute care” (an intensive-care alternative where someone who needs a ventilator for weeks or months might stay). So many patients in those institutions carry highly resistant bacteria that hospital workers isolate them when they arrive, and fret about the danger they pose to others. As infections become yet more dangerous, the healthcare industry will be even less willing to take such risks.

Those calculations of risk extend far beyond admitting possibly contaminated patients from a nursing home. Without the protection offered by antibiotics, entire categories of medical practice would be rethought.

Many treatments require suppressing the immune system, to help destroy cancer or to keep a transplanted organ viable. That suppression makes people unusually vulnerable to infection. Antibiotics reduce the threat; without them, chemotherapy or radiation treatment would be as dangerous as the cancers they seek to cure. Dr. Michael Bell, who leads an infection-prevention division at the CDC, told me: “We deal with that risk now by loading people up with broad-spectrum antibiotics, sometimes for weeks at a stretch. But if you can’t do that, the decision to treat somebody takes on a different ethical tone. Similarly with transplantation. And severe burns are hugely susceptible to infection. Burn units would have a very, very difficult task keeping people alive.”

 

 

 

                                                                         

 

 

 

Doctors routinely perform procedures that carry an extraordinary infection risk unless antibiotics are used. Chief among them: any treatment that requires the construction of portals into the bloodstream and gives bacteria a direct route to the heart or brain. That rules out intensive-care medicine, with its ventilators, catheters, and ports—but also something as prosaic as kidney dialysis, which mechanically filters the blood.

Next to go: surgery, especially on sites that harbor large populations of bacteria such as the intestines and the urinary tract. Those bacteria are benign in their regular homes in the body, but introduce them into the blood, as surgery can, and infections are practically guaranteed. And then implantable devices, because bacteria can form sticky films of infection on the devices’ surfaces that can be broken down only by antibiotics

Dr. Donald Fry, a member of the American College of Surgeons who finished medical school in 1972, says: “In my professional life, it has been breathtaking to watch what can be done with synthetic prosthetic materials: joints, vessels, heart valves. But in these operations, infection is a catastrophe.” British health economists with similar concerns recently calculated the costs of antibiotic resistance. To examine how it would affect surgery, they picked hip replacements, a common procedure in once-athletic Baby Boomers. They estimated that without antibiotics, one out of every six recipients of new hip joints would die.

Antibiotics are administered prophylactically before operations as major as open-heart surgery and as routine as Caesarean sections and prostate biopsies. Without the drugs, the risks posed by those operations, and the likelihood that physicians would perform them, will change.

 

 

“In our current malpractice environment, is a doctor going to want to do a bone marrow transplant, knowing there’s a very high rate of infection that you won’t be able to treat?” asks Dr. Louis Rice, chair of the department of medicine at Brown University’s medical school. “Plus, right now healthcare is a reasonably free-market, fee-for-service system; people are interested in doing procedures because they make money. But five or ten years from now, we’ll probably be in an environment where we get a flat sum of money to take care of patients. And we may decide that some of these procedures aren’t worth the risk.”

Medical procedures may involve a high risk of infections, but our everyday lives are pretty risky too. One of the first people to receive penicillin experimentally was a British policeman, Albert Alexander. He was so riddled with infection that his scalp oozed pus and one eye had to be removed. The source of his illness: scratching his face on a rosebush. (There was so little penicillin available that, though Alexander rallied at first, the drug ran out, and he died.)

Before antibiotics, five women died out of every 1,000 who gave birth. One out of nine people who got a skin infection died, even from something as simple as a scrape or an insect bite. Three out of ten people who contracted pneumonia died from it. Ear infections caused deafness; sore throats were followed by heart failure. In a post-antibiotic era, would you mess around with power tools? Let your kid climb a tree? Have another child?

“Right now, if you want to be a sharp-looking hipster and get a tattoo, you’re not putting your life on the line,” says the CDC’s Bell. “Botox injections, liposuction, those become possibly life-threatening. Even driving to work: We rely on antibiotics to make a major accident something we can get through, as opposed to a death sentence.”

Bell’s prediction is a hypothesis for now—but infections that resist even powerful antibiotics have already entered everyday life. Dozens of college and pro athletes, most recently Lawrence Tynes of the Tampa Bay Buccaneers, have lost playing time or entire seasons to infections with drug-resistant staph, MRSA. Girls who sought permanent-makeup tattoos have lost their eyebrows after getting infections. Last year, three members of a Maryland family — an elderly woman and two adult children — died of resistant pneumonia that took hold after simple cases of flu.

At UCLA, Spellberg treated a woman with what appeared to be an everyday urinary-tract infection — except that it was not quelled by the first round of antibiotics, or the second. By the time he saw her, she was in septic shock, and the infection had destroyed the bones in her spine. A last-ditch course of the only remaining antibiotic saved her life, but she lost the use of her legs. “This is what we’re in danger of,” he says. “People who are living normal lives who develop almost untreatable infections.”

In 2009, Tom Dukes — a fifty-four-year-old inline skater and body-builder — developed diverticulosis , a common problem in which pouches develop in the wall of the intestine. He was coping with it, watching his diet and monitoring himself for symptoms, when searing cramps doubled him over and sent him to urgent care. One of the thin-walled pouches had torn open and dumped gut bacteria into his abdomen — but for reasons no one could explain, what should have been normal E. coli were instead highly drug-resistant. Doctors excised eight inches of his colon in emergency surgery. Over several months, Dukes recovered with the aid of last-resort antibiotics, delivered intravenously. For years afterward, he was exhausted and in pain. “I was living my life, a really healthy life,” he says. “It never dawned on me that this could happen.”

 

Dukes believes, though he has no evidence, that the bacteria in his gut became drug-resistant because he ate meat from animals raised with routine antibiotic use. That would not be difficult: most meat in the United States is grown that way. To varying degrees depending on their size and age, cattle, pigs, and chickens — and, in other countries, fish and shrimp — receive regular doses to speed their growth, increase their weight, and protect them from disease. Out of all the antibiotics sold in the United States each year, 80 percent by weight are used in agriculture, primarily to fatten animals and protect them from the conditions in which they are raised.

A growing body of scientific research links antibiotic use in animals to the emergence of antibiotic-resistant bacteria: in the animals’ own guts, in the manure that farmers use on crops or store on their land, and in human illnesses as well. Resistant bacteria move from animals to humans in groundwater and dust, on flies, and via the meat those animals get turned into.

 

An annual survey of retail meat conducted by the Food and Drug Administration—part of a larger project involving the CDC and the U.S. Department of Agriculture that examines animals, meat, and human illness—finds resistant organisms every year. In its 2011 report, published last February, the FDA found (among many other results) that 65 percent of chicken breasts and 44 percent of ground beef carried bacteria resistant to tetracycline, and 11 percent of pork chops carried bacteria resistant to five classes of drugs. Meat transports those bacteria into your kitchen, if you do not handle it very carefully, and into your body if it is not thoroughly cooked—and resistant infections result.

Researchers and activists have tried for decades to get the FDA to rein in farm overuse of antibiotics, mostly without success. The agency attempted in the 1970s to control agricultural use by revoking authorization for penicillin and tetracycline to be used as “growth promoters,” but that effort never moved forward. Agriculture and the veterinary pharmaceutical industry pushed back, alleging that agricultural antibiotics have no demonstrable effect on human health.

Few, though, have asked what multi-drug–resistant bacteria might mean for farm animals. Yet a post-antibiotic era imperils agriculture as much as it does medicine. In addition to growth promoters, livestock raising uses antibiotics to treat individual animals, as well as in routine dosing called “prevention and control” that protects whole herds. If antibiotics became useless, then animals would suffer: individual illnesses could not be treated, and if the crowded conditions in which most meat animals are raised were not changed, more diseases would spread.

 

 

But if the loss of antibiotics change how livestock are raised, then farmers might be the ones to suffer. Other methods for protecting animals from disease—enlarging barns, cutting down on crowding, and delaying weaning so that immune systems have more time to develop—would be expensive to implement, and agriculture’s profit margins are already thin. In 2002, economists for the National Pork Producers Council estimated that removing antibiotics from hog raising would force farmers to spend $4.50 more per pig, a cost that would be passed on to consumers.

H. Morgan Scott, a veterinary epidemiologist at Kansas State University, unpacked for me how antibiotics are used to control a major cattle illness, bovine respiratory disease. “If a rancher decides to wean their calves right off the cow in the fall and ship them, that’s a risky process for the calf, and one of the things that permits that to continue is antibiotics,” he said, adding: “If those antibiotics weren’t available, either people would pay a much lower price for those same calves, or the rancher might retain them through the winter” while paying extra to feed them. That is, without antibiotics, those farmers would face either lower revenues or higher costs.

Livestock raising isn’t the only aspect of food production that relies on antibiotics, or that would be threatened if the drugs no longer worked. The drugs are routinely used in fish and shrimp farming, particularly in Asia, to protect against bacteria that spread in the pools where seafood is raised—and as a result, the aquaculture industry is struggling with antibiotic-resistant fish diseases and searching for alternatives. In the United States, antibiotics are used to control fruit diseases, but those protections are breaking down too. Last year, streptomycin-resistant fire blight, which in 2000 nearly destroyed Michigan’s apple and pear industry, appeared for the first time in orchards in upstate New York, which is (after Michigan) one of the most important apple-growing states. “Our growers have never seen this, and they aren’t prepared for it,” says Herb Aldwinckle, a professor of plant pathology at Cornell University. “Our understanding is that there is one useful antibiotic left.”

 

 

Is a post-antibiotic era inevitable? Possibly not — but not without change.

In countries such as as Denmark, Norway, and the Netherlands, government regulation of medical and agricultural antibiotic use has helped curb bacteria’s rapid evolution toward untreatability. But the U.S. has never been willing to institute such controls, and the free-market alternative of asking physicians and consumers to use antibiotics conservatively has been tried for decades without much success. As has the long effort to reduce farm antibiotic use; the FDA will soon issue new rules for agriculture, but they will be contained in a voluntary “guidance to industry,” not a regulation with the force of law.

What might hold off the apocalypse, for a while, is more antibiotics—but first pharmaceutical companies will have to be lured back into a marketplace they already deemed unrewarding. The need for new compounds could force the federal government to create drug-development incentives: patent extensions, for instance, or changes in the requirements for clinical trials. But whenever drug research revives, achieving a new compound takes at least 10 years from concept to drugstore shelf. There will be no new drug to solve the problem soon—and given the relentlessness of bacterial evolution, none that can solve the problem forever. In the meantime, the medical industry is reviving the old-fashioned solution of rigorous hospital cleaning, and also trying new ideas: building automatic scrutiny of prescriptions into computerized medical records, and developing rapid tests to ensure the drugs aren’t prescribed when they are not needed. The threat of the end of antibiotics might even impel a reconsideration of phages, the individually brewed cocktails of viruses that were a mainstay of Soviet Union medical care during the Cold War. So far, the FDA has allowed them into the U.S. market only as food-safety preparations, not as treatments for infections.

But for any of that to happen, the prospect of a post-antibiotic era has to be taken seriously, and those staring down the trend say that still seems unlikely. “Nobody relates to themselves lying in an ICU bed on a ventilator,” says Rice of Brown University. “And after it happens, they generally want to forget it.”


When I think of preventing this possible future, I re-read my great-uncle’s obit, weighing its old-fashioned language freighted with a small town’s grief.

The world is made up of “average” people, and that is probably why editorials are not written about any one of them. Yet among these average people, who are not “great” in political, social, religious, economic or other specialized fields, there are sometimes those who stand out above the rest: stand out for qualities that are intangible, that we can’t put our finger on.

Such a man was Joe McKenna, who died in the prime of life Friday. Joe was not one of the “greats.” Yet few men, probably, have been mourned by more of their neighbors — mourned sincerely, and sorrowfully — than this red-haired young man.

I run my cursor over the image of the tattered newsprint, the frayed creases betraying the years that someone carried the clipping with them. I picture my cousin’s grandmother flattening the fragile scrap as gently as if she were stroking her brother’s hot forehead, and reading the praise she must have known by heart, and folding it closed again. I remember the few stories I heard from my father, of how Joe’s death shattered his family, embittering my grandfather and turning their mother angry and cold.

I imagine what he might have thought — thirty years old, newly married, adored by his siblings, thrilled for the excitement of his job — if he had known that a few years later, his life could have been saved in hours. I think he would have marveled at antibiotics, and longed for them, and found our disrespect of them an enormous waste. As I do.

 

 

 

 

 

 

 

 

 

 

 

 

 

What if Everyone in the World Became a Vegetarian?

I came across this great article by L.V. Anderson on http://www.slate.com (an online daily magazine) and thought I’d share it with you. 

What if Everyone in the World Became a Vegetarian?

Calculating the chaos and the changed climate.

Vegan burgers with sweet potato and chickpeas.

The meat industry is one of the top contributors to climate change, directly and indirectly producing about 14.5 percent of the world’s anthropogenic greenhouse gas emissions, and global meat consumption is on the rise. People generally like eating meat—when poor people start making more money, they almost invariably start buying more meat. As the population grows and eats more animal products, the consequences for climate change, pollution, and land use could be catastrophic.

Attempts to reduce meat consumption usually focus on baby steps — Meatless Monday and “vegan before 6,”passable fake chicken, and in vitro burgers. If the world is going to eat less meat, it’s going to have to be coaxed and cajoled into doing it, according to conventional wisdom.

But what if the convincing were the easy part? Suppose everyone in the world voluntarily stopped eating meat, en masse. I know it’s not actually going to happen. But the best-case scenario from a climate perspective would be if all 7 billion of us woke up one day and realized that PETA was right all along. If this collective change of spirit came to pass, like Peter Singer’s dearest fantasy come true, what would the ramifications be?

 

At least one research team has run the numbers on what global veganism would mean for the planet. In 2009 researchers from the Netherlands Environmental Assessment Agency published their projections of the greenhouse gas consequences if humanity came to eat less meat, no meat, or no animal products at all. The researchers predicted that universal veganism would reduce agriculture-related carbon emissions by 17 percent, methane emissions by 24 percent, and nitrous oxide emissions by 21 percent by 2050. Universal vegetarianism would result in similarly impressive reductions in greenhouse gas emissions. What’s more, the Dutch researchers found that worldwide vegetarianism or veganism would achieve these gains at a much lower cost than a purely energy-focused intervention involving carbon taxes and renewable energy technology. The upshot: Universal eschewal of meat wouldn’t single-handedly stave off global warming, but it would go a long way toward mitigating climate change.

The Dutch researchers didn’t take into account what else might happen if everyone gave up meat. “In this scenario study we have ignored possible socio-economic implications such as the effect of health changes on GDP and population numbers,” wrote Elke Stehfest and her colleagues. “We have not analyzed the agro-economic consequences of the dietary changes and its implications; such consequences might not only involve transition costs, but also impacts on land prices. The costs that are associated with this transition might obviously offset some of the gains discussed here.”

Indeed. If the world actually did collectively go vegetarian or vegan over the course of a decade or two, it’s reasonable to think the economy would tank. According to “Livestock’s Long Shadow,” the influential 2006 U.N. report about meat’s devastating environmental effects, livestock production accounts for 1.4 percent of the world’s total GDP. The production and sale of animal products account for 1.3 billion people’s jobs, and 987 million of those people are poor. If demand for meat were to disappear overnight, those people’s livelihoods would disappear, and they would have to find new ways of making money. Now, some of them—like the industrial farmers who grow the corn that currently goes to feed animals on factory farms—would be in a position to adapt by shifting to in-demand plant-based food production. Others, namely the “huge number of people involved in livestock for lack of an alternative, particularly in Africa and Asia,” would probably be out of luck. (Things would be better for the global poor involved in the livestock trade if everyone continued to consume other animal products, such as eggs, milk, and wool, than if everyone decided to go vegan.) As the economy adjusted to the sudden lack of demand for meat products, we would expect to see widespread suffering and social unrest.

A second major ramification of global vegetarianism would be expanses of new land available. Currently, grazing land for ruminants—cows and their kin—accounts for a staggering 26 percent of the world’s ice-free land surface. The Dutch scientists predict that 2.7 billion hectares (about 10.4 million square miles) of that grazing land would be freed up by global vegetarianism, along with 100 million hectares (about 386,000 square miles) of land that’s currently used to grow crops for livestock. Not all of this land would be suitable for humans, but surely it stands to reason that this sudden influx of new territory would make land much cheaper on the whole.

A third major ramification of global vegetarianism would be that the risk of antibiotic-resistant infections would plummet. Currently, the routine use of antibiotics in animal farming to promote weight gain and prevent illness in unsanitary conditions is a major contributor to antibiotic resistance. Last year the Centers for Disease Control and Prevention announced that at least 2 million Americans fall ill from antibiotic-resistant pathogens every year and declared that “much of antibiotic use in animals is unnecessary and inappropriate and makes everyone less safe.” The overprescription of antibiotics for humans plays a big role in antibiotic resistance, but eradicating the factory farms from which many antibiotic-resistant bacteria emerge would make it more likely that we could continue to count on antibiotics to cure serious illnesses. (For a sense of what a “post-antibiotics future” would look like, read Maryn McKenna’s amazing article on the topic for Medium and her story about a possible solution for chicken farming in Slate.)

So what would be the result, in an all-vegetarian world, of the combination of widespread unemployment and economic disruption, millions of square miles of available land, and a lowered risk of antibiotic-resistant gonorrhea? I can only conclude that people would band together to form communes in order to escape capitalism’s ruthlessness, squat on the former pasture land, and adopt a lifestyle of free love.

I kid. Mostly. It’s easy to get carried away when you’re speculating about unlikely scenarios—and sudden intercontinental vegetarianism is very much an unlikely scenario.

But if the result of a worldwide shift to a plant-based diet sounds like a right-winger’s worst nightmare, it’s worth pointing out that continuing to eat as much meat as we currently do promises to result in a left-winger’s worst nightmare: In a world of untrammeled global warming, where disastrous weather events are routine, global conflicts will increase, only the wealthy will thrive, and the poor will suffer.

Let’s try a middle path. We’re not all going to become vegetarians, but most of us can stop giving our money to factory farms—the biggest and worst offenders, from a pollution and public health perspective. We can eat less meat than we currently do, especially meat from methane-releasing ruminants (cattle, sheep, goats, etc.). Just because a sudden global conversion to vegetarianism would have jarring effects doesn’t mean we can’t gradually reduce our consumption of meat, giving the market time to adjust. We not only can; we must. After all, with the world’s population slated to grow to 9 billion by 2050, we’ll be needing to take some of the 25 percent of the world’s land area back from the cows.

A much more sensible approach: “Can science stop sharks attacking humans?”

Here’s a much more sensible article outlining various ways that humans could try and modify their behaviour to work in harmony with sharks.  In the same way an inoculation for badgers in the UK would be afar better method than random culling – the same applies to the shark population. Mindless culling is never the answer! 

Bull shark

Sharks have patrolled the oceans for at least 400 million years and evolved into a huge range of remarkable species.

There are deep sea lantern sharks that glow in the dark, wobbegong sharks that grow shaggy beards, and majestic, plankton-sifting whale sharks – the biggest fish in the sea.

Nevertheless, when many people think of these animals, one thing comes to mind: shark attacks.

As a beachgoer, diver or surfer your chances of encountering a shark, let alone being killed by one, are in fact incredibly slim; lightning strikes, bee stings and car accidents all pose far more of a threat than sharks.

In reality, people kill millions more sharks than sharks kill people.

A quarter of all shark species, and their relatives the rays, are threatened with extinction, according to a recent report from the International Union for the Conservation of Nature (IUCN).

The main threat to sharks is overfishing and in greatest peril are the largest species.

Shark-repellent wetsuits The striped suit tells sharks a diver is not safe to eat, while the blue design acts as camouflage

But a controversial cull of sharks was recently ordered in Western Australia following a spate of attacks.

Scientists are now looking at other approaches to deal with the shark attack issue.

Prof Shaun Collin is leading a University of Western Australia (UWA) team of neurobiologists who are learning to think like sharks.

“We’re trying to tread this very fine line of protecting both humans and sharks at the same time,” Prof Collin told the BBC World Service programme Discovery.

By studying shark brains and shark senses, the team is developing and testing various non-lethal repellents. The aim is to manipulate the sharks’ finely-tuned senses in ways that discourage them from approaching and attacking people.

One of these is a “shark-proof” wetsuit designed to make people look like poisonous, black and white banded sea snakes, something that many sharks tend to avoid.

The stripy wetsuit was first thought up years ago by marine biologist Walter Starck. Now a detailed understanding of shark vision is helping the UWA team to bring this idea up to date.

Nathan Hart, assistant professor at UWA, explained to me that sharks don’t see as well as humans.

“We’ve made sure that the size of the bands can be detected by a shark from a certain distance,” he says.

Tests of the new wetsuit design are currently underway. This involves wrapping the fabric around a barrel filled with dead fish and watching how sharks respond to it in the wild.

It is still early days, but so far, Nathan told me, the results have been encouraging.

“Based on what we know about the sensory systems of sharks, they should reduce your risk to some extent,” he says.

“Just like a seatbelt in a car, it doesn’t reduce your risk to zero; it’s a matter of reducing your risk by a certain amount and by as much as possible,” he adds.

Shark dive boat in Fiji Divers in Fiji have trained the sharks how to behave in return for food

As well as trying to protect individual swimmers, another tactic is to make certain areas out of bounds to sharks.

“We can try and define areas on the beaches where people are confident they can go and swim,” says Dr Hart.

Bubble curtains could be deployed to keep sharks away from popular beaches.

The idea is to lay perforated hosepipes across the seabed and pump air through them and create a plume of bubbles that sharks may decide not to swim through.

Sharks can see and hear the bubbles and also feel them with their lateral line, a system of sense organs many fish have.

“It’s a system of what’s known as ‘distant touch’; it detects vibrations and very low-frequency sound in the water,” Nathan explained.

Early tests showed that tiger sharks eventually pluck up the courage to cross a barrier of bubbles, suggesting they have the ability to learn.

Eugenie Clark, a veteran marine biologist at the Mote Marine Laboratory in Florida, pioneered studies of shark learning back in the 1950s.

Nicknamed “The Shark Lady”, Dr Clark trained captive sharks to press targets with their snouts and ring bells for a food reward. She showed for the first time that sharks can learn and remember things.

Eugenie told me about the time she took a trained baby nurse shark as a gift for the Crown Prince of Japan who shared her fascination with fish.

“The airline gave me an extra seat for the shark. Most people didn’t know, he was such a tiny thing he was less than two feet long. But he never made a mistake,” says Eugenie.

Recently, I witnessed for myself the capacity sharks have to learn and in particular that they can learn not to attack people.

I went diving off the Pacific island of Fiji and saw my first bull sharks, notorious as one of the most aggressive shark species.

Locals from Beqa Adventure Divers have trained a population of around 100 bull sharks to approach a diver, one-by-one, and gently take a chunk of fish offered to them by hand.

The sharks have learned how to behave if they want food.

“They know us very well,” Fijian divemaster Papa told me before I jumped in the water. “That’s the good thing, they know what’s going on.”

Preparing for the dive, I wasn’t exactly sure how I would react to seeing these giant predators. But as soon as I got down beneath the waves my nerves evaporated and I saw just how graceful and calm bull sharks can be.

Male tiger shark killed as part of Western Australia cull The Western Australia government responded to a recent spate of attacks with a cull

There was no safety cage or any sort of repellent and I never felt in any kind of danger.

As well as helping to shift the sharks’ bad reputation as insatiable killers, the Fijian divers are showing that a live shark in the water is worth far more than a dead one.

In Fiji and elsewhere around the world, sharks are under immense pressure from the demand in Asia for shark fin soup.

Back in Western Australia, the shark cull continues amid beachside protests.

The problem has been an abnormal high in shark attacks, with seven fatalities over the last three years compared with 20 in the last century.

The response of the Western Australia government has been to lay baited hooks offshore from popular beaches. Any great white, tiger and bull sharks that are caught and are larger than 3m long are shot and dumped at sea.

One opponent of the cull is shark attack survivor Rodney Fox. Fifty years ago he suffered a horrific attack from a great white in South Australia but since then has become a dedicated shark advocate.

“We just have to learn how to live with the sharks and not just kill them from fear,” he told me.

He thinks killing sharks deliberately is an unscientific and irrational strategy to try to reduce the attack rate.

But Western Australia’s government says the cull is in place to protect swimmers and surfers. Premier Colin Barnett has said: “The West Australian government is absolutely confident that the policy in place is the right policy and we intend to continue it.”

An open letter from more than 100 scientists has urged Mr Barnett to reconsider the cull, highlighting its environmental impact and the low chance of catching the individual sharks responsible for the attacks.

“Every scientist that I’ve heard of and talked to all agree that it’s not the thing to do,” says Mr Fox.

Shark cull – enough to make you weep!

The shark cull going on in Western Australia this week is enough to make you weep.  It’s reminiscent of the badger cull we are having here at the moment – an astonishingly incompetent and pointless exercise which serves only to be seen to be trying to solve a problem which is man made in the first place and won’t make a damned bit of difference anyhow.  The total disregard for the badgers and sharks is arrogant in the extreme and makes my blood boil!   What is wrong with people??? 

Just read this article and you’ll se what a staggeringly inhumane and pointless waste of life this is…

In this photo released by Sea Shepherd, a male tiger shark hangs tied up on a fishing boat off Moses Rock on the Western Australian coast, on Saturday, 22 Feb, 2014

More than 170 sharks have been caught on lines under a controversial cull policy in Western Australia.

Drum lines were set up along seven Western Australian beaches as part of a trial between January and April. Fifty of the biggest sharks were destroyed.

Authorities said the cull was necessary after six people were killed in shark attacks.

No great white sharks, to whom most of the attacks were attributed, were caught.

Australia’s state government said the cull was successfully restoring confidence among beachgoers.

It is seeking to continue the programme for three more years.

“I think the strategy’s gone very well, bearing in mind that it’s a very broad strategy, and that’s basically to protect those people that swim in those popular areas,” Western Australia Fisheries Minister Ken Baston said.

“While of course we will never know if any of the sharks caught would have harmed a person, this government will always place greatest value on human life.”

Protesters argue that a shark cull is not the answer and would only damage the sea’s delicate ecosystem.

“The policy is very unpopular, it has hardly caught any of the sharks it was destined to catch,” said Labor fisheries spokesman Dave Kelly.

“What people want is scientific research to show why the government thinks this policy makes our beaches safer.”