Away From the Bench

The world outside of the lab

Shape-ups Ship Out

News broke yesterday about Sketchers settling charges against them from the Federal Trade Commission (FTC) for $40 million.  The FTC claims the company “deceived consumers  by making unfounded claims that Shape-ups would help people lose weight, and strengthen and tone their buttocks, legs and abdominal muscles”.

Now, if you’ve ever seen Shape-ups shoes, you might think they would look good with a nice pair of slacks (I’m using slacks as an out-dated term from the 80’s).  I personally would never buy shoes that look like that even if the company claimed it would tone 100 different muscles, and I wear athletic-looking shoes (I even have a different pair of Sketchers!) almost every day.  I apologize to any of my friends or family who bought these shoes (Amy) but they just didn’t do it for me.

Aesthetics aside, Sketchers made false claims about Shape-ups, Resistance Runner, Toners, and Tone-ups shoes.  Just the names are enough to get a person excited (good marketing!), but telling consumers they will perform miracles such as losing weight (without changing your lifestyle) and falsifying clinical study data is just stupid.  Great marketing always puts a shine on products, but you still “can’t polish a turd”.  (No matter what the Mythbusters say).

There will never be a miracle pill (or shoe) that will make you lose weight, get in shape, and tone your muscles without any effort from you.  The human body is an amazing organism and if you treat yours right, it will reward you.  There are so many complicated systems at work inside your body that scientists working on a ‘miracle weight loss pill’ are discovering that there may not be such a thing.  I think the best story about finding an ‘obesity cure’ begins back in the 1950’s.  A company that breeds new animal strains for scientific research, Jackson Laboratories, discovered a strain of mice that was constantly feeding, lethargic, and obese.  These mice could not get enough food and would eat until they couldn’t move.  (The blob on the right is one of those mice).

Leibel, RL (2008) International Journal of Obesity, 32:S98–S108.

When molecular biology and DNA genotyping finally caught up in the mid-1990’s, Dr. Jeffrey Friedman, Dr. Rudy Leibel, and their colleagues discovered that this mouse strain, named ob/ob (due to its obesity) had a mutation in a hormone called leptin.  They gave these obese mice leptin, and amazingly they stopped constantly eating and lost weight.  The miracle drug was found!  Even the name leptin was derived from the Greek word leptós, or thin.  Clinical leptin trials began…and ended…because people were not losing weight.  The problem was, most obese people are not obese because they have low levels of leptin.  Turns out, this hormone is made by your fat cells, or adipocytes.  The adipocytes secrete leptin into the blood stream, where it travels to your brain, finds the leptin receptors, and basically says, “hey, you, stop eating.”  So what do obese people have?  Lots and lots of leptin circulating in their blood from all of their adipocytes secreting the hormone.  Many obese people have leptin resistance, which means their brain cannot use the leptin signals they are receiving to tell you to stop eating.  So giving these people more leptin makes no difference.  This research did help some people who have similar mutations to the ob/ob mice or those that actually have a deficiency in leptin production.  However, for the majority of obese people, it could not help them.

The human body is great at adapting to its environment and working to survive.  If you help it along a little, you might be surprised at what it can do.  Don’t look for a magic pill or a magic shoe.  They just don’t exist.  So instead of buying these shoes and waiting for your calves to be toned while walking a block to your car, put on real sneakers and walk a little faster and a little farther, and then you may see some results.

Advertisements

Beetle Juice: the next protein shake?

Michiel den Hartogh is in the kitchen assembling a “crispy cricket” concoction — complete with curried mayonnaise, crocodile pie and fried crickets — with the special care due any delicacy.

This isn’t a chef preparing for the next episode of Fear Factor.  This quote is from an article by Teri Schultz on the Netherlands restaurant Spacktakel and their chef experimenting with insects and worms in their recipes.  More of these kind of restaurants may not be that far off, with the European Union spending 3 million euros on insects-as-food research and the Food and Agriculture Organization of the United Nations assessing the potential of edible insects.

People have been eating bugs gathered from the wild for centuries.  There are thousands of insect species known to be safe for human consumption found throughout the world.  I think insects have turned up as a delicacy in some restaurants because it’s something different and could be the ‘next big thing.’  There seemed to be a trend with Kobe beef or local farm-fresh foods in many expensive restaurants over the past several years.  As local foods and Kobe beef become more and more common, chefs are looking for the next thing to tantalize customers.  Insects could be it.  If eating insects makes you shudder, how about the age-old luxury foods such as caviar and pâté?  Caviar are sturgeon eggs treated with salt to give them flavor and pâté is ground meat and fat blended to a spreadable paste.  Not that appetizing either, huh?

The biggest obstacles to eating bugs for most people are taste and consistency. The taste can always be altered by adding other foods to it, stewing it in a soup or sauce, or even frying it.  Preparing insects in these ways can also change the consistency, although some insects have exoskeletons that might have to be removed to avoid crunchiness.  In Japan, inago (grasshoppers) fried or boiled in sweetened soy sauce have been eaten for centuries.  Most likely, insects will be introduced for mass public consumption in the Western world through insect protein in powdered form and added into other foods.  (Insert joke about Jessica Seinfeld hiding spinach in her kid’s brownies here).  Unless they read the label in the grocery store, people might not even realize they are eating protein from insects.  Bear Grylls wishes he could be that lucky.

Photo copyright simplemathbakery.com

Insects have been known to be a good food source for years.  I found what appears to be a web 1.0 site from the late 1990’s/early 2000’s (complete with visitor counter!) at www.food-insects.com, which was created by Dr. Gene DeFoliart, now Professor Emeritus at UW-Madison.  Studies from his lab (and others) have measured high levels of digestable protein in insects (46-96% protein depending on the species, with digestability at 77-96%) including many essential amino acids (remember, those are the ones you must get from food sources).  Insect conversion of plant protein to body protein mass is much higher than cows (crickets are five times more efficient) and the protein content is comparable to vertebrate animals if the exoskeleton is removed.  The exoskeleton, which contains chitin, could then be used for other agricultural (fertilizer) or industrial (thickener, binder) uses.  Many different insect species (caterpillars, termites, grasshoppers) have high levels of minerals such as iron, copper, and zinc, as well as A and B  vitamins.

Insects as edible protein for humans could help the food industry end their battles with pink slime, feces contamination, hormone use, and animal housing conditions.  It would also make the FDA’s new ‘suggestion’ to reduce antibiotic use in farm animals moot.  Even though I am a meat-eater, I am concerned about the sustainability of large-mammal farming and how it affects the environment (methane gas, clear-cutting forests, etc.)  I don’t think beef, poultry, or pork will completely disappear, but the availability may decrease drastically in the future.

For now, I think the best way to use insects is for animal (non-human) feed.  There is a nice TEDx talk by Jason Drew in which he describes the forces behind his company’s plan (AgriProtein) to use fly larvae as animal feed, instead of feeding chickens and fish old chicken and fish parts.  This return to a more natural, mass-produced food for animals is a great idea.  Although free-range animals are best, the majority of our meat does not come from these animals.  Drew calculates that one kilogram of fly eggs can create 318 kilograms of protein in 72 hours by feeding off of blood waste from slaughterhouses.  The resulting larvae can be dried and powdered to add protein to any type of animal feed.  If this type of animal feed can be used for large-scale farm operations, it could drastically improve feed quality and pathogen transmission that we now have with animal ‘cannibalism’.

Currently, insects are an expensive delicacy for the Western world because there are not many companies producing insects for human consumption (although there are 20,000 insect ‘farmers’ in Thailand), but if mass production of ‘human-grade’ insects begins, they will become much cheaper than other animal meat.  In addition, I doubt there will be any movement to up-charge for healthier ‘organic’ or ‘free-range’ insects, which is a problem we currently have with other meat.  I’m interested to see if edible insects catch on in Europe and the U.S.  Perhaps in the future your spider roll at the sushi restaurant will actually be made of spider instead of soft-shell crab…but I’m also betting my mother will still not eat it.

How do you like them apples?

Creative Commons copyright msr via Flickr

An apple a day keeps the doctor away.  Remember the good ol’ days when maxims like this were all you needed to live a healthy and long life?  Now, the public is bombarded with confusing messages about their health every day.  The latest example is 60 Minutes report on “Toxic Sugar”.  I had several people ask me about this because they wanted to know my opinion – so here it is:  just because Dr. Sanjay Gupta says something does not mean it is true…or any one scientific study for that matter.  Dr. Richard Lustig has been in the limelight recently, as his idea to tax sugar was the basis for my Boston Sugar Party post.  Let’s look at the “facts” Lustig and others presented in the report:

“Sugar is toxic.”  Sugar itself is not toxic.  You actually need sugar to live.  Your brain needs 6 grams of glucose (a type of sugar) per hour to work properly.  In addition, when your muscles and other tissues cannot take up glucose to create energy, as in the case of type 1 diabetics, you will starve to death.  Think of it this way:  you need water to live, but if you drink massive amounts of water, you will die within hours.  If you drink massive amounts of sugary drinks, your pancreas will pump out more insulin to remove the sugar from the bloodstream, and after a while you might become obese, then you may become insulin resistant, some people might become diabetic, and only then will you cause major organ damage.  Considering the time it takes to kill you after overconsumption, water is more toxic than sugar.

“There is no foodstuff on the planet that has fructose that is poisonous.”  Lustig was trying to make the point that humans WANT to eat fructose because of evolution and we instinctively know it’s safe to eat.  I’d like to see his analysis of fructose content on holly berries, sweet peas, and the other poisonous foodstuffs listed by a poison control center in Philadelphia.

“We were born this way.”  He is obviously a Lady Gaga fan, promoting her song, “Born This Way”…ok, that’s untrue, just trying to add some levity here…he just states that we are born to love fructose.  Yes, we may enjoy eating fructose (it has the highest sweetness out of all of the natural sugars), but I still don’t buy the claim that we are prone to it because of evolution and poisonous things.

“Fructose causes heart disease and stroke.”  This is new research coming out of the Dr. Kimber Stanhope lab, and any new hypothesis must be thoroughly tested before it becomes consensus.  The amount of fructose consumed is astronomically high, at 25% of daily calories.  This is not too unusual for scientific studies trying to determine effects from diets, but not many people are actually drinking this many calories every day without any type of exercise.  She states there is are increases in LDL cholesterol and “other risk factors” for cardiovascular disease.  No mention of whether these increases are statistically significant or if sugar is ACTUALLY “just as bad for their hearts as their fatty cheeseburgers.”

“[In the 1970’s] a government commission mandated that we lower fat consumption to try to reduce heart disease.”  This is true, governmental attempts to lower fatty foods did not make us healthier.  Some foods were processed down until the fat was removed, but all the sugar, salt, and calories were still there, and usually some chemicals added in to replace the fat.  However, there was no real regulation of fatty foods and many ‘low-fat’ items on the shelf were always considered ‘diet foods’ that people did not eat consistently.  People still gravitate toward high-fat foods.  Sugar alone did not increase American adult obesity to 36%.  You can order a Double Quarter Pounder with Cheese from McDonald’s which has 740 calories, more than half of those calories coming from the 42 grams of fat.  But it only has 9 grams of sugar!  It won’t make you fat!  Again, this is just not true.  (Just to be fair to other fast food chains, an Extra-Crispy Chicken Breast at KFC will set you back 510 calories, 290 of those come from the 33 grams of fat…but only 1 gram of sugar!)  You need a well-rounded diet of protein, carbohydrate, and fat for your body to function properly.  Telling people that sugar is toxic creates the next “scare” that Lustig said happened with fat 40 years ago, and obesity rates will keep increasing.

“If you limit sugar, it decreases the chances of developing cancer.”  Dr. Lewis Cantley’s example that cancer cells have insulin receptors and will therefore take up excess glucose from the bloodstream and grow into tumors was probably the most shocking to me.  Insulin receptors bind insulin (which is secreted from your pancreas and into the bloodstream after every meal) and if cells have certain glucose transporters, insulin ‘tells’ those transporters to move from inside the cell to the cell surface.  This results in increased glucose entry into the cell.  Many cells in your body have insulin receptors such as muscle, liver, fat, and brain.  Muscle (skeletal and heart) and fat cells have the glucose transporters that are ‘activated’ by insulin.  There is no data that I am aware of which claims cancer cells can take up more glucose than any of the other cells in your body.  Also, insulin signaling (what happens inside the cell after insulin binds the insulin receptor), once activated, is eventually downregulated by a negative feedback system.  Insulin cannot just sit on an insulin receptor and hang out, causing cancer cells to keep sucking up the sugar, or blue dots, as the schematic on the television show would like you to believe.

“Every cell in our body needs glucose to survive, but the problem is, these cancer cells also use it to grow.”  Once again ALL CELLS that have the insulin receptor use glucose to grow.  Period.

“Sugar is much more addictive than we thought early on.”  These scientists seem very careful with their wording.  Dr. Eric Stice showed that dopamine was released in Gupta’s brain when he drank some soda.  Activation of taste receptors on your tongue cause dopamine to be released in the brain, no matter what the food.  If a person likes spinach, the same dopamine receptors will light up in MRI brainscans, especially if the person is hungry at the time.  Dopamine regulates feeding behavior, and not just for sugar.  Maybe sugar can be addictive, but so can so many other things (including suntanning, as I mentioned in my Tanners Anonymous post).  We can’t possibly regulate every choice people make.

As a scientist, I believe the scare tactics that Lustig is using are not within the ‘unwritten rule’ of putting the science first in research.  I am all for new ideas, but a scientist cannot go on national television to state untested claims as fact and contribute positively to the field of science. You’re probably thinking, “he must be a first-hand expert to make these claims on television, right?”  During my scan of his last 27 papers published in peer-reviewed journals since 2009, I calculated only five that were original work coming directly from his lab (where he was last author and they were not review articles).  These five papers had nothing to do with ‘toxic sugar’ and its effects on the human body.  The most recent concerns growth hormone deficiency in children after radiation treatment, and the other four are about the efficacy of lifestyle intervention (behavioral) programs in obese children.  In the other 22 papers, he either contributed to another lab’s work, or they were review or opinion articles on sugar, fast food, and obesity.  Something about this seems fishy to me, and it isn’t Friday.  You can’t be an objective scientist and (what seems to me)  to be striving for fame at the same time.  He wrote an article about ‘toxic sugar’ in the journal Nature a couple months ago, and that’s when the mainstream media started to interview him and publish articles on his opinions.  There are 5 other articles in subsequent issues of Nature that refute his claims.  Here are some quotes from these articles:

Our meta-analyses of controlled feeding trials indicate a net metabolic benefit, with no harmful effects, from fructose at a level of intake obtainable from fruit (Nature. 2012 Feb 23;482(7386):470).

To describe sugar as “toxic” is extreme, as is its ludicrous comparison with alcohol…Nutritionist Jennie Brand-Miller of the University of Sydney is not alone in her disgust that you published this opinion piece (The Australian, 4 February 2012). The Dietitians Association of Australia believes that it is simplistic and unhelpful to blame sugar alone for the obesity crisis. Alan Barclay of the Australian Diabetes Council notes in the same article in The Australian that sugar consumption in Australia has dropped by 23% since 1980. But he adds that during that time, the number of overweight or obese people has doubled, while diabetes has tripled (Nature. 2012 Feb 23;482(7386):470).

Overconsumption of foods that have a high glycaemic index (that trigger a rapid and sharp increase in blood glucose), such as wheat, potatoes and certain types of rice, also contributes to obesity and diabetes. Emphasis on sugar alone is therefore too narrow a basis for devising policies to curb these problems (Nature. 2012 Feb 23;482(7386):471).

Rather than demonizing sugar, the authors would have better served public health with recommendations to manage a balanced diet with exercise (Nature. 2012 Feb 23;482(7386):471).

The Food and Agriculture Organization of the United Nations, the US Food and Nutrition Board, and the European Food Standards Authority have all considered the issues now revisited by Lustig et al. and find no reliable evidence that typical sugar consumption contributes to any disease apart from dental caries. Without evidence that reducing sugar consumption would improve public health, Lustig and colleagues’ policy proposals are irrelevant. Scientific controversies should be settled by consideration of all the available evidence, not of a seemingly biased selection (Nature. 2012 Mar 8;483(7388):158).

I appreciate the attention the ‘Western Diet’ is getting (the diet high in refined grains, sugar, fat, and red meat), as there are major problems with it that lead to obesity and diabetes, but presenting hypotheses as fact to scare the public is not the way to do it.  This is how fad diets are created and may cause people more harm than good.  Educating the public with real information and making it easier for them to make good dietary choices is the best way to battle the bulge.  I say, it probably is best if you don’t drink sugary drinks everyday, but having one every once in a while will not kill you.  And enjoy that apple.

You Can’t Do That…to Beef

I wasn’t planning on commenting on the pink slime issue because it’s all over the news, but my husband asked me last night, “What is this ‘pink slime’ they are talking about?”  So I guess that means not everyone knows what’s going on with this stuff.  The word ‘slime’ always reminds me of You Can’t Do That on Television, which I thoroughly enjoyed watching as a kid.  Who wouldn’t want to get slimed every time they said, “I don’t know?”  Double Dare also had some great slime activities.  And Ghostbusters – they actually had PINK SLIME in the second movie!  I guess the 1980’s was the decade of slime (pun intended?).

Ok, let’s get back on track.  ‘Pink slime’ in reference to foodstuffs came into the public consciousness last year when Chef Jamie Oliver made it his mission for Americans to eat healthier on Jamie Oliver’s Food Revolution.  Yet, it was back in 2002 when USDA microbiologist Gerald Zirnstein toured a beef production facility and saw the meat filler being produced, which looked like pink slime to him.  So he called it ‘pink slime’, even though the USDA refers to it as ‘Lean Finely Textured Beef’.  Pink slime is used as a lower-cost filler for processed meat products, usually ground beef.  The official USDA term sounds more appetizing, although it’s production may not be so palatable.  The outermost part of the cow is mostly fat, but when it is heated and spun really fast in a centrifuge, any protein left in the tissue separates from the fat.  This protein is mostly connective tissue, which is composed of different amino acids (the building blocks of protein) than those found in muscle tissue (the meat you usually eat).  Essential amino acids are those you must get from your diet because your body cannot produce them.  Connective tissue contains very little of these essential amino acids, which decreases the quality and nutritive value of the meat.

Not only is pink slime lesser of a protein product, but it is removed near the cow’s hide and the likelihood of contamination, especially from E. coli and Salmonella bacteria, is significantly increased.  Every problem has a solution, of course, and the beef industry’s solution is to treat this protein product with ammonia hydroxide gas to attempt to kill the bacteria that can kill you.  The problem is, this doesn’t always work well – these pathogens have been found in 51 batches of meat slated for the federal school lunch program since 2005.  Even though the USDA considers this gassing process safe, it still turns my stomach.  Furthermore, the USDA does not require foods that contain pink slime to be labeled, and it’s possible that last package of ground beef you bought at the grocery store contained this filler, as the USDA still considers this 100% beef.  I’m almost certain that frozen beef burrito you bought at 7-11 had some pink slime in it.

Luckily, there is a movement toward reducing pink slime in the food that we eat.  McDonald’s, Burger King, and Taco Bell have already discontinued using pink slime-containing products.  The public outcry over pink slime in school lunches recently persuaded the USDA to offer school districts the option to purchase pink slime-free products.  This story continues to develop, as a major pink slime producer, Beef Products Inc. just suspended operations at 3 of its 4 plants on Monday.

So if we continue to use the pink slime to keep our food costs down, would you rather be eating E. coli or ammonia hydroxide?  If you answered, “I don’t know,” you would have a bucket of slime on your head (if it were 1986).  I think any movement by the food industry towards more natural, whole foods is a good idea.  Let’s look on the bright side of all this:  at least we aren’t making sausage out of children.

I Prefer My Candy Pre-Digested

You may not put much thought into the science behind the food you eat.  Does it matter how the Hungry Man freezes my mac and cheese so it’s creamy when I take it out of the microwave?  Not really, to most consumers.  I think the movement towards more local and natural foods is the best way to go, but I find the science behind how certain foods are modernized to fit today’s society intriguing.

Almost everyone enjoys some chocolate every now and then.  In my opinion, there is no competition for caramel-filled chocolates; however, the chocolates with the most interesting filling may be cherry cordials.  How do they get the runny syrup around a cherry and into the chocolate shell?  Although ‘cordials’ used to be medicinal elixirs thought to improve circulation or aid digestion, Americans thought it was best to remove the alcohol and cover that in chocolate, as they seem to do for most things.  There is a technique that can be used to mold the chocolate shell, fill it with syrup, and then cap it with more chocolate, but that isn’t the easiest process.

The secret behind this candy is that the syrupy center starts as a solid.  The filling is basically boiled sugar water, which can be poured around the maraschino cherry in a mold and cooled into a ball or cooled first to form fondant and then wrapped around the cherry like a ball.  How does this solid ball transform into the chin-dripping liquid?  Science and time.

Copyright dpstevenson2

Invertase, a digestive enzyme produced in your saliva and small intestine, is added to the fondant.  Don’t worry, the invertase that candy companies use is purified from yeast, not from spit.  (That just reminds me of the time Dogfish Head had its employees chew corn to partially digest it into fermentable sugars to brew authentic chicha beer.)  The invertase enzyme breaks up sucrose (table sugar) into glucose and fructose which causes the hardened fondant to become a syrup.  So after manufacturing, the candies need a few weeks to ‘mature’ in their chocolate shell and let the invertase eat the sugar to create the gooey center.  Who knew chocolate-covered cherries needed age on them like a fine wine?

Happy St. Patrick’s Day!

To continue the discussion of addiction…I hope you all drink responsibly today.

This song was posted last year, but the songwriter made some scientific corrections that were pointed out to him.

Enjoy!

Tanners Anonymous?

When I think of addiction, I immediately think of alcohol and drugs.  Yet, addiction can take on many forms and through different psychological and brain chemistry mechanisms.  I recently came upon this article from several months ago that described tanning addiction.  (Insert mandatory Jersey Shore/Snooki reference here).  I was amazed that UV light itself can actually cause the reward system in the brain of excessive tanners (more than 3 times a week) to turn on, measured by increased cerebral blood flow to specific areas in the brain.  When the tanners got the UV light, they were satisfied (at least for a couple of days), but when the UV light was blocked without their knowledge, their desire to tan again remained.

I was trying to come up with an evolutionary reason for addiction and how people can get addicted to what seem to be very weird things.  I don’t know much about neurobiology, but can the brain be trained to reward itself after repeated behaviors?  Perhaps this is a survival mechanism when conditions are harsh, such as times of famine or torture?  I have no idea, but it’s fascinating to me.  There are some good examples of weird addictions on TLC’s My Strange Addiction.  The TV show follows different people who are compelled to eat soap, dryer sheets, and even drywall.  However, there are also psychological issues that may be involved in most of these cases.

What seems to be the most pressing addiction issue for the Surgeon General is tobacco use by children and teenagers.  First, I want to compliment Dr. Benjamin on her sweet uniform.  I know a few hipsters who would love to pair that with some skinny jeans.  Anyway, she calls smoking and tobacco use a ‘pediatric epidemic’.  Wow, that is a strong word, even with a quarter of U.S. high school seniors smoking.  I am all for preventing people from getting addicted tobacco and trying to help them quit.  Not only does this save the individual money from not buy tobacco products, but saves the government and healthcare industry billions of dollars to care for those who have destroyed their body from its use.  The government seems to think fear and shock is the way to change people’s behavior.  They attempted to require new warnings printed on all cigarette packs that include large images of stomas and dead bodies, but this was recently ruled unconstitutional.  In an unscientific poll of late-20s/early-30s smokers that I asked, they claimed these images wouldn’t affect their cigarette use at all.  And, as we learned from my post on hand washing by doctors, people get desensitized to imagery. Starting Monday, you will probably see anti-smoking advertisements on TV and in print, as the government is spending $54 billion on this campaign.  It is possible these graphic ads may deter some younger children from picking up a cigarette…at least for a little while.  I still remember the face of the former baseball player with a swollen face and no lower jaw from mouth cancer on a poster that was hanging in my elementary school.  I applaud the federal government to take up the slack from states who won a $264 billion settlement from the tobacco industry in 1998 and have used this money for things other than tobacco education.  I just hope these ads will make a difference.

Addiction is complicated and involves so many mental and physical processes working together.  How can we change behavior when reward systems in the brain are so strong and addicts don’t fully understand the consequences until much later?  I know there are many people working on figuring that out, and I hope more answers are discovered soon.  In the meantime, take a look at this picture and answer me this: is this the skin of a smoker, the skin of a tanner, or a wallet?

Copyright BittBox

The Plant Whisperer

Tell that to a plant, how dangerous carbon dioxide is.

These words were spoken by Rick Santorum, who is vying for the Republican nomination for President of the United States.  This is not a political blog, so I’m not going to say anything about political views here, but I do have something to say about scientific ignorance.

Before I go further, I have to address the lack of English grammar.  He was probably trying to sound like he was speaking off-the-cuff, but it didn’t work for me, especially during a speech.  ‘Nuff said.  (and that’s my off-the-cuff comment lacking grammar)

I would like to go back to grade school science now and talk about ecosystems.  Ecosystems are a network between organisms and their environment, linked through nutrient cycle and energy flow.  Energy enters the system through photosynthesis (plants convert carbon dioxide into carbohydrates, oxygen, and other compounds with the sun’s light energy).  We, as humans, eat the plants (or eat other animals that ate the plants), breathe the oxygen, and release carbon dioxide into the air and organic compounds into the ground, and the cycle continues.  Santorum was attempting to tell people that carbon dioxide is not dangerous because plants use carbon dioxide, but he failed to mention/realize/understand that humans and plants have opposite roles in the ecosystem.  And what is good for plants is not necessarily good for humans.  For example, sitting outside in the sun all day, every day would most likely result in sun burn and skin cancer – not oxygen production.

I have another pearl of wisdom for you, Santorum:  carbon dioxide CAN be dangerous to plants.  Take volcanic Mammoth Mountain in central California, for instance.  In the early 1990s, trees began dying from high concentrations of carbon dioxide gas seeping into the soil from the adjacent volcano.  The ground fractured during small earthquakes and a most likely, a reservoir of gas trapped beneath the earth’s surface is leaking.  In addition, magma releases gases as it moves toward the surface of the earth.  The problem with too much carbon dioxide in the soil?  It interferes with the plant’s uptake of oxygen and nutrients through its roots.  According to a 2000 report from the U.S. Geological Survey, more than 100 acres of trees have died near Mammoth Mountain from too much carbon dioxide.

If I ever meet Santorum, maybe I’ll inform him of the differences between humans and plants, because he seems to need a refresher.  And next time I talk to a plant, I will let it know how dangerous carbon dioxide can be.

Animal Necessities

Several ferry and airline companies in Britain have stopped transporting animals for research under pressure from animal rights groups.  There are no British airlines that will carry animals to the UK, although foreign airlines still do.  Groups that are trying to prevent laboratories from using animals seriously compromise scientific and medical research.

Using animals for medical research is a touchy subject.  I don’t usually like to take issue with people voicing their opinion about animal research because people can get very upset about this topic.  However; as a user of animals for scientific research, I believe they are indispensable for medical progress.  Banning the use of animals in scientific research is equivalent to halting all progress to study human diseases, drug discovery, and drug safety.

There is no replacement for primary tissue (tissue from an animal) or whole animals for the study of diseases.  Cell lines, which are specific cell types immortalized in a petri dish, cannot replace primary tissue.  Cells that are continually cultured in petri dishes  are basically cancer cells.  Most cell lines come from cell-type specific tumors, which are cut up and then coddled with media and growth factors to allow them to stick to the bottom of the dish and continue growing and dividing.  Cancer cells differ greatly from non-cancerous cells in their internal signaling pathways and metabolism, therefore; cannot reliably be used to understand how normal cells work or efficacy of drugs for other diseases.

Furthermore, without a whole animal, there is no way to test metabolism of new drugs.  There are many organ systems in the mammalian body.  Drugs have specific targets, yet other organs may be damaged or the body may metabolize a drug to a non-functional molecule.  Any new drug cannot immediately be tested in humans; it’s too risky.  Drugs need to be rigorously tested in animal models to determine if they are likely to harm a person or have off-target effects.  There are several phases of drug discovery and development, and without animals, the drug development process completely stops.

Copyright University of Sydney, Kassiou Lab

To those that are concerned with animal welfare, all scientific research is tightly controlled by regulatory bodies.  In the US, the Institutional Animal Care and Use Committee (IACUC) oversees all aspects of animal research to promote responsible use of animals.  Any institution supported by the Public Health Service (PHS) ensures the appropriate care and use of all animals involved in research and follows the “U.S. Government Principles for the Utilization and Care of Vertebrate Animals Used in Testing, Research, and Training” and implements the Health Research Extension Act of 1985.  The Office of Laboratory Animal Welfare (OLAW) also ensures that all of the PHS and IACUC policies and regulations are enforced.  In the UK, the Animals (Scientific Procedures) Act 1986 licenses and regulates institutions, projects, and staff in similar ways.  Animal research labs in both countries have to follow protocols to ensure no abuse of animals by using the minimum amount of animals with the least amount of pain, suffering, or distress.

To animal rights activists:  think about a situation where your child has a deadly disease which could be managed and be survivable with a new drug.  Would you rather push for good checks and balances within the animal research system to ensure that they are treated humanely and use the drug that allows your child to live, or not do everything possible to save your child?  The possibilities significantly dwindle without animal research.

Like Mom Always Said…

A few days ago, the CDC announced that 14,000 people die from Clostridium difficile (C. difficile) each year.  This is a bacterium that lives in your colon, causing inflammation, diarrhea, and nausea.  You might be thinking, so what?  The problem is that many of these deaths could be prevented by proper hand-washing, because guess how you transmit bacteria from your colon to other people?  The number of deaths is shocking to me when I think that half of those cases were acquired in hospitals because of improper hand-washing by the staff (and this is one of many other hospital-borne infections that can also do harm, like staph infections).

Other than what you may have thought as a child, your mother did not make up the idea that washing your hands is good for you.  The first person to recognize that hand-washing can prevent disease was Hungarian Dr. Ignaz Semmelweiss around 1847.  He worked at Vienna General Hospital, which had two maternity clinics.  One was assisted by doctors and medical students and the other by midwives.  He observed that ~13% of the mothers at the obstetrics clinic died from ‘childbed fever’, while only 2% of the mothers died while in the care of the midwives.  Semmelweiss realized that doctors and medical students were dissecting cadavers for their autopsy or study and ‘invisible cadaver particles’ were being transferred to the new mothers.  Thus began the era of germ theory of disease, even though the microorganisms that caused these diseases were unknown at the time.  He instituted the regimen of washing hands and instruments in a chlorinated lime solution and ‘childbed fever’ was nearly eradicated.

A recent Freakonomics podcast described hand-washing practices at Cedars-Sinai Medical Center and tried to relate compliance to fiscal responsibility.  I won’t get into how you can save more money here, but the story about hand washing at a major U.S. hospital can be a lesson for all.  The hospital wanted to determine if they were doing all they could to prevent spread of infection.  They asked the doctors to self-report hand-washing, and 73% claimed they were washing their hands as they should.  However, during the same period, the nurses were ‘keeping an eye’ on the doctors and reported that only 9% of the doctors were in fact washing their hands between patients.  NINE PERCENT.  The most educated, supposedly responsible people at the hospital had the worst hygienic behavior compared to 65% of all hospital workers, including the custodial staff, who were washing their hands properly.

The best part of this story is how they got these doctors to change their behavior.  Education through signs, emails, and rewards didn’t work, but showing them exactly what was on their hands did.  Each person pressed their hand into a petri dish of agar, which was cultured for a couple of days, and then they actually SAW the bacterial colonies growing in clumps in the shape of their hand.  The hospital took a picture of the worst one and made it the screensaver to every computer in the hospital.  Now people were washing their hands – 100% of the time.  Yet, it’s human nature to become desensitized to things like this over time, and people fell back into their old habits.  The best way to get doctors to wash their hands?  Announcing the names of those that failed the hand-washing tests (using the agar plates) at departmental meetings and shaming them into doing a better job.

Human behavior is hard to change, no matter who you might be.  If we want to prevent spread of diseases everywhere, not just in the hospital, the take-home message is: wash your own hands early and often – and don’t be afraid to ask your doctor to do the same.

Post Navigation