Away From the Bench

The world outside of the lab

Archive for the month “March, 2012”

You Can’t Do That…to Beef

I wasn’t planning on commenting on the pink slime issue because it’s all over the news, but my husband asked me last night, “What is this ‘pink slime’ they are talking about?”  So I guess that means not everyone knows what’s going on with this stuff.  The word ‘slime’ always reminds me of You Can’t Do That on Television, which I thoroughly enjoyed watching as a kid.  Who wouldn’t want to get slimed every time they said, “I don’t know?”  Double Dare also had some great slime activities.  And Ghostbusters – they actually had PINK SLIME in the second movie!  I guess the 1980’s was the decade of slime (pun intended?).

Ok, let’s get back on track.  ‘Pink slime’ in reference to foodstuffs came into the public consciousness last year when Chef Jamie Oliver made it his mission for Americans to eat healthier on Jamie Oliver’s Food Revolution.  Yet, it was back in 2002 when USDA microbiologist Gerald Zirnstein toured a beef production facility and saw the meat filler being produced, which looked like pink slime to him.  So he called it ‘pink slime’, even though the USDA refers to it as ‘Lean Finely Textured Beef’.  Pink slime is used as a lower-cost filler for processed meat products, usually ground beef.  The official USDA term sounds more appetizing, although it’s production may not be so palatable.  The outermost part of the cow is mostly fat, but when it is heated and spun really fast in a centrifuge, any protein left in the tissue separates from the fat.  This protein is mostly connective tissue, which is composed of different amino acids (the building blocks of protein) than those found in muscle tissue (the meat you usually eat).  Essential amino acids are those you must get from your diet because your body cannot produce them.  Connective tissue contains very little of these essential amino acids, which decreases the quality and nutritive value of the meat.

Not only is pink slime lesser of a protein product, but it is removed near the cow’s hide and the likelihood of contamination, especially from E. coli and Salmonella bacteria, is significantly increased.  Every problem has a solution, of course, and the beef industry’s solution is to treat this protein product with ammonia hydroxide gas to attempt to kill the bacteria that can kill you.  The problem is, this doesn’t always work well – these pathogens have been found in 51 batches of meat slated for the federal school lunch program since 2005.  Even though the USDA considers this gassing process safe, it still turns my stomach.  Furthermore, the USDA does not require foods that contain pink slime to be labeled, and it’s possible that last package of ground beef you bought at the grocery store contained this filler, as the USDA still considers this 100% beef.  I’m almost certain that frozen beef burrito you bought at 7-11 had some pink slime in it.

Luckily, there is a movement toward reducing pink slime in the food that we eat.  McDonald’s, Burger King, and Taco Bell have already discontinued using pink slime-containing products.  The public outcry over pink slime in school lunches recently persuaded the USDA to offer school districts the option to purchase pink slime-free products.  This story continues to develop, as a major pink slime producer, Beef Products Inc. just suspended operations at 3 of its 4 plants on Monday.

So if we continue to use the pink slime to keep our food costs down, would you rather be eating E. coli or ammonia hydroxide?  If you answered, “I don’t know,” you would have a bucket of slime on your head (if it were 1986).  I think any movement by the food industry towards more natural, whole foods is a good idea.  Let’s look on the bright side of all this:  at least we aren’t making sausage out of children.

Advertisements

I Prefer My Candy Pre-Digested

You may not put much thought into the science behind the food you eat.  Does it matter how the Hungry Man freezes my mac and cheese so it’s creamy when I take it out of the microwave?  Not really, to most consumers.  I think the movement towards more local and natural foods is the best way to go, but I find the science behind how certain foods are modernized to fit today’s society intriguing.

Almost everyone enjoys some chocolate every now and then.  In my opinion, there is no competition for caramel-filled chocolates; however, the chocolates with the most interesting filling may be cherry cordials.  How do they get the runny syrup around a cherry and into the chocolate shell?  Although ‘cordials’ used to be medicinal elixirs thought to improve circulation or aid digestion, Americans thought it was best to remove the alcohol and cover that in chocolate, as they seem to do for most things.  There is a technique that can be used to mold the chocolate shell, fill it with syrup, and then cap it with more chocolate, but that isn’t the easiest process.

The secret behind this candy is that the syrupy center starts as a solid.  The filling is basically boiled sugar water, which can be poured around the maraschino cherry in a mold and cooled into a ball or cooled first to form fondant and then wrapped around the cherry like a ball.  How does this solid ball transform into the chin-dripping liquid?  Science and time.

Copyright dpstevenson2

Invertase, a digestive enzyme produced in your saliva and small intestine, is added to the fondant.  Don’t worry, the invertase that candy companies use is purified from yeast, not from spit.  (That just reminds me of the time Dogfish Head had its employees chew corn to partially digest it into fermentable sugars to brew authentic chicha beer.)  The invertase enzyme breaks up sucrose (table sugar) into glucose and fructose which causes the hardened fondant to become a syrup.  So after manufacturing, the candies need a few weeks to ‘mature’ in their chocolate shell and let the invertase eat the sugar to create the gooey center.  Who knew chocolate-covered cherries needed age on them like a fine wine?

Happy St. Patrick’s Day!

To continue the discussion of addiction…I hope you all drink responsibly today.

This song was posted last year, but the songwriter made some scientific corrections that were pointed out to him.

Enjoy!

Tanners Anonymous?

When I think of addiction, I immediately think of alcohol and drugs.  Yet, addiction can take on many forms and through different psychological and brain chemistry mechanisms.  I recently came upon this article from several months ago that described tanning addiction.  (Insert mandatory Jersey Shore/Snooki reference here).  I was amazed that UV light itself can actually cause the reward system in the brain of excessive tanners (more than 3 times a week) to turn on, measured by increased cerebral blood flow to specific areas in the brain.  When the tanners got the UV light, they were satisfied (at least for a couple of days), but when the UV light was blocked without their knowledge, their desire to tan again remained.

I was trying to come up with an evolutionary reason for addiction and how people can get addicted to what seem to be very weird things.  I don’t know much about neurobiology, but can the brain be trained to reward itself after repeated behaviors?  Perhaps this is a survival mechanism when conditions are harsh, such as times of famine or torture?  I have no idea, but it’s fascinating to me.  There are some good examples of weird addictions on TLC’s My Strange Addiction.  The TV show follows different people who are compelled to eat soap, dryer sheets, and even drywall.  However, there are also psychological issues that may be involved in most of these cases.

What seems to be the most pressing addiction issue for the Surgeon General is tobacco use by children and teenagers.  First, I want to compliment Dr. Benjamin on her sweet uniform.  I know a few hipsters who would love to pair that with some skinny jeans.  Anyway, she calls smoking and tobacco use a ‘pediatric epidemic’.  Wow, that is a strong word, even with a quarter of U.S. high school seniors smoking.  I am all for preventing people from getting addicted tobacco and trying to help them quit.  Not only does this save the individual money from not buy tobacco products, but saves the government and healthcare industry billions of dollars to care for those who have destroyed their body from its use.  The government seems to think fear and shock is the way to change people’s behavior.  They attempted to require new warnings printed on all cigarette packs that include large images of stomas and dead bodies, but this was recently ruled unconstitutional.  In an unscientific poll of late-20s/early-30s smokers that I asked, they claimed these images wouldn’t affect their cigarette use at all.  And, as we learned from my post on hand washing by doctors, people get desensitized to imagery. Starting Monday, you will probably see anti-smoking advertisements on TV and in print, as the government is spending $54 billion on this campaign.  It is possible these graphic ads may deter some younger children from picking up a cigarette…at least for a little while.  I still remember the face of the former baseball player with a swollen face and no lower jaw from mouth cancer on a poster that was hanging in my elementary school.  I applaud the federal government to take up the slack from states who won a $264 billion settlement from the tobacco industry in 1998 and have used this money for things other than tobacco education.  I just hope these ads will make a difference.

Addiction is complicated and involves so many mental and physical processes working together.  How can we change behavior when reward systems in the brain are so strong and addicts don’t fully understand the consequences until much later?  I know there are many people working on figuring that out, and I hope more answers are discovered soon.  In the meantime, take a look at this picture and answer me this: is this the skin of a smoker, the skin of a tanner, or a wallet?

Copyright BittBox

The Plant Whisperer

Tell that to a plant, how dangerous carbon dioxide is.

These words were spoken by Rick Santorum, who is vying for the Republican nomination for President of the United States.  This is not a political blog, so I’m not going to say anything about political views here, but I do have something to say about scientific ignorance.

Before I go further, I have to address the lack of English grammar.  He was probably trying to sound like he was speaking off-the-cuff, but it didn’t work for me, especially during a speech.  ‘Nuff said.  (and that’s my off-the-cuff comment lacking grammar)

I would like to go back to grade school science now and talk about ecosystems.  Ecosystems are a network between organisms and their environment, linked through nutrient cycle and energy flow.  Energy enters the system through photosynthesis (plants convert carbon dioxide into carbohydrates, oxygen, and other compounds with the sun’s light energy).  We, as humans, eat the plants (or eat other animals that ate the plants), breathe the oxygen, and release carbon dioxide into the air and organic compounds into the ground, and the cycle continues.  Santorum was attempting to tell people that carbon dioxide is not dangerous because plants use carbon dioxide, but he failed to mention/realize/understand that humans and plants have opposite roles in the ecosystem.  And what is good for plants is not necessarily good for humans.  For example, sitting outside in the sun all day, every day would most likely result in sun burn and skin cancer – not oxygen production.

I have another pearl of wisdom for you, Santorum:  carbon dioxide CAN be dangerous to plants.  Take volcanic Mammoth Mountain in central California, for instance.  In the early 1990s, trees began dying from high concentrations of carbon dioxide gas seeping into the soil from the adjacent volcano.  The ground fractured during small earthquakes and a most likely, a reservoir of gas trapped beneath the earth’s surface is leaking.  In addition, magma releases gases as it moves toward the surface of the earth.  The problem with too much carbon dioxide in the soil?  It interferes with the plant’s uptake of oxygen and nutrients through its roots.  According to a 2000 report from the U.S. Geological Survey, more than 100 acres of trees have died near Mammoth Mountain from too much carbon dioxide.

If I ever meet Santorum, maybe I’ll inform him of the differences between humans and plants, because he seems to need a refresher.  And next time I talk to a plant, I will let it know how dangerous carbon dioxide can be.

Animal Necessities

Several ferry and airline companies in Britain have stopped transporting animals for research under pressure from animal rights groups.  There are no British airlines that will carry animals to the UK, although foreign airlines still do.  Groups that are trying to prevent laboratories from using animals seriously compromise scientific and medical research.

Using animals for medical research is a touchy subject.  I don’t usually like to take issue with people voicing their opinion about animal research because people can get very upset about this topic.  However; as a user of animals for scientific research, I believe they are indispensable for medical progress.  Banning the use of animals in scientific research is equivalent to halting all progress to study human diseases, drug discovery, and drug safety.

There is no replacement for primary tissue (tissue from an animal) or whole animals for the study of diseases.  Cell lines, which are specific cell types immortalized in a petri dish, cannot replace primary tissue.  Cells that are continually cultured in petri dishes  are basically cancer cells.  Most cell lines come from cell-type specific tumors, which are cut up and then coddled with media and growth factors to allow them to stick to the bottom of the dish and continue growing and dividing.  Cancer cells differ greatly from non-cancerous cells in their internal signaling pathways and metabolism, therefore; cannot reliably be used to understand how normal cells work or efficacy of drugs for other diseases.

Furthermore, without a whole animal, there is no way to test metabolism of new drugs.  There are many organ systems in the mammalian body.  Drugs have specific targets, yet other organs may be damaged or the body may metabolize a drug to a non-functional molecule.  Any new drug cannot immediately be tested in humans; it’s too risky.  Drugs need to be rigorously tested in animal models to determine if they are likely to harm a person or have off-target effects.  There are several phases of drug discovery and development, and without animals, the drug development process completely stops.

Copyright University of Sydney, Kassiou Lab

To those that are concerned with animal welfare, all scientific research is tightly controlled by regulatory bodies.  In the US, the Institutional Animal Care and Use Committee (IACUC) oversees all aspects of animal research to promote responsible use of animals.  Any institution supported by the Public Health Service (PHS) ensures the appropriate care and use of all animals involved in research and follows the “U.S. Government Principles for the Utilization and Care of Vertebrate Animals Used in Testing, Research, and Training” and implements the Health Research Extension Act of 1985.  The Office of Laboratory Animal Welfare (OLAW) also ensures that all of the PHS and IACUC policies and regulations are enforced.  In the UK, the Animals (Scientific Procedures) Act 1986 licenses and regulates institutions, projects, and staff in similar ways.  Animal research labs in both countries have to follow protocols to ensure no abuse of animals by using the minimum amount of animals with the least amount of pain, suffering, or distress.

To animal rights activists:  think about a situation where your child has a deadly disease which could be managed and be survivable with a new drug.  Would you rather push for good checks and balances within the animal research system to ensure that they are treated humanely and use the drug that allows your child to live, or not do everything possible to save your child?  The possibilities significantly dwindle without animal research.

Like Mom Always Said…

A few days ago, the CDC announced that 14,000 people die from Clostridium difficile (C. difficile) each year.  This is a bacterium that lives in your colon, causing inflammation, diarrhea, and nausea.  You might be thinking, so what?  The problem is that many of these deaths could be prevented by proper hand-washing, because guess how you transmit bacteria from your colon to other people?  The number of deaths is shocking to me when I think that half of those cases were acquired in hospitals because of improper hand-washing by the staff (and this is one of many other hospital-borne infections that can also do harm, like staph infections).

Other than what you may have thought as a child, your mother did not make up the idea that washing your hands is good for you.  The first person to recognize that hand-washing can prevent disease was Hungarian Dr. Ignaz Semmelweiss around 1847.  He worked at Vienna General Hospital, which had two maternity clinics.  One was assisted by doctors and medical students and the other by midwives.  He observed that ~13% of the mothers at the obstetrics clinic died from ‘childbed fever’, while only 2% of the mothers died while in the care of the midwives.  Semmelweiss realized that doctors and medical students were dissecting cadavers for their autopsy or study and ‘invisible cadaver particles’ were being transferred to the new mothers.  Thus began the era of germ theory of disease, even though the microorganisms that caused these diseases were unknown at the time.  He instituted the regimen of washing hands and instruments in a chlorinated lime solution and ‘childbed fever’ was nearly eradicated.

A recent Freakonomics podcast described hand-washing practices at Cedars-Sinai Medical Center and tried to relate compliance to fiscal responsibility.  I won’t get into how you can save more money here, but the story about hand washing at a major U.S. hospital can be a lesson for all.  The hospital wanted to determine if they were doing all they could to prevent spread of infection.  They asked the doctors to self-report hand-washing, and 73% claimed they were washing their hands as they should.  However, during the same period, the nurses were ‘keeping an eye’ on the doctors and reported that only 9% of the doctors were in fact washing their hands between patients.  NINE PERCENT.  The most educated, supposedly responsible people at the hospital had the worst hygienic behavior compared to 65% of all hospital workers, including the custodial staff, who were washing their hands properly.

The best part of this story is how they got these doctors to change their behavior.  Education through signs, emails, and rewards didn’t work, but showing them exactly what was on their hands did.  Each person pressed their hand into a petri dish of agar, which was cultured for a couple of days, and then they actually SAW the bacterial colonies growing in clumps in the shape of their hand.  The hospital took a picture of the worst one and made it the screensaver to every computer in the hospital.  Now people were washing their hands – 100% of the time.  Yet, it’s human nature to become desensitized to things like this over time, and people fell back into their old habits.  The best way to get doctors to wash their hands?  Announcing the names of those that failed the hand-washing tests (using the agar plates) at departmental meetings and shaming them into doing a better job.

Human behavior is hard to change, no matter who you might be.  If we want to prevent spread of diseases everywhere, not just in the hospital, the take-home message is: wash your own hands early and often – and don’t be afraid to ask your doctor to do the same.

Is That a Pancreas in Your Pocket?

When most people hear the term ‘vital organ,’ the pancreas doesn’t usually pop into their head.  However, the pancreas is vital for survival, keeping your blood sugar in check. That’s why artificial pancreas trials are so exciting.  A majority of Type 1 diabetics are diagnosed as children or teenagers, and must live the rest of their life with insulin injections or insulin pumps, which they control themselves. Having a machine to automatically adjust insulin and glucagon injections based on continuous blood glucose monitoring would make diabetic life much easier.  This is especially important in young children and during times when awareness of low blood sugar is impaired by low blood sugar itself (or during sleep), as the brain needs 6 grams of glucose per hour to function properly.  When blood sugar levels drop too low (hypoglycemia), a person can go into a coma or die because their brain and organs are not getting enough glucose as fuel to survive.

When you eat a meal, carbohydrates and other sugars get broken down or converted into glucose, which circulates through the bloodstream.  The pancreas contains clusters of endocrine cells (cells that secrete hormones into the bloodstream) called Islets of Langerhans.  The majority of cells in the islet are beta cells which secrete insulin and alpha cells which secrete glucagon.  The beta cells sense the glucose concentration in the blood and secrete insulin in response, which travels through the bloodstream to other organs.  The insulin signals to those tissues (such as muscle, liver, and fat) to take up the glucose from the blood into the cell to be used for energy or stored for energy later.  High blood sugar (hyperglycemia) results from lack of beta cells (in the case of Type 1 diabetics) and the tissues ‘starve’ because they are not getting the signal to take up and process glucose from the bloodstream.  The body then tries to remove the glucose from the blood through excess urine production.  This is why poor glycemic control over time can damage the kidney and transplants may be necessary, although hyperglycemia causes many other complications such as microvascular diseases and nerve damage.

Normally, when a person becomes hypoglycemic, glucagon is secreted from pancreatic islet alpha cells, traveling through the bloodstream to the liver.  Glucagon signals to the liver to break down glucose stores (stored in long, branching chains called glycogen) and release glucose into the bloodstream.  If both insulin and glucagon are provided in a regulated manner, blood glucose levels can be adjusted accordingly.

Copyright discoverysedge.mayo.edu

The current insulin pumps provide a basal infusion of insulin to help keep blood glucose levels steady, but they are under the control of the user and must be manually adjusted throughout the day, especially during eating (more insulin) or exercise (less insulin).  Most diabetics prick their finger several times a day to test their blood glucose levels, as continuous blood glucose sensors are not widely used and still need calibrated at least twice a day with finger pricks.  People using insulin must be aware of rapid declines in their blood glucose levels; if their blood sugar drops too low, they must eat or drink high sugar foods or injest glucagon tablets to restore normal glycemia.  It is imperative that artificial pancreas blood glucose sensors are accurate and able to sense downward trends exceedingly well.  If a person starts exercising, muscle tissue is more sensitive to insulin and can even take up glucose independently of insulin because the need for energy (glucose) is greater.  If the sensor cannot perceive the drop in glycemia quickly and accurately; confusion, coma, and death may follow.  Combining better automatic blood glucose sensing with insulin and glucagon infusion in a bionic pancreas is a major step towards creating a better life for diabetics.  Getting the technology reliable enough is the challenge.

Nerd Alert

If you thought science and discovery was only for nerds and PhDs, think again.  Computers and technology have allowed some researchers to collect so many pieces of data, that they cannot possibly sift through it by themselves.  Many times an algorithm cannot replace human recognition and analysis, and requires that the data to be examined piece by piece.  This led some groups to reach out to the public for help.

I was first aware of crowdsourcing, in a different form, several years ago when I downloaded Screensaver Lifesaver.  This screensaver used my computer to scan potential cancer-fighting compound structures while I was not using the computer myself.  The University of Oxford basically built a supercomputer through the networking of 3.5 million personal computers, even ones as crappy as my 16GB, 300mHz Dell desktop.  I was amazed when Amazon introduced Mechanical Turk in 2005, a website where anyone can post a project for others to do, or you can choose to do a project, usually for compensation.  Currently, there are projects that pay up to $17.50 for your time, but many are small tasks that can be done for pennies.  I was tempted to use the Turk to help analyze mitochondrial shapes within cells, but never did actually outsource my graduate work (much to the chagrin of my husband who also never understood why robots weren’t feeding my cell lines for me on the weekends).

Image by NASA/ESA

Most recently, SETI (Search for Extraterrestrial Intelligence) announced that they will begin crowdsourcing the search for life outside of Earth.  Although they have been using the virtual supercomputer screensaver concept since 1999 to hunt for radio frequency signals coming from stars likely to have alien life, they are now reaching out to humans.  They need people to go through this data because there are so many man-made interference signals, their algorithms cannot distinguish the differences as well as a person.  Anyone can sign up and help the search for E.T. at www.setilive.org.

The crowdsourcing model has proven to be effective, as new planets have been identified by arm-chair astronomy enthusiasts searching the public images produced by the NASA Kepler Mission.  The Citizen Science Alliance, which works in collaboration with other organizations, now offers many different projects through Zooniverse, where you can search for new planets, stars, and supernovas.  There are even opportunities to branch outside of astronomy, such as categorizing whale calls to try to decipher the language of our wet mammal brothers.

The possibilities of crowdsourcing seems endless.  I am excited that the power of many can accelerate science and discovery much more than previously possible.  Public participation also increases awareness of scientific research and scientific literacy.  If you are ever sitting at home, bored of watching TV, maybe you should pop open the computer and look at some cool space pictures.  You never know, you could be the one to identify a never-before-seen astronomical anomaly.  Projects like these might just bring out the nerd in all of us.

“Stupid Design”

Post Navigation