Sunday, March 6, 2011

Patients use Facebook, Twitter, to Get Health Information

More and more patients are turning to social networks such Twitter and Facebook for health information, according to a survey by the National Research Corporation.
In the survey of nearly 23,000 people in the United States, 41% said they use social media as a source of health care information. For nearly all of them – 94% - Facebook was their site of choice, with YouTube coming in a distant second at 32%. Eighteen percent of the respondents said they turned to MySpace or Twitter for health information.
One in four respondents said what they learned on these sites was “very likely” or “likely” to impact their future health decisions.
The respondents who used social media were 41 years old on average. People in households earning more than $75,000 were more likely to use social media for health care purposes than households earning less.
Another survey out this week found that one in four internet users living with a chronic ailment has gone online to find others with similar health condition.
“The internet enables people to help people in a way we couldn’t help each other before,” says Susannah Fox, author of the report by the Pew Internet and American Life Project. “The internet gives people access to not only information, but also to each other.”
Many of the patients who sought out other patients online have rare diseases.
“They say no one really knows what they’re going through except someone in the same situation,” Fox says. “Those personal tips can make all the difference in the world.”
Most of the patients in the survey also consulted with their doctors.
“The oft-repeated fear that patients are using the internet to self-diagnose and self-medicate without reference to medical professionals does not emerge,” according to Fox.

Taste: It's in Your Nose and Memories

It looks like a small "everything bagel" and lox. But bite into it and, to your amusement, it's ice cream.
If you close your eyes and let the aromas sink in, it's all the same flavors you remember of your favorite seed-covered bagel with cream cheese and smoked salmon.
"It's important to have a sense of humor," said Wylie Dufresne, chef and owner of wd~50 in New York. The former "Iron Chef America" contender serves up this deceptive concoction at his Manhattan restaurant.
"We often play with perception," he said. "We take familiar foods and serve them in unfamiliar ways. ... We try to stay true to the taste memory."
Humor and memory might not be the first words that come to mind when planning tonight's dinner, but this kind of creative thinking has earned Dufresne wide acclaim in the food realm.
And it's true that taste isn't just about the way food hits your tongue; there's a whole science behind how we perceive flavor and develop preferences for certain foods.
Dufresne is just one cutting-edge chef who consults with scientists to create innovative dishes that defy expectations and play with the senses. Scientists and chefs have a lot to learn from each other, said Gary Beauchamp, a biologist who leads the Monell Chemical Senses Center in Philadelphia.
Eatocracy: 'Molecular gastronomy' is off the menu
Here are some of the lessons from science that may help you think about eating in a new way:
1. You like what your mother ate
If you're pregnant, you may be transferring preferences for certain flavors to your baby right now. The food you eat gets into the amniotic fluid and flavors it. The fetus can detect those flavors and remember them after birth, Beauchamp said. This also happens with breast milk when a mother nurses an infant. As children get old enough to eat solid food, they show a preference for flavors they first experienced in the womb.
It works for flavors like carrot juice, but not for something like salt, since the amount of salt a mother eats doesn't affect the saltiness of amniotic fluid or breast milk. It's actually the smell component of flavors that gets transferred. Mothers can enhance a child's liking for healthy foods such as vegetables by eating them while pregnant and nursing.
In general, experience informs taste preferences, so if you know you've liked salty foods in the past, you're going to want them again. If everyone collectively moves toward a low-salt diet, people will begin to crave it less, Beauchamp said.
And these preferences can begin from infancy. Babies fed starchy table foods high in salt showed elevated preferences for it, he said.
2. The "tongue map" isn't exactly right
You may have seen a textbook diagram of a tongue with a "sweet" spot on the front, "salty" areas on the sides and a "bitter" zone in the back. It's true that these areas are a bit more sensitive to those flavors, but in reality, there's no clear-cut map of which parts of your tongue taste what, Beauchamp said.
In fact, there are taste receptors in the back of the throat. This was shown in a case of a woman whose tongue needed to be removed, and she could still sense flavors, he said.
3. The nose knows taste
A lot of what you perceive of flavor is actually aroma, scientists say. At the recent American Association for the Advancement of Science conference, Jane Leland of Kraft Foods' research and development group demonstrated this by having the audience pinch their noses while eating a yellow jelly bean. The candy was nearly tasteless.
What's going on? Basically, when your nose isn't closed up, the aroma of the food from your mouth is going through the back of the throat to the nose to give you the full flavor experience. When you block off the front part of your nose, it's like closing off the end of a hose and the water is no longer flowing, Beauchamp said.
4. Nostalgia while eating relates to smell
Many people have powerful memories of particular flavors from childhood, and re-experiencing them in adulthood can instantly bring back moments from years past. Marcel Proust famously begins "Remembrance of Things Past" with a description of biting into a small cake called a madeleine and being overwhelmed with sensations of prior times.
That's because the olfactory sense is perhaps the most primitive one, and the anatomical connections between smell and emotions are more direct than for other senses, Beauchamp said. However, there hasn't been concrete research on precisely how this phenomenon works, and it's hard to study because each person's experience is unique.
5. We evolved to "like things that are bad for us"
Humans are driven partly innately and partly culturally to consume salt, Beauchamp said.
People desire fats to store energy, but over the long run, too much can cause cell damage and shorten life, White House pastry chef William Yosses told the science conference last month.
"We are hard-wired to like things that are bad for us," he said.
Our ancestors probably didn't live to age 50, so they sought out salts and fats without regard for the consequences of a bad diet -- hypertension, heart disease and stroke -- in middle to old age, Beauchamp said.
And there's been plenty of warning about salt recently. In January, the federal government unveiled new guidelines recommending that African-Americans, everyone over age 51, and people with a history of hypertension, diabetes or kidney problems restrict their daily salt intake to half a teaspoon. For everyone else, it's about one teaspoon.
Salt enhances sweetness by inhibiting bitterness, Beauchamp said. One of the reasons you find salt everywhere is that it reduces the bitterness of vegetables, making them tastier.
6. Yes, you can reduce fat and salt without losing flavor
There are two strategies for addressing the salt problem: creating salt substitutes and coaxing people to shift their preferences, he said.
And what about reducing bad fats? Chefs can develop delicious recipes that aren't laden with fat and may even have a cleaner flavor, Yosses said.
Chocolate mousse, for instance, usually contains a combination of egg yolks and cream that creates a smooth, rich feeling on the tongue -- and could also cause molecular damage to arteries, he said.
Yosses' solution: Replace the cream and yolks with with a different protein. Water and gelatin, together with chocolate, make a mousse that's still creamy, but it's healthier and you can taste more of the chocolate.
And David Lebovitz, former San Francisco Bay Area pastry chef who now writes cookbooks in in Paris, advocates minimalism. Even a few grains of salt can make a difference -- put them on top of, not in, your culinary creations, he said. It provides a contrast in your mouth to the sweetness of chocolate or caramel, for instance.
"I don't use substitutes," he said. "You can substitute carob for chocolate, but it's not chocolate. I don't think you should really trick people."

Thursday, March 3, 2011

We're a Nation of Germophobes


A recent CNN article examining hotel room hygiene revealed some uncomfortable truths. From bacteria and dead skin cells infesting the mattress to improper cleaning practices, it was enough to make anybody think twice about getting too comfortable in a hotel. However, such concerns are often overblown and, in some cases, unhealthy.
Part of the problem is that scientists and the media often fail to distinguish "germs" from "bacteria" properly.
Bacteria are a lot like snakes: Some are good, most are harmless, and only a few are dangerous. However, this distinction is often not articulated, and as a result, public fear over germs is greatly exaggerated.
One may be inclined to think that of all possible fears, "germophobia" -- the technical term being mysophobia -- is certainly the least harmful. It might even be beneficial. Better safe than sorry, right? Unfortunately, this is probably not true.
For instance, overexposure to antibacterial soaps containing triclosan was associated with allergies and hay fever in children under 18. Other evidence indicates that triclosan builds up in the environment and theoretically poses the risk of generating resistance among bacteria. Additionally, the Food and Drug Administration concluded in 2005 that antimicrobial soaps are no more effective at preventing illness than regular soap and water.
It also appears that ultra-clean environments may interfere with proper development of the immune system. This controversial, yet compelling, idea is known as the "hygiene hypothesis." Put simply, the hypothesis states that lack of exposure to bacteria and other infectious agents during childhood may partially be responsible for autoimmune disorders. A growing body of evidence indicates the hypothesis might be correct.
The idea is rooted in the epidemiological observation that the rate of autoimmune diseases has been increasing in the industrialized world, but not in the developing world. Presumably, this is because of the wide availability of hygienic products in rich countries, which are not available in poor countries. Hence, people in the developing world are exposed to more infectious agents than are people in the industrialized world.
The hypothesis, which is now well-documented, is further bolstered by a recent study that showed that children who grow up on farms, and are exposed to a greater diversity of infectious agents, are less likely to develop asthma. Also, women are more likely to suffer from allergies and asthma than men, and a recent study speculates this could be blamed on little girls avoiding dirt during playtime.
New therapies to treat autoimmune disorders are being developed that incorporate this new knowledge. Parasitic worms, which modulate the immune response, are used in "helminth therapy." Clinical trials have demonstrated that patients suffering from inflammatory bowel disease improved after ingesting pig whipworm eggs. Additionally, mice suffering from a disease similar to multiple sclerosis improved after being infected with flatworms.
No microbiologist or medical doctor would recommend dispensing with basic sanitary or hygienic practices. Everyone should continue washing their hands with (preferably) non-antimicrobial soap and using alcohol-based hand sanitizers. But paranoia about microbes, which are in every bite and breath we take, is counterproductive.
It is becoming clear that our national obsession with cleanliness is doing more harm than good. Even though cleanliness may be next to godliness, remember not to take it to an extreme. Indeed, being too clean may be bad for your health.

Swimming, Biking and Running His Way Off Meds


Three months ago I saw my personal physician and was started on three medications, two for high blood pressure and one for high LDL levels, the bad lipids. It was that day that I applied for the CNN Fit Nation Triathlon Challenge and decided that I needed to make a change in my life towards healthier living.
My family history is strong for hypertension on my dad’s side of the family. It was no surprise that I would have problems as well. For years I had had borderline hypertension, even medicated for a couple years. When I lost a few pounds the BP improved and I was able to come off medications. Over the years the weight came back and the high blood pressure returned. When I saw my primary care provider in November my blood pressure was 140/98. That is the highest that it has ever been and I was promptly started on my medications.
One of my goals of the challenge was to get fit and get off my medications. On medications my BP improved to 120/70, but I wanted to be at that level without medications. I started exercising an hour 4-5 times a week. I could see and feel the weight coming off and hoped that the blood pressure would start to improve. After this routine for a couple months I had a couple episodes of lightheadedness. I am not the best at drinking water while working out so I thought that it was because I wasn’t getting enough fluids in. I started drinking more water with exercise. Then one morning I bent down to tie my son’s basketball shoes, stood up and almost fainted. Could my blood pressure have improved that much with only a couple months of routine exercise? I went in to the clinic the following Monday and my BP was 90/60! Goodbye medication #1! I always knew that my blood pressure was weight dependent, but didn’t think that it would respond so quickly and dramatically.
I am happy to report that the lightheadedness has resolved and my BP was 116/68 at last check. However, I am still on one blood pressure medication. My goals haven’t changed; I still want to get off that second medication and will continue to work hard and eat healthier and hope that by August 7, the day of the Nautica New York City Triathlon, I will be off all of my medications.

On the Brain: Does a Full Bladder Free Your Mind?


This week there's a somewhat bizarre study about whether judgment improves after drinking copious amounts of water, as well as research in Alzheimer's disease and early childhood mental disorders.
A little self-control
Don't make a hard decision with an empty bladder, suggests new research from the Netherlands. In a study published in the journal Psychological Science, psychologists at the University of Twente demonstrate that bladder control is related to samepart of the brain associated with feelings of desire and reward, the Telegraph reports. And people who drank five cups of water in the study made better decisions than those who took small sips. That's perhaps because feelings of inhibition are all connected in the brain so self-control about one thing can "spill over" (haha) into something else, Discover writes. But before you change your bathroom habits, consider that this  shows only  a correlation between drinking water and good choices; we don't know that a full bladder causes any mental advantages.
Inside anesthesia
There's still a lot of mystery surrounding the way anesthesia works, and Dr. Emery Neal Brown is seeking to make things clearer. Brown, at Massachusetts General Hospital and Harvard Medical School, has done brain imaging research on people under anesthesia to see what's happening in your head when you're put under. He's found that there is a level of regular brain activity happening even when a person is "out." The New York Times has an interview with him.
Family history matters
If your mother had Alzheimer's disease, you have a heightened risk of getting it too - higher than if your father had it, says a new study from the University of Kansas School of Medicine in the journal Neurology. None of the participants in the study had dementia, but those who had a mother with Alzheimer's disease showed brain abnormalities in areas relating to memory, among others. The Los Angeles Times reports.
Mental illness from infancy?
If you think that there's a minimum age for mental illness, you're wrong, according to researchers in the journal American Psychologist. Trauma can influence the youngsters' mental development, but also the challenges of everyday life can lead to feelings of helplessness, depression or anxiety. But there are few psychologists who specialize in early childhood mental health. Researchers urge more support programs in this area.