What's Hot

What's Hot

February 2004

What's Hot Archive

February 27, 2004

Hormone replacement therapy risks could have been revealed sooner

A duo of researchers writing in the February 28 2004 issue of the British Medical Journal have reported that the increased health risks related to hormone replacement therapy (HRT) in women were evident much earlier than 2002, when the Women's Health Initiative study made its well-publicized announcement that the trial was being halted early due to an observance of increased risks of coronary heart disease, stroke and breast cancer among HRT users compared to nonusers. Consequently, women on hormone replacement experienced an unnecessary increased risk of these deadly conditions that could have been avoided sooner.

Hormone replacement therapy was initially heralded as being protective against cardiovascular disease due to findings from observational studies, which determine trends in various populations, rather than randomized trials, which administer a therapy to one group of subjects and compare outcomes with subjects receiving a placebo. In 1997, authors Klim McPherson and Elina Hemminki published a meta-analysis of 23 small randomized studies of hormone replacement therapy, many of which had been conducted by pharmaceutical companies in order to obtain drug licenses. The trials included a total of approximately 2000 women who received hormone replacement and 1300 controls. The authors discovered that HRT was not as protective as the observational data had shown, and that a higher proportion of women who received the hormones had cardiovascular events than those taking placebos. Their findings were met with ridicule, yet six further studies provided them with further evidence.

The authors recommend that pharmaceutical companies should be required to make the results of their trials public, including adverse events. In this way the public would be able to learn about the relative risks of new drugs more quickly and avoid needless exposure to their possible dangers.

—D Dye


February 25, 2004

Italian combo helps protect against cancer in animal model

A combination of tomato and garlic has been shown to have broader anticancer effects than either plant food alone when tested in hamsters.

Researchers at Annamalai University in India gave an extract of garlic, tomato paste, neither substance, or both to hamsters who subsequently had the carcinogen dimethylbenz [a] anthracene (DMBA) administered to their buccal pouch, or to controls who did not receive the carcinogen. DMBA induces a form of cancer in hamsters similar to human oral squamous cell carincoma .

Buccal pouch, liver and red blood cell lipid peroxidation and antioxidant enzymes were measured after fourteen weeks. At this time, all of the animals who did not receive DMBA were free of tumors, while none who received the carcinogen without either plant food were tumor free. In the animals who were treated with DMBA, the tomato extract lowered tumor incidence by 25%, while the group receiving garlic experienced a 27.5 lower incidence, and both groups had smaller tumors than hamsters who did not receive the protective vegetable compounds. A combination of both garlic and tomato reduced tumor incidence to 25% of that of hamsters who were treated with DMBA alone and greatly reduced tumor size.

In animals who developed tumors, garlic combined with tomato was significantly associated with reduced lipid peroxidation and an elevation of glutathione-dependent antioxidant enzymes in the liver and red blood cells compared to levels found in DMBA-treated hamsters who did not receive the compounds.

To the authors' knowledge, this study is the first to show the combined benefit of tomato and garlic in this animal model of cancer. The results of this investigation show that an improved anticancer benefit can be provided by combining plant foods with different mechanisms of action.

The study was published in the February 2004 issue of the journal Nutrition Research.

—D Dye


February 23, 2004

Serum antioxidant levels correlate with asthma reduction in youth

The February 1 2004 issue of the American Journal of Respiratory and Critical Care Medicine published the results of research conducted at Cornell University which found a negative association between blood serum levels of some antioxidant nutrients and asthma prevalence. Studies relying upon dietary intake levels have produced inconsistent findings. The current study examined serum levels of the antioxidant nutrients beta-carotene, vitamin C, vitamin E and selenium in 6,153 individuals aged 4 to 16. Participants were recruited from children in the households of subjects taking part in the third National Health and Nutrition Examination Survey (NHANES III). Parents were questioned concerning the presence of asthma in their children, and blood samples were analyzed for levels of the antioxidants as well as for cotinine , which measures exposure to smoke.

Four hundred fifteen children were reported to have asthma. Not surprisingly, exposure to smoke was found to increase asthma risk. Although there was no relationship determined between serum vitamin E levels and asthma, all of the other antioxidant nutrients in the analysis were found to have serum levels that were inversely associated with asthma. Selenium alone had a strong association with forced expiratory volume in one second, which declines with asthma severity. Additionally, increased levels of beta-carotene, vitamin C and selenium were associated with a reduction in asthma prevalence in subjects exposed to smoke.

Rachel N Rubin and coauthors explained that, “Antioxidant status may affect asthma risk by influencing the development of the asthmatic immune phenotype, the asthmatic response to antigen provocation, or the inflammatory response during and after the asthma attack.” They conclude that the findings of this study suggest a role of asthma antioxidants in the prevention of asthma or in slowing its progression. (Rubin RN et al, “Relationship of serum antioxidants to asthma prevalence in youth,” American Journal of Respiratory and Critical Care Medicine, vol 169 2004 p 393-398.)

—D Dye


February 20, 2004

New evidence that DHEA may help grow new brain cells

The online edition of the Proceedings of the National Academy of Sciences ( www.pnas.org ) published a report on February 18, 2004 that provides evidence that dehydroepiandrosterone (DHEA) may help the brain produce new cells. DHEA is a steroid hormone produced by the adrenal glands that declines with aging, and is taken as an over the counter supplement by many individuals. Most studies utilizing DHEA have been conducted in rodents, and its mechanism of action in the human nervous system is unknown.

Researchers from the University of Wisconsin at Madison grew human fetal neural cells in culture, which formed aggregates called neurospheres . When DHEA, epidermal growth factor and leukemia inhibitory factor were administered to the cells, a 29 percent increase in new brain cells was observed compared to cells that received the same factors minus DHEA. This suggests that DHEA is involved in human neural stem cell maintenance and replication. Precursors of DHEA, such as pregnenolone or its metabolites, had no effect on cell proliferation.

Senior author and professor of anatomy and neurology at the University of Wisconsin , Madison , Clive Svendsen , announced, “This is the first real evidence of DHEA's effects on human neural cells." He explained, "What we saw was that DHEA significantly increased the division of the cells. It also increased the number of neurons produced by the stem cells, prompting increased neurogenesis of cells in culture.

One interesting possibility noted by Svendsen is that DHEA could provide benefits to the adult human brain. It has long been known that DHEA levels decline during aging. Because adults have neural stem cells that continue to create new neurons in some brain areas, it is possible that DHEA could be involved in the formation of new brain cells.

—D Dye


February 18, 2004

Serum homocysteine levels predict heart attack mortality in women

A report published in the February 10 2004 issue of the American Heart Association journal Circulation, has uncovered an association between elevated levels of homocysteine and heart attack risk, as well as death by heart attack. Homocysteine has been established as a heart disease risk factor in men, while few studies have examined its role in women.

The study involved participants enrolled in the Population Study of Women in Gothenburg ( Sweden ), which was begun from 1968 to 1969. The current study followed 1,368 participants for twenty-four years who were free of myocardial infarction at the study's onset. Frozen blood samples were analyzed for serum homocysteine levels in 2001.

During the follow-up period, eighty-eight acute myocardial infarctions occurred, with forty-two of them resulting in fatalities. The Scandinavian research team found that homocysteine was an independent risk factor for heart attack and death from heart attack . Women in the highest one-fifth of homocysteine levels (greater than 14.2 micromoles per Liter) had nearly twice the risk of experiencing an acute myocardial infarction and over five times the risk of dying from it than the remaining individuals, after adjustment for various factors. Homocysteine levels were positively correlated with age and inversely associated with serum vitamin B12.

The authors write that, “ . . . (serum total homocysteine )-lowering treatment with combined vitamins B12, B6, and folate has shown promising potential of reducing secondary coronary events in a few recently published studies, which may have wider clinical application in the future. However, whether this is applicable to all vascular events remains to be shown. Interestingly, vitamin B12 deficiency, folate deficiency, and hyperhomocysteinemia have more recently been linked to risk for development of dementia, which may reflect its vascular etiology.” ( Zylberstein DE et al , “Serum homocysteine in relation to mortality and morbidity from coronary heart disease,” Circulation , Feb 10 2004 , p 601-606.

—D Dye


February 16, 2004

Mother was right about cod liver oil

At a press conference held at the Royal College of Surgeons in London on February 12 2004 , researchers from Cardiff University in Wales announced the results of a study which found for the first time in humans that cod liver oil really is effective in slowing the progression of osteoarthritis. The team, led by Professor Bruce Caterson and Professor John Harwood of Cardiff University, and Professor Colin Dent, of the University of Wales College of Medicine, provided two extra strength cod liver oil capsules per day to arthritis patients ten to twelve weeks before knee replacement surgery and found that 86 percent of the participants experienced a partial or complete reduction in the enzymes that cause cartilage damage, compared to 26 percent of a placebo group. Enzymes that cause joint pain were also reduced in the cod liver oil group.

Professor Caterson commented, "This breakthrough is hugely significant because it demonstrates the efficacy of a dietary intake of cod liver oil in patients with osteoarthritis taken prior to their joint replacement surgery. The data suggests that cod liver oil has a dual mode of action, potentially slowing down the cartilage degeneration inherent in osteoarthritis and also reducing factors that cause pain and inflammation. What these findings suggest is that by taking cod liver oil, people are more likely to delay the onset of osteoarthritis and less likely to require multiple joint replacements later in life.”

Professor Dent added, “Patients resort to joint replacement surgery when the symptoms and pain of their arthritis becomes unbearable. Cod liver oil can counteract these symptoms and if you can switch off the cartilage destruction and pain then surgery may not be necessary. We're very excited by this latest trial."

—D Dye


February 13, 2004

C-reactive protein levels linked with macular degeneration

The February 11 2004 issue of the Journal of the American Medical Association ( http://jama.ama-assn.org ) published a report that establishes a link between elevated levels of the inflammatory marker C-reactive protein (CRP) and age-related macular degeneration (AMD), a disease of the eye which affects older individuals and is the leading cause of loss of vision in this age group. The investigation involved participants in the Age-Related Eye Disease Study (AREDS), a multicenter study that was designed to assess the incidence, prognosis and risk factors in the development of age-related macular degeneration and cataract.

The current study involved 930 participants from two study sites. Blood samples were analyzed for levels of C-reactive protein. The participants were divided into four groups according to the severity of macular degeneration present, or its absence.

C-reactive protein levels were significantly higher in the group diagnosed with advanced macular degeneration than in those in whom the disease was absent. Adjusted analysis found CRP levels to be significantly associated with the presence of both intermediate and advanced stages of the AMD. Those whose CRP levels were in the highest one-fourth had a 65 percent increased risk of macular degeneration compared to those in the lowest one-fourth of participants.

This study is the first to the authors' knowledge to establish an association between CRP levels and age-related macular degeneration in a large population. The finding may implicate inflammation in the development of age-related macular degeneration, adding to the number of conditions for which inflammation has recently emerged as a causative factor. The authors suggest that, “Anti-inflammatory agents might have a role in preventing AMD, and inflammatory biomarkers such as CRP may provide a method of identifying individuals for whom these agents and other therapies would be more or less effective.”

—D Dye


February 11, 2004

Higher iron stores a risk factor for diabetes in women

A report published in the February 11 2044 issue of the Journal of the American Medical Association ( http://jama.ama-assn.org ) described the findings of Harvard researchers that increased body iron stores are associated with an increase risk for diabetes type 2 in women who have no other risk factors for the disease. It is known that individuals with the genetic disease hemochromatosis (which involves excessive iron storage in the body) have a higher incidence of diabetes, but unknown whether moderately high iron stores in health people could also lead to the disease.

The researchers, led by Rui Jiang , MD, DrPH , of the Harvard School of Public Health in Boston, analyzed blood samples provided from 1989 to 1990 by 32,826 diabetes-free participants in the Nurse's Health Study, a prospective investigation into the causes of major diseases begun in 1976. The samples were analyzed for concentrations of the blood iron-protein complex, plasma ferritin , and the ratio of transferrin receptors (which are iron transporters) to ferritin. During the ten year follow-up period, 698 women developed diabetes. These participants were matched by age, race, fasting status and body mass index to 716 control subjects.

The authors wrote, “ At baseline, the mean ferritin concentration was significantly higher (109 vs. 71.5 nanograms /milliliter) and the mean ratio of transferrin receptors to ferritin was significantly lower (102 vs. 141) in the cases than in the healthy controls . . . This finding may have important implications for the prevention of type 2 diabetes because elevated ferritin concentration and lower concentration in the ratio of transferrin receptors to ferritin in healthy populations may help to identify a high-risk population for type 2 diabetes who may benefit from further evaluation and interventions (lifestyle or therapeutic)."

—D Dye


February 9, 2004

Red wine polyphenols but not alcohol inhibit platelet aggregation

A study published in the October 2003 European Journal of Internal Medicine found that red wine and its polypheonls inhibited platelet aggregation in vitro, but alcohol did not. Alcohol consumption in moderation has been found to be associated with a lower incidence of cardiovascular disease, and inhibition of platelet aggregation, involved in the formation of blood clots, is believed to be one of alcohol's protective mechanisms of action.

The researchers, from Meander Medical Center in the Netherlands, applied varying concentrations of red wine, a red wine polyphenol extract, and alcohol on blood platelets two minutes before aggregation was induced by the addition of adenosine-5-phosphate.

The polyphenol extract significantly and dose-dependently inhibited platelet aggregation at concentrations of 45 milligrams per liter and higher. Concentrations of 180 mg per liter and higher of the red wine polyphenol extract completely inhibited platelet aggregation, even when adenosine-5-phosphate was added in a high concentration. Red wine inhibited platelet aggregation only at very high concentrations, while no concentrations of alcohol were found to have an inhibitory effect.

Although alcohol has demonstrated an inhibitory effect on cyclo-oxygenase and the formation of thromboxane A (both involved in the clotting process), the current study suggests that it is the polyphenols from red wine, and not the alcohol, that are responsible for its inhibition of platelet aggregation, and that polyphenols affect platelets through a direct interaction as opposed to long-term effects. The concentration of polyphenols that provided significant aggregation inhibition in this study is unlikely to be obtained through moderate wine drinking. The protective benefit of wine against heart disease is probably related to metabolic changes rather than a direct blockade of platelet aggregation.

—D Dye


February 6, 2004

Cool helmets protect brain

The American Heart Association's 29th International Stroke Conference was the site of the unveiling on February 5 2004 of the findings of two studies utilizing a helmet that protects post-stroke patients by cooling the brain. Ischemic stroke occurs when a blood clot lodges in the blood vessels in or leading to the brain. Prior research has shown that hypothermia protects the brain from ischemic injury following a stroke, but cooling the entire body can elicit adverse effects.

The first study, conducted in Japan, involved seventeen patients who had a cooling helmet attached to the head and neck three to twelve hours after the onset of ischemic stroke, and who wore it for three to seven days without anesthesia. Surface brain temperature was lowered 4 degrees Fahrenheit and deep brain temperature was lowered 1.4 degrees. After ten months of follow-up, 35 percent of the patients had good functional outcome and only one had died.

The second study, conducted on six ischemic stroke patients by Americans Huan Wang, MD and colleagues at the University of Illinois in Peoria, utilized a helmet that used a liquid cooling technology developed at NASA. Fiberoptic probes inserted into the brain were used to monitor the patient's brain temperature. The brains cooled an average of 6 degrees during the first hour after the helmets were applied, while body temperatures did not change significantly until six to eight hours later. The procedure was well tolerated by the participants, with the exception of one eighty-five year old woman who developed an abnormal heart rate which was promptly treated.

Dr Wang predicted, "We believe that if you keep the brain tissue cool, you will have a longer tissue survival time. Then, when we open the artery, we could salvage much more brain tissue and hopefully avoid adverse neurological effects.”

—D Dye


February 4, 2004

Yogurt bacteria does not need to be live

The February 2004 issue of Gastroenterology published the findings of researchers at the University of California, San Diego and the Shaare Zedek Medical Center in Jerusalem, Israel, that probiotics, the “friendly” bacteria used to make yogurt and kefir, do not have to be living to provide benefits. It had previously been believed that the viability of these microorganisms was essential to their ability to help treat several conditions, such as inflammatory bowel disease and allergies.

Senior author and professor of medicine at UCSD, Eyal Raz, MD, explained, “Our goal was to address whether the metabolic activity of probiotics was mandatory for their protective effect. “ The researchers radiated probiotics to reduce their metabolic activity to a minimum, and administered the inactive bacteria to mice in whom colitis, which is similar to human inflammatory bowel disease, had been induced. Another group of colitis-induced mice received nonirradiated probiotics. It was discovered that both nonviable and viable probiotics effectively treated the colitis.

Prior research had used heat to inactivate probiotics, but this destroyed the bacteria's cellular structure as well as their benefits. The DNA found in each cell stimulates the innate immune system, providing an anti-inflammatory effect. In another experiment, Raz's team found that the immune molecule toll-like receptor 9 needs to be activated for the probiotics to provide their benefits against colitis in this mouse model.

—D Dye


February 2, 2004

Teen girls need vitamin D

The Annual Meeting of the American Society for Bone and Mineral Research was the site of a presentation by Susan Sullivan of the Department of Food Science and Human Nutrition at the University of Maine, which revealed that teenaged girls may not be getting adequate vitamin D. Because vitamin D is critical for bone growth, deficiency can predispose girls to developing osteoporosis later in life.

Dr Sullivan, along with Dr Cliff Rosen of the Maine Center for Osteoporosis Research and Education at St Joseph Hospital in Bangor , followed 23 girls ages 10 to 13 for three years, monitoring their bone mineral density, diet, and blood levels of vitamin D. They found that the half the girls had low levels of vitamin D in March, when levels of the vitamin are usually at their lowest, and 17 percent were deficient in September, when levels are usually at their highest.

The bone mineral density tests confirmed that as the girls went through puberty, calcium was rapidly being added to the bones. Vitamin D is an essential part of this process. Dr Sullivan explained, "Puberty is a very critical time when up to half of a person's adult bone mass is being deposited. If you think about life span, peak bone mass occurs at about the age of 30. This is such an important time when girls are growing their bones . . . . We've known for a long time that vitamin D has a role in getting calcium into bones. Researchers are now finding evidence that vitamin D could play other roles in health such as cancer prevention and controlling blood pressure. There are vitamin D receptors in lots of tissues in the body that aren't related to bone. “

—D Dye

What's Hot Archive Index