Recently discovered protein shows promise in treating Alzheimer’s diesease

New research in humans shows that the FKBP52 protein may prevent the Tau protein from turning pathogenic, or from causing disease.  

The FKBP52 protein was discovered by Baulieu 20 years ago in its ability to block Tau protein accumulation, which is commonly seen in Alzheimer’s disease (AD)patients. Microtubules are the railways in the brain upon which cellular cargo is transported. In patients with AD, tau tangles formed by misfolded tau proteins may compromise the stability of the microtubules within the nerve cells leaving them damaged. Currently, the mechanism of Tau toxicity is unclear and there are no drug treatments targeting Tau.

Professor Etienne Baulieu and colleagues at Inserm (National Institute for medical research in France) have recently published results that for the first time demonstrate that the FKBP52 protein may prevent hyperphosphorylation, or over accumulation of Tau protein, a characteristic of Alzheimer’s disease.

Specifically, the results of this study demonstrate a direct correlation between high levels of hyperphosphorylated Tau protein and reduced levels of FKBP52, in brain cells from patients that had died following AD, compared to normal brain cells. This indicates that when FKBP52 is reduced in nerve cells of AD patients, disease causing Tau is free to accumulate and contribute to the degeneration of brain cells.

IOS Press BV (2012, March 20). New hope for treating Alzheimer’s Disease: A Role for the FKBP52 protein. ScienceDaily. Retrieved April 5, 2012, from     
Julien Giustiniani, Marlène Sineus, Elodie Sardin, Omar Dounane, Maï Panchal, Véronique Sazdovitch, Charles Duyckaerts, Béatrice Chambraud, Etienne-Emile Baulieu. Decrease of the Immunophilin FKBP52 Accumulation in Human Brains of Alzheimer’s Disease and FTDP-17. Journal of Alzheimer’s Disease, Volume 29, issue 2 (March 2012) [link]


Watch Out Ladies, Your Tatas Could Be At Risk!

Whoever said you are too young to have breast cancer was wrong, unfortunately.  The contraceptive, depo-medroxyprogesterone acetate, or DMPA, is commonly used in the world by females.  However, in the past, there have not been many studies on whether DMPA actually increases the risk of breast cancer in young women.  Christopher Li, a breast cancer epidemiologist of Fred Hutchinson Cancer Research Center, conducted a study to see what he could find and to see if it agrees with what has been found previously.  Li says the “study adds to the body of knowledge from international studies conducted in a diverse group of countries — Kenya, New Zealand, Thailand, Mexico and Costa Rica — which have shown that one of the risks associated with DMPA use may be an increased risk of breast cancer.”

To evaluate relations between DMPA and breast cancer risk, logistic regression was used.  Li and his team found that recent DMPA use for one year or longer correlated with an increased risk of breast cancer by 2.2 times. On a brighter note, the risk of breast cancer goes away within months after contraceptive use has stopped.  In addition, it was determined that the risk of breast cancer did not increase in women who used DMPA for less than a year.  That’s a relief!

Even though young women are not usually at risk for breast cancer, the “findings emphasize the importance of identifying the potential risks associated with specific forms of contraceptives given the number of available alternatives,” the authors wrote.  In this study, 1,028 Seattle-area women ages 20 to 44 who had been diagnosed with breast cancer participated as did 919 age-matched controls without a history of breast cancer.  Roughly 10 percent reported using this contraceptive.  “In the United States many women have numerous options for contraception, and so it is important to balance their risks and benefits when making contraceptive choices,” Li said.  To all the ladies out there, make sure to be smart about what contraceptive you take; it could save your “lady humps”.



Feeling big? Grab a box of chocolates and a bottle of wine

New study suggests that wine consumption may inhibit the growth of fat cells

Following on the coattails of the news that chocolate can increase fat loss, here is research about the effects of wine, grapes and other fruits on fat cells fresh from Purdue University.

Kee-Hong Kim, an assistant professor of food science at Purdue, along with a graduate student named Jung Yeon Kwan are investigating the effect that piceatannol, a compound found in red wine, grapes, and other fruits, has on the growth and development of fat cells.

Piceatannol is what resveratrol is converted into after consumption in humans.  Resveratrol has already been linked to fighting cancer, heart disease and neurodegenerative diseases.  Researchers hope that piceatannol, which is similar in structure to resveratrol, will be an effective weapon against obesity.

It takes immature fat cells roughly 10 days to go through the several stages of development before reaching maturity.  When in the presence of piceatannol, the gene expressions of these cells is altered, and the process of maturing is delayed or completely inhibited.  What this means is that body mass gain can be slowed or stopped by targeting young fat cells.  Piceatannol works by essentially blocking the pathways that are necessary for the young cell to mature into a fat cell.

Although the research is still in the preliminary stages, it may be possible to introduce a concentration into the bloodstream so large that body fat gain is completely stopped.  This finding does not mean that drinking a bottle of wine per day will prevent the growth of fat cells.  Further testing needs to be conducted, as well as means of introducing effective concentrations of piceatannol into the body.  In the meantime, eat a diet that consists of red grape seeds and skin, red wine, blueberries, passion fruit, and other fruits and see if piceatannol has any visible effect.  Beach season is around the corner.

Further reading can be found here.

Journal Reference: J. Y. Kwon, S. G. Seo, Y.-S. Heo, S. Yue, J.-X. Cheng, K. W. Lee, K.-H. Kim. Piceatannol, Natural  Polyphenolic Stilbene, Inhibits Adipogenesis via Modulation of Mitotic Clonal Expansion and Insulin Receptor-dependent Insulin Signaling in Early Phase of DifferentiationJournal of Biological Chemistry, 2012; 287 (14): 11566 DOI

Eat chocolate, lose weight?

Study shows eating chocolate more frequently can increase fat loss

People who eat chocolate frequently appear to have less body fat than those who eat it less often. A study, published on March 26, 2012 in Archives of Internal Medicine, tested approximately 1000 adults of similar lifestyles and followed the relationship between their chocolate intake and weights. Adults who consumed chocolate more often maintained a lower body mass index. The scientists suspect this is because the antioxidants and other compounds in chocolate may deliver a metabolic boost that can offset its high caloric content.

Recent studies show chocolate has more benefits than previously thought, regardless of its high caloric and sugar content. The treat has high levels of antioxidants that may contribute to lowering blood pressure, insulin sensitivity and cholesterol levels. Scientists from the University of California, San Diego, funded by the National Institutes of Health, wanted to test the effects of chocolate on weight gain and therefore examined the cross sectional relationship of chocolate consumption with Body Mass Index (BMI) – an indicator of body fatness.

A total of 1018 middle-aged men and women aged 20-85 years, without known cardiovascular or diabetic diseases, agreed to participate in the study and data was collected on how much they exercised, the amount and type of calories they ate, and their height and weight. This data was then related to their chocolate intake. On average, they exercised three times a week and ate chocolate about twice a week. The researchers did not record the type of chocolate consumed.

The results showed the people who ate chocolate more frequently tended to have lower BMI’s even though they consumed less calories and excerised the same amount as those who ate the least chocolate. There was a difference of roughly 5 to 7 pounds between the two groups.

Dr. Beatrice A. Golomb, lead author of the study, voiced a few concerns. First, that dietary studies can be unreliable because of their complicating factors. However, she countered, the researchers adjusted the numbers to account for several different variables (ie. age, gender, depression, vegetable consumption, fat, calorie intake, etc.) but the results remained the same. Dr. Golomb also cautioned that those who consumed more chocolate did have higher BMIs, it was those who ate chocolate more frequently that experienced the loss.

“It’s not the case that eating the largest amount of chocolate is beneficial; it’s that eating it more often was favorable,” Dr. Golomb said. “If you eat 10 pounds of chocolate a day, that’s not going to be a favorable thing.” But at least we don’t have to forgo it anymore, right?

For more information:

Original Study: “Association Between More Frequent Chocolate Consumption and Lower Body Mass Index.” Found Here:

Press Release: “The Chocolate Diet?” Found Here:

Dinosaurs trying to escape wildfires

There is not much dinosaurs can do when facing wildfires. Some can escape by running to find shelter, while others can fly high above them. However, there were some species of dinosaurs that were not so lucky when it came to protecting themselves from these deadly disasters. Researchers from the Royal Holloway, University of London and the Field Museum of Natural History in Chicago have discovered new findings that reveal the frequency and destructive nature of fires that occurred during the Cretaceous period. This period occurred between 145 to 65 millions years ago, following the Jurassic period.

In the journal Cretaceous Research, researchers have explained that they discovered charcoal deposits in many fossils which reveal the prevalence of fire during this time. After forming a “global database” of charcoal deposits, scientists, including leader Professor Andrew C. Scott from the Department of Earth Sciences at Royal Holloway, found that charcoal was the residue in the fossils that was left when the plants burned.

Unlike today, the Cretaceous period was a “greenhouse world where global temperatures were higher.” Additionally, lightning strikes were likely to prompt many of these wildfires. Another characteristic of this time period which adds to the occurrence of these fires was that it had a higher atmospheric oxygen level. This is the reason plants, for instance, which hold a great amount moisture, were able to burn so easily. This is not the case today, thus the atmospheric oxygen level is not as high as it was during that time.

Obviously, these fires greatly damaged the environment and did not provide a peaceful living space for the dinosaurs. Many problems occurred because of these fires. Professor Scott describes some of these problems when he says, “Until now, few have taken into account the impact that fires would have had on the environment, not only destroying the vegetation but also exacerbating run-off and erosion and promoting subsequent flooding following storms.” Also found in this research was the prevalence of charcoal in the dinosaur fossils. This research about the frequency of wildfires will help scientists better understand the living conditions the dinosaurs endured. Scientists may be able to relate these wildfires to the world in which we live in today.

Sarah A.E. Brown, Andrew C. Scott, Ian J. Glasspool, Margaret E. Collinson. Cretaceous wildfires and their impact on the Earth system. Cretaceous Research, 2012; DOI: 10.1016/j.cretres.2012.02.008

Royal Holloway, University of London (2012, March 29). When dinosaurs roamed a fiery landscape. ScienceDaily. Retrieved April 2, 2012, from­ /releases/2012/03/120329124714.htm


To bloom or not to bloom: A flowering plant’s greatest dilemma

On March 20th, spring officially began. Blizzards were halted in their tracks, icy arctic winds abated, and people emerged from their extended torpors to greet the first blossoms of the season. Sadly, they were too late. By the time their calendars informed them that spring had started, it was already fully upon them. This past winter was relatively mild, to say the least. It brought no blizzards, only a light dusting. The arctic winds were little more than chilly gusts. The sun spent a good deal of time shining and temperatures remained on the positive side of freezing. This sort of warm weather can, and did, trigger flowering in local plants, even out of season. Many of the common species in this area bloomed far earlier than expected … which could be devastating if winter decides to stumble back for a few days and mess with the budding party.

The question is, why? There is a lot of risk in blooming too soon, it’s one of the main reasons that some plants require a specific period of cold exposure, followed by a span of warmth, before they fruit. How do plants know when to flower?

Plant growth is strongly temperature dependent. Even small changes in ambient temperature can lead to drastic changes in a plant’s development. Currently, scientists understand that flowering is activated by a special molecule and relies on various external cues for timing. There are some plants that utilize temperature as a cue, and others that rely on the length of the day to regulate their flowering cycles. Accelerated flowering is an abnormal phenomenon that results from a specific protein cascade responding to thermal shifts. Scientists did not comprehend how increased temperatures directly altered this series of interactions until now.

A study: Transcription factor PIF4 controls the thermosensory activation of flowering, was published in the journal Nature on the 21st of March … the day after the first day of spring … Scientists from the John Innes Centre on the Norwich Research Park discovered a plausible mechanism to explain how temperature directly alters plant flowering response. They identified a previously unknown switch that accelerates flowering time in response to temperature which they have named PIF4.

Warm air activates the PIF4 gene, which then activates the flowering pathway. At lower temperatures, the gene is incapable of functioning. PIF4 handles many of a plant’s other responses to warmth, such as growth, but this is the first experiment to show that the gene activates flowering in response to temperature. If temperatures remain low, plants will eventually flower, they just utilize alternative pathways. As the temperature rises, PIF4 is better able to bind to proteins and trigger flowering.

These researchers feel that, “current climate change has already altered global plant phenology and distribution, and projected increases in temperature pose a significant challenge to agriculture”. They hope that their study will lead to the creation of temperature-resilient crops. Many agricultural commodities respond negatively to warmer temperatures, which often results in reduced yields. The flowering time of a plant is a key facet of crop management and plays a huge role in the life cycles of pollinators. Since many educated individuals would agree that there might be some sort of climactic shift at some point in the near or distant future, it may be wise to breed crops that are more resilient to shifting temperatures.

For further reading on the study:

S. Vinod Kumar, Doris Lucyshyn, Katja E. Jaeger, Enriqueta Alós, Elizabeth Alvey, Nicholas P. Harberd, Philip A. Wigge. Transcription factor PIF4 controls the thermosensory activation of flowering. Nature, 2012

The Sailor’s Life for Me… Except for the Food

Analyses of bone confirms 18th century naval diet, deployment

English archaeologists have confirmed long held beliefs about sailors’ diets in the Royal Navy in the late 18th and early 19th centuries. Chemical analysis of hundreds of bones revealed considerable consistency with what contemporary naval documents proscribed for meals at sea, as well as with diet of British sailors from 1545, released from the American Journal of Physical Anthropology online earlier this month.

The research team, from Oxford, Cranfield University, and the Ecole Normal Superieure in Paris, performed an isotope analysis on over 80 individuals from two English naval cemeteries. They contained the remains of seriously ill and mortally wounded servicemen who fought in the Napoleonic wars and the War of 1812. While historians have long written about the naval diet so short of veggies, the lead researcher, Prof. Mark Pollard, said that the findings “demonstrate the benefit of using forensic methods to complement documentary records.”

The Royal Navy’s Victualling Board rationed each sailor 7 lbs bread, 7 gal beer, 4 lbs beef, 2 lbs pork, 1 qt peas, 3 pts oatmeal, 12 ozs cheese and 6 ozs of butter every week. The more time one of the studies’ skeletons’ spent in His Majesty’s service, the more the nitrogen and carbon content of their bones would reflect this carefully measured diet, something Pollard calls the “naval average.”

Pollard and his team took small samples of bone – from ribs, teeth, and thighbones – and analyzed their stable isotope composition. These are types of carbon and nitrogen that don’t decay over time (as would occur with C14 dating). Different ratios reflect the sources of peoples’ nutrients because wheat and corn (the most common feed for livestock and ingredient in beer, bread, etc.) fix different carbons in photosynthesis. Scientists measure the levels of those carbons in bone and compare the ratios to determine which plant played a bigger role in diet.

Stable isotope analysis is incredibly useful outside of historic situations. It can be used to track the spread of agriculture in the Americas and in China, and readily distinguishes between populations that prefer one type of plant to another.

The study used these distinctions to also explore the similarities between British sailors, Native American soldiers, 17th century plantation workers, and 18th century slaves from Barbados. The British sailors’ isotope ratios formed a distinct group from the Americans, but those from Plymouth cemetery, where the sailors had toured off the coast of North America, indicated a higher C4 (that’s the corn carbon fixation pathway). Either the sailors traded food from the local area to liven up their meals, as was customary, or at least one of the sailors was actually from North America. Perhaps this unlucky fellow, named Sk 844, was impressed into the Royal Navy in the early 1800s – a practice that was also customary and which eventually contributed to the War of 1812.

Check out the Press Release from AAAS here (under March 23), and try this Blog for another take on this study.

Text Message From: Neighboring Bacteria

As means of communication increase between humans, such as text messaging and social media, bacteria and potentially human cells are communicating with each other to come up with a plan of action.  Researchers at Rice and Tel Aviv University have been investigating the pathways in which these cells communicate information about cell stress, the colony density, and possible plans of neighboring cells.  Research in this area could result in many medical applications.

José Onuchi, Ph. D., explained to Science Daily that “Using this form of cell-to-cell communication, colonies of billions or trillions of bacteria can literally reach a consensus on actions that impact people.”

Onuchi gave an example of a group of harmless bacteria gathering on the skin.  These bacteria may one day send chemical signals to each other and decide that there are enough of them to join together and cause an infection.  This network of infection causing bacteria is known as a biofilm. Biofilms are responsible for making many chronic diseases difficult to treat.  Urinary tract infections and cystic fibrosis are two examples of biofilms forming and causing treatment difficulties.

Bacillus subtilis has been the focus of the research to try to understand this network of communication.  B. subtilis is a bacterium found in the soil and responds to stressful environmental conditions by either turning themselves into spores or transforming into a competency state, which is protective state that won’t be harmed by the outside conditions. These two states allow the bacteria to survive in harsh or stressful situations.  In the spore state the bacteria discard half of their DNA into the environment and create a thick armor like shell that allows them to survive for years, but they are able to return to  normal bacteria.  The competency state is more for short term stressful situations. The risk with the competency state is if the conditions don’t improve quickly the bacterium could die before being able to change into a spore.

Onuchi explained that most bacteria will become spores in poor environmental conditions but about 1-2%  “see” that the other bacteria are becoming spores and choose to take some of their discarded DNA and enter competency.

Onuchi believes that the bacteria make their decisions on “game theory”, which is a concept used in math to analyze conflict and cooperation.  Onuchi explained to Science Daily “…the bacteria have to weigh the pros and cons of their decisions. The bacteria make a decision based not only on what it knows about its own stress and environment, but it also has to think about what the other bacteria might do.”

So how can research on bacteria living in the soil and their communication and decision making processes benefit the human population in anyway?  One answer: cancer treatment.

As the research progresses with the bacteria , human cells are becoming the main focus.  Onuchi is specifically looking at communication that could result in uncontrolled division and growth, AKA the cause of cancer.  One of the main causes of cancer is just that, the uncontrolled division and growth of cells.  The current thought is that similarly to the bacteria discussed earlier, the human cells may be chit-chatting amongst themselves and sending chemical signals to one another that cause the cancer to grow. Not only could this be the starting point of cancer, but could also explain the spreading of cancer to other parts of the body, also known as metastasis.

Onuchi described the medical treatment benefits of identifying this process to Science Daily.

“It would open the door to developing better drugs that have fewer side effects. For example, once we get a handle on this process, we might block the specific chemical messages that signal a tumor to grow, developing a medicine that wouldn’t affect other body processes, reducing or eliminating side effects.”

The researchers are hopeful, but like most growing areas of science, cannot make any promises based on their research.  The identifying of the possibility of this communication pathway and understanding the complexities of the bacteria communication is key, however, in starting the in depth look at human cells.



American Chemical Society (ACS). “Bacteria use chat to play the ‘prisoner’s dilemma’ game in deciding their fate.” ScienceDaily, 27 Mar. 2012. Web. 29 Mar. 2012.

Turbulence Bad for Wind Energy?

Isn’t more wind always a good thing when it come to wind energy? Lawrence Livermore National Laboratory scientist Sonia Wharton and Julie Lundquist of the University of Colorado at Boulder and the National Renewable Energy Laboratory discovered atmospheric wind instability can actually reduce the overall power generated by a wind turbine.

Wind farmers could obtain greater amounts of wind power by studying atmospheric stability.  Wind turbines experiencing constant wind speeds,  rather than variable wind speeds, produce up to 15% more power.  Wharton noted “the dependence of power on stability is clear, regardless of whether time periods are segregated by three-dimensional turbulence, turbulence intensity or wind shear.”

Wind shear is the difference in the direction and wind speed over a short distance in the atmosphere.  It is often overlooked because many wind farmers tend to focus primarily on wind turbulence.

Wharton and Lundquist studied upwind turbines for a year on the West Coast.  By looking at upwind turbines, this reduced the chance of a wake from other turbines.  They discovered that wind speed and power production varied during the day and night, as well as during the four seasons.  During the spring and summer, wind speeds were both stable and unstable; wind during the day was almost always unstable, and more stable during the night.  If wind forecasts included atmospheric stability, then wind farmers would have a better estimate of their total power generated.


Hemophelia B Research Will Soon Lead to Extinction of Bandaids

British researchers have studied gene therapy which helps treat Hemophelia B in patients by creating a long-lasting protection against bleeding.

Hemophelia B is one of the most common forms of Hemophelia and it is a hereditary disease that makes the patient unable to clot blood properly.  These patients usually need to receive preventative injections of a clotting compound called factor IX two to three times a week.

The gene therapy that these researchers have created is a mild virus that induces liver cell toxicity to create the factor IX that is in their usual injections.  The virus that was used as a transporter is one in which participants have not been exposed to before, and therefore their immune systems would not have developed antibodies against it.  

Earlier testing of this research on animals had shown that this therapy can last ten years or longer which is promising results for those patients affected with hemophelia B.

After seeing these results, the research team tested it on six participants.  Two were given high doses, two were given moderate doses, and two were given low doses.  Once receiving this therapy, the six participants were monitored for nine to twenty months and positive results were shown.  The results proved that with this treatment the bleeding phenotype is altered.  Individuals who were treated with high doses no longer had to receive injections and their levels were normal,  and the two individuals who received a lower dose were able to cut down their factor IX injections to be less frequent.

Once the treatment is admitted, it cannot be repeated due to anti-bodies which would be created.  Although, there are many other viruses that have been untouched that would be beneficial in this gene therapy.


reference the original article here