Are β-blockers Doing You More Harm Than Good?


Chronic Obstructive Pulmonary Disease (COPD) includes a number of diseases where airflow to the lungs is blocked. Cardiovascular diseases are frequent occurrences in patients with COPD.  To manage medical conditions like high blood pressure and abnormal heart rhythm that are associated with cardiovascular diseases, medication called β-blockers are used. However, many physicians shy away from prescribing them due to uncertainty about potential effects of β-blockers on lung activity. This is especially true during episodes of acute COPD exacerbations (sudden worsening of symptoms). Contrary to these conceptions, studies show that β-blockers play a role in alleviating COPD exacerbations and reduce mortality rates.

The TONADO studies conducted by researchers at the Boehringer Ingelheim Pharma GmbH & Co. KG, Ingelheim in Germany conducted research to check lung function in COPD patients receiving bronchodilator (substance that dilate the air passages to the lungs, thus increasing airflow) treatments for a year. FEV1 or forced expiratory volume, a measure of the volume of air that is forced out in 1 second after taking a deep breath, was used to categorize the severity of COPD. Post experimentation, researchers considered data from patients receiving β-blockers to assess the potential effects of these medication on bronchodilator treatment. Researchers assessed these effects throughout the length of the study by analyzing lung function, dyspnea (difficulty breathing) and frequency of exacerbations in COPD patients.

In the study conducted on 5,163 patients, individuals who were on β-blockers showed lower occurrence of COPD with milder symptoms, and higher average FEV1 levels.

Exacerbations were also very low in the  β-blockers group at entry level. One shortcoming of β-blockers was the high occurrence of diseases related to the heart and blood vessels like stroke, heart attack, and cardiac arrhythmia. The improvement in dyspnea seen during bronchodilator treatment and lung function measurements were the same for both groups. Overall, analysis of the obtained data did not show vast negative effects of β-blocker treatment.

The reliability of data from this research is high due to the fact that it was collected over a span of 12 months. In addition, the study had a broad, global population of subjects due to the international nature of the trial. The presence of other additional diseases besides COPD was also taken into consideration during data collection and analysis. Putting all this information together, researchers concluded that their study provides valid information regarding the effects, or lack thereof,  of β-blockers in COPD patients.

In conclusion, the results support β-blocker use. The researchers agreed that the benefits of β-blocker outweigh their potential risks, especially for patients with heart disease, heart failure and hypertension. An analysis of 15 similar studies further supported the findings.

Reference: Maltais, F., et al. 2018. β-Blockers in COPD: A Cohort Study From the TONADO Research Program. Chest.

Link to article

Image: Flickr


Staying Balanced: Sour Taste Buds Linked with the Ventricular System

In January 2018, a study published by the American Association for the Advancement of Science presented a surprising link between the sense of taste and the sense of balance. While trying to determine which genes are responsible for certain taste buds to ascertain sourness, scientists found the same gene at work in the inner ear.

Lemon slices
Lemon slice. Credit: GDJ; Creative Commons Clipart.

When you think “sour,” you might think about puckering at the juice of a slice of lemon, but scientists think about pH levels. Sourness is actually a measure of acidity, due to the fact that a substance is acidic if it contains lots of H+ ions (hydrogen atoms with a positive electrical charge), which is also a mark of low pH. There are different kinds of taste buds: some recognize sweetness, some recognize saltiness, etc. The taste buds that recognize the sour *tang* of Sour Patch Kids contain ion channels that allow H+ ions to flow into the taste bud cell and send a signal to the brain that says, “Wow! This is sour!”

To ascertain which gene or genes are responsible for expressing the proteins necessary for building the H+ ion channels in sour taste bud cells, researchers at the University of Southern California used a mouse model. They compared the transcriptome of mice with sour taste buds with the transcriptome of mice without them. The transcriptome is a collection of all the RNA in a particular cell, and is an indicator of proteins that are being generated and built by a cell. When 41 potential proteins were identified in the sour taste bud cells, but not found in the other taste bud cells, the scientists knew one of them must play a role in the mechanism for detecting sour tastes.

The researchers implanted the potential genes into human embryonic kidney cells (HEK-293) or the female egg cells of a frog model (Xenopus oocytes). Then, the kidney cells and egg cells were flooded with an acidic solution and observed for H+ ion currents. The researchers noticed that the gene Otopetrin1, abbreviated as Otop1, was the only gene to produce an ion channel that permitted H+ ions to pass through.

The gene Otop1 is part of the otopetrin gene family, which happens to be known for the development and function of the vestibular system. The connection is clear when mice with Otop1 mutations exhibited issues with spatial orientation and balance. They could not properly right themselves or swim. Furthermore, the mice with Otop1 mutations had weaker currents of H+ ions in the taste bud cells, which suggests that the mice were not able to fully taste sourness. The scientists at USC hypothesize that Otop1 regulates an optimal pH level in the inner ear during development.

“We never in a million years expected that the molecule that we were looking for in taste cells would also be found in the vestibular system,” senior researcher Emily Liman said. “This highlights the power of basic or fundamental research.”

The Otop1 gene also produces H+ ion channels in the heart, uterus, adrenal gland, mammary gland, and in fat tissue, although the role of H+ ion channels in these regions is not understood. Further research may uncover more intriguing and unanticipated connections within our genetic makeup.

Taste bud cells
Taste bud cells, magnified and artificially colored. The red portions denote cells that detect sour tastes, while the green portions mark cells that detect umami, sweet, or bitter tastes. Credit: Yu-Hsiang Tu and Emily Liman.


Tu, Y.H., Cooper, A.J.,Teng, B., Chang, B.R., Artiga, D.J., Turner, H.N., Mulhall, E.M., Ye, W., Smith, A.D., & Liman, E.R. 2018. An evolutionarily conserved gene family encodes proton-selective ion channels. Science [published online] DOI: 10.1126/science.aao3264.

Gersema, E. 2018. Surprising discovery links sour taste to the inner ear’s ability to sense balance. USC Press Room. Retrieved Feb. 18 from

How to Solve the Water Crisis

Leaky Faucet
Source: Flickr

Water is essential to maintaining all life on earth, yet two billion people worldwide don’t have access to a clean or safe water. However, availability of fresh water may change as seen in an article published on the 9th of February, 2018 in Sciences Advances. This article, titled “Ultrafast selective transport of alkali metal ions in metal organic frameworks with subnanometer pores” details the findings of researchers at both Monash University and the University of Texas at Austin that offers a breakthrough solution to the water crisis.  They discovered that metal-organic frameworks (MOFs), a material with the largest internal surface area of any known substance, can be used to capture and remove salt and metal ions from water.

Metal-organic frameworks are sponge-like crystals that can capture, store, are release chemical compounds. MOFs have a narrow distribution of pore size, making them useful in various separation technologies as well as for the storage of gases like hydrogen and carbon dioxide. MOFs have been used in gas purification and separation, as a catalyst (something that increases the rate of a chemical reaction), or as sensors.

The researchers discovered that MOFs can mimic the filtering function or ‘ion selectivity’ of organic cell membrane. They are able to remove salt from seawater and separate metal ions in a highly efficient and cost effective manner. The researchers estimate that MOFs can improve desalination capacity in water treatment processes by a factor of 2 to 3 in energy consumption. This means there is a more cost-effective, fast way to treat water and make it readily available for those who need it most.

Not only this, but MOFs are able to extract metals that are harmful to humans and otherwise difficult to remove from drinking water. For example, since lithium-ion batteries have become the most popular battery for mobile electronic devices like phones and tablets, they are in such high demand that unconventional methods may have to be developed to continue lithium production, such as extraction from water with metal-organic frameworks.

There are both economic and physical reasons a region could be effected by water scarcity, but the results are the same; humans without the basic necessities of life. It can be caused by lack of investment in technology and infrastructure to collect water from various sources, economic competition for water quantity and quality, or simply the irreversible depletion of drinkable groundwater. The increasing world population, expansion of irrigated agriculture, improving living standards, and changing consumption patterns will only make it more difficult to obtain clean and safe drinking water for all, so these findings published by researchers at Monash University and University of Texas at Austin bring big news to the table in terms of providing the essentials for life to humans worldwide.

Huacheng Zhang, Jue Hou, Yaoxin Hu, Peiyao Wang, Ranwen Ou, Lei Jiang, Jefferson Zhe Liu, Benny D. Freeman, Anita J. Hill and Huanting Wang. Ultrafast selective transport of alkali metal ions in metal organic frameworks with subnanometer poresSciences Advances, 2018; DOI: 10.1126/sciadv.aaq0066

Data Sharing; How it Helped These Tropical Bears

Although you may never have heard of them before now, Helarctos malayamus better known as Sun bears (yes, believe it or not moon bears exist too),  are the smallest species of bear in the world.  These bears can be found in tropical forest habitats in Southeast Asia, but are sadly on the IUCN vulnerable species list.

Yet another animal attempting to survive in our growing anthropogenic world, the Sun Bear has seen a population loss of over 30% in the last few decades. This decrease is almost entirely in part to deforestation, and researchers at the university of South Carolina have good reason to believe these number aren’t slowing down anytime soon.  Using data captured from 1,463 non-baited camera traps spanning over 31 field sites all within Sun bear territory, the team found that there’s a direct correlation between tree cover and sun bear presence.  The bears were only seen in areas with over 20% cover (over a span of 6km^2 from the camera) and were 146% more likely to be found in areas with 80%+ tree cover then that of only 20%+.

Sleeping Sun Bear
A sun bear snoozes on a tree

A very interesting aspect of their study was the camera traps, which work by turning on and recording every time they sense movement in the area in front of them, weren’t specifically set up to find Sun bears. They were setup for a number of other studies being done on other species. However, the camera data was borrowed and used to collect population information on the Sun Bears which in regards to the other studies, was a byproduct. This reuse of data led to the findings of strong evidence in support of diminishing Sun Bear numbers, and has the potential to do it time and time again with completely different species.

The future implications on this are endless. If studies using camera traps to interpret information on one specific species, were to share they’re data across the world to other researchers attempting to learn something about completely different species found in that area. Then potentially huge knowledge gaps on all types of animals could be filled, and this doesn’t even include the economic proficiency that would come of it.

The overall takeaway is that if so much can be learned about our small bear friends in Southeast Asia using recycled camera footage (and knowledge from experts), who knows what else could be learned about other animals in that footage or other camera trap studies.


Mousseau T., 2017. Projecting range-wide sun bear population trends using tree cover and camera-trap bycatch data. PLoS One NO 12(9): 56-68.







Our Earth? Headed to Wall-e Trash Planet? More Likely Then You Think

Space Debris
Diagram of where debris could be around the Earth.

The recent space launch of a Tesla roadster aboard a SpaceX Falcon Heavy rocket (video below) has once again raised the conversation level surrounding the issue of orbital debris. Orbital debris is “any human-made object in orbit about the Earth that no longer serves any useful purpose.” In December of 2017, J. –C. Lious, PhD, Chief Scientist for Orbital Debris, gave a presentation about the current state of orbital debris and its policies. In the 1990s, NASA was the first organization in the world to create a space debris policy with specific guidelines. Entitled the NASA Procedural Requirements for Limiting Orbital Debris, the organization spearheaded an effort to expand this policy throughout the entire United States Government. The United States is not the only country that is worried about space debris.

The Inter-Agency Space Debris Coordination Committee (IADC) is a collection of a spacefaring countries that has developed a set of international space debris guidelines. Space Debris has also been on the agenda at the United Nations since 1994. Since there are so many space debris committees there must be a lot of disagreement, right? Correct! The international community created at least four separate standards that all contain different guidelines and criteria governing space debris in Low Earth Orbit (anything below 2000 km).  Many of these policies are not quantitative and contain phrases such as “minimize the probability of occurrence.” Of these organizations, NASA has been the global pioneer on orbital debris. They were the first to acknowledge the problem, and the first to set up measurable guidelines to manage it.

Even with all of these regulations, launches still occurred from January 2008 to September 2017 that did not comply with the guidelines set forth by NASA. These guidelines consist of three simple rules; the post mission orbital lifetime must be less than 25 years, the threat of orbital debris from a mission explosion must be less than 0.001 and finally, the reentry human casualty risk must be less than 1 in 10000. Examples of missions that did not follow these rules are NOAA-19, with an orbital lifetime of 500 years, and MMS Atlas 5, with a human casualty risk of 1 in 600. NASA is trying to create better compliance for the future projects by working on a set of new standards that include reducing orbital debris during normal operations, minimizing the amount of debris created by accidental explosions, and by launching missions with disposable space structures.

OPINION: The Falcon Heavy launch was definitely a site to see, but the Tesla that is in space now is unnecessary. Yes, I think it’s comical, and I understand the promotional value that Elon Musk received for shooting a Tesla into space. But at what cost?  Currently, we don’t know where it is going. All we know is that it is going to pass Mars orbit in about 6 months and eventually make it back to somewhere around Earth.  Most orbital debris serves a useful purpose at some point during its mission but Musk’s Tesla was nothing more than a rocket payload. Sustainability is on everyone’s minds right now, but what about space sustainability? How long will it be until we need to start worrying about not being able to see the sky because of orbital trash?


Liou, J.-C. 2017. Orbital Debris Briefing. NASA Technical Reports Sever: JSC-E-DAA-TN50234. 

How long have we been in a drought?

Researchers from the South African Weather Service discovered that the Western Cape Province of South Africa has experienced mild drought conditions from 1985 to 2016. However, the ramifications from the drought on agricultural and economic activity were not recognized until 2014. In 2014, they encountered their worst water shortage.  According to Botai and her team,  the province experienced their worst water shortage in 113 years from 2014 to 2017.

In reaction to the water shortage, the government imposed water restrictions and rationing on consumers while they attempted to find a supplemental water source. Recently, more restrictions have been implemented to due to the extreme shortage. Currently, the consumers are limited to thirteen gallons of water a day.

Image of a port in South Africa.
South Africa faces an ongoing drought.

The Western Cape Province is home of the Cape Town port. Usually, the weather resembles a Mediterranean climate with hot dry summers and cold winters along the coast. For agricultural production, this climate is ideal for the food they grow like apples, pears, apricots, peaches, nectarines, plums, and grapes. Agricultural production is crucial to the West Cape economy, and the food security of the country.

Due to insufficient rainfall, the province has been declared a disaster region. The west coast of the province and Central Karoo are categorized as an agricultural drought disaster area. This means that there is insufficient soil and subsoil water to promote crop growth. Eden and Central Karoo are the municipalities most impacted by the drought on their crops as a result of warm temperature and evapotranspiration. Typically, the region has intense rainfall during the summers.  In addition to these municipalities, Overberg, Cape Winelands, and West Coast experienced mild drought conditions. However, the lands were traditionally characterized by winter rainfall with sunny and dry summers.

The researchers suggest the drought was offset by decreasing rainfall. Consequently, their water reservoirs are below 30% capacity.

Image of West Cape Province map signifying weather station and relative percipitation.
Figure 1, supplemented by Botai’s article, shows the study area and distribution of select water stations in red circles. The circle sizes are relative to the Mean Annual Precipitation recorded at stations.

Although this trend may appear to be isolated, severe droughts and water shortages have become prevalent in the states in case like California, Texas, Oklahoma; and the world in cases like Ethiopia, Somalia, Afghanistan, Iran, China. This has prompted meetings with officials from government, private sector, and other sectors to address this crisis. In these circumstances, it may be useful to utilize forecast on precipitation to prepare for drought similar to this one.


Source: Botai, C.M., Botai, J.O., de Wit, J.P., Ncongwane, K.P., Adeola, A.M. 2017. Drought Characteristics over the Western Cape Province, South Africa. MDPI 9:876.

Hot or Not: How Pollinators Choose Which Flowers to Pollinate

Hoverfly by flickr user Neil Mullins.

Pollinators are essential for hundreds of thousands of plant species and over a thousand crop species. Without them, our agricultural system would suffer immensely. That is why it is so important to understand these creatures and the variables which attract them.

A study published in 2017, takes a deep look into what influences site choice by pollinators, specifically the Hoverfly. Researchers observed Hoverfly behavior in hemiboreal, alpine, and tropical environments in India and Sweden. For two years they collected data that observed which flowers Hoverflies were attracted to and which ones they did not care for. The Hoverflies are called generalists pollinators, meaning that they will take pollen from a range of plants, not just a specific species. Those studying the Hoverflies wanted to know more about the numerous plants that the pollinators would provide their services to. They tried to determine if the plants most attractive to pollinators had any common traits that would give them this advantage. These traits could include color, scent, or pattern.  

During the first year in 2015, researchers collected data from real flowers. In the second year of 2016, they created lures- artificial flowers- to attract the Hoverflies. Flowers which the pollinator visited and seemed attracted to were called “hot” and flowers which the pollinator thought of as less attractive but not repellant were called “cold”.

The study found some surprising results that conflict with existing ideas about pollinator preference. Hoverflies, and potentially other pollinators, can compensate for changes in their environment and locate a few specific traits that they favor. It was found that often floral color didn’t make a difference in preference. This shows that pollinators are versatile yet sensitive if changes within their environments occur.

Understanding the various characteristics across environments that attract Hoverflies is useful in many ways. For example, plants which contain likable characteristics could be planted in agricultural fields to increase wild pollination. In addition, the results can be used to maintain plants which will increase pollination and population growth of Hoverflies. Managing our pollinators has become a critical task as we recognize their importance and their threatened status.

Source: Nordström, K., Dahlbom, J., Pragadheesh, V. S., Ghosh, S.,Olsson, A., Dyakova, O., Suresh, S.K.,Olsson, S.B. 2017. In situ modeling of multimodal floral cues attracting wild pollinators across environments. Proceedings of the National Academy of Sciences 14:13218-13223.

Photo Source: flickr


The Beginning of an End to the Autism-Vaccine Debate?

Autism Awareness

Autism spectrum disorder (ASD) is a developmental disorder of the nervous system. The causes of ASD are yet unknown, but it has been linked to both genetic and environmental factors. Researchers at Keele University in the UK have identified aluminum as a potential cause of autism based on a study conducted in 2017 on brain tissue from people diagnosed with ASD.

Aluminum is used in vaccines to enhance the body’s immune response to antigens (harmful or toxic substances). The vaccine-autism debate is highly controversial, but animal models have linked the use of aluminum in vaccines to ASD. The results of this study on human cells further assert these findings.

Aluminum content was measured in 0.3g tissue samples from different regions of the brain of 5 individuals using atomic absorption spectrometry. This method utilizes the difference in the light absorption capabilities of different atoms to find the chemical composition of samples. The values ranged from 1.20 to 4.77μg/g. Past studies have suggested values ≥2.00μg/g as pathologically concerning and those ≥3.00μg/g as pathologically significant. The results showed at least one tissue in each individual that exceeded the established pathologically significant value.

Some of the values recorded were the highest ever measured (17.10, 18.57 and 22.11μg/g).

In addition to the concentration, the locations of the aluminum deposits were also examined using fluorescence microscopes. A dye that selectively stains aluminum in cells and human tissues and makes them appear orange or bright yellow was used to view aluminum on the images obtained through the microscope. Deposits were found both inside and outside brain cells. However, the most distinct observation was the presence of metal deposits in the microglia. Microglia are the main immune defense cells inside the central nervous systems and scientists concluded that the deposits seen in them were a direct indication that aluminum had somehow crossed the blood-brain barrier.

fluorescence micrograph
Figure 1 shows the cells in the hippocampus of a 50-year-old male donor used in the study by Mold et al. The white arrow indicated aluminum depositions that were observed via orange fluorescence emission. Hippocampus is the part of the brain considered to be the center of emotion and memory.

Aluminum is toxic to living cells. Although the microglia could remain functional for a certain time period, the metal will eventually show its adverse effects by disrupting this functionality. This directly correlates defective microglia with ASD. In addition to microglia, the study showed aluminum depositions in other tissues from different parts of the brain.

The study also showed great variability in the age groups of donors from 15 to 50 year olds. Initially, the high concentration seen in tissue from a 15 years old donor had greatly puzzled the researchers. However, the evidence of aluminum deposition in the microglia and other intracellular locations ties back to implicate vaccines as a potential cause of ASD and explain how such high amounts of aluminum could have deposited in the brain tissues of a 15 year old boy.

This shows the first ever instance of aluminum concentration measurement in human brain tissues from individuals with ASD. Despite the concrete results, the research was limited due to the lack of a substantial number of subjects and the minimal amount of tissue cells that could be obtained for the study. These factors render the research inadequate by itself to establish ASD as a direct outcome of aluminum deposition from vaccines in brain tissues. However, it is a major stepping stone towards realizing the potential cause of autism spectrum disorder. Now, there is a need for more research to either support or question the results of this study. 


Mold, M., Umar, D., King, A., and Exley, C.2018. Aluminium in brain tissue in autism. Journal of Trace Elements in Medicine and Biology 46: 76-82.

Link to article

Link to feature image



Weighing the Risks: Gastric Surgery May Lengthen Life

A recent study found that obese patients who received gastric surgery, rather than nonsurgical weight loss treatment, saw a significant decrease in mortality rates over a 4.5 year period. The researchers identified three specific surgeries, which effectively reduce the size of the stomach so that the patient will feel full with less food. The study consisted of over 8000 obese Israeli citizens and was conducted by the state health service.

The study lasted from 2005-2014, with each patient being followed up with for a minimum of one year after entering the study, an average of 4.5 years and a maximum of 11 years. Specifically, the findings reported a 1.3% mortality rate among obese patients who received gastric surgery and a 2.3% mortality rate among obese patients who opted for nonsurgical treatment. These findings are significant, because even with all the associated risks of surgery, there was still a higher survival rate with it than without.

There is a tendency to dismiss any treatment for obesity other than diet and exercise, as the presence or absence of these are the only treatment or cause of obesity. But in patients who have a history of struggling with this traditional prognosis, they would be better off having the surgery to force them into a lesser diet rather than face the health risks of continuing to remain obese under a less drastic treatment plan, based on these findings.

As a 10 year long study following the ongoing care of 8385 patients, the parameters are more than sufficient to inspire confidence in its results. And even though a 1.3-2.3% may sound small, it is a significant increase in the proportion of mortalities and an indicator of future health and longevity. So perhaps patients who struggle with obesity should consider gastric surgery as a new strategy.


Reges O, Greenland P, Dicker D, Leibowitz M, Hoshen M, Gofer I, Rasmussen-Torvik LJ, Balicer RD. Association of Bariatric Surgery Using Laparoscopic Banding, Roux-en-Y Gastric Bypass, or Laparoscopic Sleeve Gastrectomy vs Usual Care Obesity Management With All-Cause Mortality. JAMA. 2018;319(3):279–290. doi:10.1001/jama.2017.20513

Insight into Pericytes

Blood Brain Barrier and Astrocytes type 1
Blood Brain Barrier. Credit: Ben Brahim Mohammed, Wikimedia Commons

Imagine the vascular system in the brain as a strainer used in cooking. After cooking pasta in a pot of water, you pour the pasta over the strainer, so that it catches the noodles, and the water filters out into the sink. Typically, you want a strainer with small holes, so vegetable pieces or meat pieces cooked with your pasta don’t slip out with the water into the sink.

Similarly, specialized cells called pericytes act as the strainer of blood flow in the brain. These cells contribute to forming the blood-brain barrier, which permits nutrients and oxygen to filter through to feed brain cells but prevents toxins from entering the brain. The pericytes play an active role in managing this exchange. Pericytes also regulate blood flow in the small capillary blood vessels. In other words, they determine the width of the blood vessels and decide how much blood can flow freely.

A recent study published in Nature Medicine on February 5th linked pericyte damage with Alzheimer’s Disease and other forms of dementia. Previously, Azheimer’s Disease and other neurodegenerative diseases were associated with accumulations of TAU proteins, toxic proteins that build up over time and inhibit brain function. Researchers at the University of Southern California now think pericytes are to blame as an earlier marker for dementia, causing issues before TAU proteins even show up.

Researchers used a mouse model to simulate pericyte deficiency in humans, and noticed that damaged pericyte cells let some materials leak out of the blood and into the brain that were not supposed to be there, just like a strainer with holes that are too big and macaroni noodles start plopping into the sink. The leaking material was fibrinogen, a protein that creates blood clots at injury sites. During the healing process, fibrinogen is vital, but in the brain, fibrinogen deposits erode away at the insulation barrier of neurons, called myelin, and disrupt electrical communication from one neuron to another. You might equate fibrinogen as the chunks that get through your strainer, and then clog the drain pipe.

Nerve tracts gradually eroding as the result of damaged pericytes.
Myelin (shown in green and red) gradually erodes away as the result of damaged pericytes.  Credit: Montagne et al.

The alarming discovery was that in the absence of healthy pericytes, fibrinogen leaked into the brain, and the cells that produce myelin, called oligodendrocytes, started to die. By the end of the experiment, 50% of the oligodendrocytes were dying or defective. One hypothesis proposed was that besides directly destroying the oligodendrocytes, fibrinogen also blocks oxygen and nutrients from reaching them, further accelerating cell death.

The scientists are hopeful that their research will initiate new treatments for dementia by focusing on the root of the problem: the damaged pericytes producing leaks in the blood-brain barrier. The senior researcher said, “Perhaps focusing on strengthening the blood-brain barrier integrity may be an answer because you can’t eliminate fibrinogen from blood in humans. This protein is necessary in the blood. It just happens to be toxic to the brain.” With future research, the pericytes may become the primary target for dementia treatment and prevention.


Montagne, A., Nikolakopoulou, A., Zhao, Z., Sagare, A.P., Si, G., Lazic, D., Barnes, S.R., Daianu, M., Ramanathan, A., Go, A., Lawson, E.J., Wang, Y., Mack, W.J., Thompson, P.M., Schneider, J.A., Varkey, J., Langen, R., Mullins, E., Jacobs, R.E., & Zlokovic, B.V. 2018. Berichte degeneration causes white matter dysfunction in the mouse central nervous system. Nature Medicine [ePub ahead of print].

Vuong, Zang. 2018. Half of all dementias start with damaged ‘gatekeeper cells.’ USC Press Room. Retrieved Feb. 12 from


Extra Extra: Energy Savings for Dummies

Hand with ways to save by recycling and alternative energy.

Are there cost benefits to conservation? It may appear to be an obvious answer, but researchers at the University of California Davis wanted to quantify this answer. Spang and his co-authors investigated how much the state of California saved on electricity and greenhouse emission reductions after the Governor of California mandated a 25% reduction and water consumption statewide in 2015. On April 1, 2015, Governor Brown passed Executive Order B-29-15 in response to the four-year drought that impacted 48% of the state’s surface water resources. Consequently, the drought impaired 542,000 acres of land, $2.74 billion of the state’s revenue, and approximately 21,000 jobs according to the researchers.  This was the first bill that regulated urban water consumption in their state history.

Water and energy are interdependent.  Water is needed to produce energy for fuels and electricity generation. In California, energy is needed to transport water resources across the state. Additionally, energy is needed to treat water and waste water. This is way the authors decided to calculate the savings for energy. By conserving energy, they can reduce greenhouse gas emitted into the air.

For their study, they observed energy use from June 2015 to May 2016. During this time, California saved 524,000 million gallons (MG) in water, which is a 24.5% decreased from their 2013 baseline. From this, California saved $230 MG-1 on water conservation. In energy, they saved 1830 gigawatts statewide (GWh).  Because of these energy savings, they avoided approximately 521,000 metric ton (MT) CO2e in greenhouse gas emission. This is equivalent to taking 111,000 average cars off the road annually.

The hydrologic region that saved the most was the South Coast region (237, 200 MG), which has populous cities like Los Angeles and San Diego. The North Lahontan region (1,400 MG) had the lowest savings, which has sparsely populated cities like Susanville and Truckee, California.

After comparing the general cost benefits, the researchers compared the cost benefits from the statewide mandate and the energy programs of investor-owned utilities. They found that they saved 11% more on energy than investor-owned electricity utilities’ efficiency programs.

These findings could be useful in other advertisements for conservation efforts. If people knew the savings and environmental benefit from this mandate, this could encourage people to reduce consumption.

Journal Citation:

Spang, E.S., Holguin, A.J., Loge, F.J. 2018. The estimated impacts of California’s urban water conservation mandate on electricity consumption and greenhouse gas emissions. Environmental Research Letters 13:1.