Johnny Bravo’s square jaw vs Popeye’s round jaw: A scientific explanation

From left to right, popular cartoon characters Ferb, Phineas, Popeye, and Johnny Bravo.

Do you ever scratch your head and roll your eyes at the ridiculous shapes of cartoon characters? While it can’t be confirmed that Phineas and Ferb have legitimate jaw bones, there may be a scientific explanation to Popeye’s rounded jaw to Johnny Bravo’s square jaw.

A recent study published online in Scientific Reports on April 16, 2018, validated a significant association between mandibular shape and jaw muscle cross-sectional size. In other words, researchers found that thicker jaw muscles produced broader, bigger, and more rectangular jaw bones.

Previous studies have shown that craniofacial skeletal form, or the structure of the bones of the face and jaw, is influenced by mechanical loading. Just as your leg bones get stronger from running, and arm bones get stronger from lifting weights, jaw bones get stronger from chewing. The specific shape of the mandibular bone is also determined by the forces applied to it throughout development. So, you might get your dad’s square jaw through genetics, but you also have a square jaw because of the foods you eat regularly.

Jaw muscles
The temporalis muscle and master muscle of the skull. Credit: Sella-Tunis et al. Labels added.

Scientists at Carmel Research Center in Israel measured jaw shapes and jaw muscles in 382 adult patients by utilizing CT scans. These scans allow for visualization of both bone and muscle, and they specifically looked at (1) the temporalis muscle, which is a large, round muscle that reaches from the side of the skull to the side of the face, and (2) the masseter muscle, which stretches from the lower jaw to the upper jaw.

Jaw bone
Comparison of jaw bone size and shape. Credit: Sella-Tunis et al. Labels added.

Independent of gender and accounting for relative size of individuals, researchers found that larger jaw muscles resulted in a wider ramus, a bigger coronoid projection, a more rectangular base, and a more rounded basal arch. Alternatively, smaller muscles produced a skinnier ramus, a smaller coronoid projection, a narrower and angled base, and a more triangular basal arch (see picture above).

This research can be used in anthropology contexts. Researchers suspect that  hunter-gatherer populations had harder diets, comprised of nuts and meat, which generate larger muscles and produced a stronger jaw line, while agricultural groups that ate more vegetation had skulls that resembled the jaws with smaller muscles. According to this data, it is plausible that Popeye’s spinach diet led to the growth of his softer, rounder jaw, and I would guess that Johnny Bravo is a fan of tougher foods.

Source:

Sella-Tunis, T., Pokhojaev, A., Sarig, R., O’Higgins, P., & May, H. 2018. Human mandibular shape is associated with masticatory muscle force. Scientific Reports 8. [doi: 10.1038/s41598-018-24293-3].

 

 

New skin sensors collect health data, study reports

Body temperature map
Body temperature map. All temperatures were collected by skin sensors, and transmitted to a computer to generate an image. Credit: Han et al.

The era of health tracking is upon us. The ability to track details about our moment-to-moment health has grown exponentially in recent years, from Fitbits logging our sleep patterns and heart rates, to ‘Smart Pills,’ which transmit a signal from a microchip imbedded in a swallowable tablet to an iPhone in order to track compliance with a pill regimen. Recently, a research group working at Carle Hospital at the University of Illinois took the concept of health data to the next level by tracking body temperature and pressure in bedridden patients and sleep study patients. Instead of tracking data from one point, like the wrist, they collected data from the entire body, which is a progressive approach in health tracking technology.

The newly developed skin sensors are thin, soft, wireless, and made of silicon. They have been shown to adhere to the skin without causing irritation, and they are waterproof, so they can collect data even in the shower. In testing, the skin sensors were placed at various points over the whole body, creating a temperature-map and pressure-map of the wearer.

Skin sensors
Sensors placed on the back, and the back of the arms and legs to collect temperature and pressure data. Credit: Han et al.
Pressure map
Pressure map. Credit: Han et al.

Because they are wireless and battery-free, they can stay on a patient for long periods of time, and can easily travel with a patient between wards in a hospital, or from an intensive care facility to a nursing home. They can stay on during medical treatments and during physical therapy. They collect and transmit data to a computer every 3 seconds, so a continuous digital picture of temperature and pressure can be recreated online.

I personally don’t the idea of swallowing a microchip along with my medication, but this new study utilizing skin sensors has incredible implications for medicine. The study, published online in Science Translational Medicine in April 2018, explained that since body temperature naturally fluctuates between day and night, the sensors can be used to measure circadian rhythms. In this case, researchers tested the sensors in a sleep clinic.

The researchers also used the sensors to measure prolonged pressure on the body in bedridden patients. Pressure ulcers, also known as bed sores, are a concern for patients in long-term recovery who spend the majority of their time lying down. Bony areas of the body, like shoulder blades and the buttocks, can develop these bed sores, or irritations, from staying in one position too long. The skin sensors, strategically placed in high-risk areas for bed sores, can detect when the pressure reaches a harmful level.

Source:

Han, S., Kim, J., Won, S.M., Ma, Y., Kang, D., Xie, Z., … Rogers, J.A. (2018). Battery-free, wireless sensors for full-body pressure and temperature mapping. Science Translational Medicine, 20(435). [Published online 04 Apr 2018]. DOI: 10.1126/scitranslmed.aan4950.

What does ADHD look like in the brain of a preschooler?

According to the National Survey of Children’s Health from 2016, 9.4% of children in the United States have been diagnosed with ADHD, almost 1 in 10 children. ADHD, or attention-deficit/hyperactivity disorder, is a neurological disorder characterized by impulsive behavior, different attention patterns, restlessness, and disorganization. You might hear a coworker casually claim, “I can’t focus on these emails.. my ADHD is acting up!” However, ADHD is not something that comes and goes. It is a different way of thinking, which stems from abnormal brain developmental patterns.

Although ADHD can manifest as early as 4 years old, most studies have only analyzed older school-age children. A new study published online in March 2018 in the Journal of the International Neuropsychological Society recruited a group of 4-5 year olds, including 52 children exhibiting ADHD symptoms and 38 children without ADHD symptoms to use as a comparison group. With consent from both the parents and children of course, they performed MRI scans to get a look at their brain structure.

Brain scan
A brain scan showing gray matter borders (cell bodies) with white matter in the inner regions of the brain (cell axons); Credit: Wikipedia Commons.

Overall, the researchers found that some regions of the brain had a smaller volume in children with ADHD, compared to the group of children with no ADHD symptoms. In particular, gray matter volumes were decreased. Gray matter, named for its natural brownish-gray color, is tissue comprised of the cell bodies of neurons in the brain and spinal cord. A neuron cell has a central body, and a long axon “branch” which sends messages to other neurons. The neuron cell bodies tend to congregate together in the brain and arrange themselves as “gray matter.” The axons also form groupings and are visualized as white matter.

In the brains of children with ADHD, the researchers noticed that the gray matter volume was reduced most significantly in subregions of the right frontal lobe and the left temporal lobe of the brain, and greater losses in volume corresponded with greater severity of ADHD symptoms. These brain areas with smaller gray matter volume are involved in inhibitory control (for example, preventing one’s self from blurting out an answer instead of raising a hand in class), working memory (for example, remembering the question on a worksheet while in the middle of writing an answer), planning (for example, deciding to clean up the desk, then turn in homework, then put books in backpack in that order), and response control (for example, correctly following a teacher’s directions).

Brain schematic
A schematic of the brain showcasing the frontal lobe and temporal lobe, each of which play a role in ADHD symptoms; Credit: w:User:Washington irving, Wikipedia Commons.

Previously, gray matter volume differences have been assessed in older children, but this study demonstrates that brain structure developments are discernible in children as young as four. The gray matter volume of another brain area, the anterior cingulate cortex, which plays a role in attention, decision-making, and impulsivity, has been evaluated in other studies. In older children with ADHD, there is a reduction in volume of the anterior cingulate cortex, but there was no difference between groups in the 4-5 year olds, suggesting that neural development is transpiring during the course of several years.

Scientists are gaining a better understanding of developmental trajectories of  ADHD with this kind of research. The hope is that these research studies will one day shed light on what triggers the differences in gray matter volume. These neurological differences are believed to be shared by Albert Einstein, Walt Disney, and John F. Kennedy, who also had ADHD symptoms. With this knowledge, we can gain a greater appreciation of what makes us who we are.

Sources:

Children and Adults with Attention-Deficit/Hyperactivity Disorder (CHADD). (2018). General Prevalence. CHADD: The National Resource on ADHD. Retrieved Apr 3, 2018 from http://www.chadd.org/Understanding-ADHD/About-ADHD/Data-and-Statistics/General-Prevalence.aspx.

Growl, J.M. (2018). Famous people with ADHD. PsychCentral. Retrieved Apr 3, 2018 from https://psychcentral.com/lib/famous-people-with-adhd/.

Jacobson, L.A., Crocetti, D., Dirlikov, B., Slifer, K., Denckla, M.B., Mostofsky, S.H., & Mahone, E.M. (2018). Anomalous brain development is evident in preschoolers with attention-deficit/hyperactivity disorder. Journal of the International Neuropsychological Society, First View [Published online]. https://doi.org/10.1017/S1355617718000103.

Mayo Clinic Staff. (2018). Adult attention-deficit/hyperactivity disorder (ADHD). Mayo Clinic. Mayo Foundation for Medical Education and Research. Retrieved Apr 3, 2018 from https://www.mayoclinic.org/diseases-conditions/adult-adhd/symptoms-causes/syc-20350878.

 

4 Herbal Supplements for Anxiety and Depression

Passionflower image
Passionflower; Credit: Martin Thomas, Creative Commons
Saffron flowers
Saffron flowers; Credit: Ioulrc, Creative Commons
Chaste berry
Flowering chaste-tree; Credit: Tatters, Creative Commons
Lavender; Credit: Amanda Slater, Creative Commons

 

 

 

 

 

 

 

 

 

 

 

 

 

According to a recent study, one-third of cancer patients suffer from anxiety, depression, or adjustment disorder in the months following their diagnosis. As a result, many of them add prescription anxiolytic (anti-anxiety) and antidepressant drugs to their cocktail of chemotherapy, radiation therapy, anti-coagulants, and antibiotic drugs.

The problem is that some of these anxiolytic and antidepressant drugs interact with cancer treatments and are less effective in cancer patients. They also trigger a horde of negative side effects that compound the side effects of regular cancer treatments, including seizures, headaches, and addiction.

A group of scientists at the Memorial Sloan Kettering Cancer Center in New York City decided to take a closer look at alternative herbal remedies to treat anxiety and depression in cancer patients. Research done on herbal supplements and plant extracts has been scarce, so the scientists examined a collection of studies completed between 1996 and 2016. By gathering and organizing the data, they noticed that not only are several alternative remedies helpful in ameliorating anxiety and depression, they also counteract aversive effects of chemotherapy and even combat cancer themselves. While not a perfect substitute, the following herbs have promising potential:

1. Extracts of saffron, a spice derived from a Middle Eastern flower, may be able to treat mild to moderate anxiety about as well as fluoxetine (Prozac) and imipramine (Tofranil). It has also been successful in easing anxiety and depression caused by PMS in women.

2. Lavender pills, made from oil of the lavender plant, are able to treat anxiety comparable to the drugs paroxetine (Paxil) and lorazepam (Ativan), but with fewer side effects. Lavender lotions and diffuser oils are often advertised for their calming and relaxation properties, and this holds true for lavender tea and extract drops, which may increase the efficacy of antidepressants citalopram (Celexa) and imipramine (Tofranil).

3. Passionflower, although no better than prescription drugs, seems to perform similarly but with fewer side effects, when compared to oxazepam (Serax) and sertraline (Zoloft). This substance also comes from a flower, which Native Americans have historically used to prevent insomnia.

4. Chasteberry, typically used for PMS symptoms, was compared to fluoxetine (Prozac), and while it didn’t seem to address psychological symptoms of depression, including persistent sadness, hopelessness, and loss of interest, it did alleviate physical symptoms, such as sleep trouble, digestive problems, muscle aches, and headaches.

Overall, researchers found that the herbs are not as potent, but are safer than the prescription counterparts. Clinical trials are needed to further analyze the potential of these herbal supplements and determine their benefits, especially within a oncology context. Because these supplements can be purchased over the counter, physicians don’t always know which supplements their patients are taking. It’s important to discuss an alternative treatment plan with a doctor before use.

Source:

Simon Yeung, K., Hernandez, M., Mao, J.J, Haviland, I., & Gubili, J. (2018). Herbal medicine for depression and anxiety: A systematic review with assessment of potential psycho-oncologic relevance. Phythother Res. [Epub ahead of print]. doi: 10.1002/ptr.6033

The rising cost of poor diet: ADHD symptoms in children

A handful of Halloween candy later, and you are skipping around the room, your hands are fidgety, speech is jittery, and it’s hard to contain your burst of energy. You are probably quite familiar with the sugar rush that affects you this one day a year… two days a year? Three days? Almost every day?

Time and again, nutritional studies and long-term research has shown that poor dietary patterns of children influence neural development and behavior, especially hyperactivity and attention capacities, leading to diagnoses like ADHD.

A study from 2009 followed a group of children over several years and found that for significant increases in “junk food” consumption at ages 4-5, the risk for hyperactivity at age 7 was elevated. Another study, from 2004, demonstrated a prolonged association between artificial food coloring and hyperactivity. While the cause of ADHD is still undetermined, dietary patterns certainly play a role. Not only do high-fat, high-sugar diets cause issues in the developing nervous system, but these diets tend to also be low in valuable vitamins and micronutrients.

A more recent study, published online in March 2018 in European Journal of Clinical Nutrition, showed a positive correlation between a “processed” food pattern, a “snack” food pattern, and ADHD symptoms in children age 3-6. On the other hand, there was a negative correlation between a “vegetarian” food pattern and ADHD symptoms. This study used data collected from over 14,000 preschoolers in Ma’anshan City, China, and was the first large scale study in mainland China to investigate connections between diet and ADHD in children. The prevalence of ADHD symptoms in the group studied was 8.6%.

Researchers asked caregivers and parents to answer questionnaires about the recent food consumption and food choices of their children, and gave the children a Conners Abbreviated Symptom Questionnaire to assess for ADHD symptoms. Then, the researchers allocated five dietary patterns to represent the common answers expressed by the caregivers and parents: (1) “processed,” for fast food, fried foods, preserved fruit, and other high-fat food items, (2) “snack,” for high-sugar foods like sweets, biscuits, cakes, and chocolates, (3) “protein,” for red meat, poultry, eggs, fish, fruit, and rice, (4) “beverage,” for flavored milk, soda, and yogurt, and (5) “vegetarian,” for grains, beans, and fruit or vegetable juice.

Children who had a “processed” or “snack” dietary pattern had a significantly higher likelihood of demonstrating ADHD symptoms, hyperactivity, and attention problems. There was no correlation between ADHD symptoms and the “protein” or “beverage” categories, but the “vegetarian” dietary pattern seems to act as a protector against ADHD symptoms.

This study can not show causation, or that a poor diet causes ADHD. However, it can not be refuted that diet is an influencing factor in development and behavior in children. High-fat, high-sugar foods tend to be cheaper, more accessible, and more conveniently packaged in bags and wrappers for on-the-go, and they don’t need to be refrigerated. They taste good. But, the cost is a rising prevalence of ADHD and similar disorders. The relationship between food and hyperactivity is further evident through studies that have shown that elimination diets and fish oil supplements can reverse the ADHD symptoms. Fatty foods and sugary foods should be eaten in moderation; Halloween should not be occurring every day.

Source:

Yan, S., Cao, H., Gu, C., Ni, L., Tao, H., Shao, T., Xu, Y., & Tao, F. (2018). Dietary patterns are associated with attention-deficit/hyperactivity disorder (ADHD) symptoms among preschoolers in mainland China. European Journal of Clinical Nutrition [Published online March 13, 2018]. doi:10.1038/s41430-018-0131-0.

Will therapy benefit OCD patients? Computers have the answer.

fMRI machine
Patient being prepared for an fMRI. Credit: Ptrump16, Creative Commons.

Obsessive-compulsive disorder, or OCD, is characterized by unwanted, repetitive thoughts and impulsive, ritualistic actions. For example, a common fear among those with OCD is a fear of germs, which results in repetitive hand-washing. While historically OCD has been difficult to treat effectively, in recent years, modifications to cognitive-behavioral therapy have had more success. Cognitive-behavioral therapy is comprised of a series of sessions between a therapist and patient to identify negative thought patterns and symptoms, and address them through discussion, exposure to stress-inducing stimuli, and practice utilizing alternative coping mechanisms to ameliorate anxiety.

While cognitive-behavioral therapy can be effective, it is time-consuming and does not work for everyone. Using functional magnetic resonance imaging (fMRIs), scientists at UCLA trained a computer analysis system to study the brains of individuals with OCD, and determine which individuals were most likely to benefit from cognitive-behavioral therapy. Their study demonstrated that if an OCD patient were to receive a seven-minute fMRI scan, the computer program could predict the success of cognitive-behavioral therapy for that particular patient, at 67-70% accuracy.

For their study, researchers recruited 42 adults with OCD. All of the participants underwent fMRIs at the beginning of the study. Then, half of the participants attended cognitive-behavioral therapy sessions lasting about 90 minutes per session, five days a week for four weeks. At the end, their brains were analyzed with an fMRI again to detect any differences in structure or brain function. The other half of the participants were put on a four-week waitlist. At the end of four weeks, having received no therapy, their brains were scanned to see if there were any differences simply due to time. These participants received cognitive-behavioral therapy treatment after the four-week waiting period.

On the fMRI scans, the researchers were especially interested in studying the regions of the brain and their cellular networks that regulate attention, vision, motor skills, memory, self-evaluation, and the abstract sense of “mind-wandering,” or daydreaming, each of which play a role in development of OCD. They utilized mathematical models and computer learning to map differences between the participant’s brains, and match those results with behavior results of cognitive-behavioral therapy. They found that the computer could suggest which patients would benefit from therapy, regardless of individual symptoms or severity of symptoms.

fMRI brain scan
One of the brain networks studied was the default mode network, or DMN, which plays a role in “mind-wandering,” daydreaming, and abstract thought involved in thinking about the self. Regions of the DMN are highlighted in red in this fMRI scan. Credit: Leigh Hopper, UCLA Newsroom.

Widespread use of this predictive method would give therapists more information when deciding the best route of treatment for their patients. In the study, the researchers advocate for this fMRI computer model as a way to allocate time and resources, and direct cognitive-behavior therapy towards patients who are most likely to have success, versus other types of treatment such as medications, inpatient programs, intensive day programs, or group therapy. It is a move towards personalized medicine.

However, more research needs to be done to further advance this technique. Computers alone are not yet adequate to diagnose psychological disorders or comprehend subjective human experience. Furthermore, fMRIs are extremely expensive, and the money going towards fMRI scans could instead be put towards treatment. There is also a risk that those who the computer does not deem fit for cognitive-behavioral therapy miss out on a treatment opportunity that could actually help. While studies like this one advance scientific understanding of disorders like OCD, clinicians should proceed with caution when incorporating new computer-based evaluations that could be wrong and depersonalize the treatment experience.

Sources:

Reggente, N., Moody, T.D., Morfini, F., Sheen, C., Rissman, J., O’Neill, J., & Feusner, J.D. (2018) Multivariate resting-state functional connectivity predicts response to cognitive behavioral therapy in obsessive-compulsive disorder. PNAS [published online ahead of print]. https://doi.org/10.1073/pnas.1716686115.

Hopper, Leigh. 2018. Brain scan and artificial intelligence could help predict whether OCD will improve with treatment. UCLA Newsroom. Retrieved Feb. 5 from http://newsroom.ucla.edu/releases/brain-scan-AI-help-predict-ocd-improve-treatment.

Molecular markers identified for autism, schizophrenia, and depression

Some psychological disorders, such as schizophrenia, tend to be highly heritable, meaning that the disorder is often passed down generationally within a family. Schizophrenia, for instance, is 60-87% heritable; if you were to have schizophrenia, there’s a 60-87% chance that one of your immediate relatives will develop symptoms, too. Similarly, major depressive disorder is 30-40% heritable. Therefore, in order to treat these disorders, its necessary to look at the genes involved. A February 2018 study published in Science found that there is significant overlap in gene expression between autism spectrum disorder, schizophrenia, and bipolar disorder, as well as an overlap between schizophrenia, bipolar disorder, and major depression. The strongest relationship was between schizophrenia and autism spectrum disorder.

Consider gene expression as a construction company. A construction company has a stockpile of materials: concrete, glass, cement, wood, nails, etc. The company has a crew of workers, and the crew is capable of building a variety of houses and apartment buildings. The construction company is analogous to the use of DNA by cells in the brain. The DNA is like the stockpile of materials. The materials are required to build anything, but the possible combinations of materials are endless. The RNA transcription mechanism in the cells is like the crew. The crew chooses which materials to use, and determines how much of each item is necessary for the project. In cells, this system is called “gene expression.” Every cell in the brain has the same DNA, or the same starting materials, but each cell has a different construction crew that decides to use the materials slightly differently; some build houses, some build apartment buildings, some build garages.

Instead of examining the DNA, or the building materials in over 700 cadaver brain samples used in the study, the researchers looked at the gene products, or what the construction crews built. It is unknown whether the gene products found in the brains caused the disorder symptoms, or gradually developed throughout life as the consequence of the disorders. But the study provides useful information regarding what proteins and structural factors manifest in disordered brains, and this information can be used to trace back to an origin point. Director of the UCLA Center for Autism Research and Treatment, and author of the study Daniel Geschwind said, “These findings provide a molecular, pathological signature of these disorders, which is a large step forward.”

The scientists found biological markers that tend to distinguish a brain with autism, for example, from the average brain. In the case of autism spectrum disorder, the study reported an increased activation of the CD11 gene, while another gene called CD2 was especially active in the brains suffering from depression. Additionally, the study mapped gene expression commonalities between brains with the same disorder, essentially establishing a molecular blueprint that can be recognized for diagnosis, and treated more effectively at the molecular level.

Sources:

Gandal, M.J., Haney, J.R., Neelroop, N.P., Leppa, V., Ramaswami, G., Hartl, C., Schork, A.J., Appadurai, V., Buil, A., Werge, T.M., Liu, C., White, K.P., CommonMind Consortium, PsychENCODE Consortium, iPSYCH-BOARD Working Group, Horvath, S., & Gerchwind, D.H. 2018. Shared molecular neuropathology across major psychiatric disorders parallels polygenic overlap. Science 359: 693–697.

Hopper, Leigh. 2018. Autism, schizophrenia, bipolar disorder share molecular traits, study finds. UCLA Newsroom. Retrieved Feb. 26 from http://newsroom.ucla.edu/releases/autism-schizophrenia-bipolar-disorder-share-molecular-traits-study-finds.

Seidel, D.C., Bulk, C., Stanley, M.A. 2017. Abnormal Psychology: A Scientist-Practitioner Approach (4th Edition). Pearson Education [print].

Staying Balanced: Sour Taste Buds Linked with the Ventricular System

In January 2018, a study published by the American Association for the Advancement of Science presented a surprising link between the sense of taste and the sense of balance. While trying to determine which genes are responsible for certain taste buds to ascertain sourness, scientists found the same gene at work in the inner ear.

Lemon slices
Lemon slice. Credit: GDJ; Creative Commons Clipart.

When you think “sour,” you might think about puckering at the juice of a slice of lemon, but scientists think about pH levels. Sourness is actually a measure of acidity, due to the fact that a substance is acidic if it contains lots of H+ ions (hydrogen atoms with a positive electrical charge), which is also a mark of low pH. There are different kinds of taste buds: some recognize sweetness, some recognize saltiness, etc. The taste buds that recognize the sour *tang* of Sour Patch Kids contain ion channels that allow H+ ions to flow into the taste bud cell and send a signal to the brain that says, “Wow! This is sour!”

To ascertain which gene or genes are responsible for expressing the proteins necessary for building the H+ ion channels in sour taste bud cells, researchers at the University of Southern California used a mouse model. They compared the transcriptome of mice with sour taste buds with the transcriptome of mice without them. The transcriptome is a collection of all the RNA in a particular cell, and is an indicator of proteins that are being generated and built by a cell. When 41 potential proteins were identified in the sour taste bud cells, but not found in the other taste bud cells, the scientists knew one of them must play a role in the mechanism for detecting sour tastes.

The researchers implanted the potential genes into human embryonic kidney cells (HEK-293) or the female egg cells of a frog model (Xenopus oocytes). Then, the kidney cells and egg cells were flooded with an acidic solution and observed for H+ ion currents. The researchers noticed that the gene Otopetrin1, abbreviated as Otop1, was the only gene to produce an ion channel that permitted H+ ions to pass through.

The gene Otop1 is part of the otopetrin gene family, which happens to be known for the development and function of the vestibular system. The connection is clear when mice with Otop1 mutations exhibited issues with spatial orientation and balance. They could not properly right themselves or swim. Furthermore, the mice with Otop1 mutations had weaker currents of H+ ions in the taste bud cells, which suggests that the mice were not able to fully taste sourness. The scientists at USC hypothesize that Otop1 regulates an optimal pH level in the inner ear during development.

“We never in a million years expected that the molecule that we were looking for in taste cells would also be found in the vestibular system,” senior researcher Emily Liman said. “This highlights the power of basic or fundamental research.”

The Otop1 gene also produces H+ ion channels in the heart, uterus, adrenal gland, mammary gland, and in fat tissue, although the role of H+ ion channels in these regions is not understood. Further research may uncover more intriguing and unanticipated connections within our genetic makeup.

Taste bud cells
Taste bud cells, magnified and artificially colored. The red portions denote cells that detect sour tastes, while the green portions mark cells that detect umami, sweet, or bitter tastes. Credit: Yu-Hsiang Tu and Emily Liman.

Sources:

Tu, Y.H., Cooper, A.J.,Teng, B., Chang, B.R., Artiga, D.J., Turner, H.N., Mulhall, E.M., Ye, W., Smith, A.D., & Liman, E.R. 2018. An evolutionarily conserved gene family encodes proton-selective ion channels. Science [published online] DOI: 10.1126/science.aao3264.

Gersema, E. 2018. Surprising discovery links sour taste to the inner ear’s ability to sense balance. USC Press Room. Retrieved Feb. 18 from http://pressroom.usc.edu/surprising-discovery-links-sour-taste-to-the-inner-ears-ability-to-sense-balance/.

Insight into Pericytes

Blood Brain Barrier and Astrocytes type 1
Blood Brain Barrier. Credit: Ben Brahim Mohammed, Wikimedia Commons

Imagine the vascular system in the brain as a strainer used in cooking. After cooking pasta in a pot of water, you pour the pasta over the strainer, so that it catches the noodles, and the water filters out into the sink. Typically, you want a strainer with small holes, so vegetable pieces or meat pieces cooked with your pasta don’t slip out with the water into the sink.

Similarly, specialized cells called pericytes act as the strainer of blood flow in the brain. These cells contribute to forming the blood-brain barrier, which permits nutrients and oxygen to filter through to feed brain cells but prevents toxins from entering the brain. The pericytes play an active role in managing this exchange. Pericytes also regulate blood flow in the small capillary blood vessels. In other words, they determine the width of the blood vessels and decide how much blood can flow freely.

A recent study published in Nature Medicine on February 5th linked pericyte damage with Alzheimer’s Disease and other forms of dementia. Previously, Azheimer’s Disease and other neurodegenerative diseases were associated with accumulations of TAU proteins, toxic proteins that build up over time and inhibit brain function. Researchers at the University of Southern California now think pericytes are to blame as an earlier marker for dementia, causing issues before TAU proteins even show up.

Researchers used a mouse model to simulate pericyte deficiency in humans, and noticed that damaged pericyte cells let some materials leak out of the blood and into the brain that were not supposed to be there, just like a strainer with holes that are too big and macaroni noodles start plopping into the sink. The leaking material was fibrinogen, a protein that creates blood clots at injury sites. During the healing process, fibrinogen is vital, but in the brain, fibrinogen deposits erode away at the insulation barrier of neurons, called myelin, and disrupt electrical communication from one neuron to another. You might equate fibrinogen as the chunks that get through your strainer, and then clog the drain pipe.

Nerve tracts gradually eroding as the result of damaged pericytes.
Myelin (shown in green and red) gradually erodes away as the result of damaged pericytes.  Credit: Montagne et al.

The alarming discovery was that in the absence of healthy pericytes, fibrinogen leaked into the brain, and the cells that produce myelin, called oligodendrocytes, started to die. By the end of the experiment, 50% of the oligodendrocytes were dying or defective. One hypothesis proposed was that besides directly destroying the oligodendrocytes, fibrinogen also blocks oxygen and nutrients from reaching them, further accelerating cell death.

The scientists are hopeful that their research will initiate new treatments for dementia by focusing on the root of the problem: the damaged pericytes producing leaks in the blood-brain barrier. The senior researcher said, “Perhaps focusing on strengthening the blood-brain barrier integrity may be an answer because you can’t eliminate fibrinogen from blood in humans. This protein is necessary in the blood. It just happens to be toxic to the brain.” With future research, the pericytes may become the primary target for dementia treatment and prevention.

Sources:

Montagne, A., Nikolakopoulou, A., Zhao, Z., Sagare, A.P., Si, G., Lazic, D., Barnes, S.R., Daianu, M., Ramanathan, A., Go, A., Lawson, E.J., Wang, Y., Mack, W.J., Thompson, P.M., Schneider, J.A., Varkey, J., Langen, R., Mullins, E., Jacobs, R.E., & Zlokovic, B.V. 2018. Berichte degeneration causes white matter dysfunction in the mouse central nervous system. Nature Medicine [ePub ahead of print].

Vuong, Zang. 2018. Half of all dementias start with damaged ‘gatekeeper cells.’ USC Press Room. Retrieved Feb. 12 from http://pressroom.usc.edu/half-of-all-dementias-start-with-damaged-gatekeeper-cells/.

 

Alpha waves, attention, anxiety, oh my!

Neurons firing in the brain.
Neurons firing in the brain (artificial color added). Credit: Picower Institute for Learning and Memory, M.I.T.

In a recent study, published in January 2018, scientists pinpointed a unique characteristic of people who experience trait anxiety–differences in alpha brain wave activity. Usually anxiety is correlated with an absence of alpha waves; in anxious brains, beta waves overpower alpha waves, and over time, this accumulates into feelings of constant stress. Researchers in the Departments of Psychology and Psychological Science at Ball State University found that too many alpha waves can create an equally disruptive imbalance.

The brain is composed of billions of neurons, which communicate with each other through electrical signaling. When multiple neurons fire simultaneously, they produce electrical oscillations, or “waves.” The frequency of these waves depends on the current level of consciousness: brain waves tend to be lower frequency during deep sleep, but high frequency during problem-solving, decision-making, and other tasks requiring complex thinking and concentration.

Alpha waves, which were evaluated in this study, are known to occur when the mind is in a state of relaxation. At any given moment, the brain might elicit more than one type of brain wave, but alpha waves are most widespread during meditation, while daydreaming, and even during prolonged aerobic activity, like a “runner’s high.” However, as soon as we are alerted with a task, faster beta waves take over.

This may not be the case with highly anxious individuals. Researchers used an EEG to measure the alpha brain waves of a group of individuals in a high-trait anxiety condition, analogous with having an anxiety disorder, and a group of individuals in a low-trait anxiety condition, meaning they showed very few anxiety symptoms. Researchers first measured the alpha waves during a resting, relaxed state, and then while the participants completed a response-inhibition test called the Eriksen-Flanker Task.

Researchers found that the highly anxious individuals demonstrated more alpha wave activity in the resting state, compared to the less anxious individuals. But during the Eriksen-Flanker Task, the two groups demonstrated similar levels of alpha wave activity. In other words, at baseline, the highly anxious individuals were essentially more relaxed than typical, so their brains had to make a further jump to get to an alert and focused state.

While this may seem counter-intuitive, the implications for this experiment are that the prevailing alpha waves in the brain of a highly anxious individual suppress processing of external stimuli and information. The individual might then have trouble focusing on specific tasks and thoughts. In conjunction with previous studies, anxiety has been linked to a lack of alpha waves as well as extra alpha waves in a resting state, suggesting that abnormal alpha brain wave activity alters attention and processing in various ways. More research is needed to more clearly understand this phenomenon, but researchers hope this method of measuring alpha waves will become a tool to measure degrees of anxiety in the future.

brain waves
Types of brain waves, as they appear on an EEG. Credit: Slaven Cvijetic.

Sources:

Ward, R.T., Smith, S.L., Kraus, B.T., Allen, A.V., Moses, M.A., Simon-Dack, S.L. 2018. Alpha band frequency differences between lot-trait and high-trait anxious individuals. NeuroReport 29:79-83.

Bergland, C. 2015. Alpha brain waves boost creativity and reduce depression. Psychology Today. Retrieved Feb 5, 2018 from https://www.psychologytoday.com/blog/the-athletes-way/201504/alpha-brain-waves-boost-creativity-and-reduce-depression

 

Researchers Stimulate the Amygdala to Stimulate Memory

Think back to your first kiss, your soccer championship game, or hearing about the death of a loved one. Do you remember what you were wearing? Do you remember who was there with you and specifically where you were? You might even remember the exact words from what people said around you. These crystal-clear memories are called flashbulb memories, and are processed by the amygdala, a region of the brain associated with regulating emotions and emotional memory.

A new study now reveals that directly stimulating the amygdala can result in improved memory without a combined emotional experience. Participants in a study at Emory University Hospital received brief, low-amplitude electrical stimulation to the amygdala and demonstrated improved declarative memory the next day without any subjective emotional feelings or involuntary emotional responses, such as increased heart rate or faster breathing.

The study, published online in December 2017 in the journal PNAS, took place in conjunction with the Emory University School of Medicine. Epilepsy patients with electrodes already implanted in their brains were recruited to participate, and fourteen individuals took part in the study. Participants were shown numerous neutral images (i.e. a picture of a basketball or a key), and then either given a short stimulation of the amygdala or no stimulation. Immediately afterwards and again the following day, participants were shown more neutral images in a recognition-memory test. The patients who received the stimulation exhibited greater memory retention of images after one day, compared to the control group.

While the participants in the study were simultaneously receiving treatment for epilepsy, they showed substantial memory enhancement due to the amygdala stimulation. One patient, who suffered from brain damage and memory impairment and rarely recognized researchers and physicians, displayed the most memory improvement. Other patients who experienced seizures in between the initial stimulation and the test the following day showed improved memory, presenting evidence that the amygdala stores memories in spite of other neurologically debilitating disorders. None of the patients reported being able to feel the stimulation.

brain
Illustration of the amygdala (blue), hippocampus (orange), and perirhinal cortex (pink). Credit: Cory Inman, Emory University.

Researchers speculate that the amygdala plays a role in delegating non-emotional declarative memory to other structures, namely the hippocampus and the perirhinal cortex. Specific stimulation to the hippocampus and perirhinal cortex to improve memory has been erratic in prior studies, and the amygdala might be the missing link. According to co-author Joseph Manns, “the long-term goal of this research program is to understand how modulation of the hippocampus by the amygdala can at times lead to memory enhancement and at times lead to memory dysfunction, such as that observed in post-traumatic stress disorder (PTSD).”

Regarding the targeted amygdala stimulation, co-author Cory Inman explained, “One day, this could be incorporated into a device aimed at helping patients with severe memory impairments, like those with traumatic brain injuries or mild cognitive impairment associated with various neurodegenerative diseases.” Small deep-brain stimulation implants are already being used to treat Parkinson’s disease. This study may be utilized in future research to develop similar clinical treatments for patients with memory disorders, so that their non-emotional memories like what they ate for last night’s dinner or what they read in a good book, can be remembered the next day.

Amygdala and hippocampus highlighted in brain
Limbic system imbedded in the brain. Amygdala is shown in red and the hippocampus is shown in purple. Credit: Paul Wissmann, Santa Monica College.

Sources:

Inman, C.S., Manns, J.R., Bijanki, K.R., Bass, D.I., Hamann, S., Drane, D.L., Fasano, R.E., Kovach, C.K., Gross, R.E., and Willie, J.T. 2018. Direct electrical stimulation of the amygdala enhances declarative memory in humans. PNAS 115: 98-103.

Emory Health Sciences. 2017. Direct amygdala stimulation can enhance human memory for a day: Preliminary study of time-specific electrical stimulation. ScienceDaily. Retrieved January 31, 2018 from www.sciencedaily.com/releases/2017/12/171218151808.htm.