Tumors, No Longer a Guessing Game

What makes one tumor different from another? The word ‘cancer’ can be used to describe hundreds of maladies, from a misshapen mole, to sickle shaped blood cells, to massive tumors in the heart, lungs, or anywhere else. But what makes a heart tumor different from a lung tumor? And why are there four different types of lung cancer? The answer lies in the DNA of each cancer cell.

Currently, doctors diagnose different types of cancer based primarily on where they are, what they look like in scans, and what molecules they are made of when samples are taken in biopsies. A new study introduces a novel method, looking at the methylation of cancer DNA, a process that inhibits or expresses specific functions in a cell that can characterize different disorders. By identifying the specific DNA methylations in each tumor, they can be attributed to a specific diagnosis; this is done simply by removing a small piece of the tumor, and analyzing it in the lab.

This study looked specifically at tumors on the central nervous system (CNS) and performed the methylated cancer DNA diagnostic test for 1104 patients, each diagnosed with one of 64 distinct cancers resulting in CNS tumors. In 76% of cancers, the test matched the existing diagnosis. In 12% of cases, the original diagnosis was revised as a result of this test. In the remaining instances, the test could not match to a known methylation class.

These results are extremely encouraging, and point to the end of subjective cancer diagnoses, where doctors must act as detectives, gathering clues to point to the most likely diagnosis. But this new test is a fingerprint; every cancer has a different one, and once they can be told apart, they need only consult a database stocked with thousands of cancer prints.

And this database is already live and free the world over, the data for every cancer case tested in this manner will be available and shared. Doctors will be able to readily compare the DNA in the cancers they encounter with other cases, which will likely lead to the discovery of new, unique cancers, and dramatically improve the accuracy of diagnoses overall.

 

Source:

Capper D, Jones DTW, Sill M, Hovestadt V, Schrimpf D, et al., (2018) DNA methylation-based classification of central nervous system tumours. Nature. doi:10.1038/nature26000

Immunotherapy a Potential Treatment for Breast Cancer

Breast Cancer Awareness

Immunotherapy is the treatment of a disease by changing the immune system. It was believed in the past that breast cancer did not trigger an immune response. Immune response is the reaction of cells of the immune system against a foreign substance that is not recognized as a part of the body. In early 2018, researchers at the University of Washington in Seattle published a study assessing the benefits and potential future of immunotherapy as a form of treatment for breast cancer. 

In the last decade, after the detection of an immune response in breast cancer patients, there have been numerous studies that have considered immunotherapy a possible treatment for this type of cancer. A research conducted on women 6 months prior to a breast cancer diagnosis showed high rates of T-cells (a type of white blood cell) against tumor associated proteins in women who would go on to develop breast cancer. Immune checkpoint inhibitor therapy is a mechanism that uses drugs to block certain proteins which prohibit immune cells from killing cancer cells. These agents allow T-cells to recognize cancer cells, and limit tumor growth by dividing and growing themselves. 

Research studies have shown that with the progression of cancer, the development of antigens (substances that trigger an immune response) diminishes, reducing immune response. This implied that in order to come up with a proper immunotherapeutic mechanism, researchers must focus on developing strategies that reverse this effect and increase immune response that promotes tumor destruction. One of the first breast cancer associated antigens was the MUC-1 (Human Mucin-1) protein. T-cells related to this protein were in low number in patients with the disease. So, boosting these numbers would be a potential therapeutic mechanism.

It should be noted that despite recent discoveries, breast cancer is still a poor producer of immune response. Tumor infiltrating lymphocytes (TIL), another type of white blood cells infiltrate tumor tissues and cause direct physical contact between them, which results in the physical destruction of tumor cells. However, its occurrence and concentration varies based on the type of breast cancer.

Studies are currently ongoing to identify factors that identify the patients who are most likely to benefit from immune checkpoint inhibitor therapies. This is aided by the fact that despite fluctuating amounts, TILs are always present to some extent. So, one strategy for successful patient identification is to increase the number of TIL beforehand. Immunization has been found as a possible way of accomplishing this. Research has shown that vaccine induced T-cells can travel to the tumor and induce an increase in TIL. Currently, there are several vaccines under development to target multiple breast cancer antigens at the same time to make treatment more effective.

In addition to immunization, standard therapies like radiation and chemotherapy can also increase the amount of TIL. As we understand more immune response inducing effects of traditional treatment methods, we can use them more effectively. Thus, the immune environment of the tumors can be used to combine standard and novel therapeutic strategies to develop more effective treatments. Research on such methodologies is still a field in progress and there is much to learn before we can use the knowledge of immune response in breast cancer cells to improve treatment approaches.

Reference: Disis, M.L., Stanton, S.E. 2018. Immunotherapy in Breast Cancer: An Introduction. The Breast 37:196-199.

Link to article

Link to image

Soil Cannot Mitigate Climate Change

Field
Crop Field: Using crops to transfer carbon dioxide into the soil has been found to be an unrealistic option.

It was once a groundbreaking idea that that climate change mitigation was plausible by burying carbon in the ground. However, in late February of 2018 scientists at Rothamsted Research published their findings in the journal Global Change Biology that soil data stretching back to the mid 19th century demonstrates that carbon emissions cannot be stored in the ground. The researchers concluded this by analyzing of the rate of change in carbon levels in soil.

The original idea of using crops to collect carbon from the atmosphere and burying it in the soil was proposed in 2015 at an international conference. The aim of this proposal was to increase carbon sequestration (the removal of carbon dioxide from the atmosphere and holding it solid or liquid form) by “4 parts per 1000.” The researchers at Rothamsted insisted that this rate was unrealistic for such large areas all over the planet, stating that levels of soil carbon are not unrestricted; as the levels increase, they move towards equilibrium and eventually stop growing.

Data from 16 experiments on three different soil types were examined, giving 110 treatment comparisons. The researchers observed the “4 per 1000” rate of growth in soil carbon levels in some cases, but only when such extreme measures were taken that they would be impractical in a real-life setting.

Not only did these experiments prove the impracticality of the “4 per 1000” initiative, but also displayed that high rates of soil carbon increase can be achieved by removing land from agriculture. However, this extreme decrease in agriculture over vast expanses of land would be incredibly damaging to global food security. To mitigate this problem, researchers have suggested returning residue from crops to soil as an effective solution to increase carbon soil sequestration; this has been observed as an practical method used by some countries in smallholder agriculture settings.

The researchers also suggested that long-term crop rotation with occasional introduction of pasture could lead to significant soil carbon increases. While the environmental benefits are clear, this method is economically impractical for most farmers. In order for an effective change in agricultural methods to be plausible, there would have to be implementation of new policy or guidelines.

Overall, the “4 per 1000” initiative is unrealistic as a major contribution to climate change mitigation. The scientists at Rothamsted Research suggest that there has to be more logical reasoning for promoting practices that increase soil carbon levels is more important to ensure sustainable food security and wider ecosystem services.

Paul Poulton, Johnny Johnston, Andy Macdonald, Rodger White, David Powlson. Major limitations to achieving “4 per 1000” increases in soil organic carbon stock in temperate regions: Evidence from long-term experiments at Rothamsted Research, United KingdomGlobal Change Biology, 2018; DOI: 10.1111/gcb.14066

The Race to Mars: The Ins and Outs of How it Will Happen via the Deep Space Gateway

Picture of the Red Planet

For many, the fantasy of going to Mars will soon become a reality. In 2017, Michelle Rucker and John Connolly of the Mars Study Capability Team at NASA gave a PowerPoint presentation on the specifics of just how humans will get to Mars. A key aspect of this mission will be the Deep Space Gateway (More information on DSG can be found in my blog post from last week!). As a quick refresher, the Deep Space Gateway is a space station/area around the moon that will allow people to inhabit the space between the Earth and the Moon (cislunar) and will aid in transporting astronauts to Mars.

The team began their presentation by discussing how the Mars mission fits into the vision for coordinated human and robotic exploration of our solar system. This vision is entitled the Global Exploration Roadmap and one of its main goals is to explore and have astronauts live on Mars and in space. The pair of researchers highlighted the Deep Space Gateway as a way to “provide a convenient assembly, checkout, and refurbishment location to enable Mars missions”. After explaining the role of the Deep Space Gateway (DSG), the team went into the specifics of the phases of the project and the parts of the Mars mission.

After the set up and construction of the DSG and its components (Phase 1), Phase 2 begins with a 180 day Deep Space Transport (DST) checkout and a one year shakedown cruise. During this cruise, the Deep Space Gateway remains in orbit and is supplied with astronauts and cargo by the Orion capsule. At the same time, the DST takes a path that encircles the moon to simulate the deep space trip to Mars.  This piece is critical, as the DST will be the vessel that actually takes astronauts to Mars.

Phase 3 consists of our first mission to Mars via the DST! Phase 3 is only a “fly by.” The DST enters Mars orbit without interaction between humans and the Mars surface. This will prove our technical ability to travel all the way to Mars from Earth.  The most interesting things occur in Phase 4, when the first humans will be landing on Mars. The first three missions of Phase 4 will revisit the same landing site in order to create a field station on the Mars surface. The Deep Space Gateway will also play an important role, as it will provide an easy access point in order to make any necessary repairs to the Mars mission.

OPINION: This project has been going for a LONG time, but unfortunately there is still a long way to go, and we won’t see anyone standing on Mars until the 2030’s. It’s incredible to think that all of this progress has been made, but we are still so far away. There’s a lot of work that still needs to be done, but I’ll be happy and very interested to see where NASA goes with this.

LINK: https://ntrs.nasa.gov/archive/nasa/casi.ntrs.nasa.gov/20180000122.pdf

Rucker, M., Connolly, J. 2017. Deep Space Gateway – Enabling Missions to Mars. NASA Technical Reports Server: JSC-E-DAA-TN49931.

Technology Exposed

Image from Zoopah
Image by Zoopah

How much energy do our technologies consume? Researchers from McMaster University answer this question in their study on the trends of global emissions and lifespan of Information and Communication Technology (ICT) devices and services.  They based their study on smart phones, tablets, displays, notebooks, desktops, and data centers.  Based on their current results, ICT infrastructures like data centers and communication networks are the largest contributor to energy consumption and CO2 emissions.

Data centers emit 1314 to 3743 kg CO2-e/year (carbon dioxide equivalent) while they are in use. This is equivalent to 33% of the global greenhouse gas emission (GHGE) footprint by ICT devices in 2010. The average life span of data centers are ten years, and the servers attached to the centers last three to five years. Since the data centers are supporting the internet and telecommunication system, they are in constant use, resulting in higher energy consumption. In comparison, communication networks that encompass telecom operator networks, office networks, and customer premises access contribute to 28% of the global footprint in 2010. Combined, the information of energy consumption of data centers from 2007-2012 will increase by 12% in 2020.

Following data centers and communication networks of ICT greenhouse gas footprints, smart phones will contribute to 11% of energy in 2020, compared to 4% in 2010. Smart phones, specifically Apple IPhones in the study, have an average lifespan of 1.8 to 2 years. Based on the researchers’ model on absolute terms of GHGE footprints, it predicted a 730% increase in GHGE from 2010 to 2020. In 2020, smart phones will release 125 MT CO2-e into the environment. The increase of emissions is due to the short life span of these devices. Therefore, more phones need to be produced due to their ephemeral life span. Planned obsolesce is intentional in technological design, which contributes to a profitable business model for the phone manufactures and telecom industry.

In contrast to data centers, communication networks, and smart phones, the footprints of displays, notebooks, and desktops will decrease in 2020 due the transition to high phone usage.  Below, Figure 1 displays the change of GHGE by ICT category.

comparing global energy consumption in 2010 and 2020.
Figure 1: Data from Belhir and Elmeligi on the relative contribution of each Information Communication Technology categories in 2010 and 2020

Why do these numbers matter? Based on the Paris Agreement, 196 nations agreed to limit global warming below 2°C.  If the production of ICT devices and services continue as is, we will fall short on this commitment. In 2007, the global greenhouse gas emissions were at 1-1.6%; this number could exceed 14% worldwide by 2040 if we continue our current practices. More importantly, these would the global initiative to maintain the global temperature.

paris climate change
Image from WIRED

So, what now?  The researchers suggest that we should instill sustainable practices in the production and operations of data centers and communications through the use of renewable energies. Also, it will be important to raise awareness on global energy consumption from technology. This research provides incite how the environmental impacts of our technology. To meet our global initiative, it will be crucial to adapt a new method.

Source: Lotfi Belkhir, Ahmed Elmeligi. Assessing ICT global emissions footprint: Trends to 2040 & recommendations. Journal of Cleaner Production, 2018; 177: 448 DOI: 10.1016/j.jclepro.2017.12.239

Coral Reefs getting Slammed, This Time Plastics are to Blame

Coral bleaching isn’t the only major issue affecting our coral reefs. Plastic waste is once again damaging marine life, this time in the form of the coral reefs. Scientists around the world including those at Cornell University and James Cook University in Queensland, Australia found that corals entangled in plastic are more likely to be infected with pathogens and diseases then ones which aren’t.  These scientists surveyed 159 coral reefs and visually examined over 120,000 individual corals all throughout the Asia pacific region. Of the corals surveyed, 1/3 of them were wrapped in at least one piece of plastic greater then 50mm amounting to 2.0-10.9 plastics per 100m2. It was then found that corals within the presence of plastic waste saw a likelihood of contracting a disease increase by more than a factor of 20 to 89.1 ± 3.2%. This is a significant jump from that of the normal rate in corals which is 4.4 ± 0.2%. Given the widespread distribution of plastic debris on coral reefs and the continued pollution rates,  by 2025 it was estimated that if everything continues as it is, over 15.7 billion plastic items will be wrapped up in coral reefs, meaning huge increases in mortality rates, which 3/4 of the plastic debris causing diseases eventually lead to.

Plastic bag on Coral
Plastic bag entangles in coral

With over 275 million people worldwide relying on coral reefs as food sources, this plastic waste is a major problem. The hard part is, plastics aren’t uni-formally distributed. More waste is found closer to poorer regions of the world, as they tend to recycle less and pollute more. These third world countries also tend to rely more heavily on the ocean, particularly coral reefs, as their main food source, thus this phenomenon is only gonna affect them to a higher degree. For example the study found Indonesia (a third world country) had the highest amount of plastic debris in their surrounding regions and thus also had the highest percentage of coral reefs in contact with plastics, compared to everywhere else studied

The overall impact? Not good. It has been stressed over and over again and will continue to be; waste management is a must. Decreasing amounts of debris that enter the ocean is imperative to marine life as well as our own. It is vital that we reduce the amount of plastic on our coral reefs and thus the associated diseases they cause.

Source

Lamb J. B. et al., 2018. Plastic waste Associated with Disease on coral reefs. Science Vol. 359(6374): 460-462.

Asthmatics Breathe a Sigh… of Exasperation

Asthmatics Breathe a Sigh… of Exasperation

Glucocorticoids are a class of steroids patients with asthma use the world over by way of an inhaler. They are one of several drug classes that can reliably save a persons life in the event of an asthmatic exacerbation. But as many know all too well, severe asthmatic episodes can still persist despite the application of this drug. Conventional wisdom among parents and doctors has been to increase the dosage of administered glucocorticoids in patients with more serious symptoms.

But a new study on asthmatic children across the United States by the National Heart, Blood, and Lung institute finds that there is no significant differences in severity of asthma symptoms or frequency of episodes across average and quintuple doses of glucocorticoids. Further, the study even suggests high doses may stunt growth in children, revealing the possibility of more serious problems in childhood use. The article was published in The New England of Medicine.

The year long study consisted of over 250 asthma sufferers aged 5 to 11 years old, and divided them into 2 groups of continued low-level use, and quintupled use. The experiment was carried out such that neither the patients, their parents, nor the doctors knew which treatment they were receiving.

The results suggest no significant difference between the occurrence of asthmatic exacerbations or their intensity regardless of how much glucocorticoid is administered. The study also collected data on the childrens height and weight during the period and found that the patients taking the quintuple dose displayed slightly shorter stature when controlling for various differences. This may be an indicator of serious developmental problems caused by continuous high use of the drug, and should be a concern for anyone taking or prescribing it.

This study highlights a need for more effective medications in asthma sufferers, and a better understanding of how existing medications work. Glucocorticoids are self-administered in the form of an inhaler, often on an as needed basis. Add to this the fact that children are the most heavily afflicted by asthma, and it becomes extremely easy for a patient to take too much of their prescription. Granted, quintuple dose is a far cry from the average, but when double or triple dose does not lessen the symptoms, one can tend to take too much. In an age of ultra targeted and molecular approaches to medicine, a more precise asthma treatments should be available, given the low threshold of effectiveness on glucocorticoids, let alone its possible ill affects.

 

 

 

Jackson DJ, Bacharier LB, Mauger DT, Boehmer S, Beigelman A, et al., (2018) Quintupling Inhaled Glucocorticoids to Prevent Childhood Asthma Exacerbations. DOI: 10.1056/NEJMoa1710988

How to Feed 7.6 billion People

Can our current farming systems keep up with a growing population, while also protecting the land we eat from? It’s a tough question, but a study published earlier this year suggest it is possible. The research focuses on farms in the Northern Plains of the United States, specifically those under a conventional corn production system versus those under a regenerative agriculture system. Farms included in the study that were using regenerative agriculture practices never tilled their fields, did not use insecticides, grazed their livestock on the cropland, and grew a mix of cover crop species. The conventional farms included in the study practiced tillage, used insecticides, and left the soil bare after harvest.

Researchers collected soil cores from each farm to determine the amount of organic matter within. This, along with the abundance of pest, yield, and profit were assessed. Yield in this case was the gross revenue. The study found that regenerative agriculture systems had 29% lower grain production, but had 78% higher profits- two times that of conventional agriculture. In addition, there were ten times the amount of pest on fields treated with insecticides, than those that were not. All of this is because regenerative agriculture allows for nature to do its job. Spraying insecticides on a field is not only harmful to the environment, but is ineffective. Insects can adapt to new chemicals and will persist even more when their natural predators are eliminated by insecticides. Biodiversity within cropland can reduce the amount of pest and their persistence. Regenerative agriculture raises organic matter in the soil which in return allows for increased soil infiltration, diverse soil life, less fertilization, and lower input costs. Also, systems that incorporate livestock and cropland can see higher profits from the livestock as they can feed on the cover crops, reducing fodder input and allowing more of the corn harvested to feed humans.  Conventional farming sees smaller profits because of the high seed, fertilizer, and insecticide investments.

Regenerative agriculture has become a sustainable alternative to traditional farming because it provides ecosystem services, while producing higher profits than the more input intensive conventional system. Like many recent studies, the outcomes favor the unconventional farming method and show increased profitability and farm health for those using regenerative agriculture. The abundance of new research in agriculture shows that we can feed the world if we simply change how we grow our food. There needs to be a shift in farming values that prioritize the land, resources, and the quality of food over high yield numbers.

Source: LaCanne, C.E., and Lundgren, J.G. 2018. Regenerative agriculture: merging farming and natural resource conservation profitably. PeerJ 6e4428.

Photo source:  Flickr

What Would You Call a Reliable Tuberculosis Test?

TB testing machine

Latent Tuberculosis infection (LTI) is when an individual’s immune system shows a response to the bacteria that causes Tuberculosis, but the person has no clinical signs and is not infectious to others. If untreated, LTI has the chance of progressing into active tuberculosis. Tuberculosis Skin Test (TST) and blood test (IGRA) are the methods of detecting LTI. However, the results of these tests show vast differences, especially in patients with compromised immunity who have received the BCG vaccine against TB.

TST is conducted by putting a small amount of TB protein under the top skin layer of the inner forearm. Individuals with LTI show a firm red bump ≥ 5 mm when inspected after 48-72 hours. Researchers at Anakara University School of Medicine in Turkey conducted a study from 2013 to 2017 to check if higher cut-off values for TST (diameter of the red bump) would increase the specificity and agreement between the two tests. The study was conducted on three groups: all participants, solid organ transplantation (SOT) candidates, and patients schedules for anti-TNFα treatment (for people with immunosuppressive conditions like rheumatoid arthritis). All the subjects were BCG vaccinated. Patients with a history of active TB were excluded.

In order to test if a change in the cut-off value for TST would give better results, both TST and IGRA were conducted for all three groups at three different cutoff values. The diagnostic agreement was very poor for 5 mm and 15 mm but increased slightly for 10 mm in the anti-TNFα group. Overall, the results showed that although false positive results decreased with higher cut-off rates, false negative results increased.

TST is known to give false-positive results in BCG vaccinated individuals. 

Studies have shown that the type and timing of vaccination affect the TST response. In the US, BCG vaccines are given once during infancy and once during school age. Vaccination at infancy is believed to stop affecting TST after 10-15 years, however, repeated vaccinations after infancy are known to have a longer effect. Most of the differences in the two tests were observed to be positive for the skin test (TST) and negative for the blood test (IGRA). Researchers were of the opinion that the vaccines after infancy might have been the cause of this discrepancy. It was concluded that higher cut-off values for TST were not very effective in decreasing the variation between the two tests.

Analysis of the direct costs and probable consequences of the tests have shown a greater percentage of TST patients receiving preventive treatments due to the high possibility of false-positive results. This also means an increased waste of resources for unnecessary treatment. Moreover, there is the potential risk of numerous treatments leading to a resistance to TB-drugs. The rates are lower for IGRA alone and for two step screening strategies including both TST and IGRA. The American Thoracic Society (ATS) has stated that there is a lack of sufficient data to recommend any of these three methods, but specific guidelines for immunocompromised patients recommend IGRAs.

There is a need for more conclusive results to assess the reliability of the tests for LTI, especially in light of the inconvenience that TSTs cause for BCG vaccinated individuals.

Reference: Erol, S., Ciftci, F. A., Ciledag, A., Kaya, A., Kumbasar, O. O. 2018. Do higher cut-off values for tuberculin skin test increase the specificity and diagnostic agreement with interferon gamma release assays in immunocompromised Bacillus Calmette-Guerin vaccinated patients? Advances in Medical Sciences 63(2): 237-241

Link to article

Image

Will therapy benefit OCD patients? Computers have the answer.

fMRI machine
Patient being prepared for an fMRI. Credit: Ptrump16, Creative Commons.

Obsessive-compulsive disorder, or OCD, is characterized by unwanted, repetitive thoughts and impulsive, ritualistic actions. For example, a common fear among those with OCD is a fear of germs, which results in repetitive hand-washing. While historically OCD has been difficult to treat effectively, in recent years, modifications to cognitive-behavioral therapy have had more success. Cognitive-behavioral therapy is comprised of a series of sessions between a therapist and patient to identify negative thought patterns and symptoms, and address them through discussion, exposure to stress-inducing stimuli, and practice utilizing alternative coping mechanisms to ameliorate anxiety.

While cognitive-behavioral therapy can be effective, it is time-consuming and does not work for everyone. Using functional magnetic resonance imaging (fMRIs), scientists at UCLA trained a computer analysis system to study the brains of individuals with OCD, and determine which individuals were most likely to benefit from cognitive-behavioral therapy. Their study demonstrated that if an OCD patient were to receive a seven-minute fMRI scan, the computer program could predict the success of cognitive-behavioral therapy for that particular patient, at 67-70% accuracy.

For their study, researchers recruited 42 adults with OCD. All of the participants underwent fMRIs at the beginning of the study. Then, half of the participants attended cognitive-behavioral therapy sessions lasting about 90 minutes per session, five days a week for four weeks. At the end, their brains were analyzed with an fMRI again to detect any differences in structure or brain function. The other half of the participants were put on a four-week waitlist. At the end of four weeks, having received no therapy, their brains were scanned to see if there were any differences simply due to time. These participants received cognitive-behavioral therapy treatment after the four-week waiting period.

On the fMRI scans, the researchers were especially interested in studying the regions of the brain and their cellular networks that regulate attention, vision, motor skills, memory, self-evaluation, and the abstract sense of “mind-wandering,” or daydreaming, each of which play a role in development of OCD. They utilized mathematical models and computer learning to map differences between the participant’s brains, and match those results with behavior results of cognitive-behavioral therapy. They found that the computer could suggest which patients would benefit from therapy, regardless of individual symptoms or severity of symptoms.

fMRI brain scan
One of the brain networks studied was the default mode network, or DMN, which plays a role in “mind-wandering,” daydreaming, and abstract thought involved in thinking about the self. Regions of the DMN are highlighted in red in this fMRI scan. Credit: Leigh Hopper, UCLA Newsroom.

Widespread use of this predictive method would give therapists more information when deciding the best route of treatment for their patients. In the study, the researchers advocate for this fMRI computer model as a way to allocate time and resources, and direct cognitive-behavior therapy towards patients who are most likely to have success, versus other types of treatment such as medications, inpatient programs, intensive day programs, or group therapy. It is a move towards personalized medicine.

However, more research needs to be done to further advance this technique. Computers alone are not yet adequate to diagnose psychological disorders or comprehend subjective human experience. Furthermore, fMRIs are extremely expensive, and the money going towards fMRI scans could instead be put towards treatment. There is also a risk that those who the computer does not deem fit for cognitive-behavioral therapy miss out on a treatment opportunity that could actually help. While studies like this one advance scientific understanding of disorders like OCD, clinicians should proceed with caution when incorporating new computer-based evaluations that could be wrong and depersonalize the treatment experience.

Sources:

Reggente, N., Moody, T.D., Morfini, F., Sheen, C., Rissman, J., O’Neill, J., & Feusner, J.D. (2018) Multivariate resting-state functional connectivity predicts response to cognitive behavioral therapy in obsessive-compulsive disorder. PNAS [published online ahead of print]. https://doi.org/10.1073/pnas.1716686115.

Hopper, Leigh. 2018. Brain scan and artificial intelligence could help predict whether OCD will improve with treatment. UCLA Newsroom. Retrieved Feb. 5 from http://newsroom.ucla.edu/releases/brain-scan-AI-help-predict-ocd-improve-treatment.

How Drastic Deforestation Is Causing the Earth’s Surface to Heat up

Deforestation
Source: Flickr

Forest ecosystems are a large carbon sink because of their ability to absorb carbon dioxide from the atmosphere. They play a huge role in the mitigation of climate change, but the impacts of deforestation has cause the Earth’s surface to heat up. Researchers at the European Commission Joint Research Centre published an article in February of 2018 in the journal Nature Connections detailing how recent changes to the vegetation that covers the earth is causing it to heat up. They examined the effects of cutting down vast expanses of evergreen forests for agricultural expansion on energy imbalances that contribute to the rise in local surface temperatures and global warming overall. These actions have alter radiative and non-radiative properties of the surface.

Using satellite data, the researchers analyzed changes in vegetation cover from 2000 to 2015 all over the world and linked them to changes in the surface energy balance. The statistical relationship between maps of vegetation cover and variables detailing surface properties acquired by satellite imaging was then analyzed.

The researchers also examined changes between different types of vegetation, including evergreen broadleaf forests, deciduous broadleaf forest, evergreen needle leaf forests, savannas, shrublands, grasslands, croplands, and wetlands. While deforestation results in overall higher levels of radiation leaving Earth’s surface, the balance between the shortwave light the sun emits and the longwave energy the reflects changes depending on forest type. From their observations, researchers concluded that removing tropical evergreen forest for agricultural expansion is the most responsible for an increase in surface temperature locally.

Altering the vegetation cover changes its surface properties drastically, affecting an increase in the level of heat dissipated by water evaporation and the levels of radiation reflected back into space. Overall, the researchers determined that land use change has made the planet warmer. Clearly, these forest ecosystems play an important role in combating the effects of air pollution, soil erosion, and overall climate change.

Gregory Duveiller, Josh Hooker, Alessandro Cescatti. The mark of vegetation change on Earth’s surface energy balanceNature Communications, 2018; 9 (1) DOI: 10.1038/s41467-017-02810-8