Winter is Coming and Staying

Snowfall in Boston
Snowfall in the Greater Boston area has been so severe this year that some public schools have extended classes until July. Source: Flickr

Extreme winters have become the new norm in northeastern states of the United States, and researchers have recently found the reason why. In March of 2018, scientists at Rutgers University published a study in Nature Communications on their findings that there is a correlation between the frequency of extreme winter weather in the North-Eastern region of the United States to changes in Arctic temperatures.

Recently warm temperatures in the Arctic cause the jet stream – a band of strong westerly air currents that encircle the globe several miles above the earth’s surface – to occasionally move farther south, causing cold air to reach all the way down to the eastern United States. The timing of this research is somewhat convenient, as it follows increasingly extreme winters, as well as record warm Arctic temperatures and low sea ice, record-breaking disruptions in the polar vortex (a large area of low pressure and cold air surrounding both of the Earth’s poles), and record-breaking disruptive snowfall in the United States and Europe.

Researchers found that severe winter weather is two to four times more likely to occur in eastern United States when the Arctic is abnormally warm than when it is colder than normal. The study also showed that colder winters in the northern latitudes of Europe and Asia are significantly related to the warming of Arctic. On the other hand, the study also showed a correlation between the likelihood of severe winter weather in the western United States when the Arctic is colder than normal.

Researchers found that when warming of the Arctic occurs on the Earth’s surface, there is only a weak connection to severe winter weather in the northeastern region of the United States. However, when warming is extended to the stratosphere, it disrupts the polar vortex and severe weather is more likely.

To make these conclusions, researchers used three metrics of Arctic variability to diagnose the relationship between severe winter weather in the Northeast and Arctic temperatures. These measurements are called the polar cap geopotential height anomaly index (PCH), polar cap air temperature anomaly index (PCT), and the Accumulated Winter Season Severity Index (AWSSI).

The PCT and PCH indices measure geopotential height (the vertical coordinate system referenced to Earth’s mean sea level) and temperature anomalies that occur between the 65th parallel north (a circle of latitude that is 65 degrees north of the Earth’s equator) and the north pole. The AWSSI identifies severe weather owning to snowfall and temperatures at individual locations across the United States. Researchers analyzed changes in AWSSI in relation to changes in PCT and PCH to explore the relationship between Arctic variability and severe winter weather.  They found that an increase in abnormalities occurring in polar cap temperatures and geopotential height are correlated with higher values of the AWSSI, meaning an increase in cold spells and heavy snowfalls.

Inevitably, there will be an increase in certain types of weather extremes due to the effects of anthropogenic global warming. Researchers at Rutgers University have presented a quantitative analysis of the link between Arctic variability and severe winter weather, suggesting that the pattern of colder and harsher winters in the Northeast are attributed to Arctic warming is no coincidence.

Judah Cohen, Karl Pfeiffer, Jennifer A. Francis. Warm Arctic episodes linked with increased frequency of extreme winter weather in the United StatesNature Communications, 2018; 9 (1) DOI: 10.1038/s41467-018-02992-9

What do wastewater and earthquakes have in common?

Researchers predict that the second highest earthquake in Kansas between 2013-2016 resulted from the deep injection of wastewater from the oil and gas production. These earthquakes occurred in southern Kansas, frequently in Harper County and neighboring counties. Historically, the magnitude of earthquakes in southern Kansas where below 2.0. However, they peaked in 2015 with 51 earthquakes above the magnitude of 3.0.

For their study, they focused on Sumner and Harper County in southern Kansas. Oil and gas operation began in Sumner County in 1915 and in Harper County in 1950. This contributed 1 million barrels of oil and 1 billion cubic feet of natural gas in Kansas. However, oil production decreased between 1960 and 2010; while gas production remained constant until 2015, especially in Harper County, according to the authors. This interest resulted from the Mississippi Limestone Play that promised a plentiful source of natural gas to companies through hydraulic fracturing or horizontal drilling in Oklahoma and Kansas.  Wastewater disposal, typically, is deposited in the Arbuckle group, which is an aquifer that covers most of the state of Kansas. The Arbuckle group lies on the Precambrian basement that creates fractures and faults in the Arbuckle geological formation. Therefore, if water withdraws and wastewater injections disrupt Precambrian basement, this could lead to frequent earthquakes. Within this study area, there was a correlation between the highest total volume wells and the wells with most documented seismic activity. From their results, earthquakes appeared to lag 2 to 6 months after the wastewater injection.

To ensure their link between wastewater injection from oil and gas deep injection, Rubinstein  and his co-authors investigated if hydraulic fracturing led to seismic activity. They found that this possibility was low in this location because they did not spatially or temporally correlate in their results.

Rubinstein and his authors, also, found that They, also, found that the seismic activity in Kansas is parallel to the activity in Oklahoma which has a similar problem with frequent earthquakes. Recently, the rate of earthquakes decreased in Oklahoma and Kansas. The authors suggested that this was due to the economic and regulatory forces that prompted the decline in injection. In 2016, the Kansas Corporation Commission ordered the reduction of injection in southern Kansas. However, in future research, they will assess how other factors led to the decline in earthquakes.

Source: Rubinstein, J. L., Ellsworth, W.L., Dougherty, S.L. 2018. The 2013-2016 induced earthquakes in harper and sumner Counties, southern kansas. Bulletin of Seismological Society of America 20:20.

The Comeback of the Chesapeake Bay

Once known for its beauty and abundance of seafood, the Chesapeake is now known for its poor health and struggling ecosystems. Excessive pollution from the areas within the Bay have caused the collapse of fisheries and the creation of dead zones. The neglected health of its waters has come with a hefty price tag, costing the economy and those who depended on the Bay as a way of life. Recognizing the urgent need to save the Chesapeake, the government and scientific agencies have come together to take on the huge task presented.

A new study explains how 30 years of environmental policy governing the Bay has led to the successful recovery of its aquatic ecosystems. Researchers observe the increase in Submerged Aquatic Vegetation (SAV) due to declining nutrient pollution. This proves as an example of how recovery can be achieved through management of nutrients and human stressors. Since 1984 the amount of nitrogen in the water has decreased by 23% and in return, there has been a 316% increase in SAV. To understand how nutrient pollution affects SAV, they conducted two analysis. The first observed 120 subestuaries that could impact local watershed nutrient loads and the second linked environmental conditions to SAV populations. Both of the analysis demonstrate that increased nutrient pollution from nonpoint source and point source reduce the amount of SAV. This is due to excess nitrogen that causes either increased algae cover or the accumulation of sulfides and excess phosphorus that causes phytoplankton bloom and therefore, decreased sunlight penetration.

Using aerial surveys, biogeochemical monitoring data, historical information, and watershed models the researchers concluded that the Chesapeake is indeed improving and without a doubt, due to the conservation and restoration efforts put in place. The recovery of SAV is especially important because these grasses provide habitat for crabs and fish, and are a clear indicator of healthy water quality. Slowly but surely, the Chesapeake will make a full recovery. Results from the Chesapeake Bay Foundation’s “State of the Bay” assessment show that 2016 was a record year, with the highest score of Bay health in 18 years. Both the CBF and the researchers agree that though this is promising news, there is still much more to be done and efforts should continue to strive for more.

If you live within one of the six states that the Bay occupies, consider how your actions can benefit or harm the amazing ecosystem that is the Chesapeake Bay. Visit http://www.cbf.org/ for more information.

 

Article Sources: Lefcheck, J.S., et al. 2018. Long term nutrient reductions lead to the unprecedented recovery of a temperate coastal region. PNAS. http://www.cbf.org/about-the-bay/state-of-the-bay-report/2016/

Soil Cannot Mitigate Climate Change

Field
Crop Field: Using crops to transfer carbon dioxide into the soil has been found to be an unrealistic option.

It was once a groundbreaking idea that that climate change mitigation was plausible by burying carbon in the ground. However, in late February of 2018 scientists at Rothamsted Research published their findings in the journal Global Change Biology that soil data stretching back to the mid 19th century demonstrates that carbon emissions cannot be stored in the ground. The researchers concluded this by analyzing of the rate of change in carbon levels in soil.

The original idea of using crops to collect carbon from the atmosphere and burying it in the soil was proposed in 2015 at an international conference. The aim of this proposal was to increase carbon sequestration (the removal of carbon dioxide from the atmosphere and holding it solid or liquid form) by “4 parts per 1000.” The researchers at Rothamsted insisted that this rate was unrealistic for such large areas all over the planet, stating that levels of soil carbon are not unrestricted; as the levels increase, they move towards equilibrium and eventually stop growing.

Data from 16 experiments on three different soil types were examined, giving 110 treatment comparisons. The researchers observed the “4 per 1000” rate of growth in soil carbon levels in some cases, but only when such extreme measures were taken that they would be impractical in a real-life setting.

Not only did these experiments prove the impracticality of the “4 per 1000” initiative, but also displayed that high rates of soil carbon increase can be achieved by removing land from agriculture. However, this extreme decrease in agriculture over vast expanses of land would be incredibly damaging to global food security. To mitigate this problem, researchers have suggested returning residue from crops to soil as an effective solution to increase carbon soil sequestration; this has been observed as an practical method used by some countries in smallholder agriculture settings.

The researchers also suggested that long-term crop rotation with occasional introduction of pasture could lead to significant soil carbon increases. While the environmental benefits are clear, this method is economically impractical for most farmers. In order for an effective change in agricultural methods to be plausible, there would have to be implementation of new policy or guidelines.

Overall, the “4 per 1000” initiative is unrealistic as a major contribution to climate change mitigation. The scientists at Rothamsted Research suggest that there has to be more logical reasoning for promoting practices that increase soil carbon levels is more important to ensure sustainable food security and wider ecosystem services.

Paul Poulton, Johnny Johnston, Andy Macdonald, Rodger White, David Powlson. Major limitations to achieving “4 per 1000” increases in soil organic carbon stock in temperate regions: Evidence from long-term experiments at Rothamsted Research, United KingdomGlobal Change Biology, 2018; DOI: 10.1111/gcb.14066

Technology Exposed

Image from Zoopah
Image by Zoopah

How much energy do our technologies consume? Researchers from McMaster University answer this question in their study on the trends of global emissions and lifespan of Information and Communication Technology (ICT) devices and services.  They based their study on smart phones, tablets, displays, notebooks, desktops, and data centers.  Based on their current results, ICT infrastructures like data centers and communication networks are the largest contributor to energy consumption and CO2 emissions.

Data centers emit 1314 to 3743 kg CO2-e/year (carbon dioxide equivalent) while they are in use. This is equivalent to 33% of the global greenhouse gas emission (GHGE) footprint by ICT devices in 2010. The average life span of data centers are ten years, and the servers attached to the centers last three to five years. Since the data centers are supporting the internet and telecommunication system, they are in constant use, resulting in higher energy consumption. In comparison, communication networks that encompass telecom operator networks, office networks, and customer premises access contribute to 28% of the global footprint in 2010. Combined, the information of energy consumption of data centers from 2007-2012 will increase by 12% in 2020.

Following data centers and communication networks of ICT greenhouse gas footprints, smart phones will contribute to 11% of energy in 2020, compared to 4% in 2010. Smart phones, specifically Apple IPhones in the study, have an average lifespan of 1.8 to 2 years. Based on the researchers’ model on absolute terms of GHGE footprints, it predicted a 730% increase in GHGE from 2010 to 2020. In 2020, smart phones will release 125 MT CO2-e into the environment. The increase of emissions is due to the short life span of these devices. Therefore, more phones need to be produced due to their ephemeral life span. Planned obsolesce is intentional in technological design, which contributes to a profitable business model for the phone manufactures and telecom industry.

In contrast to data centers, communication networks, and smart phones, the footprints of displays, notebooks, and desktops will decrease in 2020 due the transition to high phone usage.  Below, Figure 1 displays the change of GHGE by ICT category.

comparing global energy consumption in 2010 and 2020.
Figure 1: Data from Belhir and Elmeligi on the relative contribution of each Information Communication Technology categories in 2010 and 2020

Why do these numbers matter? Based on the Paris Agreement, 196 nations agreed to limit global warming below 2°C.  If the production of ICT devices and services continue as is, we will fall short on this commitment. In 2007, the global greenhouse gas emissions were at 1-1.6%; this number could exceed 14% worldwide by 2040 if we continue our current practices. More importantly, these would the global initiative to maintain the global temperature.

paris climate change
Image from WIRED

So, what now?  The researchers suggest that we should instill sustainable practices in the production and operations of data centers and communications through the use of renewable energies. Also, it will be important to raise awareness on global energy consumption from technology. This research provides incite how the environmental impacts of our technology. To meet our global initiative, it will be crucial to adapt a new method.

Source: Lotfi Belkhir, Ahmed Elmeligi. Assessing ICT global emissions footprint: Trends to 2040 & recommendations. Journal of Cleaner Production, 2018; 177: 448 DOI: 10.1016/j.jclepro.2017.12.239

How to Feed 7.6 billion People

Can our current farming systems keep up with a growing population, while also protecting the land we eat from? It’s a tough question, but a study published earlier this year suggest it is possible. The research focuses on farms in the Northern Plains of the United States, specifically those under a conventional corn production system versus those under a regenerative agriculture system. Farms included in the study that were using regenerative agriculture practices never tilled their fields, did not use insecticides, grazed their livestock on the cropland, and grew a mix of cover crop species. The conventional farms included in the study practiced tillage, used insecticides, and left the soil bare after harvest.

Researchers collected soil cores from each farm to determine the amount of organic matter within. This, along with the abundance of pest, yield, and profit were assessed. Yield in this case was the gross revenue. The study found that regenerative agriculture systems had 29% lower grain production, but had 78% higher profits- two times that of conventional agriculture. In addition, there were ten times the amount of pest on fields treated with insecticides, than those that were not. All of this is because regenerative agriculture allows for nature to do its job. Spraying insecticides on a field is not only harmful to the environment, but is ineffective. Insects can adapt to new chemicals and will persist even more when their natural predators are eliminated by insecticides. Biodiversity within cropland can reduce the amount of pest and their persistence. Regenerative agriculture raises organic matter in the soil which in return allows for increased soil infiltration, diverse soil life, less fertilization, and lower input costs. Also, systems that incorporate livestock and cropland can see higher profits from the livestock as they can feed on the cover crops, reducing fodder input and allowing more of the corn harvested to feed humans.  Conventional farming sees smaller profits because of the high seed, fertilizer, and insecticide investments.

Regenerative agriculture has become a sustainable alternative to traditional farming because it provides ecosystem services, while producing higher profits than the more input intensive conventional system. Like many recent studies, the outcomes favor the unconventional farming method and show increased profitability and farm health for those using regenerative agriculture. The abundance of new research in agriculture shows that we can feed the world if we simply change how we grow our food. There needs to be a shift in farming values that prioritize the land, resources, and the quality of food over high yield numbers.

Source: LaCanne, C.E., and Lundgren, J.G. 2018. Regenerative agriculture: merging farming and natural resource conservation profitably. PeerJ 6e4428.

Photo source:  Flickr

How Drastic Deforestation Is Causing the Earth’s Surface to Heat up

Deforestation
Source: Flickr

Forest ecosystems are a large carbon sink because of their ability to absorb carbon dioxide from the atmosphere. They play a huge role in the mitigation of climate change, but the impacts of deforestation has cause the Earth’s surface to heat up. Researchers at the European Commission Joint Research Centre published an article in February of 2018 in the journal Nature Connections detailing how recent changes to the vegetation that covers the earth is causing it to heat up. They examined the effects of cutting down vast expanses of evergreen forests for agricultural expansion on energy imbalances that contribute to the rise in local surface temperatures and global warming overall. These actions have alter radiative and non-radiative properties of the surface.

Using satellite data, the researchers analyzed changes in vegetation cover from 2000 to 2015 all over the world and linked them to changes in the surface energy balance. The statistical relationship between maps of vegetation cover and variables detailing surface properties acquired by satellite imaging was then analyzed.

The researchers also examined changes between different types of vegetation, including evergreen broadleaf forests, deciduous broadleaf forest, evergreen needle leaf forests, savannas, shrublands, grasslands, croplands, and wetlands. While deforestation results in overall higher levels of radiation leaving Earth’s surface, the balance between the shortwave light the sun emits and the longwave energy the reflects changes depending on forest type. From their observations, researchers concluded that removing tropical evergreen forest for agricultural expansion is the most responsible for an increase in surface temperature locally.

Altering the vegetation cover changes its surface properties drastically, affecting an increase in the level of heat dissipated by water evaporation and the levels of radiation reflected back into space. Overall, the researchers determined that land use change has made the planet warmer. Clearly, these forest ecosystems play an important role in combating the effects of air pollution, soil erosion, and overall climate change.

Gregory Duveiller, Josh Hooker, Alessandro Cescatti. The mark of vegetation change on Earth’s surface energy balanceNature Communications, 2018; 9 (1) DOI: 10.1038/s41467-017-02810-8

The Perks of Hydro-Powered Dams

Image of dam

Is it possible to have a dam that contributes to the socioeconomic and energy needs of a community without degrading the surrounding environment?

Researchers from Arizona State University found the answer to this question. They wanted to observe how dams would impact the food security among communities around the Mekong River. Many of these communities rely on the river for their food source and employment. Nutritionally, rural fishing and agricultural communities receive animal protein and vitamin A from the river too. From their study, Results indicate that a design flow that mimics long inter-flood interval and short, strong flood pulses produced higher fish yields than from natural flow restoration. Their results are an extension of previous studies that linked the flood magnitude, duration, and a low period followed by short, strong flood pulses leaded to higher yields in fisheries.

Sabo and his co-authors based their study on the rivers in Lower Mekong Basin. The Mekong River is the twelfth largest in the world, estimating 4350 kilometers in length. It is also the eight largest river that discharges and hosts one of the largest inland fisheries in the world. The river goes through China, Myanmar (Burma), Laos, Thailand, Cambodia, Vietnam.

Due to its large presence across transnational boundaries, it has attracted hydro-power development. According to the researchers, hydropower is a common source of energy for poor, predominantly rural populations. Despite its alternative benefits as a renewable energy, the process could have negative impacts on the environment. For instance, hydrologic alteration from dams could lead to invasion of non-native aquatic species, which would impact the food web structure.

To analyze the relationship between food security and dams, the researchers evaluated the discharges on the Tonle Sap River that connects the Mekong River to the Tonle Sap Lake . The Dai fishery is located on this river too. The Dai fishery has a nursery habitat that houses approximately 300 fish species. In addition to the biodiversity, the fishery is the most valuable and productive (i.e. number of fish caught) in the Lower Mekong Basin according to the researchers.

Based on their results, designed flows had a 76% annual increases in yield compared to 47% in annual yield in natural flow restoration along the Tonle Sap and Mekong Rivers.  Designed flows are based on models on flood pulse extent, based on flood magnitude and duration, and net annual anomaly, based on the sum of all positive (wetness) and negative (dryness) anomalies to detail annual discharges. Natural flow is based on the conditions of the water before the installation of dams. Currently, there are projected dams in China, Laos, and Cambodia that would allow water managers to control the flow of the river. This study suggests designed flows prompts higher yields, which would encourage more projects to make dams to insure food security. Below is an image of the proposed dams  by the researchers that support the designed-flow model recommended by them.

Image of Lower Mekong Basin
Proposed Dams in Lower Mekong Basin

Source:

Sabo J.L., Ruhi, A., Holtgrieve, G.W., Elliott, V., Arias, M.E., Bun Ngor, P., Rasanen, T.A., Nam, S. 2017. Designing river flows to improve foo security futures in the Lower Mekong Basin. Science 358: 1053.

Ditch the Plastic

A study done in July of 2017 reveals the short life cycle of plastics and our excessive production rates on a global scale. Researchers found that 6.3 million metric tons of plastic waste has been created as of 2015. Of that, only 9% has been recycled and 79% has entered either the landfill or environment. By 2050 the amount of plastic waste to enter the landfill or environment is likely to double. This is alarming because plastic is not biodegradable. However, plastic will break down over hundreds of years into very small pieces. These pieces can contaminate oceans and the natural environment. In fact, somewhere between 4 and 12 million metric tons of plastic entered the ocean in 2010 alone.

The excessive amount of plastics produced globally is used mainly to package goods. The study found that 42% of all non fiber plastics have been used for packaging, primarily composed of PE, PP, and PET. PE, PP, and PET are also known as plastics #2, 5, and 1. Though most plastics can be recycled, they are often not. And even if they are, recycling just delays the amount of time before the plastic ends up in the landfill as waste. As if plastic waste contaminating our lands and oceans isn’t enough, the fossil fuels used to create plastics pollutes our air and contributes to climate change. If we continue to generate as much plastic as we currently do, plastic will account for 20% of all oil production by 2050 (plasticpollutioncoalition.org).

Every year Americans throw away 35 billion plastic bottles (utahrecycles.org), use 380 billion plastic bags (Anderson), and recycle only a small percentage of both. One of the worst daily use plastic items are straws, which 500 million are used a day and often can be found accumulating in oceans (plasticpollutioncoalition.org). This is why society needs to move away from single use items and harmful plastic. The throwaway economy needs to transform into a circular one where goods are reused and re-purposed not used and disposed. To live a sustainable life and reduce the amount of waste generated we can take part in several meaningful actions. These include: using reusable bags when shopping to avoid plastic bag use, investing in a reusable water bottle to save hundreds of plastic bottles from contaminating the environment, not using or buying plastic straws, and purchasing less plastic packaged goods or plastic goods in general. Our everyday actions can make a big difference. Choosing to avoid plastics and encouraging others to do the same, can help bring the mass production of plastics to a halt and save billions of tons of plastic waste from contaminating our environment.

 

Sources: Geyer, R., Jambeck, J.R., and Law, K.L. 2017. Production, use, and fate of all plastics ever made. Science Advances 3: 7.

Marcia Anderson. 2016. Confronting Plastic Pollution One Bag at a Time. 

https://utahrecycles.org/get-the-facts/the-facts-plastic/

Plastic Polution Coalition. 2017. Fueling Plastics: New Research Details Fossil Fuel Role in Plastics Proliferation

Photo Source: Flickr user Emilian Robert Vicol

For Better or Worse Rock Glaciers will Eventually Melt Away

As global climate change continually progresses our glaciers continually recede. However, their decreasing volumes differ from one to another. You may never have heard of them, but rock glaciers (RG’s) are the more resilient, mountainous equivalent to typical glaciers. Researchers in the BEIS/Defra Met Office Hadley Centre Climate Program, conducted research regarding these glaciers and created the first ever RGDB (rock glacier database), in an effort to increase knowledge about them and awareness of the impending hydro-logic impacts they soon may have. They were able to pinpoint over 73,000 of them all over the world, many of which were located in the highest and most arid regions of the world, including the Andes and the Himalayas.

As stated earlier rock glaciers are more resilient to low lying glaciers (in regards to global warming).  RG’s are found in high elevation area’s, mainly mountain tops, all around the world. They have an active layer which melts and thaws seasonally, and are characterized as active or inactive glaciers depending on whether or not they have ice beneath it. This active layer is also what helps regulate the glaciers temperature and causes it to be more resilient to temperature changes. However, they aren’t immune. Global warming is predicted to hit higher elevation areas harder then lower lying elevations. At first this will increase flow of rivers and streams within the watershed, but it won’t last for long. As the temperature increases so will melting and eventually the long term future consequence will be the loss of these glaciers.

Glacier and Snow on Mountain
Mountain glacier and snow in New Zealand

Thankfully, the water supply that will eventually come from the melting of these ‘natural water towers’ isn’t just gonna disappear, it’ll be utilized. The meltwater will create a significant water source for arid and semi-arid systems with potential future water scarcity problems.

Apart from the database made, the researchers also estimated the water content that individual rock glaciers hold. The number came to around 83.72 Gigaton each, give or take about 16Gts. That’s a lot of water, especially if used efficiently.

Finally, whether or not individual rock glaciers melt, people in places affected by them wont see huge droughts anytime in the near future, thanks to their resilience to climate change. The only question left is, what happens when they do finally melt away?

Source

Jones D. B, Harrison S., Anderson K., Betts R. A., February 2018. Mountain rock glaciers contain globally significant water sources. Scientific Reports v10 NO 1038: 28-34.

How to Reduce the Cost of Renewables?

How do we reduce the cost of renewable energy? Research from the German Institute of Economic Research, commonly known as DWI, evaluated how regulations/policy (as a support system) and risks in financial investments effect the transition to forms of renewable energy like wind powered plants and photovoltaic systems in Germany. In 2014, Germany mandated the use of renewable energies as their main energy instead of optional as it began in 2012. This ruling forced energy companies to invest in renewables. From this, the authors wanted to know the cost associated in these transactions. In their conclusion, they found that companies utilized hedging to cover their financial cost. Hedging offsets the potential losses or gain incurred by the company’s investment. The authors also found that support programs like green certification, market premiums decrease the ability for the company to hedge their investments. As a result, the consumers would absorb the cost of the risk instead of the hedge fund.

Similar to Germany, other European countries adapted renewables as their main electrical source. However, the investment cost varies between the regions in Europe.  The projects in southern and eastern Europe have higher financing than projects in western in northern Europe. This is due to the different funding systems that support the project: a feed-in tariff system, sliding market system, green certificates, and fixed market premiums.

A feed-in tariff system allows developers a fixed tariff on electricity. In this system, the developers bear the risks if the project fails, but they do not have to any other tariffs related to risks, only the fixed tariff.

The sliding market premium  offers premiums to the plant operators. Compared to the feed in tariff system, these fees vary in amount. This systems as to incentive forecasting profits and risk before the project begins. The forecasting can aid the design of the wind power plant and photovoltaic system. Italy, Finland, and the Netherlands adopted this method.

Green certificates and fixed market premiums allow developers to sell electricity at market value. So far, the United Kingdom, Sweden, Poland, Belgium, and Romania used this system. They have some of the highest financing cost.

Although green certificate program seems to better idea, this method impacts the ability to hedge, which negative impacts the consumers due to higher cost for electricity. This report provides incite on the ways we can make renewable energy like wind power plants affordable for the masses. Recently, the cost  renewable technologies have decreased. If there more affordable investment options, this may ease the energy transition.

 

Source: May, N, Jurgens, I, Neuhoff, K. Renewable energy policy: risk hedging is taking center stage.  2017.. DIW Economic Bulletin 29:40.