31/01/2018

Marine Trackers Show How Warming Waters Affect Australian Sea Life And Beyond

ABC NewsRhiannon Shine

Acoustic signal receivers were installed in coastal waters around Australia for the massive study. (Supplied: IMOS/Fabrice Jaine)
New data mapping the movements of Australian marine life over the past decade will provide insight into how climate change might affect sea animal behaviour, researchers say.
The study by researchers at the Integrated Marine Observing System (IMOS) and Macquarie University has tracked the whereabouts of 117 marine species, ranging from sharks and saltwater crocodiles to sea turtles and jellyfish using sound-detecting underwater receivers.
The receivers pick up and record signals from acoustic tags that have been placed on fish and marine mammals.
Lead author Xavier Hoenner said the researchers collected and quality controlled 49.6 million acoustic detections from tagged animals.
"The established IMOS Animal Tracking Facility network, consisting of nearly 2,000 receiving stations located around the country, allowed us to track 3,777 Australian sea animals, including some of Australia's most iconic species," Dr Hoenner said.
IMOS Animal Tracking Facility leader Rob Harcourt said the data would help researchers to predict how animal behaviour might change in the future in response to warming waters.
"For example, in the case of bull sharks – a species we tracked that is known to be potentially dangerous – research has shown that they move within warmer waters, meaning it is important that we understand how they modify their movements in response to changes in ocean conditions and processes," Professor Harcourt said.
"We do have quite strong evidence that there are things like pulses of warm water that are coming down with the East Australian Current which is strengthening and has been strengthening over some time.
"With those water masses we find that fish are essentially staying in the same environment but that environment is actually moving.
"Over the course of the next few years we will be able to build some really complex models to allow us to predict what is going to happen as we understand more about the oceanography."


Professor Harcourt said the tracking system showed some species travelled surprising distances.
"For instance, in Sydney Harbour the New South Wales government has been tagging bull sharks because of anxiety about people being bitten," he said.
"Our colleagues up in Townsville, which is about 2,400 kilometres away, also detected bull sharks and then discovered these were the same sharks that had been tagged in Sydney Harbour.
"We are now looking at their movements all the way up and down the coast over a number of years."
Some animals previously thought to be quite sedentary have also been proven to travel long distances.
"Sevengill sharks that are found in Tasmania have been detected over in South Australia," Professor Harcourt said.
"Sevengill sharks were thought to be quite restricted to really cold waters down here, and yet we know now that they go right up to South Australia, again, a couple of thousand kilometres away.
"That is potentially because of the way movements of water have changed."
Dr Hoenner said the tracking data was validated by a state of the art quality control algorithm developed in Hobart, which he expected to be used by other researchers around the world.
The algorithm identifies background noise signals and anomalous movements to strengthen the quality and re-usability of the data.
"There is a global need [for the quality control algorithm]," Dr Hoenner said.
"Everyone is spending a lot of time looking at the data and this could make the whole process a lot easier for everyone."
Professor Harcourt said the data, published in the nature journal Scientific Data, would help future investigations by other marine research groups.
"The data is available through the online Australian Ocean Data Network Portal, making it a very valuable resource for comparing the behaviour of marine animals today and in the future," he said.

Links

Climate Change Projects Headline Innovation And Science Australia's 2030 Plan

AFRDavid Marin-Guzman

Australia's chief scientist Dr Alan Finkel says the vocational educational and training system needs to be more responsive to changes in technology. Wayne Taylor
Restoring the Great Barrier Reef and de-carbonising the gas system have been pitched as "national mission" projects to inspire the next decade of innovation.
Innovation and Science Australia's 2030 strategy plan, to be released on Tuesday, proposes world-leading initiatives to raise the country's aspirations on what it can achieve akin to US President John F Kennedy's "moonshot" challenge.
According to the independent body's plan, the reef mission would restore and protect the reef from climate change, boosting scientific research while creating new products, start-ups and niche industries such as in bio-materials and 3D printers.
The Turnbull government's current Reef 2050 plan provided a "strong base" for the mission, the ISA said, but was primarily focused on direct threats to the reef such as the coral-eating crown-of-thorns starfish.
"It does not have an explicit climate adaptation strategy and is therefore insufficient to safeguard the reef beyond 2030."
Minister for Jobs and Innovation Michaelia Cash claimed the government has already moved to action the ISA's recommendation, by "committing last week to fund groundbreaking research to preserve the Great Barrier Reef".
However, the government's $60 million reef package focuses on targeting the crown-of-thorns starfish and land-based run-offs, with just $6 million spent on research and development for adaptation.

'Hydrogen City'
The ISA's other national mission candidate was converting an entire city to clean hydrogen gas by 2030.
Zero-emission energy sources such as solar, wind or hydro would be used to produce the hydrogen by splitting water into hydrogen and oxygen.
"This has never been done at the scale contemplated in this mission," the report said.
The technology improvements resulting from the large-scale deployment of hydrogen technologies would then create export opportunities and make Australia a leader in the field.
Public and private sectors would fund both projects at an estimated $500,000 over 10 years.
But the ISA's most "ideal" national mission would be to integrate DNA studies and precision medicine into the healthcare system.
The medicine mission would allow for early diagnosis and prevention of diseases, making Australia "the healthiest nation on earth".
The ISA said Australia was already "well connected" to international efforts in the area and the mission would build on the government's medical research future fund and $500 million already committed to the biomedical translation fund.

Skills shortage requires education revamp
At the heart of the ISA's innovation agenda was a re-booting of the educational and training system.
Despite recent fears that automation will destroy jobs, the ISA forecast that a "shortage of workers is a more likely problem than a shortage of jobs".
It forecast a looming retirement boom from an ageing population would create a 6 per cent skills shortage by 2030.
At the same time, technology meant 92 per cent of future jobs would need digital skills and 45 per cent would require people who can configure digital systems.
The ISA recommended "refining" immigration restrictions to attract specialists and entrepreneurs and increase training for teachers in science, technology, engineering and mathematics.
ISA deputy chairman and chief scientist Alan Finkel told The Australian Financial Review it was essential the future workforce had strong knowledge in disciplines as well as "21st century" skills such as creativity and problem-solving.
However, he said teachers' training and knowledge of their own disciplines needed to improve.
"There's no point raising the bar of students' aspirations if you don't also coach them to clear the bar."
The ISA recommended that teachers spend a minimum number of hours every year in professional development for their specific discipline to ensure their knowledge is up to date.
The report also called for a review of the vocational education and training system to ensure it is more responsive to new technologies and to link VET funding to employment outcomes.

Links

Climate Scientists Explore Hidden Ocean Beneath Antarctica’s Largest Ice Shelf

The ConversationCraig Stevens | Christina Hulbe

The team used hot-water drilling gear to melt a hole through Antarctica’s Ross Ice Shelf to explore the ocean below. Christina Hulbe, CC BY-ND
Antarctica’s Ross Ice Shelf is the world’s largest floating slab of ice: it’s about the size of Spain, and nearly a kilometre thick.
The ocean beneath, roughly the volume of the North Sea, is one of the most important but least understood parts of the climate system.
We are part of the multi-disciplinary Aotearoa New Zealand Ross Ice Shelf programme team, and have melted a hole through hundreds of metres of ice to explore this ocean and the ice shelf’s vulnerability to climate change. Our measurements show that this hidden ocean is warming and freshening - but in ways we weren’t expecting.


Instruments travelling 360m down a bore hole, from the snow-covered surface of the Ross Ice Shelf through to the ocean below the ice. After splash-down at about 60m, they move through the bubble-rich upper ice and down into the dark bubble-free lower reaches of the ice – passing embedded sediment that left the coast line centuries ago.

A hidden conveyor belt
All major ice shelves are found around the coast of Antarctica. These massive pieces of ice hold back the land-locked ice sheets that, if freed to melt into the ocean, would raise sea levels and change the face of our world.
An ice shelf is a massive lid of ice that forms when glaciers flow off the land and merge as they float out over the coastal ocean. Shelves lose ice by either breaking off icebergs or by melting from below. We can see big icebergs from satellites - it is the melting that is hidden.
Because the water flowing underneath the Ross Ice Shelf is cold (minus 1.9C), it is called a “cold cavity”. If it warms, the future of the shelf and the ice upstream could change dramatically. Yet this hidden ocean is excluded from all present models of future climate.
This satellite map shows the camp site on the Ross Ice Shelf, Antarctica. Ross Ice Shelf Programme, CC BY-ND
There has only been one set of measurements of this ocean, made by an international team in the late 1970s. The team made repeated attempts, using several types of drills, over the course of five years. With this experience and newer, cleaner, technology, we were able to complete our work in a single season.
Our basic understanding is that seawater circulates through the cavity by flowing in at the sea bed as relatively warm, salty water. It eventually finds its way to the shore - except of course this is a shoreline under as much as 800 metres of ice. There it starts melting the shelf from beneath and flows across the shelf underside back towards the open ocean.

Peering through a hole in the ice
The New Zealand team – including hot water drillers, glaciologists, biologists, seismologists, oceanographers – worked from November through to January, supported by tracked vehicles and, when ever the notorious local weather permitted, Twin Otter aircraft.
As with all polar oceanography, getting to the ocean is often the most difficult part. In this case, we faced the complex task of melting a bore hole, only 25 centimetres in diameter, through hundreds of metres of ice.
A team of ice drillers from Victoria University of Wellington used hot water and a drilling system developed at Victoria to melt a hole through hundreds of metres of ice. Craig Stevens, CC BY-ND
But once the instruments are lowered more than 300m down the bore hole, it becomes the easiest oceanography in the world. You don’t get seasick and there is little bio-fouling to corrupt measurements. There is, however, plenty of ice that can freeze up your instruments or freeze the hole shut.

A moving world
Our camp in the middle of the ice shelf served as a base for this science, but everything was moving. The ocean is slowly circulating, perhaps renewing every few years. The ice is moving too, at around 1.6 metres each day where we were camped. The whole plate of ice is shifting under its own weight, stretching inexorably toward the ocean fringe of the shelf where it breaks off as sometimes massive icebergs. The floating plate is also bobbing up and down with the daily tides.
The team at work, preparing a mooring. Christina Hulbe, CC BY-ND
Things also move vertically through the shelf. As the layer stretches toward the front, it thins. But the shelf can also thicken as new snow piles up on top, or as ocean water freezes onto the bottom. Or it might thin where wind scours away surface snow or relatively warm ocean water melts it from below.
When you add it all up, every particle in the shelf is moving. Indeed, our camp was not so far (about 160km) from where Robert Falcon Scott and his two team members were entombed more than a century ago during their return from the South Pole. Their bodies are now making their way down through the ice and out to the coast.

What the future might hold
If the ocean beneath the ice warms, what does this mean for the Ross Ice Shelf, the massive ice sheet that it holds back, and future sea level? We took detailed temperature and salinity data to understand how the ocean circulates within the cavity. We can use this data to test and improve computer simulations and to assess if the underside of the ice is melting or actually refreezing and growing.
Our new data indicate an ocean warming compared to the measurements taken during the 1970s, especially deeper down. As well as this, the ocean has become less salty. Both are in keeping with what we know about the open oceans around Antarctica.
We also discovered that the underside of the ice was rather more complex than we thought. It was covered in ice crystals – something we see in sea ice near ice shelves. But there was not a massive layer of crystals as seen in the smaller, but very thick, Amery Ice Shelf.
Instead the underside of the ice held clear signatures of sediment, likely incorporated into the ice as the glaciers forming the shelf separated from the coast centuries earlier. The ice crystals must be temporary.
None of this is included in present models of the climate system. Neither the effect of the warm, saline water draining into the cavity, nor the very cold surface waters flowing out, the ice crystals affecting heat transfer to the ice, or the ocean mixing at the ice fronts.
It is not clear if these hidden waters play a significant role in how the world’s oceans work, but it is certain that they affect the ice shelf above. The longevity of ice shelves and their buttressing of Antarctica’s massive ice sheets is of paramount concern.

Links

30/01/2018

Natural Gas Killed Coal – Now Renewables And Batteries Are Taking Over

The Guardian

To avoid dangerous climate change, we can’t rely on natural gas replacing coal
Over the past decade, coal has been increasingly replaced by cheaper, cleaner energy sources. US coal power production has dropped by 44% (866 terawatt-hours [TWh]). It’s been replaced by natural gas (up 45%, or 400 TWh), renewables (up 260%, or 200 TWh), and increased efficiency (the US uses 9%, or 371 TWh less electricity than a decade ago).

Evolution of the American power grid mix since 1960. Illustration: Carbon Brief
In other words, of the 866 TWh of lost coal power production, 46% was picked up by natural gas, 43% by increased efficiency, and 23% by renewables.

Natural gas is an unstable ‘bridge fuel’
While the shift away from coal is a positive development in slowing global warming by cutting carbon pollution, as Joe Romm has detailed for Climate Progress, research indicates that shifting to natural gas squanders most of those gains. For example, a 2014 study published in Environmental Research Letters found that when natural gas production is abundant, it crowds out both coal and renewables, resulting in little if any climate benefit. Part of the problem is significant methane leakage from natural gas drilling.
...abundant gas consistently results in both less coal and renewable energy use […] the quantity of methane leaked may ultimately determine whether the overall effect is to slightly reduce or actually increase cumulative emissions […] only climate policies bring about a significant reduction in future emissions from US electricity generation … We conclude that increased natural gas use for electricity will not substantially reduce US GHG emissions, and by delaying deployment of renewable energy technologies, may actually exacerbate the climate change problem in the long term.
Similarly, another 2014 study found that based on the latest estimates of methane leakage rates from natural gas drilling, replacing coal with natural gas provides little in the way of climate benefits. Though it’s been touted as a ‘bridge fuel’ to span the gap between coal and renewables, this research suggests natural gas isn’t significantly better than coal in terms of global warming effects, and thus may not be suitable for that purpose. The ‘bridge’ doesn’t appear to achieve its goal of steadily cutting our greenhouse gas emissions.

Renewables and batteries are starting to beat natural gas
California has been a national leader in clean energy. The state generates very little of its electricity from coal, but natural gas does supply more than a third of the state’s power. A quarter is generated by renewable sources like wind, solar, and geothermal plants, and another 10% comes from hydroelectric dams, on average. In 2017, renewables’ share increased by about 10%, displacing natural gas in the process.
In fact, California has an excess of natural gas power generation capabilities. Some natural gas plants are still essential for ensuring local grid reliability, but in many cases, clean energy resources like a combination of solar and storage can meet reliability needs.
In one recent example, the California Public Utilities Commission (CPUC) ordered Pacific Gas & Electric (PG&E) to procure energy storage (batteries) or “preferred resources” (renewables or increased efficiency and conservation) to meet a local reliability need in northern California. The order stemmed from an issue with a “peaker” natural gas plant (so-called because they switch on to meet high, peak electricity demand) operated in northern California. The operator (Calpine) was concerned that the plant was no longer economical, because it’s too infrequently used due largely to an abundance of renewable power. The contract they could receive for providing generation capacity to ensure grid reliability would not be high enough to cover costs to maintain the plant.
Instead of bidding their plant into the program overseen by the CPUC to ensure local reliability, Calpine went directly to the California Independent System Operator (CAISO) and requested a “reliability must-run resource” contract, which is a much higher payment than they would have received through the CPUC program. CPUC decided instead to require PG&E to fill the local reliability need with cleaner alternatives. The costs of renewable energy and battery storage have fallen so fast that the clean alternatives might now be cheaper than gas.
In another example, a proposed natural gas peaker plant in Oxnard, California was rejected when it was shown that the CAISO was using outdated battery storage costs from 2014. Given how quickly those prices have fallen, they could now potentially be competitive with natural gas peaker costs.
The redundancy and potential replacement of natural gas with cleaner alternatives extends far beyond these examples. Most electrical service providers in California are now required to develop integrated resource plans. These are electric grid planning documents that outline how the utilities will meet a number of California’s goals, including a 40% reduction in carbon pollution below 1990 levels and 50% electricity production from renewable sources by 2030. Meeting these goals will require replacing non-critical natural gas plants with renewable power.
And California is already installing battery storage systems at record pace. Tesla, AES Energy Storage, and Greensmith Energy Partners have all installed large battery storage facilities in California within the past year. Within 4 years, batteries are projected to be as cheap as natural gas “peakers,” and consistently cheaper with 10 years.

We need a fast transition
It’s important to bear in mind that power plants built today can continue to operate for decades to come. The decisions we make for today’s grid are long-lasting. That’s why there are similar pushes from groups in Michigan, Oregon, Connecticut, North Carolina, and South Carolina for utilities to scrap plans for new natural gas plants and instead consider cleaner and potentially cheaper renewable alternatives. Renewables also don’t face the uncertainty associated with fluctuating natural gas prices.
Of course, were there a national price on carbon pollution, renewables and battery storage would win in the marketplace even sooner. As it stands, natural gas prices don’t reflect the costs that we incur from the climate change caused by their greenhouse gas emissions. Nevertheless, as Union of Concerned Scientists senior energy analyst Laura Wisland put it,
Fortunately, rapidly falling costs are already making renewables and battery storage cost-competitive with natural gas, and cheaper than coal. If we’re going to succeed in avoiding the most dangerous climate change consequences, that transition away from all fossil fuels and towards clean energy can’t happen soon enough.
Links

Plunging Costs Make Solar, Wind And Battery Storage Cheaper Than Coal

RenewEconomy

The plunging cost of storage, along with that of wind and solar power, appears to have crossed a new threshold after a tender conducted by a major US energy utility suggests “firm and dispatchable” renewables are now cheaper than existing coal plants.
The stunning revelation came from Xcel Energy in Colorado, and quietly released over the Christmas/New Year break, although some outlets like Vox and Carbon Tracker were quick to pick up on the significance.
Last year, XCel Energy put out a “request for proposals” (RFP) for how it could replace two coal-fired generators that it is considering shutting down – part of a plan that will take its share of renewables to more than 50 per cent.
The results were described by Vox’s David Roberts as “mind-blowing”. And he’s not wrong.


The median bid price for projects proposing a mix of wind plus battery storage was just $US21/MWh ($A25.80/MWh), while the median price for solar plus battery storage projects was just $US36/MWh ($A44.30/MWh).
(The graph above comes from the XCel documents. The areas blacked out were done by the utility for reasons of commercial in confidence).
And these prices do not represent just a few one-off, left field offers. All told, there were more than 100 bids combining wind and solar, or both, with battery storage, and 20 gigawatts of such capacity.
The “median” means that half the bids were cheaper than the median price cited above.
According to Carbon Tracker, these are the lowest renewables plus battery storage bids in the US to date, and most likely anywhere in the world.
“The median bid for wind plus storage appears to be lower than the operating cost of all coal plants currently in Colorado, while the median solar plus storage bid could be lower than 74 per cent of operating coal capacity,” it noted in a report earlier this month.
(See graph below. This shows that the operating cost of the cheapest coal plants in Colorado is just below $US40/MWh, rising to more than $40/MWh and then soaring beyond $100/MWh for the most expensive units)

The significance of the tender result is the small additional cost of storage – between $US3 and $US7/MWh. This is less than half the $US15/MWh priced in the previous lowest bid – $US45/MWh for solar and storage in a bid accepted by Tucson Energy easier last year.
The cost of wind without storage was $18/MWh, while the cost of solar without storage was $29/MWh – both prices benefit from federal tax incentives, and would likely be around $US25/MWh and $US40/MWh without them.
The significance for Australia is enormous. The battery storage sector has only just commenced, but the potential is clearly huge.
The success of the Tesla big battery in South Australia since its launch in early December has created great interest, and caused many to think how the operations of the electricity grid may be completely rethought and redesigned.
The Tesla big battery will be joined by numerous other battery storage installations in a relatively short time – smaller battery arrays in Alice Springs and near Cooktown in Queensland are due to come on soon, as will another battery at the Wattle Point wind farm in South Australia.
This will be followed by three new battery storage arrays in Victoria, another in the Northern Territory, at least two more in South Australia (Lincoln Gap and Whyalla) and numerous other potential projects in Queensland and NSW.
It was interesting that Franck Woitiez, the head of Neoen Australia which operates the Tesla big battery adjacent to its Hornsdale wind farm, last week spoke of the huge pipeline of solar projects in NSW – more than 2,000MW – that could readily adopt battery storage.
Woitiez noted how quickly the 150MW Coleambally solar project in western NSW will be delivered – less than two years after the project was first conceived – and said large scale solar and storage could be delivered in half the time, and at a much lower cost, than the massive Snowy 2.0 pumped hydro scheme.
The US tender bears that out. Wind, solar and storage costs in the US tend to be cheaper than in Australia, partly due to the lower cost of finance, the lower cost of labour, and the depth of the industry there.
The Xcel tender results are just part of story that illustrates the plunging cost of wind, solar, and battery storage. Bids of below $US20/MWh for solar projects have now been delivered in both Saudi Arabia and Mexico, and storage is matching predictions that its cost profile will be similar to solar.
The Xcel tender elicited bids for stand alone battery storage with a media price of $US11/MWh, with storage ranging from
As Vox’s Roberts notes, a company called ViZn Energy Systems, which uses flow batteries rather than lithium-ion, is promising $US27/MWh solar+storage by 2023.
That is lower than many predictions for solar alone. When the Tucson bid results were announced, it was considered to be a death knell for the market for new gas plants.
As Danny Kennedy, formerly on Sungevity and now head of the California Clean Energy Fund, has noted, both GE and Siemens have taken an axe to their once enormous gas generation units because of the massive slump in orders because renewables and storage are beating out gas plant in tenders.
Now that the cost of wind or solar plus storage is beating out existing coal, that takes the market transition to a whole new dimension.

Links

29/01/2018

Biomining The Elements Of The Future

The Conversation

Joey Kyber/Pixels, CC BY-SA
Biomining is the kind of technique promised by science fiction: a vast tank filled with microorganisms that leach metal from ore, old mobile phones and hard drives.
It sounds futuristic, but it’s currently used to produce about 5% of the world’s gold and 20% of the world’s copper. It’s also used to a lesser extent to extract nickel, zinc, cobalt and rare earth elements. But perhaps it’s most exciting potential is extracting rare earth elements, which are crucial in everything from mobile phones to renewable energy technology.
The Mary Kathleen mine, an exhausted uranium mine in northwest Queensland, contains an estimated A$4 billion in rare earth elements. Biomining offers a cost-effective and environmentally friendly option for getting it out.
Biomining is so versatile that it can be used on other planetary bodies. Bioleaching studies on the international space station have shown microorganisms from extreme environments on Earth can leach a large variety of important minerals and metals from rocks when exposed to the cold, heat, radiation and vacuum of space.
Some scientists even believe we cannot colonise other planets without the help of biomining technologies.

How does it work?
Microorgaisms in tanks leach the minerals from any source material. Courtesy of Pacific Northwest National Laboratory.
Biomining takes place within large, closed, stirred-tank reactors (bioreactors). These devices generally contain water, microorganisms (bacteria, archaea, or fungi), ore material, and a source of energy for the microbes.
The source of energy required depends on the specific microbe necessary for the job. For example, gold and copper are biologically “leached” from sulfidic ores using microorganisms that can derive energy from inorganic sources, via the oxidation of sulfur and iron.
However, rare earth elements are bioleached from non-sulfidic ores using microorganisms that require an organic carbon source, because these ores do not contain a usable energy source. In this case, sugars are added to allow the microbes to grow.
All living organisms need metals to carry out basic enzyme reactions. Humans get their metals from the trace concentrations in their food. Microbes, however, obtain metals by dissolving them from the minerals in their environment. They do this by producing organic acids and metal-binding compounds. Scientists exploit these traits by mixing microbes in solution with ores and collecting the metal as it floats to the top.
The temperature, sugars, the rate at which the tank is stirred, acidity, carbon dioxide and oxygen levels all need to be monitored and fine-tuned to provide optimal working conditions.

The benefits of biomining
Traditional mining methods require harsh chemicals, lots of energy and produce many pollutants. In contrast, biomining uses little energy and produces few microbial by-products such as organic acids and gases.
Because it’s cheap and simple, biomining can effectively exploit low grade sources of metals (such as mine tailings) that would otherwise be uneconomical using traditional methods.
Countries are increasingly turning to biomining such as Finland, Chile and Uganda. Chile has exhausted much of its copper rich ores and now utilises biomining, while Uganda has been extracting cobalt from copper mine tailings for over a decade.

Why do we need rare earth elements?
The rare earth elements include the group of 15 lanthanides near the bottom of the periodic table, plus scandium and yttrium. They are widely used in just about all electronics and are increasingly sought after by the electric vehicle and renewable energy industries.
The unique atomic properties of these elements make them useful as magnets and phosphors. They’re used as strong lightweight magnets in electric vehicles, wind turbines, hard disc drives, medical equipment and as phosphors in energy efficiency lighting and in the LEDs of mobile phones, televisions and laptops.
Despite their name, rare earth elements are not rare and some are in fact more abundant than copper, nickel and lead in the Earth’s crust. However, unlike these primary metals which form ores (a naturally occurring mineral or rock from which a useful substance can be easily extracted), rare earth elements are widely dispersed. Thus to be economically feasible they are generally mined as secondary products alongside primary metals such as iron and copper.
Over 90% of the world’s rare earth elements come from China where production monopolies, trade restrictions and illegal mining have caused prices to fluctuate dramatically over the years.
Most renewable energy technologies depend on rare earth metals. Pixabay
Reports from the US Department of Energy, European Union, and the US intelligence commission have labelled several rare earth elements as critical materials, based on their importance to clean energy, high supply risk, and lack of substitutes.
These reports encourage research and development into alternative mining methods such as biomining as a potential mitigation strategy.
Heeding these calls, laboratories in Curtin, and Berkeley Universities have used microorganisms to dissolve common rare-earth-element-bearing minerals. These pilot scale studies have shown promising results, with extraction rates growing closer to those of conventional mining methods.
Because most electronics have a notoriously short lifespan and poor recyclability, laboratories are experimenting with “urban” biomining. For example, bioleaching studies have seen success in extracting rare earth elements from the phosphor powder lining fluorescent globes, and the use of microorganisms to recycle rare earth elements from electronic wastes such as hard drive magnets.
The rare earth elements are critical for the future of our technology. Biomining offers a way to obtain these valuable resources in a way that is both environmentally sustainable and economically feasible.

Links

CEFC Finances Solar Farm At Coleambally In NSW

Climate Leadership Report

The Clean Energy Finance Corporation has committed $30 million in debt finance to the 150MW (AC) Coleambally Solar Farm being developed by Neoen Australia. The solar farm will be the largest in NSW.
The Coleambally Solar Farm is five kilometres north east of Coleambally, and 70 kilometres south of Griffith.
It will consist of about 565,000 solar panels on 550 hectares and is expected to generate enough electricity to power more than 50,000 homes, while abating about 300,000 tonnes of carbon emissions annually.
The project has contracted 70 per cent of its output to EnergyAustralia.
The Coleambally site was chosen after a feasibility assessment confirmed there was an abundant solar resource at the location, which also has an existing electricity substation with grid connection capacity.
Up to 300 workers are likely to be employed during the construction phase, which is expected to take around nine months.
Neoen's Parkes Solar Farm
During the past 12 months, the CEFC has worked with developer Neoen Australia to accelerate large-scale solar capacity in regional NSW, providing debt finance for four projects that will deliver an additional 260MW (AC) of renewable energy capacity.
The CEFC has provided $150 million in debt finance to Neoen solar farm developments in Dubbo, Griffith and Parkes.
The Griffith and Parkes solar farm projects are now fully built and are undergoing commissioning, exporting increasing amount of renewable electricity into the national electricity grid as commissioning progresses.
Full-scale commercial operation is expected before the end of February.

Links

Analysis: The Climate Papers Most Featured In The Media In 2017

Carbon BriefRobert McSweeney

The Altmetric score provides an indicator of the attention the paper received, combining data from social media, news outlets, blogs and elsewhere (not all shown).  Carbon Brief
Every day, dozens of scientific journals publish new climate change research that is shared across the world via the internet.
These journal papers make headlines in news articles and on blog pages, they pop up in Twitter timelines and on Facebook. But which ones make the biggest impression? Which have been shared and reported most widely?
Carbon Brief has compiled its annual list of the 25 most talked-about climate change-related papers of the previous year. The infographic above shows which ones made it into the Top 10 in 2017.
Our analysis is based on the data collected by Altmetric, which tracks and scores journal papers by the number of times they’re mentioned in online news articles and on social media platforms. (You can read more about how the Altmetric scoring system works in an earlier article.)

First place
The most widely reported and shared article related to climate change last year was actually a “Policy Forum” commentary in the journal Science. Published in mid-January, “The irreversible momentum of clean energy” was covered by 232 news articles and tweeted more than 9,000 times. Its overall Altmetric score of 7,872 means it is the highest ranked of any article published last year.
This is no surprise, perhaps, considering the author was Barack Obama, who, at the time, was still the US president. But as the article is a commentary, it does not make it into Carbon Brief’s leaderboard of research papers.
Instead, first place goes to, “Global warming and recurrent mass bleaching of corals”, a Nature paper published in March, with a score of 3,166.
The study, led by Prof Terry Hughes of the ARC Centre of Excellence for Coral Reef Studies in Australia, assessed the impact of coral bleaching events in 1998, 2002 and 2016 on the Great Barrier Reef. As Carbon Brief reported, the study concluded that “immediate global action to curb future warming” is essential if coral reefs are to survive.
Global warming and recurrent mass bleaching of corals”. Credit: Rosamund Pearce, Carbon Brief
This paper was the 30th most talked about of all journal articles published last year. It was picked up by 395 news stories in 245 outlets – including the Guardian, Washington Post, CNN, MailOnline and the New York Times (both as a news article and in an editorial). It was also referenced in 1,806 tweets – more than any other paper in our Top 25 – 47 blog posts and on 27 public Facebook pages.

The Top 5
Coming second is, “Biological annihilation via the ongoing sixth mass extinction signaled by vertebrate population losses and declines”, published in the Proceedings of the National Academy of Sciences of the United States of America (or “PNAS” for short) with an Altmetric score of 2,845.
The study, led by Dr Gerardo Ceballos of the National Autonomous University of Mexico, found that the Earth’s “sixth mass extinction” is well underway and has “proceeded further than most assume”.
Analysing nearly half of the Earth’s known vertebrate species, the researchers concluded that “habitat loss, overexploitation, invasive organisms, pollution, toxification, and more recently climate disruption” have led to “catastrophic declines in both the numbers and sizes of populations of both common and rare vertebrate species”.
The paper was tweeted 1,583 times and covered by 269 news stories, including in the Atlantic, Sun, Guardian, USA Today, CNN and the Washington Post. It was also posted on 96 Facebook pages, giving the paper the highest score for Facebook of any in the Top 25.

Taking third place with a score of 2,614 is the Nature Climate Change paper, “Global risk of deadly heat”, by lead author Dr Camilo Mora from the University of Hawai’i.
As Carbon Brief reported back in June, the study suggested that up to three quarters of the world’s population could be at risk from deadly heat extremes by the end of the century if global greenhouse gas emissions are not curbed.
The research garnered headlines in 244 news stories from 191 outlets, including Le Monde, the Independent, Der Spiegel and the Huffington Post – and an editorial in Nature. It was tweeted 1,220 times and posted 49 times on Facebook.
The study also appears to have been quoted frequently in later news articles on heatwaves, such as these pieces in the MailOnline, Business Insider and Vice.
Completing the Top 5 are, “Estimating economic damage from climate change in the United States”, in Science, by lead author Dr Solomon Hsiang of the University of California at Berkeley and researchers at the Climate Impact Lab and, “Widespread Biological Response to Rapid Warming on the Antarctic Peninsula”, in Current Biology, led by Dr Matt Amesbury of the University of Exeter.
The latter study generated the same number of news stories at the first placed paper (395), but was tweeted just 147 times – the third lowest total of the Top 25. Interestingly, the Altmetric scores of both papers are more than 2,000, which would have put them second place in Carbon Brief’s 2016 list and first in the 2015 one.

Elsewhere in the Top 10
Just missing out on the Top 5 is, “Assessing recent warming using instrumentally homogeneous sea surface temperature records”, published in Science Advances, in sixth place.
The paper’s lead author is Carbon Brief’s US analyst Zeke Hausfather. The study, published in early January before Hausfather joined Carbon Brief, uses the latest sea surface temperature (SST) data to see which of the major global temperature datasets best captures the rate of warming in recent decades.
As Carbon Brief reported at the time, the study found that National Oceanic and Atmospheric Administration’s (NOAA) most recent dataset matched Hausather’s record closely, and that the other datasets underestimated recent warming.
While the study generated a substantial amount of news coverage when it was published, it received a subsequent bounce when NOAA’s SST record became the centre of an alleged “whistleblower” article in the Mail on Sunday, which accused NOAA of manipulating climate data to show more warming in recent years.
As Hausfather explained in a guest post for Carbon Brief, NOAA’s data had been independently verified by his Science Advances study and the Mail on Sunday’s piece “in no way changes our understanding of modern warming or our best estimates of recent rates of warming”.
Multiple responses to the Mail on Sunday article brought another flurry of news articles, including in the Washington Post, New York Times and, ironically, in an Associated Press article that was reposted by the MailOnline.
(The Independent Press Standards Organisation subsequently ruled that the Mail on Sunday article was “significantly misleading” and required the newspaper to publish a correction.)
The Top 10 also includes, “Assessing ExxonMobil’s climate change communications (1977–2014)”, published in Environmental Research Letters by Dr Geoffrey Supran and Prof Naomi Oreskes of Harvard University.
The study, coming in seventh place, found that ExxonMobil contributed to advancing climate science through its scientific publications, while simultaneously promoting doubt in paid, editorial-style advertisements in the New York Times. The conclusion that ExxonMobil “misled the general public” on climate change was reported in many major news outlets.
Completing the Top 10 is, “Less than 2C warming by 2100 unlikely”, in Nature Climate Change by lead author Prof Adrian E Raftery from the University of Washington.
The study used statistical forecasts to show there is a 5% chance of keeping global warming to less than 2C above pre-industrial levels this century – and just 1% of staying below 1.5C. This stark conclusion was reported in 185 news articles last year.

Honourable mentions
As our list of the most talked about climate papers in 2017 comprises 25 articles, here are a few honourable mentions of those that fall outside the Top 10.
In 11th place is, “The Lancet Countdown on health and climate change: from 25 years of inaction to a global transformation for public health”, published – unsurprisingly – in the Lancet.
The paper is from a Lancet project involving 24 academic institutions and intergovernmental organisations from across the world. It will release a report tracking progress on climate change and global health every year, of which this is the first.
As Carbon Brief reported from the study’s press conference, the authors said the effect of climate change on human health is now so severe that it should be considered “the major threat of the 21st century”.
Landing in 12th is, “Increased light, moderate, and severe clear-air turbulence in response to climate change”, in Advances in Atmospheric Sciences.
The study’s author, Prof Paul Williams from the University of Reading, wrote Carbon Brief a guest article on his research. Williams described how severe air turbulence “is set to become twice or even three times as common by the latter half of the century”.

Interestingly, two other papers on climate change and air travel also appear in the Top 25 – “The impacts of rising temperatures on aircraft takeoff performance”, in Climatic Change in 18th place and, “Global Response of Clear-Air Turbulence to Climate Change”, in Geophysical Research Letters in 25th. The latter is a follow up to Williams’s study in 12th place.
In 13th place is, “Emission budgets and pathways consistent with limiting warming to 1.5C”, published in Nature Geoscience, which estimated the remaining carbon budget for holding global temperature rise to no more than 1.5C above pre-industrial levels.
In a Carbon Brief guest post, lead author Dr Richard Millar from the University of Oxford explained the study’s findings that “we have a little more breathing space than previously thought to achieve the 1.5C limit”.
The paper caused quite a stir, with parts of the media claiming that climate models – the basis for carbon budget estimates – have overstated the observed warming of the planet. Carbon Brief factchecked these claims. The authors also published their own response in a follow-up guest article.
Twenty-third place goes to, “Influence of high-latitude atmospheric circulation changes on summertime Arctic sea ice”, in Nature Climate Change. As Carbon Brief reported when the paper was published in March, the study found that rising greenhouse gas emissions are responsible for at least half, possibly up to two-thirds, of the decline in summer sea ice in the Arctic since the late 1970s – with the remaining contribution a result of natural fluctuations.
And just sneaking in to the Top 25 is, “Coupling of pollination services and coffee suitability under climate change”, in PNAS in 24th place.
As Carbon Brief reported in September, the study warns that the Latin American coffee industry faces losses in suitable farmland and declines in important bee species – which play a key role in pollinating coffee plants – as a result of future climate change.
If you want a closer look at the final scores, we’ve compiled all the data for the Top 25 climate papers of 2017 in this spreadsheet.

Journals
Finally, a look at which journals the Top 25 papers were published in shows that Nature comes out on top with four, followed by Nature Climate Change and Science Advances with three each.

Chart by Carbon Brief using Highcharts

As the chart above shows, there is quite a spread of different journals – 15 in total, compared to 11 last year. And while the 2016 list had six papers each in the Top 25 for the journals Science and Nature Climate Change, the placings are shared more evenly across the journals in 2017.

Links

Banks Slash Coal Loans By 50 Per Cent As Investor Pressure Mounts

Fairfax

Australia's big banks slashed loans to fossil fuel companies by almost a fifth in 2017, including a 50 per cent drop in their coal mining exposure, new analysis shows, as investors and regulators ramp up pressure over climate change risks.
ANZ Bank, National Australia Bank, Westpac and Commonwealth Bank's combined loans to coal miners slumped by about $1.5 billion, or more than 50 per cent per cent, according to analysis of bank disclosures from environmental finance group Market Forces.
Banks are cutting their exposure to fossil fuels, especially coal. 
The analysis also showed declines in lending to oil and gas extraction and coal-fired power stations. On an underlying basis, the figures suggest a decline of 18.5 per cent in the big four's fossil fuel exposure.
While banks have acknowledged the broad trend, the extent of the fall highlights the change that is occurring as companies face growing scrutiny on climate risks from big investors including superannuation funds.
"Given that this is the second year running that the banks' reported exposures to the fossil fuel sector have fallen by around 15 per cent, it represents a huge drop overall. But in the context of the bank's commitments on climate change, it is no less than what you would expect to see," said Market Forces executive director Julien Vincent.
The figures from Market Forces, an affiliate of Friends of the Earth Australia, show that as well as cutting fossil fuel financing, the lenders had boosted exposure to renewable energy by 20 per cent, or about $1.8 billion.
The analysis found Westpac fossil fuel exposure fell by a third to $5.5 billion, while Commonwealth Bank's total exposure was $9.5 billion, which it estimates as a fall of about a fifth. CBA does not publish like-for-like data from 2016, which prevented an exact comparison, but its chair Catherine Livingstone told shareholders last year coal exposure was falling and would continue to do so.
It found ANZ's exposure fell 15 per cent, though an ANZ spokesman pointed to different data that said total exposure to fossil fuels fell 12.2 per cent, to $12.9 billion. ANZ chairman David Gonski told shareholders last month that its exposure to coal had slumped by more than 50 per cent in the last two years, and it had not funded any new coal fired power stations in 2017.
The analysis shows National Australia Bank's exposure also fell in each sector except for oil and gas extraction, where it increased by $3.1 billion, to $7.4 billion. It is understood this occurred because of a short-term financial product, rather than a loan, and the exposure was flat on an "underlying" basis.
The industry-wide decline in fossil fuel financing comes as investors and regulators step up pressure on banks to consider the risk of holding assets that could become "stranded" as the world moves towards renewable energy.
Underlining this pressure, the world's largest investment manager, Blackrock, last week signalled that companies it invests in would be pushed to consider climate change risks.
Vice-chairman of the investment giant, Philipp Hildebrand, told a session in Davos that its asset owner clients were increasingly demanding products that took into account climate risks, and that trend was likely to take off.
“We’re going to have our licence to operate withdrawn if you’re a company that doesn’t pay attention to these things. This is I think is a new development,” Mr Hildebrand said on a panel session.
Tim Buckley, energy finance analyst at the Institute for Energy Economics an Financial Analysis, said Australian banks had been relative laggards on climate risks, but they were catching up to lenders overseas.
Pointing to Blackrock's comments and a recent move by Lloyds of London to stop investing in coal, Mr Buckley said there was clear global momentum for further cuts in banks' fossil fuel exposure.
"It does create a bit of a snowball, when you've got the major banks, you've got the insurers, and now you've got the biggest investor in the world moving," Mr Buckley said.

Links