Wednesday, 20 February 2019

Investigating the Southern Ocean: Part 1…by Carol Arrowsmith

Carol organising her equipment at BAS prior to departure
In a few days I will be embarking on my leg of the major NERC project called ORCHESTRA (Ocean Regulation of Climate through Heat and Carbon Sequestration and Transport) to collect seawater samples for isotope analysis. My leg is called ANDREX II - Antarctic Deep Water Rates of Export (ANDREX), and is the second time this part of the ocean has been sampled. I will be boarding the RRS James Clark Ross in Punta Arenas and following a stop off in the Falklands will start sampling from the tip of the Antarctic Peninsula along the 60°S parallel and across the Southern Ocean to 30°E, before returning to the Falklands in mid April.


Why are we collecting seawater samples from the World’s oceans?

Since the industrial revolution, the global ocean has absorbed around 30% of anthropogenic (human-produced) CO2 emissions. In addition, 93% of the total extra heat in the Earth system since the onset of global warming has been absorbed by the global ocean. Improving climate prediction requires us to learn more about how the global ocean works, and how it interacts with the atmosphere to control the split of heat and carbon between them, especially given the extra heat and carbon we are currently producing.


The Southern Ocean is key

A key region in this context is the Southern Ocean, the vast sea that encircles Antarctica. The Southern Ocean occupies around 20% of the total ocean area, but absorbs about three-quarters of the heat that is taken into the ocean, and approximately half of the CO2. This is because of its unique pattern of ocean circulation: it is the main region where deep waters rise to the surface, allowing new water masses to form and sink back into the ocean interior. This exposure of “old” waters to the atmosphere, and the production of new waters at the surface, is fundamental to the exchanges of heat and carbon with the atmosphere.


The track of the ANDREXII cruise
Despite knowing the key role that the Southern Ocean plays in global climate, there are many important unknowns. These include how exactly heat and carbon are taken up by the oceans and how fast this occurs (especially important because of the Anthropocene period we are living in), and how much heat and carbon is currently stored in the oceans. These questions are being addressed using various chemical and physical measurements of the ocean, including the stable isotope composition of the seawater (which we are responsible for at the BGS). Oxygen isotopes will tell us about how much freshwater to seawater there is at particular locations (which will help us understand melting of the Antarctic ice mass and therefore heat) and carbon isotopes will tell us where the carbon is formed and how the ocean uses the carbon.  

The ANDREX leg in particular seeks to assess the role of the Weddell gyre in driving the southern closure of the meridional overturning circulation, in ventilating the deep global ocean, and in sequestering carbon and nutrients in the global ocean abyss.


Progress

ORCHESTRA is in the second year of a five year collection programme around the World’s oceans. I will be collecting samples from the RRS James Clark Ross. I will be tweeting @CarolArrowsmith and @ORCHESTRAPROJ and Facebooking (Orchestra project) along the way, as well as updating the BGS Geoblogy. Carol Arrowsmith is a chief technician in the stable isotope facility at the BGS. 




ORCHESTRA (Ocean Regulation of Climate by Heat and Carbon Sequestration and Transports) is a programme funded by NERC and includes partners at the British Antarctic Survey (lead), the National Oceanography Centre, Plymouth Marine Laboratory, and many more including BGS.

Monday, 18 February 2019

How to heat a city…and decarbonise it using heat pumps!...by David Boon and Gareth Farr

With only 11 years to go until the first UK emissions target deadline, the race is now on for the UK to reduce its greenhouse gas emissions by 57% by 2030 of 1990 levels. BGS geoscientists, David Boon and Gareth Farr ask ‘How on earth will we do it?’.

Cartoon illustrating the concept of using shallow urban aquifers and
heat pumps in district heat networks.
Credits: City of Cardiff Council/BGS/WDS Green Energy Ltd
In our first blog titled 'How to Heat a City', we announced our intention to install a pilot open loop Groundwater Source Heat Pump (GWHP), as part of an InnovateUK funded feasibility study to better understand how UK shallow aquifers can supply low carbon heating in urban areas. The recently established Cardiff Urban Geo Observatory hosts the GWHP pilot, which is conveniently surrounded by a world class groundwater monitoring network with over 100 temperature sensors in 60 boreholes, providing high resolution baseline data. Groundwater in urban areas can be slightly warmer than rural areas, due to the ‘subsurface Urban Heat Island effect’, and this human footprint actually makes heat pumps run more efficiently.

The pilot GWHP scheme is a collaboration between WDS Green Energy, City of Cardiff Council and the British Geological Survey. The physical infrastructure comprises two shallow boreholes (~20 m deep) that abstract and simultaneously re-inject shallow groundwater from an ice-age sand and gravel aquifer which underlies much of the city.

The heat pump system works by passing groundwater through a heat exchanger where 2 Kelvin degrees of its thermal energy is transferred to a heat pump which uses a gas phase-change to raise the water temperature to a useable 46oC. The heat pump keeps the inside of the school at a comfortable 22oC using a renewable energy resource that has low carbon emissions and is cheaper to run than the old gas boilers. 

As part of the InnovateUK project we fully instrumented the heat pump system with sensors above and below ground to allow us to follow its long-term environmental impact and energy performance. The project proved this technology could be scaled-up across Cardiff and other UK cities with similar shallow aquifers, where the geology allows.

The GSHP pilot plant room and associated monitoring and data telemetry. (Credit: BGS-UKRI)
Shallow aquifers can be a super-efficient way to run a heat pump as the borehole pumps do not require much energy to lift the water compared with deeper schemes. The energy efficiency of our pilot GSHP system is around 450 %, over four times more efficient than a condensing gas boiler, and is actually saving the council money on its energy bills!  And what about that all important CO2 target? …well after 3 years' the pilot GWHP had reduced the cost of heating with a 35 % reduction in CO2 emissions from the school overall.

The project has also had impact on energy policy in Welsh Government and was featured as a case study in a recent National Assembly for Wales Low Carbon Heat Research Briefing. The Cardiff Urban Geo-Observatory, which includes the heat pump monitoring pilot, has also been selected along with the Glasgow UKGEOS site as a pilot area for a new 3-year EU GeoERA project called MUSE (Managing Urban Shallow Geothermal Energy), and as a European Plate Observing System (EPOS) site.

CO2 emissions reductions resulting from switching from a gas boiler to a
shallow groundwater source heat pump.
Data courtesy of Cardiff City Council.
We will continue to monitor the environmental impact of the scheme on the aquifer source, and the BGS Geothermal Team is keen to support other UK cities in their journey to explore their range of geothermal resources. The project has also attracted lots of interest from industry keeping the project’s Principal Investigator, David Boon, busy giving presentations to stakeholders such as the IEA Heat Pump Technologies Annex 52 meeting in London in September 2018, seminars for Construction Excellence Wales and APSE Energy. The research findings will feature in a new CIBSE Code Of Practice (CP3) and in peer-reviewed papers.

Although open-loop ground source heat exchangers are not suitable in all geological environments, we have been working to understand the wider 3D geological and hydrogeological settings at a city-scale, with the release of a 3D superficial geology model of Cardiff. This evidence will allow better ‘above’ and ‘below-ground’ planning and regulation, and will (we hope) stimulate market growth in renewable energy systems and supply chains.  More demonstration projects like this are needed to improve the image and public perception of renewables. Local authorities can invest in renewable energy such as heat pumps using interest-free Government-backed finance schemes like the SALIX finance and the Renewable Heat Incentive (RHI) scheme. Case studies such as ours can give society and business the confidence to invest in shallow geothermal technologies to accelerate the energy transition.

Acknowledgements
David Boon and Gareth Farr have jointly managed and overseen the creation of the Cardiff Urban Geo Observatory (2014 – 2018).  The project has evolved naturally out of City Region Geoscience project (under former Chief Geologist Wales, Dave Schofield), the 2015 InnovateUK feasibility project (led by David Boon), and by listening to local stakeholders in Wales. Massive thanks go out to the project team: Ashley Patton, David Schofield, Alan Holden, Rhian Kendall, Laura James, Steve Thorpe, Corinna Abesser, Johanna Scheidegger, Jon Busby, Susanne Self, BGS Dando Drilling Facility, and others. Key partners are Cardiff Harbour Authority, City of Cardiff Council, WDS Green Energy, David Tucker (Nu Vision Energy (Wales)), and Innovate UK/ BEIS. @BGSWales

Wednesday, 13 February 2019

BGS and Heriot-Watt Partnership in Action: Geochemistry and Carbon Burial at the BSRG AGM 2018...by Joe Emmings

Joe visiting Hutton’s Unconformity at Siccar Point during the
BSRG AGM pre-conference fieldtrip
In late December, Joe Emmings (BGS) and Tom Wagner (Heriot-Watt University) convened Geochemistry and Carbon Burial Sessions at the British Sedimentological Research Group (BSRG) AGM. Here Joe tells us about the conference and ongoing research in this area…

Integration of geochemistry and sedimentology is vital to understanding ancient sedimentary deposits as hydrocarbon, metal or aggregate resources, and as records of climate change and carbon burial. For this reason we convened sessions focussed on this topic at the BSRG AGM recently hosted by the Lyell Centre in Edinburgh. BSRG this year involved around 300 delegates from across the UK and overseas presenting and discussing ongoing research in sedimentology and related fields.

About a third of the conference focussed on understanding modern sedimentary processes in a variety of settings, including continental, nearshore and shallow through to deepwater settings. This is based on the principal of uniformitarianism; that ‘the present is the key to the past’. If we better understand modern sedimentary processes, this knowledge is then applied to ancient deposits for a variety of purposes. For example, experimental sedimentology using flume tanks or live monitoring of sediment density flows, can help us better understand reservoir variability, a critical parameter for oil and gas extraction or carbon capture and storage (CCS).

Geochemical research on sedimentary rocks is typically used to understand hydrocarbon reservoir and source potential. Yet this research is not limited to oil and gas research, and is increasingly applicable to a wide range of areas. For example, geochemistry used to understand mechanisms and timing of sandstone diagenesis is important for CCS. Hydrocarbon source rocks, often termed ‘black shales’, are enriched in redox-sensitive metals which can become mobilised and concentrated, as part of mineral systems, to produce important metal deposits. Black shales are the record of ancient basin sinks of large volumes of organic carbon and metals fixed under anoxic conditions. Topics at BSRG included understanding the genesis of stratiform manganese deposits in Cyprus, and focus on the models for anoxia in a variety of ancient settings. Black shale research also helps understand the impacts of global warming on modern marine systems. Modern anoxic settings, such as the Black Sea, are rare and spatially limited compared to some periods in the ancient record. Yet it is highly likely global warming is causing the expansion of modern marine hypoxic ‘dead zone’ phenomena. Therefore ancient anoxic ‘events’ are potential analogues to ‘dead zones’. In this respect, this is the principal of uniformitarianism but in reverse. Through this research we can better understand timings, spatial extents and impacts of these anoxic events.

BSRG also hosted for the first time a special session bridging the gap between sedimentology and society. The session included presentations on microplastics in the natural environment, sedimentary geohazards and energy storage in sedimentary reservoirs. Many of these applied research areas are likely to become increasingly important if we want realise our global decarbonisation targets as set out in the IPCC Special Report on Global Warming of 1.5°C.

Taken as a whole, the 2018 BSRG AGM shows the importance of sedimentology to a wide range of current and future applications, including energy, metal resources, construction, understanding and mitigating climate change, and ultimately society.

Joe Emmings is a Post-Doctoral Research Associate in Geochemistry at the British Geological Survey’s Stable Isotope Facility and Centre for Environmental Geochemistry. Please contact Joe if you are interested in his research field at josmin65@bgs.ac.uk

Tuesday, 5 February 2019

Seeing is knowing: From physics, philosophy, and Shakespeare to a new set of geological visualisation models for the UK...by Katie Whitbread

Central England Geological 3D  Model
Observations are the root of scientific knowledge. Because we can observe distant galaxies in the vast expanse of space, we also can learn about the laws that control our universe and uncover our solar system’s fourteen billion year history. Perhaps the relationship between seeing and knowing is too obvious to be particularly striking. But turning the premise on its head leads to some interesting questions; What do we know about things we cannot see? What can we know about things that are beyond our capacity to observe? It is easy to shrug our shoulders and assume that things we can’t see don’t matter. But what if it is important that what is out of sight is not also out of mind?

Take the ground beneath our feet – of course we know it is there, for the most part we can take its firmness and solidity for granted. So long as it doesn’t give way we can go about our business. Job done. But what is the ground beneath our feet made of? What actually happens down there? The answers really do matter…

What the ground is made of has far reaching impacts, stretching well beyond giving us a place to stand (and build). The Earth supplies much of our energy, all our resources of metal, aggregates, and stone, a lot of our salt and water, and (apart from the space junk we leave floating in orbit) it must also store all our material waste. Society is not only built on the earth, it is built from it. So we all have a stake in our planet.

South Wales Geological 3D Model
For many decades, conversations about rocks, sediments and landscapes used to be held largely in the relatively ‘niche’ domains of science and industry. Now, our search for energy, and decisions about how we manage our environment and resources are increasingly matters of public debate. Geology matters to us all.

As the British Geological Survey, our role is to ‘shine a light’ into the subsurface – using a range of tools for investigating what is hidden below ground. This helps us deliver targeted insights to support industry, regulators and planners, but it doesn’t stop there. We also deliver open access resources designed to provide everyone and anyone with the opportunity to see into the subsurface, to learn more about what is down there, and how we know.

With the digital revolution in geoscience, 3D geological models are now offering enhanced visualisation of the subsurface – literally allowing us to see into the ground. This mysterious hidden domain that was once largely conceptualised within the minds of geologists, and depicted in codified form on geological maps, is at last being opened up and revealed to everyone.

Marking a new stage in our delivery of open access resources for the UK, BGS have now launched a new set of 14 Regional Geological Visualisation Models, developed as 3D pdf ‘documents’ and available from the BGS website. The models, initially covering England, Wales and Northern Ireland (with models for Scotland planned), reveal the 3D geology of the UK by linking the UK3D national fence diagram of cross-sections with 1:625 000 bedrock geological maps. A range of interactive tools have been designed to allow users to navigate and explore, revealing the hidden structure of the world beneath us.


So this is an invitation to take a look, to learn about your world, to see the places you know from a new angle. Shakespeare wrote that “all the world’s a stage” – to really know this ‘set’ in which we go about our lives, to understand the myriad of relationships that are part of our story, we first need to be able to see the Earth.  So get stuck in… and when you do take look at the models – tell us how you find them, and help us see our way to making the world beneath our feet more visible. 

Monday, 4 February 2019

Can we use carbon isotopes to tell us about past levels of CO2 in the atmosphere?...by Barry Lomax and Melanie Leng

This carbon cycle diagram shows the storage and annual exchange of carbon
between the atmosphere, hydrosphere and geosphere in gigatons - or billions of
tons - of carbon (GtC). Carbon isotopes of plant materials from the geological
record have been used in the past to predict past CO2. Our research suggests
more work needs to be done to understand this proxy.
This is a public domain image from Wikipedia.
Dr Barry Lomax and Prof Melanie Leng are isotope geochemists who work on understanding how the isotopic composition of environmental materials can tell us about past environments. Here they blog about their new paper, available via open access in the premiere geochemistry journal (Geochimica et Cosmochimica Acta), co-authored by Dr Janice Lake and Dr Phillip Jardine on the use of carbon isotopes in plant materials to predict atmospheric CO2. The paper sets out to test this relationship to determine if it could be used as a tool for estimating changes in atmospheric CO2 concentrations through geological time.
  
Understanding both the long term carbon cycle and rapid perturbations in atmospheric CO2 observed through the geological record has become increasingly important as we enter a period of human induced carbon cycle variations especially the rapidly increasing atmospheric CO2. A major limiting step in understanding the climate system sensitivity to changes in atmospheric CO2 over geological time has been the variability in modelled solutions of “palaeo-CO2” concentration which vary considerably between different types of modelling solutions. To validate estimates of past CO2 we need CO2 proxies, those currently most commonly used include the study of the numbers of stomata in fossil leaves, or the carbon isotope chemistry of fossil organisms. The carbon isotope composition of fossil plant material is thought to provide a direct measure of the past atmospheric CO2 concentrations through geological time, however there is extensive scientific controversy and debate in the literature. We set about to validate this approach which if successful would allow us to assess changes in amount of CO2 in the atmosphere from the dawn of vascular plants in the Lower Palaeozoic which would help us predict the responses of modern plants to changing CO2 concentrations.

From a physiological standpoint changes in carbon isotope composition of plant tissue is linked to changes in water use efficiency of the plants that are ultimately controlled by the opening and closure of the stomatal pores which regulates gas exchange. For carbon isotopes to be used as an accurate and precise method to reconstruct CO2 the major requirement is to demonstrate that changes in CO2 are the main driver of changes in the carbon isotopes. This relationship needs to be independent of other environmental conditions that can affect the way plants use water such as temperature and specifically the amount of water availability.

Arabidopsis thaliana (common name Thale cress),
a small flowering weed used as a model plant
in many science fields. Copyright Wikipedia.
In our experiment we used Arabidopsis thaliana (common name Thale cress), a small flowering plant native to Eurasia and Africa, where it is considered a weed as it readily colonises roadsides and disturbed land and has a short life cycle. Thale cress has a relatively small genome, and it was the first plant to have its genome sequenced. It is used to understand molecular biology of many plant traits, including flower development and light sensing. Thale cress is an ideal experimental plant as its short life span allows for many experiments to be carried out within a short period of time. In our experiments, the Thale cress was exposed to different watering regimes (low, medium and saturated) and grown over a wide range of CO2 concentrations (from 380 to 3000ppm; the current amount of CO2 in the atmosphere is over 400ppm and climbing rapidly) relevant to conditions what plants would have been subjected to through plant evolution.

We compared the concentration of CO2 in the growing environment to the values predicted by the carbon isotope composition of Thale cress. The data show that there is a wide variation in carbon isotope composition of Thale cress as a function of water availability and actually the amount of CO2 in the atmosphere was less important. In particular there was a strong under prediction of CO2 in experiments designed to simulate very high levels of CO2 in the atmosphere of the Mesozoic and Cenozoic (250 - 2 millions of years ago) eras (≥1500ppm). Our experiment casts doubt on the use of carbon isotopes in plant material as a proxy to reconstruct palaeoatmospheric CO2 and suggests other aspects of the growth environment are probably more important on the carbon isotope composition of plant matter. For now other proxies probably provide more viable data of palaeoCO2, these include changes in stomatal frequency, the Br isotopic composition of foraminifera, the carbon isotopic composition of marine carbon including sedimentary alkenones, dinoflagellate cysts and coccoliths…

Dr Barry Lomax is a lecturer in Environmental Science at the University of Nottingham and his research is focused on quantifying how the Earth's climate has changed over geologic time, how these changes have influenced the Earth's terrestrial biosphere and how in turn the Earth's terrestrial biosphere has influenced climate. Dr Janice Lake is an Independent Research Fellow at the University of Sheffield focussing on plant physiological responses to atmospheric CO2. Dr Phillip Jardine is a researcher at the University of Münster with interests in the fossil pollen and the development of palaeoclimate proxies. Prof Melanie Leng is the Director of the Centre for Environmental Geochemistry at the British Geological Survey and the University of Nottingham and leads a lab group in stable isotope geochemistry. Twitter @MelJLeng

Friday, 1 February 2019

Simulating the human gut – the story of BS ISO 17924:2018...by Joanna Wragg, Mark Cave and Paul Nathanail

Twenty years in the making, the Unified BARGE Method (UBM) of measuring bioaccessibility has become a British and International Standard method used to improve estimates of the risks from ingesting contaminated soil.


As a geological survey, we are interested in the details of soil chemistry and how that affects people’s health.  It’s not just about measuring the total concentration of elements, but more about estimating risks from contaminant absorption into the bloodstream.

Children regularly eat soil, usually accidentally but in some cases deliberately. This soil can contain important nutrients, but also contamination. Ingestion can happen by licking fingers after playing in the garden, rubbing faces with soily hands and making ‘mud pies’.

Human exposure to soil contaminants may be damaging health. It is therefore important that we know how much soil we ingest, the amount of contamination within it, and the mineralogy so that we can better assess the risks of activities such as leisure use of parks and gardens or to support regeneration of post-industrial brownfields in our cities. 

Any soil contaminant that dissolves in the gastro-intestinal juices is termed ‘bioaccessible’. If dissolved contaminant is transferred through the gastro-intestinal wall and into the blood, it is ‘bioavailable’ and has the potential to reach organs where it can cause harm. Any remaining bioaccessible contaminant and undigested soil is excreted.

The Northampton Sand Formation as naturally high concentrations of arsenic. However our research at several proposed housing sites showed that only a small fraction would dissolve in the stomach juices and potentially pass into the bloodstream. This helped risk assessors from Land Quality Management Ltd demonstrate to the local authority that remediation was not needed before safe development of the land for much needed homes.

The longer-term legacy of our industrial past is now becoming evident. Some contaminants can still be detected at potentially toxic levels decades after they were first released. We have shown that bioaccessibility studies on soils and made ground at post-industrial brownfields can improve decisions on future land uses.


We can measure the bioaccessible fraction of soil contaminants by simulating the chemical and physical conditions found in the human gastro-intestinal system, e.g. the composition of fluids, body temperature and the amount of time solids stay in the gut. 


For the last 20 years, staff at BGS and our industrial and international research partners, have been developing a robust laboratory based method for the correct estimation of bioaccessibility.  Such estimates provide defensible information for use in risk assessment and policymaking.


The BGS is part of BioAccessibilty Research Group of Europe (BARGE). BARGE started as a small network of European research teams, comparing methods for a range of soils. We have now expanded to 20 researchers, with groups from Canada and America and we are looking forward to more joining us. Perhaps we should change our name from Europe to ‘Everywhere’!


Back in 2005, BARGE decided to use a single method with agreed gastrointestinal parameters, for measuring bioaccessibility; the Unified BARGE Method was born. We worked together on a pooled set of samples from various countries with different concentrations of potential toxic elements: arsenic, lead and cadmium. The results of a round robin trial were compared and the group explored the whats and whys for any difference in the data.


Then there was the question of validation: 'How well does our in vitro test compare with in vivo analogues?' Colleagues in France at the University of Lorraine and the French National Institute for Industrial Environment and Risks (INERIS) vivo-vitro validated the UBM measurements using a pig analogue, resulting in a paper in the high-impact journal Environmental Science and Technology (Denys et al., 2012).


The UBM has now been used and cited internationally by a range of science disciplines, including environmental, soil, toxicology, public health, pharmacology and medical sectors and helped inform many risk assessments.


In 2007, the International Standards Organisation published guidance on the application and selection of physiologically based extraction methods to estimate the human bioaccessibility/bioavailability of metals in soil. In 2018, BARGE’s hard work paid off when a new standard was introduced: BS ISO 17924:2018 specified the UBM as the method to use.

From the start of the journey through to the publication of the ISO standard, the BARGE group and researchers around the world have published over 30 peer reviewed papers and articles for non-specialists and given oral and poster presentations at too many conferences for us to remember.


The UBM has been used by researchers and commercial laboratories alike for assessing the human health risk from contaminants in soil, herbal medicines, changing land use, dust, air particulates, mine waste and food. It has been coupled with other laboratory based methods to understand why a contaminant is soluble in the gastro-intestinal environment, not just how much is available to do harm. Bioaccessibility information can be used by environment & health professionals, amongst others, to make better informed decisions on the need for remediation (and its success), land redevelopment and planning applications, and to potentially reduce the cost associated with returning brownfield sites to beneficial use.

In 2009, NERC funded a study on the financial impacts of bioaccessibility testing. It showed the savings achieved by using it at potentially contaminated sites. The results are given in the table. Similar savings will have been made at an increasing number of contaminated sites since the study reported.

More recently, bioaccessibility data has been used to predict bioaccessibility of contaminants on a regional scale. The map uses UBM data and geochemical survey data on soils to predict the bioaccessibility of arsenic in the soils of south west England.
 
Please visit the BARGE website for a flavour of what has been done by the group, and who is involved. Be prepared for some photos of when we all looked a lot younger. 

The development, validation and now publication of an ISO standard is an important scientific milestone. But sometimes, when you are in the thick of it, you don’t realise how many people and how long it takes to go from inception to completion. Don’t forget to have a look at BS ISO 17924:2018 via our library.

Wednesday, 23 January 2019

Our use of fire retardants: out of the frying pan into the environmental fire...by Christopher Vane

Forty five Thames surface sediments taken to evaluate environmental BFR
Why do we use them: Most of us don’t know it but every day we come into contact with fire retardant chemicals via clothing, furniture, polyurethane foam, power cables, circuit boards as well as plastics in electronic devices such as televisions, tablets and building materials. The very screen you are viewing probably contains a range of brominated flame retardants (BFR) which have been added to capture free radicals and inhibit flame formation. In other words these are really useful compounds that make it difficult for your device to burn; in effect they buy us time in the event of a fire and save lives. More than 75 different BFR have been manufactured since the 1970s and their use has risen in parallel with our seemingly insatiable demand for complex yet miniaturised devices that are fire-safe.
 
Changing of the Guard: In the early 2000’s scientists found that some common BFR formulations were harmful to animal and human health. For example studies with rats and mice showed negative effects to pancreas (diabetes), nervous system, immune system and reproductive systems. As a result, these chemical formulations were prohibited within the European Union in 2004. A few years later BGS investigated the presence of these compounds in sediments from the Glasgow reaches of the River Clyde, Scotland. However, more recently additional restrictions on the manufacture and use of another BFR has followed, and it was listed in 2017 under Annex A of the Stockholm Convention which severely prohibits use due to environmental concerns. Consequently a variety of new so called novel brominated fire retardants (NBFR) have been developed and added to products in order to maintain fire safety but information on the environmental presence of these compounds was lacking.
 
BFR Hunters:  A collaborative team comprised of scientists from University of Birmingham, School of Geography, Earth and Environmental Sciences as well as Thermo Fisher Scientific and BGS’s Organic Geochemistry laboratory used the very latest mass-spectrometer to identify both old and new brominated fire retardants in surface sediments of the River Thames (see here). Birmingham PhD student Aristide Ganci found the replacement BFR compounds were present at similar concentrations to the legacy equivalents and that their occurrence corresponds with proximity to London’s main sewage works and the docks at Tilbury. Close inspection of the geochemistry results have revealed the first ever UK occurrence of selected compounds in the environment. The implications of this are that the newer formulations of BFR are rapidly entering our urban rivers and because the chemicals do not easily decompose may serve as a useful chronological tracer.

For further information please contact Chris Vane     
 

Wednesday, 16 January 2019

Inorganic Geochemistry in Kenya Part II…by Olivier Humphrey

Job, David, Olivier and Doreen outside the Biotechnology Labs,
University of Eldoret
I was recently involved in field work aimed at assessing the micronutrient status and monitoring exposure rates to potentially harmful elements in western Kenya. Michael Watts, Andy Marriott and I visited the University of Eldoret, Kenya for a 16 day fieldtrip that would consist of us collecting environmental and human biomonitoring samples from over 90 households in western Kenya. For more information see the earlier blog: Geochemistry and Health in the Kenyan Rift Valley.This was my second trip to Kenya for fieldwork, the first being in January earlier this year.
 
During the sample collection phase of the trip, we were based in Kerichio before moving to Kisumu, located on the edge of Lake Victoria in Winam Gulf; Andy would remain there at the end of our sampling trip for a project of his own investigating aquaculture in Lake Victoria, more to follow in a future blog! Our field teams consisted of 3 vehicles led by Michael, Andy and I alongside our collaborators; Professor Odipo Osano (University of Eldoret), Dr Diana Menya (Moi University), technical lab staff; David Samoei and Doreen Meso (University of Eldoret), drivers and field assistants. In addition to the field teams, we also worked with public health officers (PHO) from each county who assisted with household entry, sample collection and translation. During the fieldwork, we provided training to our partners on how to collect environmental samples and all the necessary information from each site, with particular focus on accurate record keeping. After 5 days in the field, we had collected more than 800 samples and it was time to get back to the laboratory in Eldoret to process them!

Once in the labs at the University of Eldoret, I provided training in sample checking and processing. In addition to David and Doreen, who helped in the field, we were also joined by MSc student Job Isaboke and 6 undergraduate students studying environmental science. Having both David and Doreen with us in the field and laboratory provided them with a greater understanding of the importance of quality assurance and in turn, they can enforce it in their students. In terms of sample preparation, all the fresh fruits and root vegetables had to be peeled, chopped, frozen and vacuum packed, which took 4 people all day to get through! The leafy vegetable material was dried and packed for transport whilst the grains/beans/pulses and nuts were ground, a task that we could not have completed without the help of all the students. Finally, the soils were riffle split, allowing us to bring back a small amount of soil for analysis at BGS and to leave the bulk of the soil in their archive for future student projects at the University of Eldoret. One of the most important tasks I had to complete was ensuring that all the data we collected was handled properly. Showing our partners how to handle data collected from the field is essential to their development and ability to collect samples independently in the future.

From L-R: Preparing grain samples for transport to BGS, Keyworth; David preparing soils for transport to BGS, Keyworth
and their won archive

The partnership with the University of Eldoret has been going for over three years; in addition to multiple sampling trips in Kenya, David visited the UK for two months of lab training under a Commonwealth Professional Fellowship (See previous blog: A model for Quality Assurance, Lab Management and Good Laboratory Practices for Africa). Based off of the success of this trip we are looking forward to welcoming both Odipo and Doreen to BGS for training in 2019.
Overall, the sampling trip was a success, and we now feel confident that we are developing our partner’s skills to the point where they can venture into the field and collect samples in areas that have been missed to strengthen the work. The field work was very enjoyable and productive, I can’t wait to work in Kenya again, keep an eye out for future blogs!


Acknowledgements to the wider team:


University of Eldoret: Jackson Masai, Charles Owano, David Samoie, Prof. Odipo Osano, Doreen Meso. Melvine Anyango, Job Isaboke and all of the undergraduate students for help in the field and lab
Moi University: Dr Diana Menya, Esilaba Anabwani - ESCCAPE, Eldoret (Esophageal Squamous Cell Carcinoma Africa Prevention Effort), Amimo Anabwani- ESCCAPE, Eldoret
British Geological Survey: Dr Michael Watts, Olivier Humphrey – CEG PhD student, Dr Andy Marriott
Public Health Officers: Many thanks to the PHO’s from all the Counties

Monday, 14 January 2019

Setting ‘long tail’ geological data free...by Mike Stephenson, Qiuming Cheng, Junxuan Fan, Chengshan Wang and Roland Oberhänsli

‘Long tail’ data is the difficult-to-get-at data that sits in libraries, institutes and on the computers of individual scientists. Informatics specialists like to contrast it with the smaller number of large, more accessible data sets . The name ‘long tail’ derives from graphs drawn of the size of data sets against their number: there are relatively few large datasets and a lot of smaller ones. Geological science has more long tail data than sciences like physics or meteorology, probably because historically it has been less associated with big science infrastructure and sensors.


Recovering data islands


The fact that ‘long tail’ data is difficult to get at probably holds back progress in geological science. The low discoverability of long tail geoscience data, and its heterogeneity make it difficult to bring it together to gain the benefits of machine learning and artificial intelligence. Informaticians describe ‘data islands’ in geological libraries, in the records of geological surveys and on desktop computers.
Two new projects aim to improve the discoverability of these data islands and the ease of compilation (interoperability) of geoscience data. One looks at the fundamentals of making historical paper data in islands available to cyberspace, the other seeks to bind already-digital data together to answer some of the biggest geoscience questions that still remain.


Collaborative projects


The first project is a unique collaboration between a geological survey and the academics and computer scientists of the GeoBioDiversity database (GBDB). Archipelagos of data islands exist within the geological surveys of the world. An example is British Geological Survey (BGS) biostratigraphical data associated with about 3 million fossils and thousands of localities and stratigraphic sections, gathered over 150 years from all over the country to exacting and consistent standards. The data has great potential for science, but much of it is contained within paper documents or simple document scans and so is inaccessible to big data tools. It needs lifting from the page and into cyberspace. The BGS began working recently with GBDB (the official database of the International Commission on Stratigraphy (ICS)) which is almost unique in being the only large database to hold sequences of fossils tied to sections, rather than just spot collections. To date GBDB and BGS scientists have placed live manipulable data from more than 6000 UK stratigraphic sections on a public access website . The project is also using machine-learning methods to get at biostratigraphical information directly from text.

Deep Time Digital Earth


Another even more ambitious project seeks to make already-digital geoscience data work together better. The Deep Time Digital Earth (DDE) project has just been approved as the first of the International Union of Geological Science (IUGS) ‘Big Science’ programs and its aim is to harmonize deep time geological data. Again the focus is on data islands, but this time the islands are already digital but are not linked or cannot easily be used together. Through DDE, data will be available in easily used ‘hubs’ providing insights into the distribution and value of earth resources and materials, as well as earth hazards. Data brought together in new ways may provide a novel glimpses into the Earth’s geological past and its future.

An example of how DDE will work concerns the evolutionary history of the biosphere. Previous analyses of long-term paleobiodiversity change were mostly at a resolution of ~10 million years, which are too coarse to reveal fine details of past biodiversity changes. Linked databases in DDE could provide high-resolution (10-100 kyr) diversity patterns. In the realm of minerals, DDE could, for example, provide integration of database systems for mapping clusters of porphyry copper deposits (PCDs) by linking georeferenced plate motion and geometric properties of subducted slab data. Linked databases in DDE, including African groundwater and aquifer data, recharge data, meteorological data, sediment flux, subcrop geology, basin subsidence, sequence stratigraphy, compaction, geomechanics and tectonics data could lead to more accurate models for African groundwater storage, underpinning sustainable development in poor countries vulnerable to climate change.

Earth sciences supporting broad-based scientific studies


The DDE is closely consistent with the vision of the IUGS which is to promote development of the Earth sciences through the support of broad-based scientific studies relevant to the entire Earth system. It brings together an almost unique range of partners including the ICS, the International Association of Palaeontology (IAP), the International Association of Sedimentologists (IAS), the Society for Sedimentary Geology (SEPM) and the International Association for Mathematical Geosciences (IAMG). Major geological surveys and institutes including the China Geological Survey (CGS), the BGS and the All Russian Geological Institute (VSEGEI) are also involved.

These institutions are coming together at a time when informatics and computing are evolving fast, but where a wider range of geoscience data were not available until now. In this way DDE may help to solve some of the biggest geoscience questions that still remain.

DDE is being linked with UNESCO, the International Geosphere-Biosphere Programme (IGBP), the Global Sedimentary Geology Program (GSGP), the International Geoscience and Geopark Program (IGGP), the Commission of the Geologic Map of the World (CGMW), the Global Geochemical Baseline (GGB), the International Lithosphere Program (ILP), and OneGeology. DDE will also operate the full FAIR data concept (Findable, Accessible, Interoperable, and Re-usable) and link to desktop systems for geoscientists all over the world as well as to students and teachers in classrooms and on the internet.

Building bridges between data islands


Geology could be said to have lagged behind other physical sciences in capitalizing on its big data, but DDE will enable bridges between data islands to be built and for data to be interrogated using modern tools tackling some of the most important and pressing questions of our time. The project has an ambitious time frame but aims to report its first progress at the 36th International Geological Congress, New Delhi in March 2020.

This article has been written by Prof Mike Stephenson, Director of Science and Technology at the BGS, Qiuming Cheng, President of the IUGS, Chengshan Wang, Chinese Academy of Sciences and Professor at the China University of Geosciences, Junxuan Fan, Director of the GeoBioDiversity database, and Roland Oberhänsli, Past-President of the IUGS.

Friday, 11 January 2019

Updating the World Magnetic Model: From the centre of the Earth, straight to your pocket


What links the centre of the Earth, billions of smartphones, and BGS scientists? The answer is: the recently updated World Magnetic Model (WMM) 

Will Brown of the BGS Geomagnetism Team explains.

The WMM describes the primary component of the geomagnetic field, and is normally produced every five years. It also predicts the Earth’s field for the next five years. Sometimes however, the Earth’s core behaves in an unexpected manner, and so we’ve recently updated the current WMM2015 with a release of WMM2015v2.

At BGS we monitor and map the Earth’s magnetic field using a global network of surface observatories, including our own nine, and satellites in low-Earth orbit such as ESA’s Swarm mission. We use these measurements to build models of the magnetic field that allow us to interpolate between our measurements and estimate the strength and direction of the field at any location.

A model can be thought of like a map of the “topography” of Earth’s magnetic field, but what many people don’t realise is that Earth’s magnetic field isn’t a single fixed feature: it’s a combination of many effects and it changes through time. 

Map of magnetic variation through time from 1900 to 2015 – the magnetic poles are where the strong red and blue contours converge, and the north pole moves very quickly in recent years.

The magnetic poles drift, the field strengthens and weakens, and the immense magnetic field of the Sun, carried by the solar wind, constantly batters at it from the outside. The effect of all these changes vary depending on when and where you are on, under, or above the Earth’s surface. Our models dissect the field we measure into its different parts, or sources, and provide a map of each that varies in time, even predicting the future of some parts.

So where does your phone come in? The WMM is the standard magnetic model used for navigation by organisations such as NATO, the Ministry of Defence, and the US’ Department of Defense, and also by smartphone operating systems such as Android and iOS. When you open your smartphone’s map app, you may see an arrow pointing which way you’re facing, and there’s something quite clever going on underneath. Your phone contains a magnetometer that is measuring the Earth’s magnetic field. In order to make sense of this information a reference model like the WMM is needed to correct the measurements of magnetic north made by your phone to True North. You go through the same procedure if you use a map and compass when out hiking: set your compass for the map’s magnetic variation adjustment (we provide these too for Ordnance Survey maps!), and then convert your compass reading of magnetic North, to give a direction relative to the map’s grid North.


Magnetic variation in degrees – you really need to set your compass correctly when hiking!
 
The rate-of-change of magnetic variation in degrees per year – the changes are quickest near the north pole.

The WMM is a joint effort from BGS and the US’ NOAA NCEI, on behalf of the UK’s Defence Geographic Centre and the US’ National Geospatial-Intelligence Agency. The WMM is a model of the primary component of the geomagnetic field: that of Earth’s core. The core field, which gives us our familiar magnetic poles and allows us to use a compass, is generated by dynamo action in the swirling iron-rich fluid of the outer core, roughly 3,500 km below out feet. The ever-changing flow of the outer core leads to an ever-changing magnetic field. This is a complex process that we don’t fully understand the physics of yet, and so we have to update our model regularly.

Since late 2014 the core field has varied in an unpredicted, and currently unpredictable, manner. This led to the WMM becoming less accurate, particularly at high northern latitudes, much faster than normal, and so we released an update ahead of the next regularly scheduled WMM release in late 2019. We can map the field changes that have occurred since 2015, and show that they seem to be related to two phenomena, an abrupt unpredictable change called a “geomagnetic jerk” in 2014/2015, and an acceleration of flow in the core in the northern hemisphere. This update to the WMM will be used until the next release in December 2019, when we’ll make our best estimate of the likely change in the core field until 2025.
The change in the vertical component of magnetic field at the core-mantle boundary between 2015 and 2018 – the three intense patches in the northern hemisphere are related to changes like the Livermore et al (2017) “core jet” model