Quantcast
Channel: Environment
Viewing all 2972 articles
Browse latest View live

How ‘weather bombs’ could help reveal Earth’s innermost secrets

$
0
0

GettyImages 460239326

How the seafloor quivers under an intense storm called a “weather bomb” could help reveal Earth’s innermost secrets.

Using a network of seismic sensors, researchers in Japan detected a rare type of deep-Earth tremor originating from a rapidly strengthening cyclone over the North Atlantic Ocean.

Tracking how these newfound shakes ripple through the globe will help geoscientists map the materials that make up the planet’s depths, the researchers report August 26 in Science.

“We’re potentially getting a suite of new seismic source locations that can be used to investigate the interior of the Earth,” says Peter Bromirski, a geophysical oceanographer at the Scripps Institution of Oceanography in La Jolla, Calif., who wrote a commentary on the new research in the same issue of Science. “Further investigations will refine our understanding of how useful these particular waves will be.”

Tremors traveling through the ground speed up, slow down or change direction depending on the type of material they pass through. Carefully measuring these movements from earthquake waves has allowed scientists to gather clues about the structure and composition of Earth’s deepest layers.

Some regions — the middle of tectonic plates under the ocean, for instance — don’t see many earthquakes, though. Luckily, weather bombs can generate their own seismicity. Whipping winds can stir up towering ocean swells. When two opposing ocean swells collide, the meet-up can send a pressure pulse down to the ocean floor. The pulse thumps the seafloor, producing seismic waves that penetrate deep into the planet.

Scientists had previously detected only one type —called P waves —of these storm-generated seismic waves. P waves cause a material to compress and stretch like an accordion in the same direction that the wave travels. The other variety, called S waves, has proved more elusive. S waves formed by storms are typically weaker than P waves and cause material to ripple perpendicular to the wave’s path. The effect is similar to when one end of a garden hose is jerked up and down, producing waves that travel along the hose’s length.

082516_TS_weatherbomb_inlineSeismologists Kiwamu Nishida of the University of Tokyo and Ryota Takagi of Tohoku University in Sendai, Japan, hunted for the elusive S waves using a network of 202 seismic stations in Japan. Typically, the waves are lost within Earth’s natural seismic background noise. By combining and analyzing the data collected by the extra-sensitive seismometers, however, the researchers were able to tease out the S wave signals.

The waves originated from a North Atlantic cyclone, the researchers found. That storm actually produced two types of S waves. SV waves shift material vertically relative to Earth’s surface and can form from P waves. SH waves shift material horizontally and their origins are more of a mystery. Those SH waves may form from complex interactions between the ocean and seafloor, Nishida says.

Combining measurements of P, SV and SH waves will “ultimately provide better maps of Earth’s mantle and maybe even the core,” says Keith Koper, a seismologist at the University of Utah in Salt Lake City. Koper and colleagues report similar observations of S waves generated in the Pacific Ocean and detected by a Chinese seismic network in the Sept. 1 Earth and Planetary Sciences Letters. “It’s nice to see someone else get similar results —it makes me feel more confident about what we observed,” Koper says.

SEE ALSO: We're finding more evidence that unconventional natural gas development is having detrimental effects on health

DON'T MISS: 8 maps show plastic’s impact on the world’s oceans — and what’s being done about it

Join the conversation about this story »

NOW WATCH: The biggest mistake people make when brushing their teeth


California's new climate legislation is being hailed as a breakthrough

$
0
0

A coal-burning power plant can be seen behind a factory in the city of Baotou, in China's Inner Mongolia Autonomous Region October 31, 2010. REUTERS/David Gray/File Photo

California's state legislature has given final approval to a package of two bills that climate campaigners say contain the "most aggressive" climate goals in North America.

Governor Jerry Brown has called them a milestone for the state's environmental policies, and said he plans to sign both measures into law.

California has often led the rest of the United States in environmental legislation, and these bills are no exception.

Advocates of policy responses to climate change are hailing them as a breakthrough, noting they are extremely ambitious about reducing emissions that help cause climate change — and that they focus efforts locally, in an attempt to benefit people in the state.

"It's great to hear about saving polar bears and hugging trees, and making sure we address global warming from a world perspective [...] But how about people?" Eduardo Garcia, a Democratic state assembly member who sponsored one of the bills, told the LA Times

Senate Bill 32 and Assembly Bill 197 also change the way emissions policies are regulated in California, giving more oversight to legislature and less power to a state board that's been accused of having ties to polluters.

Senate Bill 32 will significantly expand ambitions to cut greenhouse gas emissions in the state, the second-largest emitter in the country after Texas.

While previous legislation from 2006, which California is on track to comply with, set a target to cut emissions to 1990 levels by 2020, the new bill demands a further reduction of 40 percent by 2030. If the state manages to meet this new goal, it would mean slashing California's total emissions by that year to half of what they were in 2006.

Assembly Bill 197 addresses concerns among lawmakers and environmental groups that resources used to combat climate change so far haven't brought noticeable benefits to many local communities. By targeting the state regulator for greenhouse gas emissions, the Air Resources Board, bill sponsor Garcia said he hopes to secure improved air quality for residents.

In addition to establishing a new permanent legislative committee to oversee the Air Resources Board and imposing six-year term limits on its members, AB197 will direct the board to focus emission reduction efforts on refineries and local manufacturing.

california factory steel welding

The Air Resources Board has recently been under scrutiny for its unwillingness to curb the air pollution that causes smog and associated public health concerns, and for its ties to industry. In March this year the Sierra Club and three other environmental organizationssued the South Coast Air Quality Management District, claiming it had allowed the refinery industry to draft a proposal for smog-prevention measures that it later adopted.

Because AB197 is so specific about the types of pollution the Air Resources Board will have to focus on, business groups have pointed out it might also undermine the cap-and-trade program, which currently allows businesses to buy and sell emissions rights on a market. Environmental groups have criticized the cap-and-trade program for allowing companies to purchase carbon offsets outside the state while continuing to emit air pollution locally.

An interactive map of environmental justice issues across the country, EJSCREEN, released by the Obama administration in 2015, shows that many low-income, minority communities in Southern California in particular live in close proximity to pollution sources, among them refineries and manufacturers. (Garcia represents Coachella, one of those communities.)

This data backs up claims that the new climate legislation is designed to improve public in these communities.

Join the conversation about this story »

NOW WATCH: The biggest mistake people make when brushing their teeth

California farmers are using drones to fight the persistent drought

$
0
0

AP_16238840805955

LOS BANOS, Calif. (AP) — A drone whirred to life in a cloud of dust, then shot hundreds of feet skyward for a bird's-eye view of a vast tomato field in California's Central Valley, the nation's most productive farming region.

Equipped with a state-of-the-art thermal camera, the drone crisscrossed the field, scanning it for cool, soggy patches where a gopher may have chewed through the buried drip irrigation line and caused a leak.

In the drought-prone West, where every drop of water counts, California farmers are in a constant search for ways to efficiently use the increasingly scarce resource. Cannon Michael is putting drone technology to work on his fields at Bowles Farming Co. near Los Banos, 120 miles southeast of San Francisco.

california drought

About 2,100 companies and individuals have federal permission to fly drones for farming, according to the drone industry's Association for Unmanned Vehicle Systems International. Federal regulators planned to relax the rules Monday on commercial drones, a move that could spur even greater use of such aircraft on farms.

Michael is descended from Henry Miller, a renowned cattle rancher, farmer and Western landowner who helped transform semi-arid central California into fertile farmland 150 years ago by building irrigation canals, some still flowing today.

Six generations later, Michael farms a 17-square-mile portion of that same land, growing melons, carrots, onions, cotton and almonds, while carrying on in the same pioneering spirit as Miller.

"I've always been a big fan of technology," said Michael, 44, mindful of how climate change is making water more precious. "I think it's really the only way we're going to stay in business."

On his 2,400-acre tomato crop alone, Michael estimates that this year his leak-detecting drones could save enough water to sustain more than 550 families of four for a year.

California endured the driest four-year period on record before a relatively wet and snowy winter this year overflowed some reservoirs in the northern part of the state. Southern California, however, remains dry, and the statewide drought has not ended.

Beyond California, drones are becoming fixtures on farms in places such as Canada, Australia, South Africa and Latin America as they become more affordable and easier to use, said Ian Smith of DroneDeploy, a San Francisco-based industry leader in drone software development.

A farmer can order a commercial-grade drone online for $2,000 and receive it in the mail days later, he said. Its video camera is then paired up with a smartphone or computer tablet that is used to control the drone.

"Hook it up to a smartphone. Boom. Take off and you're in business," Smith said.

Many farmers, however, have yet to grasp the full potential beyond capturing video images of crops or using infrared cameras to spot color variations in the plants that can signal a problem.

Few have used technology and invested in it to the degree Michael has. This year he began using the thermal camera, which can cost up to $10,000 and can show moisture variations in soil. He also created a new management position at his company dedicated to overseeing drones.

Recently, Danny Royer, the new vice president of technology at Bowles, stood at the tailgate of his pickup studying live images transmitted to the screen of his tablet as a drone buzzed 300 feet overhead.

Rows of mature tomato plants appeared on the screen in glowing burnt orange, indicating warmer, drier areas, while dark patches of purple showed the cool moist soil hidden below the plants.

After taking the images back to his office to analyze them, he decided there were no leaks to repair, but the soil needed to be enriched in places to help the field grow evenly.

On Monday, the Federal Aviation Administration was scheduled to ease the rules so that operators of commercial drones that weigh less than 55 pounds will no longer need to go through the long, expensive process of earning an airplane pilot's license.

Instead, they will have to take a written test — but not an actual flying test at the controls of a plane — and will be issued a drone license for $150.

The rule change and emerging technology could make drones more attractive tools for farmers, said Brandon Stark, director of the University of California's Center of Excellence for Unmanned Aircraft Systems Safety, based at the Merced campus.

However, he said that until federal regulators clarify parts of the new rules, commercial drones must continue to fly below 400 feet, limiting their use on very large fields.

Stark is seeking what he calls the Holy Grail of drone use in agriculture — enabling them to directly diagnose what ails a tree, whether it's deficiencies in water or nutrients, or a pest — without having to send a person into the field.

"We're just getting started," Stark said. "The research is really still in its infancy."

SEE ALSO: The most famous lion family in Southwest Africa has been poisoned

DON'T MISS: Rain can't do its job anymore

Join the conversation about this story »

NOW WATCH: A devastating look at the California drought

It might be time to bring nuclear power back to America

$
0
0

nuclear power plant steam

Looks like its time to get radioactive, America.

The US has committed to cutting 80% of the greenhouse gases it currently pumps into the air by 2050.

It's a major project, and an expensive one, too.

But Columbia University environmental economist Geoffrey Heal said nuclear power could cut those costs significantly.

Here's why:

Shifting to an 80% cleaner grid will likely involve building up solar and wind from about 6% to 66% of total electric generation on the US grid. In addition to the costs of the windmills and photovoltaic panels themselves, the country will need to build giant batteries and interstate power lines to help deal with days where one area might not get enough sun or wind. All told, Heal estimates the project will run into the trillions of dollars over 34 years.

Though far from cheap, nuclear power could help cut those costs. New plants could reduce the burden of wind and solar from 66% to 50%, Heal found, saving money in the process.

Nuclear power plants get a bad rap, mostly due to safety issues with older reactors. But a properly functioning plant only emits one thing into the atmosphere: safe, clean, water vapor. 

Stepping up plant construction could cut the solar, wind, and battery requirements down from two-thirds to one-half — saving the economy a lot of money in the process.

"That was a calculation I found surprising, because nuclear is something that's regarded as very expensive," Heal said. "and it's effectively priced itself out of the market these days."

Where nuclear power is now

Right now, nuclear power accounts for 20% of all electricity on the US grid.

But US nuclear infrastructure is aging. The last reactor to come online was Tennessee's Watts Bar facility back in 1996, which broke ground way back in 1973. Only one other plant, Watts Bar 2, is now under construction. At the same time, 20 old plants could shut down in the next decade. And of course, no one has offered a good solution for disposing of barrels of nuclear waste.

At the same time, nuclear power has actually grown less popular in the US. For the first time in March 2016, a majority of Americans (54%) told Gallup they opposed nuclear power.

No matter what, turning the vast inertia of American power production toward an 80% reduction will take a massive effort. Heal estimates it'll cost some $42 billion $176 billion per year every year between now and 2050. The bulk of that spending will go toward building solar farms and winds plants, along with giant batteries to store surplus power for days that are neither windy nor sunny.

SEE ALSO: An economist figured out how much Clinton's plan to save the world from runaway climate change would actually cost

DON'T MISS: A typo and a bag of kitty litter might cost US taxpayers billions in nuclear waste cleanup

Join the conversation about this story »

NOW WATCH: What Idaho's Atomic City looks like today — 30 years after a nuclear disaster drove everyone away

The first offshore tidal turbines in the world have been set up in Scotland

$
0
0

A young girl gets drenched in a large wave during high tide at a sea front in Mumbai.Tidal powered turbines installed off the coast of Shetland (Bluemull Sound) have been connected to the electricity grid and could herald a "new era" in tidal energy.

The first offshore tidal turbines in the world to deliver electricity to the grid have been set-up by the company Nova Innovation (and are owned by the North Yell Development Council).

The two turbines are the first to form part of the Shetland Tidal Array. The 100 kilowatt turbines were part made by a local company called Shetland Composites.

Tidal energy is a form of hydropower that converts the energy of the tides into electricity or other useful forms of power. Tidal power is the only technology that draws on energy inherent in the orbital characteristics of the Earth–Moon system, and to a lesser extent in the Earth–Sun system, given the effect of these celestial bodies on the sea and the creation of tides.

 The devices installed in the Bluemull Sound site have the capacity to power 300 homes on the Scottish islands. They are connected to the local grid via a 1 kilometre submersive sea cable. Bluemull Sound is the strait between Unst and Yell in Shetland's North Isles.

An important aspect to the devices is there predictability, with Nova Innovation stating to the BBC that the turbines will generate to full power across all tidal conditions.

With the grid established, Simon Forrest, managing director of Nova Innovation, stated: "We are absolutely delighted to be the first company in the world to deploy a fully operational tidal array."

Join the conversation about this story »

NOW WATCH: An exercise scientists explains the key to getting stronger that many overlook

Scientists have officially declared that we are living in a new geological era

$
0
0

earth from space

It’s literally epoch-defining news.

A group of experts tasked with considering the question of whether we have officially entered the Anthropocene – the geological age characterized by humans' influence on the planet – has delivered its answer: yes.

The British-led Working Group on the Anthropocene (WGA) told a geology conference in Cape Town that, in its considered opinion, the Anthropocene epoch began in 1950 – the start of the era of nuclear bomb tests, disposable plastics and the human population boom.

The Anthropocene has fast become an academic buzzword and has achieved a degree of public visibility in recent years. But the more the term is used, the more confusion reigns, at least for those not versed in the niceties of the underpinning science.

Roughly translated, the Anthropocene means the “age of humans”. Geologists examine layers of rock called “strata”, which tell a story of changes to the functioning of Earth’s surface and near-surface processes, be these oceanic, biological, terrestrial, riverine, atmospheric, tectonic or chemical.

When geologists identify boundaries between layers that appear to be global, those boundaries become candidates for formal recognition by the International Commission on Stratigraphy (ICS). The commission produces the International Chronostratigraphic Chart, which delimits verified changes during the planet’s 4.5 billion-year evolution.

image 20160830 28249 1m9ks0u

The chart features a hierarchy of terms like “system” and “stage”; generally, the suffix “-cene” refers to a geologically brief stretch of time and sits at the bottom of the hierarchy. We have spent the past 11,500 years or so living in the so-called Holocene epoch, the interglacial period during which Homo sapiens has flourished.

If the Holocene has now truly given way to the Anthropocene, it’s because a single species – us – has significantly altered the character of the entire hydrosphere, cryosphere, biosphere, lithosphere and atmosphere.

The end of an era?

Making this call is not straightforward, because the Anthropocene proposition is being investigated in different areas of science, using different methods and criteria for assessing the evidence. Despite its geological ring, the term Anthropocene was coined not by a geologist, but by the Nobel Prize-winning atmospheric chemist Paul Crutzen in 2000.

He and his colleagues in the International Geosphere-Biosphere Program have amassed considerable evidence about changes to everything from nutrient cycles to ocean acidity to levels of biodiversity across the planet.

Comparing these changes to those occurring during the Holocene, they concluded that we humans have made an indelible mark on our one and only home. We have altered the Earth system qualitatively, in ways that call into question our very survival over the coming few centuries.

Crutzen’s group talks of the post-1950 period as the “Great Acceleration”, when a range of factors – from human population numbers, to disposable plastics, to nitrogen fertilizer – began to increase exponentially. But their benchmark for identifying this as a significant change has nothing to do with geological stratigraphy. Instead, they ask whether the present period is qualitatively different to the situation during the Holocene.

crowd brooklyn bridge new york

Rocking out

Meanwhile, a small group of geologists has been investigating the stratigraphic evidence for the Anthropocene. A few years ago a subcommission of the ICS set up the Anthropocene working group, which has now suggested that human activity has left an indelible mark on the stratigraphic record.

The major problem with this approach is that any signal is not yet captured in rock. Humans have not been around long enough for any planet-wide impacts to be evident in Earth’s geology itself. This means that any evidence for a Holocene-Anthropocene boundary would necessarily be found in less permanent media like ice sheets, soil layers or ocean sediments.

The ICS has always considered evidence for boundaries that pertain to the past, usually the deep past. The WGA is thus working against convention by looking for present-day stratigraphic markers that might demonstrate humans’ planetary impact. Only in thousands of years' time might future geologists (if there are any) confirm that these markers are geologically significant.

In the meantime, the group must be content to identify specific calendar years when significant human impacts have been evident. For example, one is 1945, when the Trinity atomic device was detonated in New Mexico. This and subsequent bomb tests have left global markers of radioactivity that ought still to be evident in 10,000 years.

Alternatively, geographers Simon Lewis and Mark Maslin have suggested that 1610 might be a better candidate for a crucial human-induced step change. That was the year when atmospheric carbon dioxide dipped markedly, suggesting a human fingerprint linked to the New World colonists' impact on indigenous American agriculture, although this idea is contested.

Decision time

The fact that the WGA has picked a more recent date, 1950, suggests that it agrees with the idea of defining the Great Acceleration of the latter half of the 20th century as the moment we stepped into the Anthropocene.

It’s not a decision that is taken lightly. The ICS is extremely scrupulous about amending the International Chronostratigraphic Chart. The WGA’s suggestion will face a rigorous evaluation before it can be scientifically accepted by the commission. It may be many years before it is formally ratified.

Elsewhere, the term is fast becoming a widely used description of how people now relate to our planet, rather like the Iron Age or the Renaissance. These words describe real changes in history and enjoy widespread use in academia and beyond, without the need for rigorously defined “boundary markers” to delimit them from prior periods.

Does any of this really matter? Should we care that the jury is still out in geology, while other scientists feel confident that humans are altering the entire Earth system?

Writing on The Conversation, geologist James Scourse suggests not. He feels that the geological debate is “manufactured” and that humans' impact on Earth is sufficiently well recognized that we have no need of a new term to describe it.

Clearly, many scientists beg to differ. A key reason, arguably, is the failure of virtually every society on the planet to acknowledge the sheer magnitude of the human impact on Earth. Only last year did we finally negotiate a truly global treaty to confront climate change.

In this light, the Anthropocene allows scientists to assemble a set of large-scale human impacts under one graphic conceptual banner. Its scientific status therefore matters a great deal if people worldwide are at long last to wake up to the environmental effects of their collective actions.

Gaining traction

But the scientific credibility of the Anthropocene proposition is likely to be called into question the more that scientists use the term informally or otherwise. Here the recent history of climate science in the public domain is instructive.

Even more than the concept of global warming, the Anthropocene is provocative because it implies that our current way of life, especially in wealthy parts of the world, is utterly unsustainable. Large companies who make profits from environmental despoliation – oil multinationals, chemical companies, car makers and countless others – have much to lose if the concept becomes linked with political agendas devoted to things like degrowth and decarbonisation. When one considers the organized attacks on climate science in the United States and elsewhere, it seems likely that Anthropocene science will be challenged on ostensibly scientific grounds by non-scientists who dislike its implications.

Sadly, such attacks are likely to succeed. In geology, the AWG’s unconventional proclamation potentially leaves any ICS definition open to challenge. If accepted, it also means that all indicators of the Holocene would now have to be referred to as things of the past, despite evidence that the transition to a human-shaped world is not quite complete in some places.

Some climate contrarians still refuse to accept that researchers can truly distinguish a human signature in the climate. Similarly, scientists who address themselves to the Anthropocene will doubtless face questions about how much these changes to the planet are really beyond the range of natural variability.

If “Anthropocene sceptics” gain the same momentum as climate deniers have enjoyed, they will sow seeds of confusion into what ought to be a mature public debate about how humans can transform their relationship with the Earth. But we can resist this confusion by recognizing that we don’t need the ICS’s imprimatur to appreciate that we are indeed waving goodbye to Earth as we have known it throughout human civilization.

We can also recognize that Earth system science is not as precise as nuclear physics or geometry. This lack of precision does not mean that the Anthropocene is pure scientific speculation. It means that science knows enough to sound the alarm, without knowing all the details about the unfolding emergency.

The Anthropocene deserves to become part of our lexicon – a way we understand who we are, what we’re doing and what our responsibilities are as a species – so long as we remember that not all humans are equal contributors to our planetary maladies, with many being victims.

Noel Castree, Professor of Geography, University of Wollongong. This article was originally published on The Conversation. Read the original article

SEE ALSO: 5 signs humans have pushed earth into a new geological age

DON'T MISS: Human-made climate change started twice as long ago as we thought

Join the conversation about this story »

NOW WATCH: 6 scientists spent a year in a mock Mars habitat in Hawaii with no fresh water

A coffee shortage is looming — here's how soon it could be extinct

$
0
0

coffee

Coffee is more than just the crucial beverage that makes it easier to face the workday. It provides comfort, culture, and is an essential source of the caffeine that Harvard neuroscientist Charles Czeisler says makes modern life possible

But the global coffee supply is currently at risk, with shortages already starting to affect the world.

A full half of the world's area that's deemed suitable for growing coffee will be lost by 2050 if climate change remains unchecked, according to a new report from The Climate Institute of Australia.

By 2080, the report estimates that wild coffee (which helps us find genetic varietals that might be more resistant to climate stress) could go extinct.

Coffee shortages that make it harder to get good coffee and that hurt the livelihoods of 25 million coffee farmers around the globe are already having an effect, and it's not just environmental research groups that are concerned about future access to coffee. Advisors for corporate giants like Starbucks and Lavazza agree.

"We have a cloud hovering over our head. It’s dramatically serious," Mario Cerutti, Green Coffee and Corporate Relations Partner at Lavazza, said at a hospitality conference in Italy in 2015.

"Climate change can have a significant adverse effect in the short term," he said. "It's no longer about the future; it's the present."

What's happening to coffee?

People drink more than 2.25 billion cups of coffee each and every day. The coffee industry is a major one, producing the second most valuable export for developing countries. But the better and more commonly grown type of coffee, Coffea Arabica, can only thrive in very specific conditions. For now, that means tropical highlands around the globe, from Central America and Brazil to Indonesia, Vietnam, and East Africa, its place of origin.

coffee beans

But a warming world and extreme weather, including both heavy rains and drought, are making it harder to grow coffee in these regions, according to the report. Temperature and heavy rain have helped a fungus called Coffee Leaf Rust spread through Central America and into South America, destroying crops. Pests like the Coffee Berry Borer are spreading for the same reasons. Drought in Brazil cut coffee production by around 30% in 2014 in Minas Gerais, a major coffee region.

Even a half a degree of temperature change can make a region that used to be a coffee gold mine unsuitable. Moving production to higher altitudes is not always feasible and can be especially difficult for the small farmers that make up 80-90% of coffee growers.

By 2050, half of currently suitable land will no longer be suitable, unless the world can limit warming to the 1.5-2 degree Celsius rise that was set as a goal at the 2016 Paris Climate Agreement, and really, even 1.5 degrees is pushing it for most farmers.

It's not a completely hopeless scenario — cutting emissions and limiting warming to 1.5 degrees would make a big difference, both for individual coffee lovers and for the 120 million people who make a living from the coffee supply chain. Buying coffee from groups that provide fair incomes to farmers can help those communities adapt.

But this is a serious situation and one worth paying attention to now, before problems get worse down the line.

As Starbucks sustainability director Jim Hanna told The Guardian in 2011 — five years ago — it's urgent.

"If we sit by and wait until the impacts of climate change are so severe that is impacting our supply chain then that puts us at a greater risk,"he said. "From a business perspective we really need to address this now, and to look five, 10, and 20 years down the road."

SEE ALSO: I went to the source of the world's best coffee — and saw firsthand why the industry is in trouble

DON'T MISS: These are the worst stings in the world, according to a guy who's experienced them all

Join the conversation about this story »

NOW WATCH: We tried the mushroom coffee that claims to increase productivity without giving you the jitters

There's a big, stupid reason the US will probably break its climate change promise

$
0
0

Smog over NYC

The United States will cut emissions 80% by 2050. We promise. We really really mean it this time.

Sure, we've known for decades that the gases we pump into the atmosphere are heating the planet. And yeah, it's too late now to prevent a major shock to our climate. But it's not just us spewing greenhouse gases into the atmosphere. And we're going to change. Really.

At the 2016 international Paris Agreement to combat climate change, the Obama administration committed the country to the goal of an 80% reduction by 2050. Hillary Clinton, the person who seems far and away most likely to become president in 2017, has made it part of her campaign platform.

Here's the thing though: There's a good chance America can't pull it off.

That's not because it's technologically or economically impossible. It's because our political system has become too fractured and disorganized to address climate change in the measured, multi-decadal way necessary to get the job done. On its face, that's a pretty stupid reason to go back on a major international commitment. But experts across the political spectrum agree that's likely what will happen.

A study by the Columbia University economist Geoffrey Heal, which I explore more deeply here, found the total cost of the project would likely fall somewhere in between the war in Afghanistan and World War II. And we have 34 years to do it, so the annual cost would be just a small fraction of gross domestic product.

But Heal's study relies on the assumption that we live in a functioning society that can competently organize itself to address complex challenges over long time scales.

"We'd need a sort of a carefully thought out policy that was laid down," Heal told Business Insider, "establishing some clear expectation of continuity in the policy field over quite a long period of time in order to mobilize the kind of money that would be needed here."

We'd need a sort of a carefully thought out policy that was laid down, establishing some clear expectation of continuity in the policy field over quite a long period of time.

The reality is, our politics is more polarized than ever, which makes that kind of mobilization and long-term planning difficult on any front. And climate science has become a political football.

In its party platform, the Republican party writes: "Climate change is far from this nation’s most pressing national security issue. This is the triumph of extremism over common sense, and Congress must stop it." The Democrats' party platform, which addresses climate change at length, comes down on the opposite side: "Climate change is an urgent threat and a defining challenge of our time."

The consequence of that discord is that our existing climate policy already doesn't work that well.

"Some incentives [for developing green energy], like the production tax credit for renewable energy, have been on and off and on and off. It is very disorienting to people who think about investing in this area," Heal said.

Myron Ebell, an energy and environmental policy analyst — and avowed climate-science skeptic — with the libertarian Competitive Enterprise Institute think tank, told Business Insider that a comprehensive program to drastically reduce US emissions would meet major political opposition.

"I think Obama has gone about as far as you can go in terms of twisting the current regulatory structure to try to do things," Ebell said. "At some point Congress would have to vote for this kind of program, and I think it's a long way in the future if at all. I have my doubts that it will ever happen, but right now you can say it's several Congresses away."

Ebell said that political "friction"— from opponents ranging from national lawmakers to local landowners objecting to power lines and windmills — will most likely add costs at every stage of a major national-energy overhaul.

Heal agrees that politics and planning could be a major hurdle.

"I don't think anyone has thought through in any detail what it would take to mobilize the amount of money that we're talking about here," he said.

But he while he said the plan is highly unlikely to succeed, there's some solace to take in the country moving in this direction.

"I'd say I'm moderately optimistic in the sense that [the costs of clean technology] will certainly come down over time," he said. "Whether we'll get an 80% reduction by 2050 I'm not sure. But I think we can certainly get a 50% reduction by 2050 without massive amounts of expenditure or stress."

"And if costs come down and if technology is developed in the right way, we could see 60% percent or a 70%. I think 80% is tough but it's doable. And I think we're likely to get quite a lot of the way there."

SEE ALSO: An economist figured out how much Hillary Clinton's plan to save the world from runaway climate change would actually cost

DON'T MISS: This is what it looks like when society collapses

Join the conversation about this story »

NOW WATCH: Research reveals why men cheat, but it's not what you think


I went to the source of the world's best coffee — and saw firsthand why the industry is in trouble

$
0
0

felix and life monteverde packages

Mmm, coffee.

Not only is the tangy brew one of the most widely consumed beverages in the world, its active ingredient — caffeine — is the most popular psychoactive drug on the planet.

But coffee is in trouble. The crop is highly vulnerable to climate change. According to a new report, it'll be extinct by 2080.

I recently visited a coffee farm in Costa Rica, one of the world's most desirable coffee-harvesting countries, to see why the delicious crop is on the brink of disappearing:

UP NEXT: What caffeine does to your body and brain

SEE ALSO: A coffee shortage is looming — and scientists have figured out how soon it could be extinct

Our drive to a coffee farm called Cafe Monteverde took us up a mountain on a dirt road for about an hour and a half. On our way, we got some breathtaking views of the area's rugged, hilly terrain and gorgeous forest cover.



The region of Monteverde, where a lot of Costa Rica's coffee is grown, is a misty, cloud-enshrined area about three hours from San Jose, the capital. The humid, shady climate is ideal for growing coffee plants, but the drive to reach it can be a challenge if you're not familiar with the roads.



My partner (right) and I were introduced to the farm by Felix Salazar (left), a nature photographer born and raised in Monteverde who also works on the farm and gives tours in his free time. Felix walked us through the rolling green fields where the coffee for Cafe Monteverde is grown.



See the rest of the story at Business Insider

A tropical storm might be headed for the East Coast

$
0
0

td9 wind forecast noaa

Forecasters expect a tropical cyclone that's brewing in the Gulf of Mexico to soon gain enough strength to be named as a tropical storm.

For now weather experts are calling it "tropical depression nine," or TD9, which means it can blow sustained winds up to 38 mph for at least a minute.

TD9 first appeared on Sunday, Aug. 28, near Havana, Cuba, and has moved west into the Gulf's warm waters ever since.

Most computer models predict TD9 will gain steam before curving northeast and cutting across central to northern Florida.

Dr. Rick Knabb, the direction of NOAA's National Hurricane Center, gave it up to a 50% chance of becoming a tropical storm before it makes landfall just north of Tampa:

But a number of computer models also predict the storm will continue on north — then stall over the eastern seaboard of the US:

Three of the models even predict the storm will blow out slightly into the Atlantic, then hook back westward to impact the New England area sometime over Labor Day weekend:

Still, as many meteorologists are reminding their followers, it's still early for TD9.

Until a "hurricane hunter" mission returns from its flight to gather fresh data and gauge the depression's strength, all bets are off.

Stay tuned to Business Insider this week as we keep tabs on TD9 and other powerful weather systems.

SEE ALSO: Meet a 'hurricane hunter' who flies straight into terrifying storms

DON'T MISS: Two potential hurricanes are gaining strength over the Atlantic

Join the conversation about this story »

NOW WATCH: These futuristic beach homes were inspired by the devastating power of hurricanes

California is expanding the program that allows oil and gas companies to inject waste into aquifers

$
0
0

Kern River Oil Field

As the western United States struggles with chronic water shortages and a changing climate, scientists are warning that if vast underground stores of fresh water that California and other states rely on are not carefully conserved, they too may soon run dry.

Heeding this warning, California passed new laws in late 2014 that for the first time require the state to account for its groundwater resources and measure how much water is being used.

Yet California's natural resources agency, with the oversight and consent of the federal government, also runs a shadow program that allows many of its aquifers to be pumped full of toxic waste.

Now the state — which relied on aquifers for at least 60 percent of its total water supply over the past three years — is taking steps to expand that program, possibly sacrificing portions of dozens more groundwater reserves. In some cases, regulators are considering whether to legalize pollution already taking place at a number of sites, based on arguments that the water that will be lost was too dirty to drink or too difficult to access at an affordable price. Officials also may allow the borders of some pollution areas to be extended, jeopardizing new, previously unspoiled parts of the state's water supply.

The proposed expansion would affect some of the parts of California hardest hit by drought, from the state's agriculturally rich central valley to wine country and oil-drilling fields along the Salinas River. Some have questioned the wisdom of such moves in light of the state's long-term thirst for more water supplies.

"Once [the state] exempts the water, it's basically polluted forever. It's a terrible idea," said Maya Golden-Krasner, staff attorney for the Center for Biological Diversity, which is suing California to force it to complete an environmental impact assessment of the proposed aquifer changes. California, she said, is still offering breaks to its oil industry. "We're at a precipice point where the state is going to have to prioritize water over an industry that isn't going to last."

California is one of at least 23 states where so-called aquifer exemptions — exceptions to federal environmental law that allow mining or oil and gas companies to dump waste directly into drinking water reserves — have been issued.

Exemptions are granted by a U.S. Environmental Protection Agency division that has had difficulties in recordkeeping and has been criticized for its controversial management of groundwater reserves. A 2012 ProPublica investigation disclosed that the federal government had given energy and mining companies permission to pollute U.S. aquifers in more than 1,000 locations, as part of an underground disposal program that allows toxic substances to be disposed of in nearly 700,000 waste wells across the country.

In many cases, the exact locations of the exemptions and the precise boundaries of areas where aquifer pollution was allowed had been left poorly defined, raising concerns that waste might reach adjacent drinking water. Several states, including California, have since admitted they've allowed that to happen.

As droughts have worsened and aquifers have become more cherished, the implications of aquifer exemptions have become more serious, even as regulators have continued to issue these legal loopholes.

California drought lake water

The federal Safe Drinking Water Act distinguishes between underground aquifers that are too salty or dirty to ever be used and those that are pure enough to drink from, defining the latter as an "underground source of drinking water." Protection of drinking water is required under the law, and any polluting of it through waste disposal, oil and gas production, or mining is a crime. Companies, however, can file petitions to change how an aquifer is classified, arguing that it either has already been polluted or is too deep underground to likely be used. Even if water is relatively clean, if the EPA approves a change in definition, an aquifer is no longer considered a "source of drinking water," and is no longer protected.

Applications to exempt an aquifer are supposed to undergo extensive scientific scrutiny, and today they usually do. But when the Safe Drinking Water Act was initially implemented, the federal government traded away much of that scrutiny as a compromise to win state and industry support for the new regulations. The EPA granted blanket exemptions for large swaths of territory underlying California and Texas oil fields, for example, and did the same in other states with large energy and mining industries. Documents from California, dating to 1981, estimate that at least 100 aquifers in the state's central valley were granted exemptions.

It's not always clear where the aquifers polluted under these early exemptions are located. For decades, both state officials and the federal government have struggled just to identify the precise places where the permits they issued applied, and where pollutants were being injected into groundwater. A spreadsheet listing thousands of exempted aquifer locations nationwide, provided to ProPublica in 2012 by the EPA in response to a Freedom of Information request, listed incomplete location coordinates for a majority of the exemptions, describing them merely by the county or township in which they are located . When pressed for more information, an EPA official admitted that was all the information the agency had.

California's exemption records are only slightly more precise, and no less problematic.

Most of them appear to be best described in the appendices of a tattered 1981 document, yellowed with age. (State officials suggested to ProPublica this week that other records exist but could not produce them.) Overlying sections of a simple map of the state's vast central valley, hand-drawn boundaries are sketched over areas equivalent to thousands of acres and shaded in. There are only vague descriptions like depth and name of the geologic formation, but nothing as precise as latitude and longitude coordinates, for the borders of the shaded areas. "Unfortunately, what we do not have is an easy-to-use, enumerated list," Don Drysdale, a spokesman for the California Department of Conservation, wrote to ProPublica in an email this week. The state has never endeavored to measure the total volume of water it has allowed to be spoiled.

The waste being injected into exempted aquifers is often described as merely "salt water." Indeed, only "non-hazardous" substances are supposed to be pumped into aquifers, even with exemptions. But under concessions won by the oil industry and inserted into federal law, oilfield production waste — including chemicals known to cause cancer and fracking materials — are not legally considered "hazardous," a term with a specific definition in federal environmental law. According to the California Department of Conservation, which regulates the state's oil and gas industry, "drilling mud filtrate, naturally occurring radioactive materials (NORM), slurrified crude-oil, saturated soils, and tank bottoms" are all allowed to be injected into aquifers as "non-hazardous" material.

Despite the substantial wiggle room granted by law, California has come under fire for not managing its roughly 52,000 waste wells properly. In 2011, the EPA sharply criticized the state for keeping poor records, mismanaging its environmental reviews, and failing to follow federal law. It suggested that the state's autonomy over its groundwater regulations could be revoked, and that the EPA would impose federal oversight.

To fend off that change, California launched its own review and, in 2014, began to uncover extraordinary lapses: Thanks to poor recordkeeping and confusion over which aquifers had been written off, the state found more than 2,000 wells were injecting toxins not into exempt areas, but directly into the state's drinking water aquifers. In 140 cases wastewater was being put into the highest quality aquifers, raising concerns in the state capitol about the threat to public health. California shut down some 56 waste wells last year until it could sort out the mess, and it passed improved regulations that will give the state's water agency a role in the approval process. Still, it has allowed injection to continue until the end of this year in 11 drinking water aquifers that it has to reevaluate because neither the feds nor state officials are sure whether they exempted them in the 1980s.The state is also allowing injection to continue until next February in other drinking water quality aquifers pending the approval of new aquifer exemptions that would extend that indefinitely.

Those 11 aquifers have been the focus of much of the state's renewed attention, but California still hasn't confirmed the borders of the hundreds of legacy exemptions in other aquifers that date back to the 1980s. Without taking this step, the state's top water official said, there's no way to know how much clean water California still has.

"That's part of the whole point," Felicia Marcus, chair of the California State Water Resources Control Board told ProPublica, "not injecting into aquifers that people are depending on now, but also to go back and make sure we were not too loose on it in the past. Certainly the discovery of all these mistakes puts us on red alert."

Now California — with Marcus' blessing — may fix the problem by expanding the boundaries of exempted areas rather than identifying and restricting them.

The Department of Conservation is poised to consider as many as 70 new aquifer exemptions, redrawing some to include areas where companies have been injecting waste illegally into drinking water. In the state's central valley, where a substantial portion of the nation's fruits and nuts are grown using groundwater, three applications for aquifer exemptions around the Fruitvale, Round Mountain and Tejon oil fields — all in or near Bakersfield — are already undergoing state reviews that would precede approval by the EPA.

And in February the state submitted final plans to the EPA to exempt a new portion of the Arroyo Grande Aquifer in Paso Robles, allowing oil companies to inject waste or fluids to help in pumping out more oil. In that case, Marcus and the state's Water Resources Control Board — the agency in charge of the quality of the state's water supply — say they agreed to allow the exemption because the aquifer was already of poor quality and would not be used in the future. Marcus said she was convinced the contaminants injected there could not migrate underground in ways that would affect other, cleaner water sources nearby — that they would be sealed in by the geologic structure of the region.

Still, the areas California is writing off are surrounded by underground water reserves that get used every day. An exemption might cover the water soaked up in one particular layer of rock, at a certain depth, even while wells extract water from aquifers above or below it. And, according to Golden-Krasner, the state's assessment that pollution will remain confined is often dependent on an oil company maintaining a specific pressure underground, making the future of the clean water vulnerable to human error.

In our 2012 investigation, ProPublica found numerous cases in which waste defied the containment that regulators and their computer models had promised, and contamination spread. In many instances, injection wells themselves punched holes in the earth's seal and leaked. In others, faults and fissures in the earth moved in ways that allowed trapped fluids to migrate. Several of the problems documented had occurred in California.

The area around Bakersfield affected by the majority of the new aquifer pollution applications is also home to one of the state's largest underground water storage facilities, the Kern Water Bank, relied on by California farmers. It lies directly above at least one of the exempted aquifers and is pierced by dozens of oil wells. The state's water board supports the exemptions, but their close proximity to drinking water could be reason to worry, acknowledges Jonathan Bishop, the chief deputy director of the Water Resources Control Board.

"Are we concerned that wells going through aquifers that have beneficial use be maintained and have high integrity? Yeah," Bishop said. "They do go through drinking water aquifers in many locations, not just in Bakersfield."

Opponents of the exemption program are infuriated by the fact that applications are evaluated on an isolated basis, without any consideration of the state's larger water supply issues. The original criteria for aquifer exemptions set out in federal statute never contemplated that in California and plenty of others states, multiple exemptions could be granted in close proximity or that polluted areas could be sandwiched between clean water reserves. Neither state nor federal codes call for any broader analysis of the cumulative risk.

"Their whole review is from the perspective of can we check the boxes on federal criteria and the state law," said John Noel, who covers oil and gas issues for the environmental group Clean Water Action. "Nobody is asking the question, if we exempt these five aquifers what is the long term supply impact? How much water are we writing off?"

ProPublica is a Pulitzer Prize-winning investigative newsroom. Sign up for their newsletter.

SEE ALSO: The majority of California counties have had drinking water violations

Join the conversation about this story »

NOW WATCH: A Harvard psychologist's advice on how to argue when you know you're right

The Pope is urging Christians to save the polluted planet from 'debris, desolation, and filth'

$
0
0

Pope Francis arrives to leads the weekly audience at the Vatican August 31, 2016.  REUTERS/Stefano Rellandini

ROME (Reuters) - Pope Francis called on Thursday for concerted action against environmental degradation and climate change, renewing his fierce attack on consumerism and financial greed, which, he said, were threatening the planet.

A year after publishing the first ever papal document dedicated to the environment, the pope returned to the subject calling on Christians to make the defense of nature a core part of their faith.

"God gave us a bountiful garden, but we have turned it into a polluted wasteland of debris, desolation and filth," Francis said in a document released to coincide with the World Day of Prayer for the Care of Creation.

Born in Argentina, Francis is the first pope from a developing nation and has placed environmental causes at the heart of his papacy, denouncing what he sees as a "throwaway" consumer culture and rampant, market-driven economies.

"Economics and politics, society and culture cannot be dominated by thinking only of the short-term and immediate financial or electoral gains," Francis said, suggesting more ambitious action might be needed to curb climate change.

World leaders agreed at a United Nations summit in Paris last December to commit to new policies to limit greenhouse-gas emissions in an effort to stabilize rising temperatures.

Francis welcomed the accord, but said world temperatures looked liked setting new records this year and urged voters everywhere to make sure their governments did not backtrack.

"It is up to citizens to insist that this happen, and indeed to advocate for even more ambitious goals," he said.

He urged the world's one billion Roman Catholic to embrace ecology, saying defense of the environment should be added to the so-called acts of mercy, which provide believers with guiding principles and duties that they are meant to follow.

These centuries-old practices include taking care of the hungry and sick, burying the dead and teaching the ignorant.

"May the works of mercy also include care for our common home," Francis said, adding that simple, daily gestures which broke with "the logic of violence, exploitation and selfishness" would make a difference.

Even recycling rubbish, switching off lights and using a car-pool or public transport would help, he said. "We must not think that these efforts are too small to improve our world."

(Reporting by Crispian Balmer; Editing by Jon Boyle)

SEE ALSO: Pope tells teens: Happiness 'is not an app'

Join the conversation about this story »

There’s another worrying source of pollution in the oceans

$
0
0

ocean water

Ask most people about pollution, and they will think of rubbish, plastic, oil, smog, and chemicals. After some thought, most folks might also suggest noise pollution.

We’re all familiar with noise around us, and we know it can become a problem – especially if you live near an airport, train station, highway, construction site, or DIY-enthusiast neighbor.

But most people don’t think that noise is a problem under water.

If you’ve read Jules Verne’s Twenty Thousand Leagues Under the Sea you might imagine that, maelstroms excepted, life is pretty quiet in the ocean. Far from it.

When we put a hydrophone (essentially a waterproof microphone) into the water, no matter where in the world’s oceans, it’s never quiet. We hear wind blowing overhead and rain dropping onto the ocean surface – even from hundreds of metres deep. In Australian waters we can also detect the far-off rumbles of earthquakes and the creaking of Antarctic ice thousands of kilometers away.

Wet and noisy

Water is much denser than air, so its molecules are packed tighter together. This means that sound (which relies on molecules vibrating and pushing against one another) propagates much further and faster under water than in air.

This also applies to human-produced sound. Under water we can hear boats and ships and even airplanes. Large vessels in deep water can be detected tens of kilometers away. We can be far offshore doing fieldwork, the only people around, with nothing in sight but water in any direction. Yet when we switch the engines off and put a hydrophone into the water, we hear ship noise. Sometimes, whole minutes later, the vessel we heard might appear on the horizon.

Seafarers have known about another source of sound for thousands of years: marine life. Many animals produce sound, from the tiniest shrimp to the biggest whales. Many fish even communicate acoustically under water – during the mating season, the boys start calling. Whales do it, too.

fish underwater

Light doesn’t reach far under water. Near the surface, in clear water, you might be able to peer a few metres, but in the inky depths you can’t see at all. So many marine animals have evolved to “see with sound”, using acoustics for navigation, for detecting predators and prey, and for communicating with other members of their species.

The thing is that man-made sound can interfere with these behaviors.

The effects of noise on marine animals are similar to those on us. If you’ve ever been left with ringing ears after a rock concert, you’ll know that loud noise can temporarily affect your hearing or even damage it permanently.

Noise interferes with communication, often masking it. Can you talk above the background noise in a busy pub? Long-term exposure to noise can cause stress and health issues — in humans and animals alike.

Excessive noise can change marine creatures' habits, too. Like a person who decides to move house rather than live next door to a new airport, animals might choose to desert their habitat if things get too noisy. The question is whether they can find an equally acceptable habitat elsewhere.

image 20160902 1048 u7f8rf.JPG

There is a lot more research still to be done in this field. Can we predict what noises and vibrations might be released into the marine environment by new machinery or ships? How does sound propagate through different ocean environments? What are the long-term effects on marine animal populations?

One positive is that even though noise pollution travels very fast and very far through the ocean, the moment you switch off the source, the noise is gone. This is very much unlike plastic or chemical pollution, and gives us hope that noise pollution can be successfully managed.

We all need energy, some of which comes from oil and gas; most of our consumer goods are shipped across the seas on container vessels; and many of us enjoy eating seafood caught by noisy fishing boats, some of which even use dynamite to catch fish. We want to protect our borders, making naval operations a necessity. Then there’s the ever growing industry of marine tourism, much of it aboard ever-bigger cruise ships which need large ports in which to berth.

There are a lot of stakeholders in the marine environment, and all speak a different language, all make different claims, and all make noise. Knowing precisely how much noise they make, and how it affects marine life, will help to ensure our oceans and their resources last well into the future.

September 3-11 is SeaWeek 2016, the Australian Association for Environmental Education Marine Educators’ national public awareness campaign.

Christine Erbe, Director, Centre for Marine Science & Technology, Curtin University. This article was originally published on The Conversation. Read the original article

SEE ALSO: How ‘weather bombs’ could help reveal Earth’s innermost secrets

DON'T MISS: Scientists have officially declared that we are living in a new geological era

Join the conversation about this story »

NOW WATCH: An exercise scientists explains the key to getting stronger that many overlook

The future could be bleak for our national parks

$
0
0

image 20160819 30406 11z2frb

Trees are dying across Yosemite and Yellowstone national parks. Glaciers are melting in Glacier Bay National Park and Preserve in Alaska.

Corals are bleaching in Virgin Islands National Park. Published field research conducted in U.S. national parks has detected these changes and shown that human climate change – carbon pollution from our power plants, cars and other human activities – is the cause.

As principal climate change scientist of the U.S. National Park Service, I conduct research on how climate change has already altered the national parks and could further change them in the future.

I also analyze how ecosystems in the national parks can naturally reduce climate change by storing carbon. I then help national park staff to use the scientific results to adjust management actions for potential future conditions.

Research in U.S. national parks contributes in important ways to global scientific understanding of climate change. National parks are unique places where it is easier to tell if human climate change is the main cause of changes that we observe in the field, because many parks have been protected from urbanization, timber harvesting, grazing and other nonclimate factors. The results of this research highlight how urgently we need to reduce carbon pollution to protect the future of the national parks.

Melting glaciers, dying trees

Human-caused climate change has altered landscapes, water, plants and animals in our national parks. Research in the parks has used two scientific procedures to show that this is occurring: detection and attribution. Detection is the finding of statistically significant changes over time. Attribution is the analysis of the different causes of the changes.

Around the world and in U.S. national parks, snow and ice are melting. Glaciers in numerous national parks have contributed to the global database of 168 000 glaciers that the Intergovernmental Panel on Climate Change (IPCC) has used to show that human climate change is melting glaciers. Field measurements and repeat photography show that Muir Glacier in Glacier Bay National Park and Preserve in Alaska lost 640 meters to melting from 1948 to 2000. 

image 20160819 30383 1drd07t

image 20160819 30403 12jx7wm

In Glacier National Park in Montana, Agassiz Glacier receded 1.5 kilometers from 1926 to 1979. Snow measurements and tree cores from Glacier National Park, North Cascades National Park, and other national parks contributed to an analysis showing that snowpack across the western U.S. has dropped to its lowest level in eight centuries.

image 20160831 30768 1jz1s6sClimate change is raising sea levels and heating ocean waters.

Golden Gate National Recreation Area in California hosts the tidal gauge with the longest time series on the U.S. West Coast.

That gauge has contributed to the global database that the IPCC used to show that human climate change has raised sea level 17 to 21 centimeters in the 20th century.

Measurements of sea surface temperatures by ocean buoys in Buck Island Reef National Monument, Channel Islands National Park, and Virgin Islands Coral Reef National Monument have contributed to a global database that IPCC has used to show that human climate change is heating surface waters at a rate of 1.1 ± 0.2 degrees Celsius per century.

On land, climate change is shifting the ranges where plants grow. A global analysis that colleagues and I published in 2010 found that, around the world, climate change has shifted biomes– major types of vegetation, such as forests and tundra – upslope or toward the poles or the Equator. This type of research requires long-term monitoring of permanent plots or reconstruction of past vegetation species distributions using historical information or analyses of tree rings or other markers of the past. In the African Sahel, I uncovered a biome shift by hiking 1,900 kilometers, counting thousands of trees, reconstructing past tree species distributions through verified interviews with village elders and counting thousands of trees on historical aerial photos.

Research has documented biome shifts in U.S. national parks. In Yosemite National Park, subalpine forest shifted upslope into subalpine meadows in the 20th century. In Noatak National Preserve, Alaska, boreal conifer forest shifted northward into tundra in the 19th and 20th centuries.

Wildlife is also shifting. In Yosemite National Park, scientists compared the species of small mammals they captured in 2006 to the species originally captured along an elevation transect from 1914 to 1920 and showed that climate change shifted the ranges of the American pika and other species 500 meters upslope. Across the United States, the Audubon Society organizes its annual Christmas Bird Count in numerous national parks and other sites. Analyses of bird species results from 1975 to 2004 and possible local causes of changing distributions found that climate change shifted the winter ranges of a set of 254 bird species northward. Examples include northward shifts of the evening grosbeak (Coccothraustes vespertinus) in Shenandoah National Park and the canyon wren (Catherpes mexicanus) in Santa Monica Mountains National Recreation Area.

American Pika

Climate change is driving wildfires in and around many national parks in western states. Fire is natural and we need it to periodically renew forests, but too much wildfire can damage ecosystems and burn into towns and cities. Field data from 1916 to 2003 on wildfire in national parks and across the western U.S. show that, even during periods when land managers actively suppressed wildfires, fluctuations in the area that burned each year correlated with changes in temperature and aridity due to climate change. Reconstruction of fires of the past 2,000 years in Sequoia and Yosemite national parks confirms that temperature and drought are the dominant factors explaining fire occurrence.

Climate change is killing trees due to increased drought, changes in wildfire patterns and increased bark beetle infestations. Tracking of trees in Kings Canyon, Lassen Volcanic, Mount Rainier, Rocky Mountain, Sequoia and Yosemite National Parks has contributed to a database that revealed how climate change has doubled tree mortality since 1955 across the western United States.

High ocean temperatures due to climate change have bleached and killed coral. In 2005, hot sea surface temperatures killed up to 80 percent of coral reef area at sites in Biscayne National Park, Buck Island Reef National Monument, Salt River Bay National Historical Park and Ecological Preserve, Virgin Islands National Park and Virgin Islands Coral Reef National Monument.

Managing national parks in a changing climate

When the U.S. Congress established the National Park Service a century ago, it directed the agency to conserve the natural and cultural resources of the parks in ways to leave them “unimpaired for the enjoyment of future generations.” By altering the globally unique landscapes, waters, plants and animals of the national parks, climate change challenges the National Park Service to manage the parks for potential future conditions rather than as little pictures of a past to which we can no longer return.

For example, Yosemite National Park resource managers plan to use climate change data to target prescribed burns and wildland fires in areas that will be different from the areas selected using estimates of fire distributions from the 1850s. At Golden Gate National Recreation Area, resource managers have examined stewardship plans resource-by-resource to develop actions that account for climate change. At Everglades National Park, managers are using sea level rise data to help plan management of coastal areas.

image 20160831 30768 4q5tm1

Continued climate change is not inevitable. It is in our power to reduce carbon pollution from cars, power plants and deforestation and prevent the most drastic consequences of climate change. In the face of climate change, we can help protect our most treasured places – the national parks.

Patrick Gonzalez, Principal Climate Change Scientist, National Park Service. This article was originally published on The Conversation. Read the original article

SEE ALSO: Humans have dramatically changed Hawaii — here's how

DON'T MISS: Scientists have officially declared that we are living in a new geological era

Join the conversation about this story »

NOW WATCH: An exercise scientists explains the key to getting stronger that many overlook

Hurricane Hermine intensifies Zika fears as it pounds Florida

$
0
0

Robert Long and his son J.D., 4, watch workers removing downed trees during cleanup operations in the aftermath of Hurricane Hermine in Tallahassee, Florida, U.S. September 2, 2016.

Hurricane Hermine wreaked havoc across Florida on Friday, knocking out power to nearly 300,000 homes and businesses, flooding low-lying areas and raising concerns about the spread of the Zika virus from pools of standing water left behind.

The first hurricane to make landfall in Florida in 11 years, Hermine came ashore early on Friday near the Gulf shore town of St. Marks, 20 miles (30 km) south of the capital of Tallahassee, packing winds of 80 mph (130 kph) and churning up a devastating storm surge in coastal areas.

Torrential downpours and high surf left parts of some communities under water early Friday, with mandatory evacuations ordered in parts of five northwestern Florida counties.

One storm-related death was reported by authorities in the northern Florida town of Ocala, where a fallen tree killed a homeless man sleeping in his tent.

Hermine, later downgraded to a tropical storm, was expected to snarl Labor Day holiday travel as it churned northeast after battering Florida's $89 billion tourism industry.

Downed trees and power lines block the road after Hurricane Hermine blows through Tallahassee, Florida September 2, 2016.

As of 5 p.m. EDT (2100 GMT), the fourth named storm of the 2016 Atlantic hurricane season was passing near Charleston, South Carolina with strong winds and heavy rains, the National Hurricane Center (NHC) said. The governors of Georgia, North Carolina, Maryland and Virginia declared emergencies for all or parts of their states, and a state of emergency remained in effect for most of Florida.

Though maximum sustained winds had weakened to 50 mph (80 kph), the tempest headed to the Atlantic seaboard along a path where tens of millions of Americans live, prompting storm watches and warnings stretching as far north as Rhode Island, NHC said.

Likely to regain strength over Atlantic 

The storm was projected to creep north along the Carolina coast Friday night, then gather strength after moving offshore into the Atlantic on Saturday morning, possibly reaching near-hurricane intensity by late Sunday, according to the center.

hermine hurricane tropical storm destruction

In addition to powerful winds extending up to 185 miles (295 km) from its center, Hermine was expected to unleash a dangerous storm surge in the Hampton Roads area of tidewater Virginia, where flooding could reach 3 to 5 feet deep, the NHC said.

The storm also could douse several southeastern and mid-Atlantic states with up to 15 inches (38 cm) of rain through Sunday.

New Jersey, still mindful of devastation from superstorm Sandy in 2012, was on high alert as emergency officials advised residents to prepare for flooding, high winds and a surge of seawater.

New York Governor Andrew Cuomo on Friday activated his state's emergency operations center and ordered officials to stockpile resources, including sandbags and generators.

Utility crews cut tree limbs off power lines as an ambulance drives by in the rain and wind from Hurricane Hermine in Tallahassee, Florida, U.S. September 2, 2016.

New York City Mayor Bill De Blasio said residents should avoid beach waters for fear of life-threatening riptides. "I say that to people who go the beach, I say that to surfers: Don't even think about it," De Blasio told reporters.

In Florida, concerns over the standing water in which mosquitoes breed intensified as the state battles an outbreak of the Zika virus.

"It is incredibly important that everyone does their part to combat the Zika virus by dumping standing water, no matter how small," Florida Governor Rick Scott told a news conference, also warning people to watch out for downed power lines and avoid driving through standing water.

The remains of a snapped telephone pole and its transformer block a road in the rain and wind from Hurricane Hermine in Tallahassee, Florida, U.S. September 2, 2016.

Overnight, Pasco County crews rescued more than a dozen people after their homes were flooded.

Richard Jewett, 68, was rescued from his home in New Port Richey, just north of Tampa, as emergency teams carried out a mandatory evacuation.

"The canal started creeping up toward the house, and even though it wasn't high tide it looked like it was coming inside," Jewett said.

In the island community of Cedar Key, waters rose more than 9.5 feet (2.9 meters), among the highest surges ever seen, the National Weather Service said.

Join the conversation about this story »

NOW WATCH: An exercise scientists explains the key to getting stronger that many overlook


15 of the deadliest, most destructive hurricanes in US history

$
0
0

a hurricane

Tropical Storm Hermine hit Florida on Friday, dumping rain on the state along with Georgia and the Carolinas.

Hurricane season, which lasts from June to the end of November for the Atlantic Ocean, has seen plenty of catastrophic storms throughout history.

Here's a look at some of the deadliest, most horrific storms that have hit the US over the past century.

SEE ALSO: The storm that's churning toward Florida is now officially a hurricane — with a 'mind of its own'

DON'T MISS: Singapore has been added to the list of places with Zika — here's a map of all the places it has spread so far

Hurricane Hugo, 1989: 21 deaths

Hurricane Hugo made landfall as a Category 4 storm in South Carolina. It caused 21 deaths in the US and resulted in $7.1 billion of damage. At the time, it was the costliest storm in US history.



Tropical Storm Allison, 2001: 41 deaths

While not an official hurricane, Allison clocks in as the costliest and deadliest tropical storm in US history, causing 41 deaths and costing more than $5 billion in damage. The storm started over the Gulf of Mexico near Texas, then traveled east, causing floods like the one pictured here in Houston, Texas.



Hurricane Irene, 2011: 45 deaths

Hurricane Irene made landfall in the US in North Carolina as a Category 1 storm. The storm eventually made its way up to New York City, bringing flooding — like the kind pictured here from Irene's catastrophic visit to Puerto Rico — and causing $7.3 billion in damage overall. 



See the rest of the story at Business Insider

Alzheimer's-linked nanoparticles, found in pollution, are showing up in people's brains

$
0
0

A pedestrian walks past a beam of sunlight cast through two buildings amid heavy smog in Shenyang, Liaoning province December 26, 2014.  REUTERS/Stringer

Toxic nanoparticles from air pollution have been discovered in human brains in abundant quantities, a newly published study reveals.

The detection of the particles, in brain tissue from 37 people, raises concerns because recent research has suggested links between these magnetite particles and Alzheimer's disease, while air pollution has been shown to significantly increase the risk of the disease. However, the new work is still a long way from proving that the air pollution particles cause or exacerbate Alzheimer's.

This is a discovery finding, and now what should start is a whole new examination of this as a potentially very important environmental risk factor for Alzheimer's disease, said Prof Barbara Maher, at Lancaster University, who led the new research. Now there is a reason to go on and do the epidemiology and the toxicity testing, because these particles are so prolific and people are exposed to them.

Air pollution is a global health crisis that kills more people than malaria and HIV/Aids combined and it has long been linked to lung and heart disease and strokes. But research is uncovering new impacts on health, including degenerative brain diseases such as Alzheimer's, mental illness and reduced intelligence.

The new work, published in the Proceedings of the National Academy of Sciences, examined brain tissue from 37 people in Manchester, in the UK, and Mexico, aged between three and 92.

It found abundant particles of magnetite, an iron oxide. You are talking about millions of magnetite particles per gram of freeze-dried brain tissue - it is extraordinary, said Maher.

Magnetite in the brain is not something you want to have because it is particularly toxic there, she said, explaining that the substance can create reactive oxygen species called free radicals. Oxidative cell damage is one of the hallmark features of Alzheimer's disease, and this is why the presence of magnetite is so potentially significant, because it is so bioreactive.

Patients with Alzheimer's and dementia are sit inside the Alzheimer foundation in Mexico City April 19, 2012. REUTERS/Edgard Garrido

Abnormal accumulation of brain metals is a key feature of Alzheimer's disease and a recent study showed that magnetite was directly associated with the damage seen in Alzheimer's brains. Magnetite particles are known to form biologically in human brains, but these are small and crystal-shaped, unlike the larger, spherical particles that dominated the samples in the new study.

Many of the magnetite particles we have found in the brain are very distinctive, said Maher. They are very rounded nanospheres, because they were formed as molten droplets of material from combustion sources, such as car exhausts, industrial processes and power stations, anywhere you are burning fuel.

They are abundant, she said. For every one of [the crystal shaped particles] we saw about 100 of the pollution particles. The thing about magnetite is it is everywhere. An analysis of roadside air in Lancaster found 200m magnetite particles per cubic metre.

Furthermore, said Maher: We also observed other metal-bearing particles in the brain, such as platinum, cobalt and nickel. Things like platinum are very unlikely to come from a source within the brain. It is a bit of an indicator of a [vehicle] catalytic converter source.

Other scientists told the Guardian the new work provided strong evidence that most of the magnetite in the brain samples come from air pollution but that the link to Alzheimer's disease remained speculative.

This is a very intriguing finding and it raises a lot of important questions, saidProf Jon Dobson, at the University of Florida and not part of the research team. But he said further investigation was needed: One thing that puzzles me is that the [particle] concentrations are somewhat higher than those previously reported for the human brain. Further studies [are needed] to determine whether this due to regional variations within the brain, the fact that these samples are from subjects who lived in industrial areas, or whether it is possibly due to [lab] contamination. The researchers said they had gone to great lengths to avoid contamination.

Air pollution was linked to a significant increase in the risk of Alzheimer's disease by a major study published in 2015, while other research showedbrain damage related to Alzheimer's disease in children and young adults exposed to air pollution. Air pollution has also been linked to dementia in older men and women.

If there's at least a possibility that exposure to traffic pollution is having even worse health impacts than were previously known, then take the steps you can to reduce your dose as far as you can.

We have not demonstrated a causal link between these particles and Alzheimer's disease but when you consider that magnetite has been found in higher concentrations in Alzheimer's brains and you know that magnetite is pernicious in its effect on the brain, then having a direct [air pollution] source of magnetite right up your olfactory bulb and into your frontal cortex is not a great idea, said Maher.

Prof David Allsop, an Alzheimer's disease expert at Lancaster University and part of the research team, said: There is no blood-brain barrier with nasal delivery. Once nanoparticles directly enter olfactory areas of the brain through the nose, they can spread to other areas of the brain, including hippocampus and cerebral cortex– regions affected in Alzheimer's disease. He said it was worth noting that an impaired sense of smell is an early indicator of Alzheimer's disease.

Knowledge is power, Maher said. So if there's at least a possibility that exposure to traffic pollution is having even worse health impacts than were previously known, then take the steps you can to reduce your dose as far as you can.

What this is pointing towards perhaps is there needs to be a major shift in policy and an attempt to reduce the particulate matter burden on human health. Maher said. The more you realise the impact this is having, the more urgent and important it is to reduce the concentrations in the atmosphere.

Dr Clare Walton, research communications manager at the Alzheimer's Society, said: This study offers convincing evidence that magnetite from air pollution can get into the brain, but it doesn't tell us what effect this has on brain health or conditions such as Alzheimer's disease. Further work in this area is important, but until we have more information people should not be unduly worried. There are more practical ways to lower your chances of developing dementia such as regular exercise, eating a healthy diet and avoiding smoking.

SEE ALSO: There's an easy way to tell if you're talking to an expert or a faker

Join the conversation about this story »

NOW WATCH: Women are more attracted to men with these physical traits

Brazil's new government may sacrifice the Amazon to get the economy going

$
0
0

Brazilian artist Mundano works on murals depicting indigenous people in protest against the construction of hydroelectric power plants in the Amazon rainforest, in Rio de Janeiro, Brazil, August 7, 2016. The phrases (L-R) on the graffiti read,

The impeachment of former president Dilma Rousseff, coup or not, represents a fundamental realigning of modern Brazil.

For some in the country, the crisis is an opportunity.

These politicians and businessmen are now exploiting the upheaval to roll-back environmental laws and get their hands on the vast natural resources found in protected regions of the Amazon.

The new government led by Michel Temer faces a budget deficit of 10%, an unemployment rate of 10.9% and strong calls for austerity.

It looks set to terminate a number of successful social policies, and proposes to weaken worker rights by redefining slavery to exclude “degrading conditions” and “exhausting shifts”.

Nonetheless, Temer will want to maintain Brazil’s international brand of a nation committed to the environment. After all, climate change was put centre stage at the opening ceremony of the 2016 Rio Olympics and a clear message was beamed into billions of homes across the planet: Brazil is green.

Yet these environmental credentials are questionable. Under president Dilma Rousseff and her predecessor, Lula, deforestation returned, large-scale mining and agriculture was expanded, and more dams were built.

Temer has appointed a number of environmentalist politicians to prominent positions such as the Green Party’s José Sarney Filho, now environment minister, and José Serra, the foreign minister. But economic rejuvenation at all costs will inevitably overshadow policies aimed at conservation.

Earlier this year, Temer published a document titled “A bridge to the future”, which outlined his plans for the future of Brazil and its economy. The environment, the Amazon and climate change were not mentioned.

A tractor works on a wheat plantation on land that used to be virgin Amazon rainforest near the city of Santarem, Brazil, April 20, 2013.

In particular, campaigners fear the new, pro-business government will fast-track dams, mines and other damaging schemes by weakening environmental impact assessments. A proposed bill, if passed, would allow for infrastructure projects to continue regardless of potential impacts on the environment and indigenous lands. This opens the door for accelerated environmental damage in the name of economic recovery and growth.

Though activists cheered the recent cancellation of a $10 billion hydroelectric dam on environmental grounds, it seems such celebrations may prove to be premature.

A key figure behind this bill is senator Blairo Maggi, Brazil’s soybean king and a former recipient of Greenpeace’s Golden Chainsaw awarded to the “person who most contributed to Amazon destruction”. Temer has recently appointed him Minister of Agriculture.

Maggi is a prominent member of the Agricultural Parliamentary Front (or ruralistas) that have long argued for land reform so that protected forests can be chopped down for crops, cattle and mining, with the products sold abroad. As of 2014, 28.4% of protected areas in the Amazon were of interest to mining companies. These lands – protected by concerns for both the environment and indigenous communities – will likely witness further encroachment under Temer’s government.

In recent months, this increasingly strong lobby has submitted a list of demands to President Temer, including land reform and increased subsidies for agriculture. Over lunch with the ruralistas, Temer seemingly committed to exploring these demands.

Lyndon Pishagua Chinchuya, representative of the indigenous peoples of the Peruvian Amazon, attends a meeting during the World Climate Change Conference 2015 (COP21) at Le Bourget, near Paris, France, December 8, 2015.

In one of her last acts as president, the Guardian reports, Rousseff supported indigenous land claims and acknowledged a number of quilombolos (lands occupied by the descendants of runaway slaves). Under Temer, policies like these are now under review.

The ruralistas also want to transfer responsibility for land demarcation from the executive to the legislature, where they dominate. The bill proposing this change was first drawn up in 2000 and is now back on the agenda after years in the doldrums. If passed, it would likely sound a death knell for future territory protection.

These “land reform” schemes largely focus on the Amazon rainforest, where deforestation will likely continue thanks to lucrative opportunities in agriculture and mining. Tighter government budgets will also mean less money for those charged with keeping illegal loggers and miners out of protected areas. In a nation where 50 environmental defenders were murdered in 2015– the most in the world – resistance will likely result in violence.

A silver lining can be found in Brazil taking steps towards ratifying the 2015 Paris Agreement on climate change. Yet, with the Brazilian economy in its worst slump for decades, a bitter medicine remains likely.

Senator Roberto Requião, who voted against impeachment proceedings, urged the new government to “Get yourselves into the trenches … conflict will be inevitable.” The danger, as it so often is in times of recession, is that the environment will be the new Brazil’s battlefield, and its forgotten first victim.

Ed Atkins, PhD Candidate; Environment, Energy & Resilience, University of Bristol

This article was originally published on The Conversation. Read the original article.

Join the conversation about this story »

NOW WATCH: An exercise scientists explains the key to getting stronger that many overlook

Ecuador drills the first barrel of oil in a pristine corner of the Amazon rainforest

$
0
0

Ecuador has started drilling the first oil from Yasuní National Park, a pristine corner of the Amazon rainforest.

ecaudor oil rainforest

Vice President Jorge Glas toured the site with reporters on September 7 as state oil company Petroamazonas drilled the first barrel of oil from just outside the park.

SEE ALSO: Mesmerizing photos show what life is like in the Amazon rainforest — before mankind destroys it entirely

DON'T MISS: 7 natural wonders that humans could destroy within a generation

Yasuní is a nearly 3,800-square-mile protected nature preserve on the Western edge of the Amazon.



Scientists estimate 150 amphibian, 120 reptile, and 4,000 vascular plant species live in the area, which the Ecuadorian government began protecting in 1979.

Source: United Nations Development Programme



Yasuní is also a UNESCO site, since in addition to its unparalleled biodiversity, indigenous tribes also call the area home.

Source: PLOS One



See the rest of the story at Business Insider

Big data and algorithms are slashing the cost of fixing Flint's water crisis

$
0
0

In this photo taken March 21, 2016, the Flint Water Plant water tower is seen in Flint, Mich. President Barack Obama next week will make his first trip to Flint, Mich. since the impoverished city was found to have lead-tainted drinking water, the White House said Wednesday, April 27, 2016.   (AP Photo/Carlos Osorio)

The water crisis in Flint, Michigan highlights a number of serious problems: a public health outbreak, inadequate urban infrastructure, environmental injustice and political failures.

But when it comes to recovery, the central challenge, and one that has received relatively little attention, is our lack of useful information and understanding.

Who is most at risk? Where are the harmful sources of lead? Where should resources be allocated? Using modern big-data tools, we can answer these questions and help inform the response to this crisis.

With the support of our student team at the University of Michigan, we have aggregated a trove of available data around Flint’s water issues, including water test results, records of the service lines that deliver water to homes, information on parcels of land and water usage. Leveraging new algorithmic and statistical tools, we are able to produce a significantly more complete picture of the risks and challenges in Flint.

These methods strongly resemble those used by Facebook, Amazon and other large tech companies who collect vast amounts of data from users.

But whereas Facbeook’s algorithms crunch through uploaded photographs to detect faces and Amazon’s models predict which products you’ll like, we are using these analytics tools to detect homes with high risk of lead contamination and to predict the locations of lead pipes buried underground or hidden in the homes of residents.

What have we learned? Here are a few takeaways from our research.

Lead contamination varies widely across homes and is highly scattered around Flint, but it is surprisingly predictable

The headlines on Flint could easily lead one to believe all homes in the city have dangerously high levels of lead. But in fact, using data from the state’s sentinel program, we found during a period in February only between 8 and 15 percent of homes had lead above the federal action level of 15 parts per billion (ppb).

Indeed, things have been improving from January through August 2016, according to the test data from the sentinel program. Based on about 750 homes monitored repeatedly, fewer homes have tested above the action level over time. Almost half of all samples have virtually no detectable level (below 1 parts per billion).

Percent of samples in the DEQ’s sentinel program that tested below the federal action level.

These low numbers provide little comfort when we don’t know which homes are at risk. Only around 30 percent of homes in Flint have had their water tested, according to government data, and these water tests do not guarantee safety; they only identify danger. Also, it is clear from the data that homes that are slower to sample their water tend to be those at much greater risk.

So can we find these homes? The answer is yes, to a modest degree of accuracy. We have built statistical models that profile a home based on several attributes (year of construction, location, value, size, etc.), and provide an estimate of the risk level.

Based on our statistical models, we can display locations which we estimate to be at high risk of lead contamination.

The quality of these models is driven by the huge swaths of data from water samples submitted by residents and tested by government officials in response to the crisis. This provides us with a database of measurements that includes over 20,000 water samples covering roughly 10,000 homes in Flint since November 2015 to present. We have made our risk assessments available to government officials, and are being incorporated into an mobile application, funded by Google and built by students at UM Flint, that allows Flint residents to learn of their home’s risk level.

Younger properties have lower lead levels, on average and based on the 90th percentile (blue line). There were 8 percent of tests above federal action level 15 ppb (dotted red), and still some well above 150 ppb and even 1000 ppb. The highest 0.5 percent of samples are not shown.

These statistical models not only provide predictions; they also give a better understanding of the problems. This has much broader implications, as these factors predicting lead may generalize beyond Flint.

The data suggest that lead contamination is associated with a number of factors; older homes tend to be at greater risk, for instance, as are those of lower home value. Lower-value homes also tend to be those with the lowest rates of water sampling. Additionally, while the highest readings are geographically scattered, the homes predicted to be at high risk tend to cluster in specific neighborhoods.

Flint’s lead pipe records are spotty and noisy, but statistical methods can significantly fill the gap

Media reports and political efforts have continued to focus on the so-called “water service lines” that connect each house to the distribution system in the street. The assumption is that homes with lead service lines are most at risk for lead exposure and poisoning. As a result, much of the attention has been on locating and replacing these lines.

The Michigan legislature has allocated over US$25 million toward replacing the harmful lines, beginning with a pilot phase of roughly 250 homes. This effort is being headed up by a team under National Guard Brig. Gen. Michael McDaniel.

The problem, however, is not only with lines made out of lead material: Lead particulate can accumulate on the walls of corroded galvanized steel pipes. Pipes made of copper or plastic, on the other hand, are generally considered to be safe.

But there are immediate challenges with the line replacement program. And the most obvious is: Where are these dangerous pipes?

The city, unfortunately, did not maintain consistent records on service line installations and materials. But city officials eventually found, after some searching, a set of maps with handwritten annotations (last updated in 1984), and these records were digitized by a UM Flint research team lead by Professor Marty Kaufman. These appeared to identify the material of the service lines for most home parcels in Flint.

Using paper records, researchers were able to get a rough idea of what type of material – lead, copper or plastic – was used to bring water service to home.

How complete and accurate are these records? Unfortunately, not very. For over 30 percent of homes, either there are missing labels or the records disagree with a home inspection of a portion of the service line.

We can again fill in gaps with the help of algorithms and data. Looking for patterns in the existing records, statistical tools can provide a reasonable “educated guess” as to the type of material in a home’s service line. We have been working directly with Gen. Michael McDaniel’s line replacement team, providing statistical estimates of where lead pipes are most likely to be found, and this has guided their targeting of replacement resources.

Our recommendations are adapting to incoming data, using techniques applied in online advertising experiments or clinical trials, to identify the risky homes quickly and efficiently.

Professors Schwartz (left) and Abernethy (right) at a service line replacement site in Flint, Michigan.

Our machine learning techniques, which utilize all of the available city data, parcel records and a database of over 3,000 inspection reports, are able to estimate line materials with better than 80 percent accuracy. We find, for instance, that houses built in the 1920s to 1940s are many times more likely than those built after 1960 to have lead in their service line. Our guesses aren’t perfect by any means, but estimates of this level can save millions of dollars on recovery efforts.

Home service lines may not be the largest contributor of lead

Despite the huge media attention focused on the service lines, one of the major takeaways from our analyses is that these service lines may not be the major driver of the lead in Flint’s drinking water. Yes, it is the case that those homes with copper service lines have lower lead levels, on average, than those with lead in their service line. But when you look closely at the water testing data, the differences are much smaller than you might think.

While it is difficult to determine with certainty due to the spotty records, what we have found is that large spikes of lead occur in homes with and without lead service lines. This suggests a large fraction of the dangerously high lead readings are probably not being driven by the service line material but instead by other factors. Civil engineers who study these problems report that lead can leach from several sources, including the home’s interior plumbing, faucet fixtures and aging pipe solder.

We can look at homes that, based on records and home inspections, appear to have copper-only service lines versus those containing some lead. We plot the distribution of the lead readings for water samples from these two home categories.

What we can conclude is that citizens as well as policymakers may need to widen their focus beyond the service line materials and consider alternative efforts to address other sources of lead. Service line replacement is certainly a necessary part of the solution, but it will not be sufficient.

Toward solving the broader problem, data and statistical tools can help greatly reduce risks at much lower cost, and a data-oriented understanding of the problems in Flint can guide efforts to address lead concerns in other regions as well.

For more information about getting water filters and testing your water, visit michigan.gov/flintwater/

Jacob Abernethy, Assistant Professor, University of Michigan and Eric Schwartz, Assistant Professor of Marketing, University of Michigan

This article was originally published on The Conversation. Read the original article.

Join the conversation about this story »

NOW WATCH: An exercise scientists explains the key to getting stronger that many overlook

Viewing all 2972 articles
Browse latest View live




Latest Images