Category Archives: uranium and nuclear
Forecasts for uranium price all point up
July 13, 2017
Uranium was the glaring exception amid a broad-based rally in metals and minerals in 2016. The price of U3O8 fell 41% in 2016 with the industry tracker UxC’s broker average price hitting 12-year lows below $18 per pound in November.
After top supplier Kazakhstan announced in the second week of January that it’s cutting output by 5.2 million pounds, equal to 3% of global production, the price rallied, hitting $26.75 a pound by mid-February.
But Japanese utility TEPCO’s declaration of force majeure on a key uranium delivery contract from Cameco Corp. (CCO-T), the world’s top listed uranium producer, dampened enthusiasm.
And news in April that the US dept of Energy is making cuts to the amount of uranium that it disperses into the market (as much as 1.1m pounds per year less) did little to buoy sentiment, not to mention bad news surrounding nuclear power including the first new reactor to be built in the UK in a generation and risks to the US industry.
Last week Russian state nuclear corporation Rosatom suspended its Mkuju River uranium project in Tanzania for at least three years due to depressed uranium market.
Spot uranium rose to $20.75 this week but remains technically in a bear market, trading down more than 20% from its February peak. Despite the current negativity analysts surveyed by FocusEconomics in July predict a steady increase in the price from today’s levels rising by 40% by the end of next year and over $40 a pound in 2020:
Why Commodity Traders Are Fleeing the Business
The number of trading houses has dwindled, and the institutional, pure-play commodity hedge funds that remain are few.
By Shelley Goldberg
July 12, 2017, 3:00 AM CST July 12, 2017, 11:32 AM CST
Copper, the “beast” of commodities.
Photographer: John Guillemin/Bloomberg
Profiting from commodity trading often requires a combination of market knowledge, luck, and most importantly, strong risk management. But the number of commodity trading houses has dwindled over the years, and the institutional, pure-play commodity hedge funds that remain — and actually make money — can be counted on two hands. Here is a list of some of the larger commodity blow-ups:
The largest and most successful commodity trading house in its day caved, triggered by copper trading
The New York branch of this large German conglomerate lost $1.5 billion in heating oil and gasoline derivatives
Yasuo Hamanaka blamed for $2.6 billion loss in copper scandal
Dissolves after misreporting natural gas trades, resulting in Arthur Andersen, a ‘Big 5’ accounting firm’s fall from grace
Energy hedge fund folds after losing over $6 billion on natural gas futures
One of the best-performing hedge funds in 2011, closed its doors in 2012, shrinking from $2 billion to $1.2 billion on crude oil bets
Brevan Howard Asset Management
One of the largest hedge funds globally. Closed its $630 million commodity fund after having run well over $1 billion of a $42 billion fund
The sister and energy trading arm of Phillip Brothers, ranked (1980) the 15thlargest U.S. company, dissolves
Vermillion Asset Management
Private-equity firm Carlyle Group LP split with the founders of its Vermillion commodity hedge fund, which shrank from $2 billion to less than $50 million.
Amid the mayhem, banks held tightly to their commodity desks in the belief that there was money to be made in this dynamic sector. The trend continued until the implementation of the Volcker rule, part of the Dodd-Frank Act, which went into effect in April 2014 and disallowed short-term proprietary trading of securities, derivatives, commodity futures and options for banks’ own accounts. As a result, banks pared down their commodity desks, but maintained the business.
Last week, however, Bloomberg reported that Goldman Sachs was “reviewing the direction of the business” after a multi-year slump and yet another quarter of weak commodity prices.
In the 1990s boom years, commodity bid-ask spreads were so wide you could drive a freight truck through them. Volatility came and went, but when it came it was with a vengeance, and traders made and lost fortunes. Commodity portfolios could be up or down about 20 percent within months, if not weeks. Although advanced trading technologies and greater access to information have played a role in the narrowing of spreads, there are other reasons specific to the commodities market driving the decision to exit. Here are the main culprits:
- Low volatility: Gold bounces between $1,200 and $1,300 an ounce, WTI crude straddles $45 to $50 per barrel, and corn is wedged between $3.25 and $4 a bushel. Volatility is what traders live and breathe by, and the good old days of 60 percent and 80 percent are now hard to come by. Greater efficiency in commodity production and consumption, better logistics, substitutes and advancements in recycling have reduced the concern about global shortages. Previously, commodity curves could swing from a steep contango (normal curve) to a steep backwardation (inverted curve) overnight, and with seasonality added to the mix, curves resembled spaghetti.
- Correlation: Commodities have long been considered a good portfolio diversifier given their non-correlated returns with traditional asset classes. Yet today there’s greater evidence of positive correlations between equities and crude oil and Treasuries and gold.
- Crowded trades: These are positions that attract a large number of investors, typically in the same direction. Large commodity funds are known to hold huge positions, even if these only represent a small percent of their overall portfolio. And a decision to reverse the trade in unison can wipe out businesses. In efforts to eke out market inefficiencies, more sophisticated traders will structure complex derivatives with multiple legs (futures, options, swaps) requiring high-level expertise.
- Leverage: Margin requirements for commodities are much lower than for equities, meaning the potential for losses (and profits) is much greater in commodities.
- Liquidity: Some commodities lack liquidity, particularly when traded further out along the curve, to the extent there may be little to no volume in certain contracts. Futures exchanges will bootstrap contract values when the markets close, resulting in valuations that may not reflect physical markets and grossly swing the valuations on marked-to-market portfolios. Additionally, investment managers are restricted from exceeding a percentage of a contract’s open interest, meaning large funds are unable to trade the more niche commodities such as tin or cotton.
- Regulation: The Commodity Futures Trading Commission and the Securities and Exchange Commission have struggled and competed for years over how to better regulate the commodities markets. The financial side is far more straightforward, but the physical side poses many insurmountable challenges. As such, the acts of “squeezing” markets through hoarding and other mechanisms still exist. While the word “manipulation” is verboten in the industry, it has reared its head over time. Even with heightened regulation, there’s still room for large players to maneuver prices — for example, Russians in platinum and palladium, cocoa via a London trader coined “Chocfinger,” and a handful of Houston traders with “inside” information on natural gas.
- Cartels: Price control is not only a fact in crude oil, with prices influenced by the Organization of Petroleum Exporting Countries but with other, more loosely defined cartels that perpetuate in markets such as diamonds and potash.
- It’s downright difficult: Why was copper termed “the beast” of commodities, a name later applied to natural gas? Because it’s seriously challenging to make money trading commodities. For one, their idiosyncratic characteristics can make price forecasting practically impossible. Weather events such as hurricanes and droughts, and their ramifications, are difficult to predict. Unanticipated government policy, such as currency devaluation and the implementation of tariffs and quotas, can cause huge commodity price swings. And labor movements, particularly strikes, can turn an industry on its head. Finally, unlike equity prices, which tend to trend up gradually like a hot air balloon but face steep declines (typically from negative news), commodities have the reverse effect — prices typically descend gradually, but surge when there’s a sudden supply shortage.
What are the impacts? The number of participants in the sector will likely drop further, but largely from the fundamental side, as there’s still a good number of systematic commodity traders who aren’t concerned with supply and demand but only with the market’s technical aspects. This will keep volatility low and reduce liquidity in some of the smaller markets. But this is a structural trend that feasibly could reverse over time. The drop in the number of market makers will result in inefficient markets, more volatility and thus, more opportunity. And the reversal could come about faster should President Donald Trump succeed in jettisoning Dodd-Frank regulations.
(Corrects attribution of Goldman’s review of commodity operations in third paragraph.)
Bloomberg Prophets Professionals offering actionable insights on markets, the economy and monetary policy. Contributors may have a stake in the areas they write about.
To contact the author of this story:
Shelley Goldberg at firstname.lastname@example.org
To contact the editor responsible for this story:
Max Berley at email@example.com
Fear of radiation is more dangerous than radiation itself
By David Ropeik:
He is an instructor in the environmental programme of the Harvard Extension School, and an author, consultant and public speaker who focuses on risk perception, communication and management. His latest book is How Risky Is it, Really? Why Our Fears Don’t Always Match the Facts (2010). He lives near Boston, Massachusetts.
Photo by Gregg-Webb-IAEA
The fear of ionising (nuclear) radiation is deeply ingrained in the public psyche. For reasons partly historical and partly psychological, we simply assume that any exposure to ionising radiation is dangerous. The dose doesn’t matter. The nature of the radioactive material doesn’t matter. The route of exposure – dermal, inhalation, ingestion – doesn’t matter. Radiation = Danger = Fear. Period.
The truth, however, is that the health risk posed by ionising radiation is nowhere near as great as commonly assumed. Instead, our excessive fear of radiation – our radiophobia – does more harm to public health than ionising radiation itself. And we know all this from some of the most frightening events in modern world history: the atomic bombings of Japan, and the nuclear accidents at Chernobyl and Fukushima.
Much of what we understand about the actual biological danger of ionising radiation is based on the joint Japan-US research programme called the Life Span Study (LSS) of survivors of Hiroshima and Nagasaki, now underway for 70 years. Within 10 kilometres of the explosions, there were 86,600 survivors – known in Japan as the hibakusha – and they have been followed and compared with 20,000 non-exposed Japanese. Only 563 of these atomic-bomb survivors have died prematurely of cancer caused by radiation, an increased mortality of less than 1 per cent.
While thousands of the hibakusha received extremely high doses, many were exposed to moderate or lower doses, though still far higher than those received by victims of the Chernobyl or Fukushima nuclear accidents. At these moderate or lower doses, the LSS found that ionising radiation does not raise rates of any disease associated with radiation above normal rates in unexposed populations. In other words, we can’t be sure that these lower doses cause any harm at all, but if they do, they don’t cause much.
And regardless of dose, the LSS has found no evidence that nuclear radiation causes multi-generational genetic damage. None has been detected in the children of the hibakusha.
Based on these findings, the International Atomic Energy Agency estimates that the lifetime cancer death toll from the Chernobyl nuclear accident might be as high as 4,000, two-thirds of 1 per cent of the 600,000 Chernobyl victims who received doses high enough to be of concern. For Fukushima, which released much less radioactive material than Chernobyl, the United Nations Scientific Committee on the Effects of Atomic Radiation (UNSCEAR) predicts that ‘No discernible increased incidence of radiation-related health effects are expected among exposed members of the public or their descendants.’
Both nuclear accidents have demonstrated that fear of radiation causes more harm to health than radiation itself. Worried about radiation, but ignoring (or perhaps just unaware of) what the LSS has learned, 154,000 people in the area around the Fukushima Daiichi nuclear plants were hastily evacuated. The Japan Times reported that the evacuation was so rushed that it killed 1,656 people, 90 per cent of whom were 65 or older. The earthquake and tsunami killed only 1,607 in that area.
The World Health Organization found that the Fukushima evacuation increased mortality among elderly people who were put in temporary housing. The dislocated population, with families and social connections torn apart and living in unfamiliar places and temporary housing, suffered more obesity, heart disease, diabetes, alcoholism, depression, anxiety, and post-traumatic stress disorder, compared with the general population of Japan. Hyperactivity and other problems have risen among children, as has obesity among kids in the Fukushima prefecture, since they aren’t allowed to exercise outdoors.
Though Chernobyl released far more radioactive material than Fukushima, fear caused much more health damage still. In 2006, UNSCEAR reported: ‘The mental health impact of Chernobyl is the largest public health problem caused by the accident to date … Rates of depression doubled. Post-traumatic stress disorder was widespread, anxiety and alcoholism and suicidal thinking increased dramatically. People in the affected areas report negative assessments of their health and wellbeing, coupled with … belief in a shorter life expectancy. Life expectancy of the evacuees dropped from 65 to 58 years. Anxiety over the health effects of radiation shows no signs of diminishing and may even be spreading.’
The natural environment around the Chernobyl and Fukushima Daiichi accidents adds evidence that ionising radiation is less biologically harmful than commonly believed. With people gone, those ecosystems are thriving compared with how things were before the accidents. Radiation ecologists (a field of study that blossomed in the wake of Chernobyl) report that radiation had practically no impact on the flora and fauna at all.
The risk from radiophobia goes far beyond the impacts in the immediate area around nuclear accidents. Despite the fact that radiation released from Fukushima produced no increase in radiation-associated diseases, fear of radiation led Japan and Germany to close their nuclear power plants. In both nations, the use of natural gas and coal increased, raising levels of particulate pollution and greenhouse gas emissions.
Neither country will meet its 2020 greenhouse gas emissions-reduction targets. Across Europe, fear of radiation has led Germany, France, Spain, Italy, Austria, Sweden and Switzerland to adopt policies that subsidise solar, wind and hydropower over nuclear as a means of reducing greenhouse gas emissions, despite the fact that most energy and climate-change experts say that intermittent renewable energy sources are insufficient to solve the problem. In the United States, 29 state governments subsidise wind and solar power, but only three offer incentives for nuclear, which produces far more clean power, far more reliably.
Fear of radiation has deep roots. It goes back to the use of atomic weapons, and our Cold War worry that they might be used again. Modern environmentalism was founded on fear of radioactive fallout from atmospheric testing of such weapons. A whole generation was raised on movies and literature and other art depicting nuclear radiation as the ultimate bogeyman of modern technology. Psychologically, research has found that we worry excessively about risks that we can’t detect with our own senses, risks associated with catastrophic harm or cancer, risks that are human-made rather than natural, and risks that evoke fearful memories, such as those evoked by the very mention of Chernobyl or Three Mile Island. Our fear of radiation is deep, but we should really be afraid of fear instead.
Infographic: Inside the Canada-US energy trade
By JWN staff
July 9, 2017, 8:53 a.m.
Further troubles lie ahead as Ottawa’s attempt at modernizing major resource project approval processes reveals a divided Canada
Further troubles lie ahead as Ottawa’s attempt at modernizing major resource project approval processes reveals a divided Canada
By Darrell Stonehouse
June 29, 2017, 1:21 p.m.
Image: Kinder Morgan Canada
Call it an exercise in herding cats.
Only one year into the federal government’s efforts to reshape Canada’s environmental and regulatory processes surrounding resource development, and it’s already revealed a country deeply divided on how to assess environmental concerns with new projects and how to regulate industry to mitigate any issues.
The federal government launched its multi-department review last June after instituting a temporary system in January for projects already under environmental assessment. The goal is to replace the environmental assessment legislation put in place by Stephen Harper’s Conservatives in 2012, while modernizing the National Energy Board (NEB), Fisheries Act, and Navigation Protection Act.
The rationale for the review is to “restore Canadians’ trust in environmental assessments,” said Catherine McKenna, the federal minister of environment and climate change.
“The review of Canada’s environmental and regulatory practices will ensure that decisions are based on science, facts and evidence,” added Kirsty Duncan, the federal minister of science.
Over the last year, the government has been gathering submissions and holding public hearings to get input from Canadians across the country. In early April, the expert panel reviewing the environmental assessment process released its recommendations. A similar report concerning the modernization of the NEB was released in mid-May.
The preliminary results from the environmental review show the challenges of trying to balance environmental stewardship with industrial growth.
“Views about federal environmental assessment across the various interests ranged from support to all-out opposition,” the environmental panel said in its report to the government.
The view from industry
Industry was looking for a number of things from the review, including assurances that any new regulations wouldn’t further harm the country’s competitiveness.
“Canada is competing globally for capital investment in our oil and gas resources, and it is imperative for the Canadian economy that Canada remain competitive with other jurisdictions,” Jim Campbell, Cenovus Energy’s vice-president of government and community affairs, told the task force on behalf of his company.
Campbell pointed to a recent study and survey showing the Canadian industry is falling behind competitors when it comes to competing for capital. “Primary reasons cited Canada’s decline include regulatory duplication and inconsistencies and complexity of environmental regulations,” he noted.
In its submission to the task force, Suncor Energy, like most others from industry who offered input, said the federal review process should dovetail with, rather than overlap, provincial and local review processes. The process should, “accent, not duplicate, provincial reviews,” said Suncor. “One project, one assessment. Duplicate reviews do not add additional protections and can add years to project applications.”
The federal assessment “should be a process to assess residual environmental risks in areas of federal jurisdiction,” Suncor added.
Cenovus, with most of its primary assets in Alberta, agreed primary responsibility for environmental assessments should remain with the provinces.
“Local regulators have the experience and technical expertise to best evaluate projects, work with local communities and perform follow-up monitoring and compliance,” noted Campbell.
Campbell also said federal and provincial environmental assessment processes should be streamlined by allowing for substitution and equivalency agreements based on the principles of the best-placed regulator to do the work and a single-window approach.
When it comes to addressing First Nations’ concerns, Suncor said the federal government, rather than industry, must take a leadership role, pointing out that the review “must ensure the Crown is upholding its duty to consult.”
“Proponents have the responsibility to support the Crown through direct engagement and partnership with affected communities, incorporating traditional knowledge through applications and developing projects in a sustainable manner,” Suncor added.
The oilsands giant said the people and communities closest to projects should be at the front of the line when it comes to consultations in environmental assessments.
“Reviews must allow those most directly affected by the outcome of a particular project to have the greatest opportunity to participate and have a voice in the process,” it noted. “Input from affected stakeholders can get diluted when the process is used for purposes other than gathering information on a specific project.”
Suncor and other resource companies and associations also said they don’t believe the review process should be hijacked by groups wanting to debate larger public concerns outside the boundaries of the project. Governments should first set public policy direction on these broader issues like climate change, and then the review process should ensure public policy standards are met.
“The review process is not the appropriate venue for debating broader public policy,” the company said.
Another key element for industry and provinces with resource-based economies in the review process was ensuring the designated projects section of the Canadian Environmental Assessment Act, 2012 remained in place. Projects including minerals mining (such as potash), linear developments (transmission lines and highways) that do not cross provincial boundaries, extraction of non-potable groundwater, in situ oilsands developments and natural gas facilities were removed from the list of projects requiring federal assessments in the 2012 legislation.
“Removing these projects from federal [environmental assessment] review saved time and cost by greatly reducing unnecessary duplication of [assessments] and other regulatory processes, reducing red tape for proponents while maintaining robust provincial environmental safeguards,” said the government of Saskatchewan in its submission. “The province advocates for the exclusion of such projects from federal review, recognizing mature and effective provincial environmental regulatory review processes.”
Green groups, First Nations look for greater participation in process
While industry looked to streamline the environmental assessment process and provide certainty to investors, environmentalists and First Nations looked for greater input into the process and for the federal government to expand the list of designated projects that require federal approval. Many also requested a climate test be included in the process.
West Coast Environmental Law said it was looking for a “next-generation assessment law” that accounted for the economic, ecological and social aspects of sustainability, that respected First Nations authority and governance, that provided for full public participation, and that connected the assessment, decision-making and action of different levels of government.
They also wanted the law to “address the causes and effects of climate change, include strategic and regional assessment as fundamental components, and to require appropriate assessment of the thousands of smaller projects currently not being studied.”
“This isn’t the time to make small adjustments to a deeply flawed process—we need a new law that ensures the health of Canadians and the environment, and this is our chance to get it right,” said Stephen Hazell, the director of conservation and general counsel at Nature Canada.
Recommendations favour expansion of federal role in assessments
The initial report from the expert panel is promising many of the big changes environmentalists and others who submitted opinions wanted. The first is a major expansion in the assessment process beyond the environmental impacts of a project.
“We outline that, in our view, assessment processes must move beyond the bio-physical environment to encompass all impacts likely to result from a project, both positive and negative. Therefore, what is now ‘environmental assessment’ should become ‘impact assessment,’” the panel said. “Changing the name of the federal process to impact assessment underscores the shift in thinking necessary to enable practitioners and Canadians to understand the substantive changes being proposed in our report.”
This new assessment process would cover what the panel calls the “five pillars of sustainability: environmental, social, economic, health and cultural impacts.”
While industry said it would like to see public input limited to those most affected by the project, the panel also sided with environmental groups wanting to see broader public input. The panel also said that more meaningful public participation in the assessment process is a must.
“An overarching criterion of public participation opportunities in impact assessment processes is that these opportunities must be meaningful,” the report added. “A meaningful participation process needs to have the inherent potential to influence decisions made throughout the assessment, provide inclusive and accessible opportunities for early and ongoing engagement from the public and indigenous groups, and provide the capacity required for active participation in the engagement.”
The panel said current rules regarding public participation are lacking and have been perceived as having been designed to “limit public participation in the assessment process.”
The panel believes the NEB’s adoption of the “standing test” has greatly hindered trust in its assessments.
“The degree to which this test has limited participation is evident through NEB participation data. The outcome of this is not an efficient assessment process or timely incorporation of public input into a decision-making process,” the panel said. “In the case of the Trans Mountain Expansion project review, a ministerial panel was convened after the NEB assessment process was completed, at least in part to hear from those who felt shut out of the initial process. In short, limiting public participation reduces the trust and confidence in assessment processes without bringing any obvious process efficiency.”
“The panel recommends that…legislation require that [an impact assessment] provide early and ongoing participation opportunities that are open to all,” the report said. “Results of public participation should have the potential to impact decisions.”
The expert panel also questioned the need for time limits on the review process, suggesting that instead, the time frame of the review process be project-specific. The current process, put in place in 2012, requires environmental assessments of projects that occur on federal lands, such as pipelines, to be completed within one or two years, depending on the project’s size and complexity.
“This has not met the objective of delivering cost- and time-certainty to proponents,” the report said. “Our recommended approach seeks to build public confidence in the assessment process. We believe that public trust can lead to more efficient and timely reviews. It may also support getting resources to market.”
The expert panel also recommended a number of ways to increase First Nations participation in the assessment process, including implementing the principles of the United Nations Declaration on the Rights of Indigenous Peoples (UNDRIP), “especially with respect to the manner in which environmental assessment processes can be used to address potential impacts to potential or established aboriginal or treaty rights.”
The panel recognized that there are broader discussions that need to occur between the federal government and indigenous peoples with respect to nation-to-nation relationships, overlapping and unresolved claims to aboriginal rights and titles, reconciliation, treaty implementation and the broader implementation of UNDRIP. According to the panel, many of these discussions will be necessary prerequisites for the full and effective implementation of the recommendations contained in the report.
Among its recommendations regarding indigenous people, the panel suggested that indigenous peoples be included in “decision-making at all stages of the assessment process, in accordance with their own laws and customs.”
It also suggests First Nations be funded adequately to allow meaningful participation in the process and be given the time to review information.
The panel report defines the criteria for the type of projects that should be federally reviewed and limits the criteria of projects that are included for federal review in the designated projects list.
“Many participants favoured the continued use of a project list approach to trigger federal assessments because it is predictable and clear and places the focus on major resource projects,” wrote the panel.
“Requiring an assessment for projects with minor impacts was described as too burdensome and time-consuming for proponents and lacking proportionality. Participants also said, however, that the current project list is too focused on certain industries, such as mining, and should be revisited to ensure that the list more accurately reflects projects with the highest potential for adverse effects, with some participants indicating that in situ oilsands projects and hydraulic fracturing activities should be included.”
The committee recommended only projects that affect federal interests should be included on the list. This differs from the current approach that includes projects that may not affect matters of federal interest. And it said there should be an appropriate threshold for effects on federal interests so that a trivial impact does not trigger an assessment.
“A new project list should be created that would include only projects that are likely to adversely impact matters of federal interest in a way that is consequential for present and future generations,” said the committee.
On the issue of government jurisdiction, there was widespread support for the idea of “one project, one assessment.”
However, a key goal of the assessment process is to leverage the knowledge of all government levels.
“In Canada, many jurisdictions have the expertise, knowledge, best practices and capacity to contribute to impact assessments,” said the panel. “For example, the federal and provincial governments may focus on closely related issues, such as impacts to water quality versus impacts to a fishery. Yet indigenous groups also have relevant knowledge on these topics related to the practice of their aboriginal and treaty rights, their traditional and ongoing land use, and their laws, customs and institutions. Similarly, municipalities are the custodians of land use and the full range of local impacts that affect residents and their communities.”
The committee said it believes the best way to connect all these areas of expertise is through a co-operative approach.
“To date, the best examples of co-operation among jurisdictions have been joint-review panels backed up by general co-operation agreements between Canada and many provinces,” said the committee. “As such, expanding the co-operation model to include all relevant jurisdictions is the preferred method to carry out jurisdictional co-ordination.”
Climate change a sticky issue
The expert panel said the issue of climate change has proved difficult to address under existing environmental assessment regulations.
“Current processes and interim principles take into account some aspects of climate change, but there is an urgent national need for clarity and consistency on how to consider climate change in project and regional assessments,” it said.
The panel said criteria, modelling and methodology must be established to assess a project’s contribution to climate change, consider how climate change may impact the future environmental setting of a project, and consider a project’s or region’s long-term sustainability and resiliency in a changing environmental setting.
Industry is concerned the issue of climate change has sidelined project assessments and turned them into debates over government policy. The panel addressed this issue by recommending the federal government lead a strategic impact assessment or similar co-operative and collaborative mechanism on the Pan-Canadian Framework on Clean Growth and Climate Change to provide direction on how to implement the framework and related initiatives in future federal project and regional assessments.
Liberals vow greater Indigenous input, tougher environmental hurdles for resource projects
OTTAWA — The Globe and Mail
Published Thursday, Jun. 29, 2017 4:36PM EDT
Last updated Thursday, Jun. 29, 2017 4:48PM EDT
The Liberal government is proposing new rules that would require resource companies to consult with Ottawa and indigenous communities on major projects well before the firms finalize their plans and apply for regulatory approval.
The companies would also be expected to provide greater opportunity for partnership with aboriginal Canadians, and seek indigenous peoples’ consent over whether developments that impact their traditional territory should proceed, though they would not have a veto.
The Liberals released a discussion paper Thursday that proposes sweeping changes to the federal regulatory review process for major projects, undoing many of the controversial changes put in place by the Conservatives just five years ago. The government plans to introduce legislation by the end of the year.
Its proposals call for additional environmental protection and increase public and indigenous involvement in decision-making in the review period, while insisting the new regime will “ensure good projects go ahead and resources get to market.”
The former Conservative government overhauled the assessment process in 2012, with the aim of accelerating decision-making; limiting the environmental considerations that would be taken into account, and preventing project opponents from bogging down hearings.
Echoing concerns of environmentalists and First Nations, the Liberals argued the Conservative tilted the process in favor of pipelines and other resource projects. In its discussion paper, the government says the current process is not transparent and lacks scientific rigor; fails to protect waterways and fisheries, and does not allow for sufficient participation by the public and indigenous Canadians.
Despite those shortcomings, the Liberals decided when they took power in late 2015 to review pipeline projects already in the queue under the existing process, saying it would be unfair to make the companies wait for the reforms. Instead, the government adopted “interim measures” that included increased consultations and an assessment of the greenhouse-gas-emission impacts of pipeline expansion.
The oil industry’s proposed pipeline expansions have drawn the most opposition, and have prompted an often-heated national debate over the risks and benefits of resource development, pitting Alberta and its oil industry against vocal opponents in British Columbia and Quebec.
However, the proposed reforms – which are to some degree driven by that debate – would not impact the major oil pipelines. Ottawa approved two projects last November – including the hugely controversial expansion of the Trans Mountain line to Vancouver – and is currently reviewing TransCanada Corp.’s Energy East project.
Industry officials reacted cautiously to the proposals on Thursday, saying they needed to to review the discussion paper to understand the full impact. The government proposes to keep the National Energy Board (NEB) as a regulator of ongoing operations, and maintain its office in Calgary, but the board would have a diminished role in the environmental assessment process.
The government proposes to create a single review agency that would replace the NEB in assessing pipeline and major energy projects, but the agency would work with the NEB to benefit from its technical expertise.
It aims to eliminate federal-provincial overlap by promoting a “one project, one assessment” approach, and would have legislated timelines for reviews, though ministers could approve exceptions.
The Canadian Energy Pipeline Association (CEPA) – which represents companies like TransCanada Corp – had proposed a two-step approval process, with an initial finding on whether a project was in the national interest and a more technical environmental assessment.
The Liberals’ discussion paper rejects that approach, but CEPA president Chris Bloomer said there are elements in the plan that could ensure the political debates and decision are made outside the environmental assessment process. “There’s a lot in the discussion paper that requires further definition,” Mr. Bloomer said. “We think we can work with it but this is not the end of the process.”
Mining projects are reviewed under the Canadian Environmental Assessment Act, which will also be up-dated. Canadian Mining Association president Pierre Gratton said he was pleased to see Ottawa rejected recommendations made this spring by a government-appointed advisory panel that industry feared would have greatly bogged down the review process.
However, one environmental lawyer said government’s proposals “fall far short” of establishing a process that would ensure sustainable development for the benefit of the communities. The measures “would not give Canada a leading-edge, world-class environmental review process,” Anna Johnston, staff lawyer at West Coast Environmental Law, said. “It would still let short-term economic benefit that go to a few trump environmental harm.”
How news organizations unintentionally misinform the public on guns, nuclear power, climate change, etc.
This story can easily be switched to topics of nuclear power, greenhouse gases, climate change, etc.
How news organizations, including this one, unintentionally misinformed the public on guns
Mike Wilson, Editor
The Dallas Morning News
Steve Doud, a subscriber from Plano, emailed me to say he’d read something in the June 21 Dallas Morning News that couldn’t possibly be true.
An eight-paragraph Washington Post article on page 10A reported on a national study about kids and guns. The last sentence said 4.2 percent of American kids have witnessed a shooting in the past year.
“Really?” Doud wrote. “Does it really sound believable that one kid out of every 24 has witnessed a shooting in the last year? I think not, unless it was on TV, in a movie, or in a video game. In that case it would probably be more like 100 percent.”
His instincts were right. The statistic was not.
Here is the unfortunate story of how a couple of teams of researchers and a whole bunch of news organizations, including this one, unintentionally but thoroughly misinformed the public.
It all started in 2015, when University of New Hampshire sociology professor David Finkelhor and two colleagues published a study called “Prevalence of Childhood Exposure to Violence, Crime, and Abuse.” They gathered data by conducting phone interviews with parents and kids around the country.
The Finkelhor study included a table showing the percentage of kids “witnessing or having indirect exposure” to different kinds of violence in the past year. The figure under “exposure to shooting” was 4 percent.
Those words — exposure to shooting — are going to become a problem in just a minute.
Earlier this month, researchers from the CDC and the University of Texas published a nationwide study of gun violence in the journal Pediatrics. They reported that, on average, 7,100 children under 18 were shot each year from 2012 to 2014, and that about 1,300 a year died. No one has questioned those stats.
The CDC-UT researchers also quoted the “exposure to shooting” statistic from the Finkelhor study, changing the wording — and, for some reason, the stat — just slightly:
“Recent evidence from the National Survey of Children’s Exposure to Violence indicates that 4.2 percent of children aged 0 to 17 in the United States have witnessed a shooting in the past year.”
The Washington Post wrote a piece about the CDC-UT study. Why not? Fascinating stuff! The story included the line about all those kids witnessing shootings.
The Dallas Morning News picked up a version of the Washington Post story.
And Steve Doud sent me an email.
When I got it, I asked editorial writer Michael Lindenberger to do some research. He contacted Finkelhor, who explained the origin of the 4 percent “exposure to shooting” stat.
According to Finkelhor, the actual question the researchers asked was, “At any time in (your child’s/your) life, (was your child/were you) in any place in real life where (he/she/you) could see or hear people being shot, bombs going off, or street riots?”
So the question was about much more than just shootings. But you never would have known from looking at the table.
Finkelhor said he understood why “exposure to shooting” might have misled the CDC-UT researchers even though his team provided the underlying question in the appendices. Linda Dahlberg, a CDC violence prevention researcher and co-author of the study featured in The Post and this newspaper, said her team didn’t notice anything indicating the statistic covered other things.
Then again, the Finkelhor study didn’t say anything about kids “witnessing” shootings; that wording was added by the CDC-UT team. Dahlberg said she’ll ask Pediatrics about running a correction.
All of this matters because scientific studies — and the way journalists report on them — can affect public opinion and ultimately public policy. The idea that one in 25 kids witnessed a shooting in the past year was reported around the world, and some of the world probably believed it.
No matter where you stand on guns or any other issue, we ought to be making decisions based on good information.
Finkelhor’s team caused confusion by mislabeling a complicated stat. The CDC-UT researchers should have found the information suspect. The Washington Post should have asked more questions about that line from the CDC-UT study.
And we should have been as skeptical of the Washington Post report as Steve Doud was.
It’s the carbon, stupid: Visualizing Canada’s carbon flows
June 26, 2017, 1:16 p.m. |
Energy systems—the production and use of fuels and electricity—are under intense pressure to change for the good of the environment and the economy.
However, the flow of energy through these systems is not the problem.
Rather it is the flow of carbon, especially when those flows bring carbon dioxide (CO2) and methane (CH4) to the atmosphere where they become greenhouse gases (GHGs).
Six months ago, Canadian Energy Systems Analysis Research (CESAR) released Sankey diagrams for Canada showing how energy flows, from the sources we extract from nature to the demands of society for fuels and electricity.
Today, CESAR released a new set of Sankey visualizations—the first of their kind—showing how carbon flows through the same fuel and electricity systems.
These new Sankey diagrams come with a brand new and improved web portal where the user can toggle between the energy and the carbon Sankeys for a given province or year to explore 1056 different Sankey diagrams.
The portal has a number of other new and useful features that users can learn about by checking out the brand new User’s Guide to Energy-Carbon Sankeys. Literally hours of fun for energy systems nerds!
The title of this post (originally published on the CESAR website) gives a nod to the 1992 US presidential election: Bill Clinton strategist James Carville outlining one of three campaign pillars for workers as “the economy, stupid.”
CESAR tries to highlight a few of the things that one can learn by focusing on carbon. For the record, in saying “stupid,” CESAR says it pointing to itself. Its focus, even the CESAR name, has been about energy, but the core problem is carbon.
Comparing the Energy and Carbon Sankeys
Figure 1 provides a side-by-side comparison of energy (Panel A) and carbon (Panel B) flows associated with the production and use of fuels and electricity in Canada in 2013. There are many similarities and some important differences between the two visualizations.
In CESAR Sankey diagrams, the nodes on the left-hand side represent the energy (Figure 1A) or carbon (Figure 1B) content of all the energy resources produced in (i.e. Primary energy or carbon) or imported into Canada.
Figure 1: The flows of energy (A) and carbon (B) through the fuel and electricity systems of Canada in 2013.
The nodes in the middle portion of the Sankey represent the companies in the energy sectors. They convert the energy (and carbon, if present) into fuels and electricity that are then either exported or passed to a number of demand sectors.
These include transportation demands (personal and freight), building (residential, commercial and institutional), energy using industries (cement, agriculture, chemicals, etc), and non-energy uses (plastics and asphalt) Some recovered energy and carbon has little or no commercial value (e.g. petcoke), so it is stockpiled (mostly in northern Alberta) and allocated to an node called “stored energy” in these Sankey diagrams.
Since neither energy nor carbon can be created or destroyed (putting aside some nuclear reactions), the sum of all flows at the right side of the each diagram equals the sum of all the flows on the left side of each diagram.
In the energy Sankey (Figure 1A), the energy that was consumed (or lost by venting) through domestic processes is depicted as orange and grey flows lines that were either attributed to delivering the valuable end-use service (i.e. useful energy) or consumed as a “conversion loss” in the process of providing the fuel/electricity or end-use service. While all this energy still exists, it is highly dispersed (i.e. low exergy) and therefore of little economic value.
Exergy is the energy that is available to be used. A fuel that can create, on combustion, very high temperatures relative to the surroundings has high exergy since the differential temperature can do some valuable work.
In the C sankey (Figure 1B), the grey flows are predominantly in the form of carbon dioxide (CO2), the end product of fossil fuel combustion. However, some fossil carbon may be in the form of methane (CH4), a potent GHG that can leak to the atmosphere.
The dark grey flows on the right side of Figure 1B represent the flows of CO2 from the combustion of bio-based fuels.
Although similarities exist in the two Sankeys shown in Figure 1, there are substantial differences that reflect variations among feedstocks in their carbon: energy content.
As shown in Figure 2, the C content of energy feedstocks vary widely, from zero for nuclear, hydro, wind and solar to about 24 or 25 kgC/GJ for coal or biomass.
Figure 2: The carbon content of energy feedstocks in Canada’s fuel and electricity systems. The black lines represent the range of values for each feedstock type, while the bar shows the typical value.
Oil and most refined petroleum products tend to have 19-20 kgC/GJ while natural gas is about 14 kgC/GJ. Consequently, some of the energy flows do not appear on the carbon sankeys (nuclear, hydro, wind) while others (esp. coal, biomass) become proportionately larger relative to the flow for natural gas.
It is important to note that this C only accounts for the C that is contained within the energy feedstock itself; it does not provide information on how that C may contribute to GHG emissions. Nor does it reflect the life cycle emissions associated with recovering, refining and transporting each energy resource.
Insights from CESAR’s Carbon Sankeys
So what can we learn from a closer study of the Carbon Sankey for Canada (Figure 1B), insights that could not be gained from the Energy Sankey (Figure 1A)?
- Visualizing CO2equivalents (CO2e).
To extract the maximum amount of energy from fossil fuels, they need to be combusted in the presence of oxygen to produce CO2. This reality is reflected in the light grey flows to ‘fossilCO2’ in the C ankey (Figure 1B). Each tC as CO2 contributes one tC as CO2 equivalent (CO2e) GHG emissions.
Figure 3: Details of the C Sankey showing the origin of CO2 equivalents.
However, fugitive emissions of fossil C (especially gaseous methane) does occur in Canada’s fuel and electricity systems, and those flows are tracked within the CanESS model. When displayed in the Sankey diagram (Figure 1B or Figure 3 for a higher resolution perspective), the flows are small compared to “fossilCO2.”
However, methane is a potent GHG, with a global warming potential of 9.1 tC as CO2e per tC as CH4 (see footnote 1). Therefore, the CH4 C flows are multiplied by 9.1 to calculate their contribution to GHG emissions measured as tC as CO2e.
It is important to note that the C Sankey in Figure 1B only shows the GHG emissions coming from fossil C feedstocks. Process-based CO2 emissions such as that associated with cement or steel making are not shown in these Sankey diagrams, nor are the CH4 or N2O emissions associated with Canada’s agricultural systems or landfills.
- Biogenic carbon is treated differently.
In Figure 1B (or Figure 3), the dark grey flows on the right side of the Sankey diagram show the magnitude of biomass flows to the atmosphere as a result of their use as fuels. However, in agreement with international convention, these emissions do not contribute to Canada GHG emissions (CO2e in Figure 1B or 3).
The convention assumes that the plants from which this bio-carbon is obtained had recently (i.e. within the last year for crop plants, or the last century for trees) removed it from the atmosphere. Under international agreement, countries are expected to report C stock changes in their managed biological systems to ensure they are being sustainably managed.
Canada does report these numbers (see footnote 2), but the reported carbon stock changes are not counted in national totals for GHG emissions, nor are they shown in Figure 1B since those flow only include the biocarbon associated with fuel and electricity production and use.
- Following the carbon from source to demand.
Starting from the left side of the carbon Sankey for 2013 (Figure 1B) we can see that a total of 392 MtC entered the Canadian energy system that year. Canadian oil, gas and coal producers took 300 million tonnes of carbon out of the ground — 182 MtC in the form of crude oil, 78 MtC in the form of natural gas and 38 MtC in the form of coal.
In addition, 21 MtC in wood and agricultural biomass were used for energy, and an additional 72 MtC were imported in the form of petroleum, natural gas and coal.
The nodes on right side of the diagram show where all that carbon went. Half of the total carbon – fully 194 MtC – was exported as oil (130 MtC), gas (41 MtC) or coal (23 MtC). This carbon would almost all have ended up in the atmosphere when the exported fuels were burned, but the responsibility for those emissions rests with the importing country (primarily the US). Another 178 MtC, (including 21 MtC of “biogenic” carbon from biomass combustion) was emitted into the atmosphere in Canada, from power plant stacks, residential and commercial building chimneys, personal and commercial vehicle tailpipes, and industrial energy consumption, including the fossil fuel industry itself.
All totalled, 95% of the carbon taken out of the ground ends up in the atmosphere. The remainder is stored or sequestered, mostly in plastics and other non-consumable petrochemicals (16 Mt), with smaller quantities in oil sands tailings and petroleum coke stockpiles (7 MtC).
- Tracking fossil fuel CO2 emissions.
By focusing on the light grey flows in any of CESAR’s carbon Sankeys, users have a rapid and easy way to explore the origin of the emissions of fossil carbon. Hint: on the web portal, hover your cursor over the flow of interest and a pop-up will appear to giving you the number of MtC flowing through that part of the fuel and electricity system.
For example, in the 2013 carbon Sankey for Canada, the fuel and electricity industries can be seen to emit 59 MtC as CO2, or 38% of the 156 Mt of fossil C that Canada emitted to the atmosphere as CO2 in 2013.
The energy industry’s emissions were split evenly between power plants (29 Mt) and the oil and gas extraction and processing industry (30 Mt).
The remaining 62% of fossil CO2 emissions in 2013 (97 MtC) were associated with the demand sectors, and more than half of that (34% or 53 MtC) came from transportation, both personal and freight. Fossil carbon emissions from buildings accounted for 13% of 20 MtC and the remaining 16% (24 MtC) were from the energy using industries of Canada.
Overall, these visualizations reveal the rivers of carbon from their headwaters in the primary production of fossil fuels and biomass through to their final emission into the atmosphere from the smokestacks, chimneys and tailpipes of our industrial plants, buildings and vehicles.
In this post, CESAR provided a few directions for navigating these rivers and encourages readers to use the new portal to compare energy and carbon Sankeys, explore how they have changed with time, and how they compare among provinces (especially using the per capita option).
- The 9.1 value assumes 25 tCO2e per tCH4; see User’s Guide for details.
- According to Environment and Climate Change Canada’s 2017 National Inventory Report, if we don’t count the C stock losses that occur on managed lands as a result of major forest fires, Canada’s managed biological systems annually accumulate about 9.3 MtC/yr.
That is equivalent to about 34 Mt CO2 /yr of net removal from the atmosphere.
Sask. mining rescue crews showcase emergency response skills
By Rebekah Lesko
June 4, 2017
Mining rescue crews from across the province showcased their skills at the Saskatchewan Mine Rescue Skills Competition.
Mining crews from across Saskatchewan came to Saskatoon to show off their emergency response skills.
Teams from potash, coal, uranium and gold mines showcased their rescue skills in simulated scenarios.
If anyone knows how important emergency response training is, it’s Rod Greve.
Greve worked at the Lanigan potash mine for over 40 years and said they are there to help their fellow miners should the need ever arise.
“They want someone to be trained. It’s a highly dedicated group of people from all the mines that get together here,” said GREVE, who is a judge at the 49th annual Saskatchewan Mine Rescue Skills Competition.
“This training improves our community, our teams, our co-workers, everyone benefits from it.”
From fire, to first aid, the competition tests miner’s skills for future emergencies, skills that are even more important in remote regions.
“We need to have to have our own emergency response teams available because resources aren’t available like medical aid or ambulances and fire trucks, we don’t have the communities right next to us,” Camille Pouteaux, a Cameco Key Lake team member, said.
“Having the ability to offer rescue services at the sites are very important.”
New project aims to extract rare earth elements from uranium tailings
ALEX MACPHERSON, SASKATOON STARPHOENIX
Published on: June 5, 2017 | Last Updated: June 5, 2017 6:00 AM CST
Saskatchewan Research Council mineral division head Bryan Schreiner says a new pilot project to remove rare earth elements from uranium tailings could have significant benefits for the province. KAYLE NEIS / SASKATOON
New technology under development in Saskatoon could make it profitable for Saskatchewan-based mining companies to extract “significant” quantities of rare earth elements from uranium tailings solution that would otherwise go to waste.
The parallel processes being piloted by Saskatchewan Research Council (SRC), which started work on the project three years ago, involve concentrating the tailings solution and then using “cells” containing mixers to separate out each of the rare earth elements.
“It’s good for our uranium companies and it’s good for the province,” said Bryan Shreiner, who heads SRC’s minerals division. “And in terms of value for Canada and the rest of the world, rare earths are in demand.”
Rare earth elements are used to improve alloys and manufacture consumer electronics and other products. While the 17 elements are relatively abundant, they are difficult to produce because they almost never appear in significant concentrations.
SRC’s technology, the product of about three years’ work, could not only ease China’s stranglehold on the global market for rare earths, but make extracting the elements much cheaper than setting up a dedicated facility, Schreiner said.
“The value of the elements is quite high. And the other value proposition here is you’ve already crushed and ground and dissolved the material (to get uranium) so you don’t have to do that for the rare earths.”
Schreiner said funding for the project comes from the Crown corporation’s innovation fund. According to its latest annual report, SRC turned revenues of just under $70 million into $484 million in “direct economic benefits” for the province.
It remains unclear, however, if companies invested in the uranium sector will adopt the technology.
Saskatchewan’s uranium industry has been badly hurt by plummeting prices, the result of collapsing demand in the wake of the 2011 Fukushima Daiichi nuclear disaster. It remains unclear if any will choose to invest limited capital in the new technology.
Cameco Corp. spokesman Gord Struthers said in an email that while the project is “very preliminary,” the Saskatoon-based uranium mining company has discussed the possibilities with SRC and is considering whether it can “take it further.”
“It’s an interesting idea that could add additional value to our milling operations,” Struthers wrote.
Schreiner said while challenges remain — SRC is comfortable with the separation process but needs to refine its technique for concentrating the tailings solution — there is little doubt Saskatchewan firms would find a market for rare earth elements.
However, “It has to be tried and tested because the companies aren’t really interested in something unless it’s pretty secure and pretty reliable.”