Husky Energy investing in carbon capture pilot plant at Sask. heavy oil
ALEX MACPHERSON, SASKATOON STARPHOENIX, SASKATOON STARPHOENIX 07.13.2017
Steam operations have started at the Edam East project near Lloydminster, the first of three new heavy-oil thermal projects in the province.
Husky Energy Inc. is increasing its investment in carbon capture and storage technology, which it hopes will make its expanding heavy oil operations in Saskatchewan more environmentally friendly.
The Calgary-based company has been operating a tiny CCS plant developed by Inventys Inc., a clean energy company headquartered in Burnaby, B.C., at its Pikes Peak South operation northwest of Maidstone for six months. Earlier this month, it invested millions of dollars in the B.C. company with the aim of developing a much larger plant at the site.
“We are moving ahead with a 30 tonnes per day pilot project. … We believe this technology has the potential to reduce the cost of carbon capture, compared to existing technologies, and could turn Lloyd thermal production into a lower carbon source of energy,” and Alberta more environmentally-friendly, Husky spokeswoman Kim Guttormson said in an email.
Carbon dioxide captured by the new project will be used alongside carbon dioxide recovered from other facilities for “enhanced oil recovery” operations in the region, Guttormson said. The process makes other types of oil wells more efficient, she added.
The new plant at Pikes Peak South is expected to be commissioned in the fourth quarter of 2018. Inventys CEO Claude Letourneau said it will have the footprint of two flatbed trailers, cost about $20 million and use the company’s second-generation CCS technology, which improves efficiency by absorbing the carbon dioxide into a solvent rather than a solid.
The increased efficiency, Letourneau continued, is expected to lead to significant cost savings. The capital cost of existing CCS technology is between $60 and $90 per tonne, but Inventys is aiming to cut that to about $30 per tonne — which the oil industry requires before it can start adopting CCS on a wide scale.
Last December, Husky’s board of directors approved three new $350 million steam-assisted heavy oil plants in Saskatchewan. The company, which has boosted its reliance on the facilities to 40 per cent from about eight per cent of total production, has many more projects “in the wings,” according to its former CEO.
That represents a major opportunity not just for Inventys — which wants to build 10 plants capable of capturing between 200 and 600 tonnes per day for Husky — but for an entire industry that is “looking for a solution,” Letourneau said. Husky’s investment, he continued, is a “clear sign” that energy companies are getting serious about addressing carbon capture.
Guttormson would not say how much the company has or is planning to invest in CCS technology, but Letourneau said its commitment is around 80 per cent of the $10 million it raised to support the pilot project in Saskatchewan.
Brandt wants SaskPower to catalyze renewable energy industry by favouring local firms
ALEX MACPHERSON, SASKATOON STARPHOENIX
Published on: July 12, 2017 | Last Updated: July 12, 2017 4:42 PM CST
The head of Saskatchewan’s largest privately-held company wants the province’s electrical utility to favour local firms, including his, as it works to boost Saskatchewan’s reliance on alternative energy sources to 50 per cent from the current 25 per cent during the next 13 years.
Brandt Group of Companies president Shaun Semple said a “local preference” in SaskPower’s procurement process could support not just his firm’s plan to build a wind turbine factory in Saskatoon’s former Mitsubishi Hitachi Power Systems Canada factory, but an entire industry in the province.
“Government procurement is not the solution, but it can be the seed, right? It can be the catalyst that starts an industry growing,” said Semple, just over three months after Brandt bought the sprawling 58th Street East factory for an undisclosed price and unveiled plans to fill the vacant facility with up to 500 of its employees.
The Regina-based company has already spent about $4 million on assessment, cleaning and refurbishment, and hired about 50 people to work at the massive facility. Semple said that total — Brandt currently has “just over” 2,000 employees — is expected to climb to about 100 by the end of the year, and could hit 300 by the end of 2018.
In addition to the turbine factory, the plant is expected to house elements of the company’s agricultural and custom manufacturing divisions, as well as research and development facilities, he said. Brandt does not disclose its finances, but Semple has said previously the purchase is part of a plan to boost its $1.7 billion revenue to $5 billion by 2025.
“We’re at the beginning stages of our industry on wind and alternate energy (sources), and if we don’t give the preference in the scorecards and use the procurement of SaskPower as a catalyst to develop it, it won’t happen and this plant will never see its full capability,” Semple said.
The Crown corporation’s procurement policy states its purchases must “obtain best value” for its money, ensure everyone is treated fairly, meet its operational requirements, comply with the province’s trade obligations, maintain “the highest ethical business standards” and support the development of Saskatchewan’s economy, including Aboriginal businesses.
SaskPower representatives were not available for interviews, but a spokesman for the Crown corporation said in a statement that it is “working closely with Priority Saskatchewan to ensure our procurement processes find the best value for our company.”
Priority Saskatchewan is a branch of SaskBuilds aimed at ensuring government procurement is fair and open.
“SaskPower procures goods and services in a fair and transparent public tendering process … We would look forward to any bid from (the Brandt) organization,” Jonathan Tremblay said in the statement.
While a homegrown wind turbine industry could have “huge” economic benefits for Brandt and other local companies, it’s vital that the government balance the need to support Saskatchewan businesses against the benefits of open competition, said North Saskatchewan Business Association executive director Keith Moen.
“We still are taxpayers and we want to see our government act prudently and judiciously in awarding their contracts, and whenever you can have that connection of it being a Saskatchewan company that gets the contract it’s a win-win. But it isn’t always a win-win because … we don’t want them to spend money frivolously.”
Forecasts for uranium price all point up
July 13, 2017
Uranium was the glaring exception amid a broad-based rally in metals and minerals in 2016. The price of U3O8 fell 41% in 2016 with the industry tracker UxC’s broker average price hitting 12-year lows below $18 per pound in November.
After top supplier Kazakhstan announced in the second week of January that it’s cutting output by 5.2 million pounds, equal to 3% of global production, the price rallied, hitting $26.75 a pound by mid-February.
But Japanese utility TEPCO’s declaration of force majeure on a key uranium delivery contract from Cameco Corp. (CCO-T), the world’s top listed uranium producer, dampened enthusiasm.
And news in April that the US dept of Energy is making cuts to the amount of uranium that it disperses into the market (as much as 1.1m pounds per year less) did little to buoy sentiment, not to mention bad news surrounding nuclear power including the first new reactor to be built in the UK in a generation and risks to the US industry.
Last week Russian state nuclear corporation Rosatom suspended its Mkuju River uranium project in Tanzania for at least three years due to depressed uranium market.
Spot uranium rose to $20.75 this week but remains technically in a bear market, trading down more than 20% from its February peak. Despite the current negativity analysts surveyed by FocusEconomics in July predict a steady increase in the price from today’s levels rising by 40% by the end of next year and over $40 a pound in 2020:
WRITTEN BY JAMES EDWARD KAMIS, GUEST POST ON JANUARY 19, 2017
Antarctica’s Larsen Ice Shelf Break-Up driven by Geological Heat Flow Not Climate Change
Figure 1) North tip of Antarctic Continent including Larsen Ice Shelf Outline (black line), very active West Antarctica Rift / Fault System (red lines), and currently erupting or semi-active volcanoes (red dots).
Progressive bottom melting and break-up of West Antarctica’s seafloor hugging Larsen Ice Shelf is fueled by heat and heated fluid flow from numerous very active geological features, and not climate change.
This ice shelf break-up process has been the focus of an absolute worldwide media frenzy contending man-made atmospheric global warming is at work in the northwest peninsula of Antarctica.
As evidence, media articles typically include tightly edited close-up photos of cracks forming on the surface of the Larsen Ice Shelf (Figure 2) accompanied by text laced with global warming alarmist catch phrases.
This “advertising / marketing” approach does in fact produce beautiful looking and expertly written articles. However, they lack subsidence, specifically a distinct absence of actual scientific data and observations supporting the purported strong connection to manmade atmospheric global warming.
Working level scientists familiar with, or actually performing research on, the Larsen Ice Shelf utilize an entirely different approach when speaking about or writing about what is fueling this glacial ice break-up.
They ascribe the break-up to poorly understood undefined natural forces (see quote below). Unfortunately, comments by these scientists are often buried deep in media articles and never seem to match the alarmist tone of the article’s headline.
“Scientists have been monitoring the rift on the ice shelf for decades. Researchers told NBC News that the calving event was “part of the natural evolution of the ice shelf,” but added there could be a link to changing climate, though they had no direct evidence of it.” (see here)
Figure 2) An oblique view of crack in the Antarctic’s Larsen C ice shelf on November 10, 2016. (NBC News Article credit John Sonntag / NASA via EPA
This article discusses what more properly explains what is fueling the Larsen Ice Shelf break-up. A theory that is supported by actual scientific data and observations thereby strongly indicating that the above mentioned undefined natural forces are in fact geological.
Let’s begin by reviewing the map atop this article (Figure 1). This map is a Google Earth image of the local area surrounding, and immediately adjacent to, the Larsen Ice Shelf, here amended with proven active geological features.
If ever a picture told a thousand words this is it. The Larsen Ice Shelf lies in and among: twenty-six semi-active (non-erupting but heat-flowing) land volcanoes, four actively erupting land volcanoes, two proven semi-active seafloor volcano (seamounts), and a proven actively heat flowing major fault system named the West Antarctic Rift.
Not shown on this map are known seafloor hydro-thermal vents (hot seafloor geysers), likely heat emitting fractures, and prominent cone-shaped seafloor mountains that are most likely seamounts (ocean volcanoes).
This geological information paints a very clear and compelling picture that the Larsen Ice Shelf is positioned in an extremely active geological setting. In fact a strong case can be made that the Larsen Ice Shelf owes its very existence to a down-faulted low valley that has acted as a glacial ice container (see research on the Bentley Subglacial Trench of the West Antarctic Rift / Fault).
Next let’s review in more detail a few of the key very local areas on the Figure 1 map which will help clarify the power and recent activity of these areas.
First up, the Seal Nunataks area which is labeled on the Figure 1 map as “16 Semi-Active Volcanoes“. In general, these volcanoes lie within and push up through the northern portion of the Larsen Ice Shelf (Figure 3).
More specifically, the Larsen Ice Shelf is formally divided into three sub-areas: northern “A” segment, central “B” segment, and southern “C” segment. The 16 Seal Nunataks‘ volcanoes are strongly aligned in a west to east fashion and are designated as the boundary between the Larsen “A” and B” segments.
This 50-mile-long and 10-mile-wide chain of visible land volcanoes has likely been continuously volcanically active for at least the last 123 years based on limited amounts of data from this remote and largely unmonitored area.
Each time humans have visited this area they have recorded obvious signs of heat and heated fluid flow in the form of: fresh lava flows on volcanoes, volcanic ash on new snow, and volcanic debris in relatively new glacial ice. Remember, these observations only document volcanic activity on exposed land surfaces, and not the associated volcanic activity occurring on the seafloor of this huge volcanic platform.
More modern research published in 2014 by Newcastle University is here interpreted to indicate that the Larsen “B” portion of the greater Larsen Ice Shelf pulsed a massive amount of heat in 2002. Research elevation instruments showed that a huge portion of the Larsen “B” area quickly rose up, likely in response to swelling of underlying deep earth lava pockets (mantle magma chambers).
This process heated the overlying uplifted ground. This heated ground then acted to bottom melt the overlying glaciers (quote below). This is an awesome display of the power geologically induced heat flow can have on huge expanses of glacial ice.
“Scientists led by Newcastle University in the UK studied the impact of the collapse of the giant Larsen B ice shelf in 2002, using Global Positioning System (GPS) stations to gauge how the Earth’s mantle responded to the relatively sudden loss of billions of tonnes of ice as glaciers accelerated. As expected, the bedrock rose without the weight but at a pace ‚Äì as much as 5 centimetres a year in places ‚Äì that was about five times the rate that could be attributed by the loss of ice mass alone”, said Matt King, now at the University of Tasmania (UTAS), who oversaw the work.
“It’s like the earth in 2002 was prodded by a stick, a very big stick, and we’ve been able to watch how it responded,” Professor King said. “We see the earth as being tremendously dynamic and always changing, responding to the forces.” Such dynamism – involving rocks hundreds of kilometres below the surface moving “like honey” – could have implications for volcanoes in the region. Professor King said. (see here)
Figure 3) Map of the Seal Nunataks 16 Semi-active volcanoes relative to the three Larsen Ice Shelf segments, “A”, “B”, and “C” (see here). Also, a historical aerial photo of several Seal Nunatak volcanic cones pushing up through the Larsen Ice Shelf.
It is clear that the vast Seal Nunataks’ volcanic plateau at the very least pulsed significant amounts of heat and likely heated fluid flow in the following years: 1893, 1968, 1982, 1985, 1988, 2002, 2010.
The next key local area on the Figure 1 map is portion labeled as “6 Semi-Active Volcanoes” of which two are seamounts (seafloor volcanoes) and four are land volcanoes. All of these geological features are known to be currently emitting heat and heated fluid flow, however the rate and volume of this flow is not well understood. The most noteworthy feature is Deception Island, which is a huge six-mile-wide collapsed land volcano (caldera).
This volcanic feature: extends a great distance outward and downward into the surrounding ocean, has been earthquake active in the years 1994 / 1995 / 1996, and has moderately erupted in the years 1820, 1906 / 1910 / 1967 / 1969 / 1992.
Early explorers used the harbor created by this collapsed volcano, however, on occasion they had to abandon their moorings when the seawater in the harbor boiled (see quote below). More modern research stations in the 1960s had to temporarily abandon the island due to moderate eruptions.
“The fifth volcano, off the northern tip of the Antarctic Peninsula, is a crater that has been ruptured by the sea to form a circular harbor known as Deception Island. Beginning in the 1820s, it was used as shelter by sealing fleets from New England and later by whalers.
On occasion, water in that harbor has boiled, peeling off bottom paint from the hulls of ships that did not escape in time. An eruption a decade ago damaged research stations established there by both Britain and Chile.
This volcano and the two newly discovered ones on the opposite side of the peninsula, the longest on earth, are thought to be formed by lava released from a southeastward-moving section of the Pacific floor that is burrowing under the peninsula in the same process thought to have formed the Andean mountain system farther north, in South America.” (see here)
The last two key local areas on the Figure 1 map are labeled as “1 Erupting and 5 Semi-Active Volcanoes” and “3 Erupting Volcanoes. These two areas represent major currently erupting land volcanoes that are spewing huge amount of ash into the atmosphere, and most importantly, massive amounts of heat and heated fluid flow into the surrounding ocean (see here and here).
These ongoing eruptions all lie along, and are generated by, deep earth faulting associated with the northern extension of the West Antarctic Rift / Fault System (red lines on Figure 1 map). The reader is directed to previous Climate Change Dispatcharticles detailing heat flow and volcanic activity along this West Antarctic 5,000-mile-long fault system (see here and here)
Reviewing how mega-geological forces drive Earth’s internal heat engine also has direct bearing on what is fueling the Larsen Ice Shelf Break-up as follows:
- Earth is undergoing an extremely active period of volcanic and earthquake activity during the last three years especially major deep ocean fault systems such as those associated with the Pacific Rim of Fire and Icelandic Mid-Atlantic Ocean Rift. It makes perfect sense that the West Antarctic Rift / Fault System which underlies the Larsen Ice Shelf has also become more active of during this time frame.
- The 2015-2016 El Ni√±o Ocean “Warm Blob” has now been proven to be caused / generated by “natural forces”, and not manmade or other purely atmospheric forces. These natural forces are almost certainly geological as per numerous previous Climate Change Dispatch articles (see hereand here). If geological forces have the power to warm the entire Pacific Ocean, they can certainly act to warm the ocean beneath the Larsen Ice Shelf.
- Climate scientists favoring manmade Global Warming continue to force fit all anomalous warming events into an atmospheric framework because this is the only abundant data source they have available. However, there is very little global atmospheric data that supports rapid local Larsen Ice Shelf melting or local rapid ocean warming. Most global atmospheric data indicates that the Antarctic atmosphere is cooling or not changing temperature.
- The surface of our planet is 70% water and 90 % of all active volcanoes are present on the floor of Earth’s oceans. Quite amazingly only 3-5% of the ocean floors have been explored by human eyes, and virtually none of this area is monitored. This is especially true in the nearly unexplored / completely unmonitored deep regions of the oceans in and around the Larsen Ice Shelf.
- It just makes sense that if major rift / fault zones that from the boundary of Earth’s outer crustal plates that have the power to move entire continents 1-2inches per year, certainly have the power to warm oceans as per the Plate Climatology Theory. The Weddell Sea which surrounds the Larsen Ice Shelf, no problem.
In summary huge amounts of research and other readily available information clearly indicates that the Larsen Ice shelf lies within a geologically active region. Media reports that do not mention this aspect relative to the potential cause of bottom melting and subsequent break-up of the Larsen “A”, “B”, and “C” glaciers are best characterized as “Fake News” and not “97% Proven / the Debate is Over” news.
Thankfully there are smaller media venues such as Climate Change Dispatch that provide scientists with a platform to present viable alternative explanations to complicated climate and climate-related events, specifically in this case…. Antarctica’s Larsen Ice Shelf Break-Up is Fueled by Geological Heat Flow and Not Climate Change.
James Edward Kamis is a Geologist and AAPG member of 42 years with a B.S. and M.S. in geology who has always been fascinated by the connection between Geology and Climate. More than 12 years of research / observation have convinced him that the Earth’s Heat Flow Engine, which drives the outer crustal plates, is also an important driver of the Earth’s climate. The Plate Climatology Theory (plateclimatology.com) was recently presented / published at the annual 2016 American Meteorological Society Conference in New Orleans, LA. (see here)
http://news.nationalgeographic.com/2016/11/foehn-winds-melt-ice-shelves-antarctic-peninsula-larsen-c/ Warm Winds not Climate Change but from Geologically Warmed Ocean.
http://www.seeker.com/three-volcanoes-erupting-nasa-satellite-2031377713.html Three Volcanoes North of Antarctica Erupt at Once
http://earthobservatory.nasa.gov/IOTD/view.php?id=87995 Bristol Island Eruption May 1, 2016
http://www.ldeo.columbia.edu/research/blogs/operation-icebridge-scientists-map-thinning-ice-sheets-antarctica Lamont Doherty Involvement in Operation Ice Bridge Antarctica
http://www.academia.edu/5715929/Volcanic_tremors_at_Deception_Island_South_Shetland_Islands_Antarctica Deception Island South Shetland Islands Antarctica
http://www.smh.com.au/environment/fire-and-ice-melting-antarctic-poses-risk-of-volcanic-activity-study-shows-20140520-zrj06.html Mantle Under Larsen Shelf Rises and Activates Volcanoes.
http://earthsky.org/earth/new-glimpse-of-geology-under-antarcticas-ice Bentley Subglacial Trench in West Antarctica
https://www.nasa.gov/pdf/121653main_ScambosetalGRLPeninsulaAccel.pdf Map Larsen Ice Shelf
http://www.sciencedirect.com/science/article/pii/089598119090022S Deception Island and Bransfield Strait
Cenovus expects up to $2.5 billion from sale of Weyburn, Palliser assets: Report
John Tilak, Reuters
July 12, 2017
Cenovus Energy , The Canadian Press
Oil and gas producer Cenovus Energy Inc (CVE.TO 1.41%) has hired investment banks to sell the Weyburn and Palliser oil assets, which it hopes would fetch as much as $2.5 billion, according to people familiar with the situation.
The moves are part of a push to sell assets to pay down debt Cenovus took to help fund the $16.8-billion purchase of oil and gas businesses from ConocoPhillips (COP.N 0.37%).
Cenovus is working with Toronto-Dominion Bank to sell Weyburn, and with Credit Suisse and Scotiabank to sell the Palliser assets, two people said this week, declining to be named as the appointments are not public.
The company last month identified Weyburn and Palliser as additional assets it would look to sell and raised its total divestiture target to $4 billion to $5 billion, from $3.6 billion earlier.
Cenovus expects to generate as much as $1.5 billion for Weyburn and about $1 billion for Palliser, the people said. The deals could be announced in the fourth quarter, the company has said.
If successful, these divestitures could help Cenovus reach up to half of its asset sales targets.
Cenovus and Credit Suisse declined to comment, while TD and Scotiabank did not immediately respond to Reuters’ request for comments.
The Weyburn assets in southern Saskatchewan are attractive because of their light oil output and low decline, characteristics that might lead to a robust bidding process, the people said.
Located in southeast Alberta, the Palliser block is a conventional tight-oil asset. Cenovus said last year it has identified 700 drilling locations in the block with what it said are high-return potentials.
The company is also selling its Pelican Lake and Suffield conventional oil and gas assets.
Cenovus is under pressure because of the debt it incurred with the transformational acquisition of oil and gas assets. Its share price is down about 47 per cent since the deal was announced in March, while the company said last month that Chief Executive Brian Ferguson, who had championed the transaction, would retire.
Some investors have questioned whether the company can meet its asset sales targets, given that other oil firms are seeking to sell Canadian assets.
StatsCan study provides a reality check about fossil fuels and climate change
By Gordon Jaremko
July 12, 2017, 2:50 p.m.
Image: Joey Podlubny/JWN
Before highways, airlines and the Internet compressed vast landscapes into snapshots flashing by sealed windows, soft seats, climate-controlled interiors and video screens, earth scientists like Sidney Ells recorded the resource in unaided natural perspective as a remote fraction of northern Canada.
To do the first survey of the bitumen mine district, Ells endured a marathon by boat and on foot. Half a century later, he recorded the human side of the science in a memoir published by the federal Department of Mines and Technical Surveys, titled Reflections on the Development of the Athabasca Oil Sands.
Just to reach the expedition’s jumping-off point was a 120-kilometre trek north from Edmonton to Athabasca Landing. The easiest leg of the adventure—paddling a nine-metre scow and 6.6-metre freight canoe for 384 kilometres downstream on the Athabasca River to Fort McMurray—took nine days.
The return voyage south was a 23-day struggle. His team wrestled rope along rough riverbanks to tug the scow upstream, laden with a nine-tonne cargo of samples and equipment. The ordeal, called “tracking,” was a Canadian version of the self-abusive sled “man-hauling” Robert Scott’s fatal British-forced march to the South Pole suffered in 1911-12.
“Scow tracking south of McMurray was anything but child’s play,” Ells recalled.
“Harnessed to the heavy tracking line, men fought their way grimly along rough boulder-strewn beaches or through a tangle of overhanging brush, often ankle-deep in mud or waist-deep in water. The ceaseless torture of myriads of flies from daylight until dark and the heavy work, which only the strongest could long endure, made tracking one of the most brutal forms of labor.”
The survey’s northern base had not been soft either, Ells recalled. “In 1913 McMurray consisted of a dozen primitive log cabins, a bug-infested hovel proudly referred to as the ‘hotel,’ and during the summer months, many Indian tepees and tents. Everywhere starving train [work] dogs roamed at will, and the greatest care for the protection of food and other supplies was essential.”
In the virgin state seen by Ells, the bitumen belt was not paradise. Cars, planes, trains and electronic communications long ago ended pioneer hardship. But a realistic perspective on oilsands industrialization is available thanks to a 340-page Statistics Canada survey, The changing landscape of Canadian metropolitan areas.
Since Alberta bitumen production began in 1967, the provincial and federal energy departments report that mining and upgrading complexes disrupted 895 square kilometres of forests and muskeg swamps. But southern and central Canada did not stand still to preserve nature.
Since 1970 urban sprawl has chewed up 8,895 square kilometres of Canadian terrain, 10 times more of the countryside than oilsands excavation north of Fort McMurray. The total space taken by census metropolitan areas—inner cities, suburbs and edge communities—grew by 157 per cent to 14,546 square kilometres from 5,651.
Alberta economic growth propelled by the petroleum industry added 752 square kilometres to Edmonton and 427 to Calgary. But Toronto developers left oilsands miners in the dust by adding 1,189 square kilometres to its built-up metro area. Montreal grew by 816 square kilometres, Vancouver by 503, and Ottawa by 417.
Population growth is only one driver of urban sprawl. Personal satisfaction contributes. In places where the ideal of owning a home that feels like a castle stays affordable, Canadians think big, the national landscape survey reports.
“More than half of homes built since 2001 were over 1,500 square feet compared to a quarter of homes built before 1978. As well, 13 per cent of homes built since 2001 were over 2,500 square feet compared to five per cent of homes built before 1978,” reads the report.
Bitumen pits and plants occupy borrowed territory, Alberta Energy and Natural Resources Canada point out. The loans last for decades. But production only happens if the sponsors commit to reclaiming the disturbed areas eventually to a naturally productive condition.
Scars of urbanization run deep and seldom heal, Statistics Canada observes. Unlike Ells and his industrial heirs who had to settle for buried treasure in harsh places, city builders concentrate on the most congenial and productive land, water and climates.
The cityscape survey, part of a Statistics Canada series titled Human Activity and the Environment, spotlights consequences of urban evolution over the past half-century.
“The transformation from more natural covers to built-up landscapes, characterized by a high percentage of impervious surfaces including roadways, parking lots and roof tops, increases storm water runoff, creates urban heat islands and reduces the number and diversity of animals and native plants,” the urban landscape review says.
“While Canada’s built-up area represented only 0.1 per cent of the country’s total area in 2011, urban expansion results in the loss of prime agricultural land because numerous communities across the country were originally established on fertile agricultural land,” Statistics Canada reports.
“The expansion and intensification of built-up area also results in the loss of green space and natural land covers. These changes are normally permanent—once agricultural or natural land is used for urban purposes, it is unlikely to return to a natural state.”
The survey includes a lesson: blame for climate change cannot be laid on fossil fuel extraction and refining alone. The industry grew in response to popular demand for its products.
“Workers living in newer homes generally have to travel farther to get to work,” Statistics Canada observes. “These trends have an impact on the environment—motor vehicle use by households is responsible for more than half of household greenhouse gas emissions, accounting for one-tenth of Canada’s total greenhouse gas emissions.”
Why Commodity Traders Are Fleeing the Business
The number of trading houses has dwindled, and the institutional, pure-play commodity hedge funds that remain are few.
By Shelley Goldberg
July 12, 2017, 3:00 AM CST July 12, 2017, 11:32 AM CST
Copper, the “beast” of commodities.
Photographer: John Guillemin/Bloomberg
Profiting from commodity trading often requires a combination of market knowledge, luck, and most importantly, strong risk management. But the number of commodity trading houses has dwindled over the years, and the institutional, pure-play commodity hedge funds that remain — and actually make money — can be counted on two hands. Here is a list of some of the larger commodity blow-ups:
The largest and most successful commodity trading house in its day caved, triggered by copper trading
The New York branch of this large German conglomerate lost $1.5 billion in heating oil and gasoline derivatives
Yasuo Hamanaka blamed for $2.6 billion loss in copper scandal
Dissolves after misreporting natural gas trades, resulting in Arthur Andersen, a ‘Big 5’ accounting firm’s fall from grace
Energy hedge fund folds after losing over $6 billion on natural gas futures
One of the best-performing hedge funds in 2011, closed its doors in 2012, shrinking from $2 billion to $1.2 billion on crude oil bets
Brevan Howard Asset Management
One of the largest hedge funds globally. Closed its $630 million commodity fund after having run well over $1 billion of a $42 billion fund
The sister and energy trading arm of Phillip Brothers, ranked (1980) the 15thlargest U.S. company, dissolves
Vermillion Asset Management
Private-equity firm Carlyle Group LP split with the founders of its Vermillion commodity hedge fund, which shrank from $2 billion to less than $50 million.
Amid the mayhem, banks held tightly to their commodity desks in the belief that there was money to be made in this dynamic sector. The trend continued until the implementation of the Volcker rule, part of the Dodd-Frank Act, which went into effect in April 2014 and disallowed short-term proprietary trading of securities, derivatives, commodity futures and options for banks’ own accounts. As a result, banks pared down their commodity desks, but maintained the business.
Last week, however, Bloomberg reported that Goldman Sachs was “reviewing the direction of the business” after a multi-year slump and yet another quarter of weak commodity prices.
In the 1990s boom years, commodity bid-ask spreads were so wide you could drive a freight truck through them. Volatility came and went, but when it came it was with a vengeance, and traders made and lost fortunes. Commodity portfolios could be up or down about 20 percent within months, if not weeks. Although advanced trading technologies and greater access to information have played a role in the narrowing of spreads, there are other reasons specific to the commodities market driving the decision to exit. Here are the main culprits:
- Low volatility: Gold bounces between $1,200 and $1,300 an ounce, WTI crude straddles $45 to $50 per barrel, and corn is wedged between $3.25 and $4 a bushel. Volatility is what traders live and breathe by, and the good old days of 60 percent and 80 percent are now hard to come by. Greater efficiency in commodity production and consumption, better logistics, substitutes and advancements in recycling have reduced the concern about global shortages. Previously, commodity curves could swing from a steep contango (normal curve) to a steep backwardation (inverted curve) overnight, and with seasonality added to the mix, curves resembled spaghetti.
- Correlation: Commodities have long been considered a good portfolio diversifier given their non-correlated returns with traditional asset classes. Yet today there’s greater evidence of positive correlations between equities and crude oil and Treasuries and gold.
- Crowded trades: These are positions that attract a large number of investors, typically in the same direction. Large commodity funds are known to hold huge positions, even if these only represent a small percent of their overall portfolio. And a decision to reverse the trade in unison can wipe out businesses. In efforts to eke out market inefficiencies, more sophisticated traders will structure complex derivatives with multiple legs (futures, options, swaps) requiring high-level expertise.
- Leverage: Margin requirements for commodities are much lower than for equities, meaning the potential for losses (and profits) is much greater in commodities.
- Liquidity: Some commodities lack liquidity, particularly when traded further out along the curve, to the extent there may be little to no volume in certain contracts. Futures exchanges will bootstrap contract values when the markets close, resulting in valuations that may not reflect physical markets and grossly swing the valuations on marked-to-market portfolios. Additionally, investment managers are restricted from exceeding a percentage of a contract’s open interest, meaning large funds are unable to trade the more niche commodities such as tin or cotton.
- Regulation: The Commodity Futures Trading Commission and the Securities and Exchange Commission have struggled and competed for years over how to better regulate the commodities markets. The financial side is far more straightforward, but the physical side poses many insurmountable challenges. As such, the acts of “squeezing” markets through hoarding and other mechanisms still exist. While the word “manipulation” is verboten in the industry, it has reared its head over time. Even with heightened regulation, there’s still room for large players to maneuver prices — for example, Russians in platinum and palladium, cocoa via a London trader coined “Chocfinger,” and a handful of Houston traders with “inside” information on natural gas.
- Cartels: Price control is not only a fact in crude oil, with prices influenced by the Organization of Petroleum Exporting Countries but with other, more loosely defined cartels that perpetuate in markets such as diamonds and potash.
- It’s downright difficult: Why was copper termed “the beast” of commodities, a name later applied to natural gas? Because it’s seriously challenging to make money trading commodities. For one, their idiosyncratic characteristics can make price forecasting practically impossible. Weather events such as hurricanes and droughts, and their ramifications, are difficult to predict. Unanticipated government policy, such as currency devaluation and the implementation of tariffs and quotas, can cause huge commodity price swings. And labor movements, particularly strikes, can turn an industry on its head. Finally, unlike equity prices, which tend to trend up gradually like a hot air balloon but face steep declines (typically from negative news), commodities have the reverse effect — prices typically descend gradually, but surge when there’s a sudden supply shortage.
What are the impacts? The number of participants in the sector will likely drop further, but largely from the fundamental side, as there’s still a good number of systematic commodity traders who aren’t concerned with supply and demand but only with the market’s technical aspects. This will keep volatility low and reduce liquidity in some of the smaller markets. But this is a structural trend that feasibly could reverse over time. The drop in the number of market makers will result in inefficient markets, more volatility and thus, more opportunity. And the reversal could come about faster should President Donald Trump succeed in jettisoning Dodd-Frank regulations.
(Corrects attribution of Goldman’s review of commodity operations in third paragraph.)
Bloomberg Prophets Professionals offering actionable insights on markets, the economy and monetary policy. Contributors may have a stake in the areas they write about.
To contact the author of this story:
Shelley Goldberg at email@example.com
To contact the editor responsible for this story:
Max Berley at firstname.lastname@example.org
Saskatchewan has a rare helium deposit. See Weil Group eyes growing helium market with $10-million plant in Mankota Saskatchewan It has be known for a while Helium Prospects in Southwest Saskatchewan by H.B. Sawatzky, R.G. Agarwal and W. Wilson – 1960
BHP Billiton released an item today, “Our plan to grow value.” See http://www.bhp.com/investor-centre/our-plan-to-grow-value?utm_source=Subscribers&utm_medium=Organic&utm_campaign=PlanToGrowValue&utm_content=Overview
Item 4 within it includes:
Our Jansen Potash Project in Canada is one of the best undeveloped potash resources in the world. We have made significant progress on the development of the shafts as part of our phased approach to increase optionality and reduce risk. We could seek Board approval for an initial 4 million tonnes per annum stage as early as June 2018, with possible first production from FY2023
The link under item 4 goes to a PPT, which includes this slide:
Fear of radiation is more dangerous than radiation itself
By David Ropeik:
He is an instructor in the environmental programme of the Harvard Extension School, and an author, consultant and public speaker who focuses on risk perception, communication and management. His latest book is How Risky Is it, Really? Why Our Fears Don’t Always Match the Facts (2010). He lives near Boston, Massachusetts.
Photo by Gregg-Webb-IAEA
The fear of ionising (nuclear) radiation is deeply ingrained in the public psyche. For reasons partly historical and partly psychological, we simply assume that any exposure to ionising radiation is dangerous. The dose doesn’t matter. The nature of the radioactive material doesn’t matter. The route of exposure – dermal, inhalation, ingestion – doesn’t matter. Radiation = Danger = Fear. Period.
The truth, however, is that the health risk posed by ionising radiation is nowhere near as great as commonly assumed. Instead, our excessive fear of radiation – our radiophobia – does more harm to public health than ionising radiation itself. And we know all this from some of the most frightening events in modern world history: the atomic bombings of Japan, and the nuclear accidents at Chernobyl and Fukushima.
Much of what we understand about the actual biological danger of ionising radiation is based on the joint Japan-US research programme called the Life Span Study (LSS) of survivors of Hiroshima and Nagasaki, now underway for 70 years. Within 10 kilometres of the explosions, there were 86,600 survivors – known in Japan as the hibakusha – and they have been followed and compared with 20,000 non-exposed Japanese. Only 563 of these atomic-bomb survivors have died prematurely of cancer caused by radiation, an increased mortality of less than 1 per cent.
While thousands of the hibakusha received extremely high doses, many were exposed to moderate or lower doses, though still far higher than those received by victims of the Chernobyl or Fukushima nuclear accidents. At these moderate or lower doses, the LSS found that ionising radiation does not raise rates of any disease associated with radiation above normal rates in unexposed populations. In other words, we can’t be sure that these lower doses cause any harm at all, but if they do, they don’t cause much.
And regardless of dose, the LSS has found no evidence that nuclear radiation causes multi-generational genetic damage. None has been detected in the children of the hibakusha.
Based on these findings, the International Atomic Energy Agency estimates that the lifetime cancer death toll from the Chernobyl nuclear accident might be as high as 4,000, two-thirds of 1 per cent of the 600,000 Chernobyl victims who received doses high enough to be of concern. For Fukushima, which released much less radioactive material than Chernobyl, the United Nations Scientific Committee on the Effects of Atomic Radiation (UNSCEAR) predicts that ‘No discernible increased incidence of radiation-related health effects are expected among exposed members of the public or their descendants.’
Both nuclear accidents have demonstrated that fear of radiation causes more harm to health than radiation itself. Worried about radiation, but ignoring (or perhaps just unaware of) what the LSS has learned, 154,000 people in the area around the Fukushima Daiichi nuclear plants were hastily evacuated. The Japan Times reported that the evacuation was so rushed that it killed 1,656 people, 90 per cent of whom were 65 or older. The earthquake and tsunami killed only 1,607 in that area.
The World Health Organization found that the Fukushima evacuation increased mortality among elderly people who were put in temporary housing. The dislocated population, with families and social connections torn apart and living in unfamiliar places and temporary housing, suffered more obesity, heart disease, diabetes, alcoholism, depression, anxiety, and post-traumatic stress disorder, compared with the general population of Japan. Hyperactivity and other problems have risen among children, as has obesity among kids in the Fukushima prefecture, since they aren’t allowed to exercise outdoors.
Though Chernobyl released far more radioactive material than Fukushima, fear caused much more health damage still. In 2006, UNSCEAR reported: ‘The mental health impact of Chernobyl is the largest public health problem caused by the accident to date … Rates of depression doubled. Post-traumatic stress disorder was widespread, anxiety and alcoholism and suicidal thinking increased dramatically. People in the affected areas report negative assessments of their health and wellbeing, coupled with … belief in a shorter life expectancy. Life expectancy of the evacuees dropped from 65 to 58 years. Anxiety over the health effects of radiation shows no signs of diminishing and may even be spreading.’
The natural environment around the Chernobyl and Fukushima Daiichi accidents adds evidence that ionising radiation is less biologically harmful than commonly believed. With people gone, those ecosystems are thriving compared with how things were before the accidents. Radiation ecologists (a field of study that blossomed in the wake of Chernobyl) report that radiation had practically no impact on the flora and fauna at all.
The risk from radiophobia goes far beyond the impacts in the immediate area around nuclear accidents. Despite the fact that radiation released from Fukushima produced no increase in radiation-associated diseases, fear of radiation led Japan and Germany to close their nuclear power plants. In both nations, the use of natural gas and coal increased, raising levels of particulate pollution and greenhouse gas emissions.
Neither country will meet its 2020 greenhouse gas emissions-reduction targets. Across Europe, fear of radiation has led Germany, France, Spain, Italy, Austria, Sweden and Switzerland to adopt policies that subsidise solar, wind and hydropower over nuclear as a means of reducing greenhouse gas emissions, despite the fact that most energy and climate-change experts say that intermittent renewable energy sources are insufficient to solve the problem. In the United States, 29 state governments subsidise wind and solar power, but only three offer incentives for nuclear, which produces far more clean power, far more reliably.
Fear of radiation has deep roots. It goes back to the use of atomic weapons, and our Cold War worry that they might be used again. Modern environmentalism was founded on fear of radioactive fallout from atmospheric testing of such weapons. A whole generation was raised on movies and literature and other art depicting nuclear radiation as the ultimate bogeyman of modern technology. Psychologically, research has found that we worry excessively about risks that we can’t detect with our own senses, risks associated with catastrophic harm or cancer, risks that are human-made rather than natural, and risks that evoke fearful memories, such as those evoked by the very mention of Chernobyl or Three Mile Island. Our fear of radiation is deep, but we should really be afraid of fear instead.