Tuesday, December 10, 2013





Highly radioactive snow with counts of up to 65 CPM was measured with Geiger Counter in January of 2013. This snow fell in an area that is 25 km away from Fukushima Daichi. Minamisōma is a city located in Fukushima Prefecture, Japan.


As of May, 1, 2011, the city had an estimated population of 68,745 and a population density of 60 persons per km². The total area is 398.50 km². The Japanese government says this area is 'safe' to live in. As of 2012., residents are being invited to 'return' to this 'safe' area. Minamisōma is about 25 kilometres (16 miles) north of Fukushima I Nuclear Power Plant, the site of the nuclear accident that followed the 2011 Tōhoku earthquake and tsunami.

Geiger counter (at Wiki page, not in video above) showing radiation at Minamisoma: 0.532 μSv/h. This equates to an annual radiation dose of 4.66 millisieverts, compared to the government's criteria for 'safe' return residents at 20 millisieverts per year. 

The pre Fukushima background radiation was a maximum of 1 millisievert exposure per year, which used to be the safe maximum limit for citizens prior to Fukushima Daichi. 

In the video above, we are detecting 'fresh' radiation in the snow in Japan.  Could there be a pattern here? 

On Dec. 6th, 2013, we did a swipe test of rain droplets on a mailbox in Northern California, at around 8 PM. Using a  pancake style Geiger Counter from Aware Electronics; Model RM-80 with a plastic layer on top of it (which blocks all alpha radiation, this wet paper towel had a high radiation reading of 291 uR/Hr.. average of 265. The pancake Geiger detector is hooked into a computer which provides the graphs below. 

The normal background radiation registered between 9-37 uR/Hr, as shown in first six graphed points below. 



The next test was putting a thick newspaper between the pancake detector and the wet paper towel. This reduced the radiation readings by a considerable amount, to 53 uR/Hr, approximately twice background radiation. 



The third test, was putting a thick book between the wet paper towel and the detector.  All of the radiation was blocked by a paperback book and the detector went back to normal background radiation levels, in the 20-30's. 

Then we took the same wet paper towel sample and put it back on the plastic bag on top of the detector and the reading went right back up to 260-290. 

So what can we learn from this radiation test of the rain in California, done in 2013? (We recommend all you radiation testers out there do it this same way, to isolate the TYPE of radiation that you are detecting without the expensive equipment that someone like Antiproton has.)

Friday, December 6, 2013

FIVE DEFICITS : THE BANE OF INDIAN ECONOMY AND GROWTH STORY | INDIAN DREAMS Vs REALITY

FIVE DEFICITS : THE BANE OF INDIAN ECONOMY AND GROWTH STORY | INDIAN DREAMS Vs REALITY
The potential of Indian growth story and chance of India to become a global economic superpower has been a hot topic with the economists and analysts all over this globe. The forty year opportunity window is at India's doorsteps just waiting to be grabbed for fulfilling the dreams and aspirations of 1 billion plus Indians. This great opportunity window exists simply on the back of huge demographic advantage of cheap skilled manpower both in the technical and managerial cadres and the world economies looking for new economic order in the face of diminishing old order energy resources and global warming associated with them.
The demographic advantage of today will become a major disadvantage after a few decades when the bulk of the population becomes old. After all no sanity can deny the fact that ultimately for about 35-40 years of productive life, the same individual is going to be a burden on the national resources for next 25-30 years. The number of aged individuals and non productive years of these aged individuals will keep on increasing as a nation marches on road to prosperity. The problems of developed and industrialized economies mainly emanated from this crude reality of life. Whereas these countries fully utilized the opportunity window that came their way and grew old after becoming rich, it looks quite likely that India may grow old before becoming rich and prosperous unless and until some wisdom prevails on Indian polity and they start addressing the five deficits, the bane of Indian economy and growth story at the earliest in a sincere and earnest manner. If the polity keeps harping on their  vote bank politics with subsidies, doles and freebies  for another decade or so, India would have missed this golden opportunity and all because the nation could not produce leaders who could keep their baser instincts of lust for power and money away while serving/governing the nation.

Sunday, December 1, 2013

Graph of the Day: If all the ice melted


ice-gone
Homo sapiens sapiens, the species with the ironic name, is not known for long-term thinking. So if the prospect of Sandy-level storm surges happening every year (!) in a half century or so isn’t enough to get us to stop using the atmosphere as an open sewer for carbon pollution, then the prospect we are going to melt all of the Earth’s landlocked ice and raise sea levels more than 200 feet over the next couple of millenna or so ain’t gonna do the trick.
Still, National Geographic has been one of the few major magazines to consistently warn the public about the risks posed by unrestricted carbon pollution. And who better to be alarmed about how we are going to destroy the nation’s geography than National Geographic? Unsurprisingly, the deniers and confusionists, including Bjorn Lomborg himself, have suggested that somehow Nat Geo’s concern is misplaced. Sadly, it isn’t.
The best science suggests that on our current CO2 emissions path, by 2100 we could well pass the tipping point that would make 200+ feet of sea level rise all but unstoppable — though it would certainly take a long time after 2100 for the full melt-out to actually occur.
That said, the text on Nat Geo’s graphic is a little confusing and has the unfortunate effect of suggesting that we would need 22°F of global warming to melt all the ice on the planet, when that’s not what the paleoclimate record suggests.
The confusionists are preternaturally confused by all this. A leading denier website actually cites current data on sea ice (!) to refute Nat Geo, even though it is only melting landlocked ice that raises sea levels.

Saturday, October 26, 2013

The Hidden Cost Of Solar And Wind Power In One Image From The UK

The Hidden Cost Of Solar And Wind Power In One Image From The UK
on October 25 2013 8:57 AM
Wind turbines are seen at Thanet Offshore Wind Farm off the Kent coast in southern England September 23, 2010.
Wind turbines at Thanet Offshore Wind Farm off the Kent coast in southern England REUTERS
In light of new plans to build the Hinkley Point C power station in southwest England, the U.K. energy ministry released an infographic that compares the amount of real estate the nuclear plant will occupy to the land demands of current wind and solar power facilities.
The new $26 billion Hinkley project is expected to generate about 5 percent of England’s energy generating capacity, whereas it will occupy only a fraction of the land area required by those other energy sources.

Saturday, October 19, 2013

‘Hidden fuel’ worth hundreds of billions: IEA’s new energy focus


In the debate about the energy sources of the future, fossil fuels and renewables easily steal the limelight. Energy efficiency, however, is far less flashy; languishing in the corners of policy makers’ minds as the ‘low-hanging fruit’, or the ‘hidden fuel’, of the low-carbon future.
But a new report by the International Energy Agency argues we should bring energy efficiency out of the shadows and start seeing it for what it is: the world’s “first fuel,” with a global market already worth hundreds of billions, and with huge energy and emissions saving potential.
The inaugural Energy Efficiency Market Report – released on Wednesday to join the ranks of the IEA market reports for oil, gas, coal and renewable energy – illustrates that the scale of global investment in energy efficiency and its contribution to energy demand are as significant as those of other developed supply-side resources.
“Energy efficiency has been called a ‘hidden fuel’, yet it is hiding in plain sight,” IEA Executive Director Maria van der Hoeven said as she presented the report at the World Energy Congress in Korea. “Indeed, the degree of global investment in energy efficiency and the resulting energy savings are so massive that they beg the following question: Is energy efficiency not just a hidden fuel but rather the world’s first fuel?”

Friday, October 18, 2013

A New Win-Win? CO2-eating Microalgae as a Biofuel Feedstock

While researchers remain skeptical about turning any form of microalgae into profitable biomass, an Australian company thinks it can do just that.


Successful microalgae-to-biodiesel conversion has been the goal of some renewable energy researchers for more than two decades. But after years of research on how to best grow these Carbon Dioxide (CO2)-loving plants in open ponds, a commercially viable solution has remained elusive.

SMITH: ‘LET’S TALK ABOUT WIND … AND BIOMASS AND SOLAR AND HYDRO’

So said Rep. Tony Klein at a recent joint legislative energy committee meeting. After hearing the recommendations of the Governor’s Energy Generation Siting Policy Commission, Rep. Klein asked if we are asking the right question. Is the issue the siting process, or is it the specific technologies?
The answer is both. The siting process (the Public Service Board Section 248 process) and the specific renewable energy technologies being deployed are both in need of review and re-evaluation.
The commission’s discussions were limited by their charge to look only at new electric generation, including fossil fuels such as a natural gas plant. In their numerous all-day deliberations, commissioners danced around talking about specific technologies, and focused on the process through which electricity generation is sited.
At the joint energy committee’s recent all-day meeting, legislators were thankfully open to hearing about the issues of “now” – already sited generation projects – plus transmission lines, pipelines, energy efficiency and non-electricity energy developments that have been dominating some people’s lives as society seeks to develop yet more energy infrastracture (aka energy sprawl).
Towns and members of the public dealing with wind, biomass and solar renewable energy projects, and gas and electric transmission lines share common ground where the PSB process is concerned. The process requires an inordinate amount of money and time to participate. Citizens, through an accident of their location, must become involved in three rounds of pre-filed testimony, three rounds of discovery, opportunities for depositions, and last minute deals in the form of MOUs between ANR and developers, all on an immovable schedule with deadlines week after week after week. This makes for a daunting undertaking, even for attorneys paid to participate.
The imbalance at the PSB process is nothing new. The late activist Joe Bivins wrote before he died in 1997, “In this setting, no one represents the community’s interests in social or environmental justice. When the doors to the courthouse are opened only to the few, justice will not be found within at all.”
Of all the technologies, big wind turbines create the most issues. Biomass is next, followed by solar. Fortunately these issues are quantifiable and finite. Once identified, conversations can follow.
Jane Palmer of Monkton is one of those citizens. She spoke during the public comment period of the legislators’ meeting, and brought with her some, but not all of the paperwork she and her organic farmer husband have accumulated since they became intervenors in the Vermont Gas Pipeline case in April. In six months she has accumulated four large boxes full of paperwork, which she brought into the room on a hand truck.
The commission never discussed the details of the PSB’s process, or whether it is the right regulatory body to review the siting of renewable energy projects, which have local and regional land use impacts. There is a discussion to be had about the benefits of the current PSB process versus the Act 250 process, which is better equipped to address the kinds of issues raised by solar, biomass and wind installations.

Saturday, October 12, 2013

Federal official: Half of U.S. nuclear power comes from disarmed Russian warheads

Wednesday, October 9, 2013 20:20 EDT
Russian soldiers next to a model of a Russian Topol missile during a training session at the Serpukhov's military missile forces research institute some 100km outside Moscow. [AFP]
 
  • 15
     
  • Print Friendly and PDF
  • Email this page
Uranium fuel from 20,000 disarmed Russian warheads are generating about half of US nuclear power in a spinoff from a landmark disarmament accord, a top US official said Wednesday.
But the deal under which 500 tonnes of Russian weapons-grade uranium has been used to light and heat American homes will end next month because Russia believes its former Cold War rival has been getting energy on the cheap.
Rose Gottemoeller, U.S. under secretary of state for arms control, told a UN committee the 1993 accord was a disarmament success.
Arms control experts call it the “megatons-to-megawatts” deal and hail the accord as a little known but important example of the United States and Russia pressing disarmament.
Gottemoeller called the Highly Enriched Uranium Purchase Accord a “significant non-proliferation accomplishment”.
Signed after the collapse of the Soviet Union, the deal was concluded as the two countries sought ways to get rid of warheads under their 1991 Strategic Arms Reduction Treaty.
The weapons level uranium is downgraded in Russia and the low enriched product “is delivered to the United States, fabricated into nuclear fuel and used by nearly all US nuclear power plants to generate half of the nuclear energy in the United States,” Gottemoeller said.
“Approximately 20,000 nuclear warheads have been eliminated under this unique government-industry partnership,” Gottemoeller told the UN committee.
Over the past 15 years, the Russian uranium fuel has accounted for about 10% of all electricity produced in United States, she added.
The 500 tonnes of uranium traded is the equivalent of about 10 billion barrels of oil, according to experts. A smaller amount of uranium from disarmed American weapons is also used to generate power.
US officials will go to St Petersburg in November to mark the loading of the final Russian containers which should arrive in the United States in December, Gottemoeller said.
“We look forward to celebrating this historic achievement,” she said.
The United States tried to extend the accord, but Russia refused saying the price was too low, diplomats said.

Saturday, September 21, 2013

The Naked Truth About Nuclear Accident Insurance
Going without insurance is described as "going naked" in insurance industry lingo. Going without insurance for the worst hazards in the nuclear power industry is business as usual.

One need not look back very far to see the problem. In March 2011, the Fukushima-Daiichi nuclear power plant disaster, triggered by an earthquake followed by a tsunami that overwhelmed all of Japan's safeguards, melted down three reactors, displaced 160,000 people and caused an estimated $250 billion in damages and other still-unfolding economic consequences.

Naked AmericaToday, in the United States, we have 104 operating nuclear plants producing electricity. The owners, operators, and government regulators who oversee them say an event like Fukushima will not happen here. And even if it did, they insist, there is enough liability insurance in place to cover the damages. The actual amount of that insurance coverage: just $12.6 billion.

You don't need an advanced degree in calculus or risk analysis to see that something doesn't add up, and to start feeling a bit...naked. But when it comes to nuclear insurance, naked is the fashion designed for the American public. 

Wednesday, July 24, 2013

Solar PV: The true value of distributed energy


Earlier this month, the decision of APS – Arizona’s largest electric utility – to propose two major changes to its net metering program, one which would do away with the program entirely, has become the latest lightning rod fueling the growing tension between utilities and solar PV. In California, there is significant debate about whether to raise net metering caps. In Texas, CPS Energy – the largest municipally owned utility in the country – has proposed a net metering alternative that some fear will substantially reduce the value of solar.RMI Outlet
What is driving these conflicts? One major factor is that distributed energy resources, including distributed solar photovoltaics (DPV), have different physical, operational, and economic characteristics than conventional power plants. Such differences create potentially significant misalignments when they are added into a system designed for decades around the characteristics of conventional power plants.
Lack of understanding a barrier
At the root of all this is the lack of a clear understanding of the actual costs of integrating DPV onto the grid, and likewise, of the actual values that solar can provide to the grid. Without that foundation of understanding, it’s impossible to fairly evaluate policies such as net metering or its alternatives, and debates become based on opinion rather than fact.
An early step in creating that solid foundation is gleaning collective insight from the plethora of individual studies that have sought to identify and quantify the values DPV provides and, to a lesser extent, the costs it imposes on the system. Over the past several months, a team from the Electricity Innovation Lab (eLab) has done just that – reviewing more than 15 studies and synthesizing the results and implications in a new report, A Review of Solar PV Benefit & Cost Studies, released today. Here’s what we found:
  1. No study comprehensively evaluated the benefits and costs of DPV, although many acknowledge additional sources of benefit or cost and many agree on the broad categories of benefit and cost. There is broad recognition that some benefits and costs may be difficult or impossible to quantify, and some accrue to different stakeholders.

Utility Solar Is Dead; Long Live Distributed Generation

The shift from the centralized utility model is forcing utilities—for the first time in their existence—to figure out how to compete.

HARESH PATEL: JUNE 17, 2013
For years we’ve likened the energy sector to the computing world, holding up Moore’s law as a guiding example proving that renewables will achieve grid parity.
Today, as panel costs have dropped 90 percent and adoption is at an all-time high, the analogy between the two seems even more fitting. Just like the massive mainframe disruption spawned by personal computing, distributed generation has already begun to challenge the centralized solar model favored by utilities, with no end in sight.
At an industry level, the evidence of a new distributed era is all around us. Fuel cells like Bloom Energy’s are enabling the C&I transformation to self-made energy. Combined natural gas power plants are on the rise, and microgrids are popping up in states across the nation.
The change may feel sudden, but for most of us, it’s been a long time coming. 2009 marked the beginning of utility-scale’s heyday. Investors interested in deploying capital looked at smaller 1-megawatt to 3-megawatt projects and realized that utility-scale solar had the same diligence cost. Investors promptly abandoned the C&I segment in favor of big projects. Though a good decision at the time, the situation has changed. The number of utility projects have dwindled and the shift from the centralized utility model has taken root and is forcing utilities -- for the first time in their existence -- to figure out how to compete.
There’s no doubt now that utilities will ultimately have to change their business models. In a recent discussion with a well-known utility, top executives admitted that not only had solar utility segment plateaued, but that “utility is dead.” It sounded dramatic, but the sentiment has been the topic of discussion for the last twelve months, both in the media and in more hushed tones in closed meetings. So, how will utilities adapt?

Solar methanol and the third industrial revolution


However, the tantalising promise of methanol is that it can be used as a fuel in a fuel cell. Fuel cells can operate at an efficiency of 80 per cent, as against gas-fired steam turbines of 50 per cent, and coal-fired steam turbines of 40 per cent. These are best practice numbers; many Chinese coal-fired steam turbines are much less efficient than 40 per cent.The Hydrogen Economy has often been touted as the next big energy source. However, due to the prohibitive cost of its infrastructure, hydrogen has gone out of favour. Fairly recently, Professor George Olah, Nobel Laureate, has proposed using methanol (think methylated spirits). Methanol is a liquid at room temperature, it can be used in the pre-existing gasoline infrastructure and, unlike LNG, it can be transported by ordinary oil tanker.
Because the fuel cell stack can be located close to the consumer, the 20 per cent “cogenerated” waste heat can be ducted to the consumer (for space heating) along with the electricity, resulting in 100 per cent fuel efficiency.
The ideal fuel cell for methanol will consume the methanol directly and not need a (steam reformation) unit at its front end to preconvert the methanol into hydrogen and carbon dioxide.
Up to now, two fuel cell types seem to have emerged from the scrum, the solid polymer and the solid oxide. Both of these have crippling disadvantages.
Firstly, the solid polymer cell. The proton-exchange “nafion” polymer membrane fuel cell uses platinum. End of story! Also, platinum catalysts are prima donnas. They hate carbon dioxide. Even the tiniest amount of carbon dioxide that escapes the hydrogen scrubber will convert to carbon monoxide and “poison” the platinum. Methanol will “cross over” the electrolyte and combine with the oxygen directly, without forming an external circuit. This greatly reduces the output current.
Secondly, the Solid Oxide Cell (SOFC). The SOFC operates at 1000°C, continuously. It can use methanol, but it is so costly!
However … a couple of days ago, a South Korean research team at the Ulsan National Institute of Science and Technology has disclosed a cheap, iodine coated graphene-based direct methanol fuel cell. They have reported that this fuel cell generates 33% more current than platinum, is unaffected by carbon monoxide, and does not display “methanol cross-over”. If it can be commercialised, this fuel cell could be the holy grail that will usher in Olah’s “Methanol Economy.”
Traditionally, methanol has been produced by the steam reformation of natural gas. However, there is another pathway. If carbon dioxide and hydrogen are processed in the presence of a copper/zinc oxide/ alumina catalyst, the result is satisfactory levels of methanol and water.

Straight from the sun: The renewables revolution has landed


The southern tip of Manhattan narrows to a point at its southern end and juts out into the broad expanse of New York Harbour. The Marina is on the lower West Side, far enough down for the famous landmark of Ellis Island to be clearly visible. Just beyond is the Statue of Liberty; it was a mid-June Monday and the statue was bathed in the bluish haze of a warm humid late afternoon at the end of spring. A few lazy sailboats drifted in front of it. Further away, the high, bright-orange superstructure of a Staten Island ferry passed in front of the Verrazano Narrows bridge, itself a tiny latticework on the horizon, spanning the channel between Staten Island and Brooklyn, and guarding New York’s gateway to the sea.  Closer to the shore, the Circle Line sightseeing boat passed by, as did the odd ferry across the Hudson, carrying commuters home to the New Jersey shore a mile or so away across the river.Just after the latest round of climate change talks (in Bonn this time) had sort-of stalled, I took a walk to New York’s North Cove Marina.
P1020148
The Planet Solar in New York’s North Cove Marina (pic: M. Robbins)
The Marina itself is tucked into the steel-and-glass canyons of modern Manhattan; over it looms the new Freedom Tower that has sprung from the ruins of September 11 2001. That afternoon the MSTûranor Planet Solar had backed into her berth in the Marina after a long trip across the Atlantic to Florida and thence up the coast. The nameTûranor is taken from J.R.R. Tolkien; it is Elvish for Power of the Sun. They are not joking. Planet Solar is powered by an enormous solar array of about 5,600 square feet (519 sq m). Walking into the marina from the south, the 89-ton boat was instantly recognisable; she is actually a catamaran, with a totally flat superstructure bar a small blister for the bridge – the rest of her topside is solar cells.
The brainchild of Swiss eco-entrepeneur Raphael Domjan, in 2010-2012 she became the first solar boat to circumnavigate the globe. On this occasion, she had not come so far – across the Atlantic from La Ciotat on France’s Mediterranean coast. The trip had been accomplished solely on solar power; although she carries a back-up engine to recharge the batteries, she hadn’t needed it.
Planet Solar had not come to New York just to prove a point. On board was a team from the University of Geneva, led by Martin Beniston, Professor of Climate Change at the University and also director of its newly-established Institute of Environmental Sciences. On the night the Planet Solar arrived in Manhattan, the Swiss Consulate arranged a cheerful informal reception on board, and I found Professor Beniston unwinding with some excellent Swiss wines and cheeses.
Although Swiss, Professor Beniston was born in the UK and did his first degree at the University of East Anglia, where I did my own PhD  on climate change. The project, he explained, was to carry out research in the Gulf Stream into the mechanics of CO2 fluxes between the ocean and the atmosphere, and especially into the role of phytoplankton. “Because it’s a pollution-free boat, it will be ideal for the collection and analysis of samples,” he told me. “They won’t be contaminated.”
Later I climbed up to the bridge to greet the captain of the Planet Solar,  Gérard d’Aboville.  The vessel had had a hard time docking that afternoon in the confined space of the marina, and he could have been in a foul mood, but he wasn’t, or if he was, he hid it well. But then, not much bothers a man who has rowed singlehandedly across both the Atlantic and the Pacific, sat in the European Parliament and done much else besides. (He also once competed in the Paris-Dakar with his four brothers, each one riding a Kawasaki 250; so maybe the whole family is slightly mad.) Then I stood with my companion in the hatch and admired the solar array, which glowed carmine and orange as the sun sank slowly towards the New Jersey shore, lighting the pink and grey clouds and setting the Hudson on fire.

The future of solar – centralised or local generation?


The 392MW Ivanpah solar tower power station is the biggest concentrated solar thermal project in the world. It is also the most visually arresting. It features three huge towers, each 150m tall, surrounded by huge fields of mirrors that will focus the sun’s energy on a receiver located at the top of the tower. Water is boiled to create steam that then drives the turbines.After driving several hours along Interstate 15 through the desolate and ancient land formations of the Mojave Desert, and after rising over a large summit, you are suddenly presented with a glimpse of what many say is the future of electricity generation.
Screen Shot 2013-07-20 at 11.30.24 AM
It’s solar generation at a massive scale, made more impressive by its surroundings. Even though it spreads over so many hectares, its size pales against the grandeur of the stunning Mojave landscape.
Ivanpah is not the only solar power station of large-scale being built in this art of the world. To the north, across the state border in Nevada, a 110MW solar tower with storage facility is being built by SolarReserve.
To the west, in the heart of California’s “high desert”, First Solar is nearing completion of a 250MW AVSR solar PV project near Lancaster, while down the road SunPower has begun construction of a 579MW solar PV plant of their own.
A little further north, the tables are turned as SunPower puts the finishing touches to its 250MW CVSR project, while First Solar is about to trump it with the 550MW Topaz solar PV project, which is half way through construction.
But even as these massive projects are nearing completion, the question is being asked: Does the future of solar really lie in more of these large scale projects? Even the owners of these huge projects are not so sure.
NRG, the largest owner of generation assets in the US, and part owner of the Ivanpah project, says it is uncertain about the future of such large scale projects, because they are hugely capital-intensive.

King Island achieves 100% renewables – wind, solar, storage


The King Island project combines some 2.45MW of wind and a lot less solar power, with storage devices and an automated control system. The $46 million project has been funded by the federal Government and has been described as a potential insight into how Australia’s main grids can wean themselves off fossil fuels.Hydro Tasmania is hailing a major breakthrough with its King Island Renewable Energy Integration Project , saying it has achieved extended periods of 100 per cent renewable energy for the island’s grid – the first time that a grid of this scale has been serviced by wind, solar and storage devices.
Project leader Simon Gamble says the major achievement so far has been the ability to switch off all fossil fuels completely for extended periods while variable renewable sources such as wind and solar  are used. This is the first time this has been achieved for a load of this size (some renewable grids such as Pacific island of  Tokelau’s are just 100kW is size), and the first time with predominantly wind power.
“This has removed a key barrier,” Gamble told RenewEconomy. Such systems had usually succeeded in turning down the amount of fossil fuel power needed, but not switching them off altogether. The full range of storage and control management systems have yet to be deployed.
“Achieving 100 per cent renewable energy penetration in large off-grid systems has remained elusive until now, and is very difficult to achieve given the need to maintain reliability and security of power supply under highly variable wind and solar conditions,” he said in an earlier statement>
The overall project aims to cut the use of diesel consumption on King Island by more than 65 per cent over a one year period, but it will allow diesel generators to be switched off when not required. So far it has achieved zero diesel operation for periods of up to 1.5 hours overnight when customer demand is lowest, and in daylight hours under high wind conditions.
Last year, Gamble told RenewEconomy that the project would provide an insight into how the Australian grid might look in a few decades – a combination of renewables backed up by dispatchable power and with storage solutions. “The NEM (National Electricity Market) is a much larger system, but it will have similar technical issues,” Gamble said then. “If we are integrating more wind, and solar, we need to learn how to do it.”
The project is using Hydro Tasmania’s own advanced automated control systems and dynamic resistor technology, coupled with a standard flywheel uninterruptible power supply system, commonly used in hospitals and telephone exchanges. Later this year, customer load control will be introduced, as well as a 3MW battery array from Ecoult, which is developing an enhanced lead battery known as the “UltraBattery” 

Thursday, July 18, 2013

THE ICEMAN COMETH


Hyperloop To Start Testing This Year. Theoretical Speed Of 4,000 MPH. (VIDEO) (via Clean Technica)

This article was first published on Gas2.By Chris DeMorro. America has been embroiled in a debate over the wisdom of spending billions of dollars on a national high-speed rail network. A small Colorado company called ET3 has other ideas though, calling…

Saturday, June 8, 2013

The ‘Social Cost Of Carbon’ Is Almost Double What The Government Previously Thought: The U.S. government updated its estimate of how much carbon pollution harms the economy. They found that their previous estimated costs were too low — ranging from 50 to 100 percent depending on the year and the estimate. An interagency working group coordinated by the White House released something called the “Technical Update of the [...]/p

Friday, May 31, 2013

ENERGY SECURITY NEWS, VIEWS & OPINIONS: Nuclear is not the answer
Tom Horton's op-ed in pr...
: Nuclear is not the answer Tom Horton's op-ed in praise of Norman Meadows and nuclear power presented inaccuracies ("Time for ...

Tuesday, May 28, 2013

Study claims 100 percent renewable energy possible by 2030

Jan 19, 2011 by Lin Edwards 

(PhysOrg.com) -- New research has shown that it is possible and affordable for the world to achieve 100 percent renewable energy by 2030, if there is the political will to strive for this goal.

Achieving 100 percent  would mean the building of about four million 5 MW , 1.7 billion 3 kW roof-mounted solar photovoltaic systems, and around 90,000 300 MW solar power plants.
Mark Delucchi, one of the authors of the report, which was published in the journal , said the researchers had aimed to show enough renewable energy is available and could be harnessed to meet demand indefinitely by 2030.
Delucchi and colleague Mark Jacobson left all fossil fuel sources of energy out of their calculations and concentrated only on wind, solar, waves and geothermal sources.  currently provide over 80 percent of the world’s energy supply. They also left out biomass, currently the most widely used renewable energy source, because of concerns about pollution and land-use issues. Their calculations also left out nuclear power generation, which currently supplies around six percent of the world’s electricity.
To make their vision possible, a great deal of building would need to occur. The wind turbines needed, for example, are two to three times the capacity of most of today’s wind turbines, but 5 MW offshore turbines were built in Germany in 2006, and China built its first in 2010. The solar power plants needed would be a mix of photovoltaic panel plants and concentrated solar plants that concentrate solar energy to boil water to drive generators. At present only a few dozen such utility-scale solar plants exist. Energy would also be obtained from photovoltaic panels mounted on most homes and buildings.
Jacobson said the major challenge would be in the interconnection of variable supplies such as wind and solar to enable the different renewable sources to work together to match supply with demands. The more consistent renewable sources of wave and tidal power and geothermal systems would supply less of the energy but their consistency would make the whole system more reliable.


Read more at: http://phys.org/news/2011-01-percent-renewable-energy.html#jCp