How Clean is Your Cloud and Telecom?

Greenpeace report How Clean is Your Cloud? I saw mentioned in 3T magazine news is actually quite interesting reading. This year’s report provides a look at the energy choices some of the largest and fastest growing IT companies. The report analyzes the 14 IT companies and the electricity supply chain in more than 80 data center cases.


The report contains also lots of interesting background information on both IT and telecom energy consumption. I recommend checking it out. Here are some points picked from How Clean is Your Cloud? report:

Facebook, Amazon, Apple, Microsoft, Google, and Yahoo – these global brands and a host of other IT companies are rapidly and fundamentally transforming the way in which we work, communicate, watch movies or TV, listen to music, and share pictures through “the cloud.”

The growth and scale of investment in the cloud is truly mind-blowing, with estimates of a 50-fold increase in the amount of digital information by 2020 and nearly half a trillion in investment in the coming year, all to create and feed our desire for ubiquitous access to infinite information from our computers, phones and other mobile devices, instantly.

The engine that drives the cloud is the data center. Data centers are the factories of the 21st century information age, containing thousands of computers that store and manage our rapidly growing collection of data for consumption at a moment’s notice. Given the energy-intensive nature of maintaining the cloud, access to significant amounts of electricity is a key factor in decisions about where to build these data centers. Industry leaders estimate nearly $450bn US dollars is being spent annually on new data center space.

Since electricity plays a critical role in the cost structure of companies that use the cloud, there have been dramatic strides made in improving the energy efficiency design of the facilities and the thousands of computers that go inside. However, despite significant improvements in efficiency, the exponential growth in cloud computing far outstrips these energy savings.

How much energy is required to power the ever-expanding online world? What percentage of global greenhouse gas (GHG) emissions is attributable to the IT sector? Answers to these questions are very difficult to obtain with any degree of precision, partially due to the sector’s explosive growth, a wide range of devices and energy sources, and rapidly changing technology and business models. The estimates of the IT sector’s carbon footprint performed to date have varied widely in their methodology and scope. One of the most recognized estimates of the IT sector’s footprint was conducted as part of the 2008 SMART 2020 study, which established that the sector is responsible for 2% of global GHG emissions.

The combined electricity demand of the internet/cloud (data centers and telecommunications network) globally in 2007 was approximately 623bn kWh (if the cloud were a country, it would have the fifth largest electricity demand in the world). Based on current projections, the demand for electricity will more than triple to 1,973bn kWh (an amount greater than combined total demand of France, Germany, Canada and Brazil).

The report indicates that, due to the economic downturn and continued energy efficiency and performance improvements, global energy demand from data centers from 2005-2010 increased by 56%. Estimates of data center electricity demand come in at 31GW globally, with an increase of 19% in 2012 alone. At the same time global electricity consumption is otherwise essentially flat due to the global recession is still a staggering rate of growth.

Given the scale of predicted growth, the source of electricity must be factored into a meaningful definition of “green IT”. Energy efficiency alone will, at best, slow the growth of the sector’s footprint. The replacement of dirty sources of electricity with clean renewable sources is still the crucial missing link in the sector’s sustainability efforts according to the report.


The global telecoms sector is also growing rapidly. Rapid growth in use of smart phones and broadband mobile connections mean mobile data traffic in 2011 was eight times the size of the entire internet in 2000. It is estimated that global mobile data traffic grew 133% in 2011, with 597 petabytes of data sent by mobiles every month. In 2011, it is estimated that 6 billion people or 86.7% of the entire global population have mobile telephone subscriptions. By the end of 2012, the number of mobile connected devices is expected to exceed the global population. Electronic devices and the rapidly growing cloud that supports our demand for greater online access are clearly a significant force in driving global energy demand.

What about telecoms in the developing and newly industrialized countries? The report has some details from India (by the way it is expected that India will pass China to become the world’s largest mobile market in terms of subscriptions in 2012). Much of the growth in the Indian telecom sector is from India’s rural and semi-urban areas. By 2012, India is likely to have 200 million rural telecom connections at a penetration rate of 25%. Out of the existing 400,000 mobile towers, over 70% exist in rural and semi-urban areas where either grid-connected electricity is not available or the electricity supply is irregular. As a result, mobile towers and, increasingly, grid-connected towers in these areas rely on diesel generators to power their network operations. The consumption of diesel by the telecoms sector currently stands at a staggering 3bn liters annually, second only to the railways in India.

What is the case on other developing and newly industrialized countries? I don’t actually know.

NOTE: Please note that that many figures given on the report are just estimates based on quite little actual data, so they might be somewhat off the actual figures. Given the source of the report I would quess that if the figures are off, they are most probably off to direction so that the environmental effect looks bigger than it actually is.


  1. Tomi Engdahl says:

    Use Google’s New Mapping Tool To See How Much Solar Panels Would Benefit Your Home

    With Project Sunroof, Google hopes to make it easier for more households to make the switch to renewables.

    Solar power is cheaper than buying from the grid in cities like Boston, San Francisco, and San Jose—even without any government credits. Now, if you live in one of those cities, a new tool from Google will calculate exactly how much you can save by zooming into your roof via Google Maps.

    Project Sunroof crunches weather data and then creates a 3-D model of the roof and nearby trees to figure out how much sunlight or shade will fall on the panels and how much money someone can save.

    Google engineers were inspired to create the tool after noticing how many people were searching for information on getting solar panels (and seemingly not finding what they needed). Similar tools have been around for a while—a startup called RoofRay was using Google Maps to calculate solar potential as long ago as 2008, and an MIT spinoff company Mapdwell won Fast Company’s Innovation By Design award last year. But Google thinks it can do better using its own expertise in maps and machine learning, combined with the high-resolution imagery available on Google Earth.

    About Project Sunroof

  2. Tomi Engdahl says:

    But it might. Hey, we don’t know, we’re the Met Office

    There hasn’t actually been any global warming for the last fifteen years or so – this much is well known. But is this just a temporary hiccup set to end soon? A new report from the UK’s weather bureau says it just might not be.

    The Met Office boffins believe that, yes, a long-expected El Nino is at last starting up in the Pacific. This will probably mean warming. The Pacific Decadal Oscillation (PDO), another hefty Pacific mechanism, also looks set to bring some heat.

    But, no doubt upsettingly for some, there’s a third and very powerful factor to consider: the mighty Atlantic Multidecadal Oscillation (AMO).

    The AMO has actually been heating the world up since the mid-1990s – though not strongly enough to raise temperatures – but now it looks set to swing into a negative phase and cool the planet off, probably for a long time, as AMO phases typically last several decades.

    The Met Office doesn’t care for phrases such as “hiatus” or even “pause” to describe the absence of global warming for the last fifteen years or so: it describes the flat temperatures as a “slowdown”.

    Meanwhile, it appears quite possible that 2015 will be a record warm year globally – though not in Europe or America. However, as NASA climate chief Gavin Schmidt pointed out in 2013, “one more year of numbers isn’t in itself significant”.

  3. Tomi Engdahl says:

    BLOOD FEAST: Climate mosquitoes will wipe out North Polar REINDEER
    Look out Santa! Rudolph’s red nose will be swollen with PARASITE BITES

    Everyone knows that climate change is a terrible menace: but now the threat has really sunk in, with the revelation that global warming is set to threaten the lovable reindeer/caribou of the Arctic – with obvious consequences for a certain jolly chap in a red suit.

    This is because – according to a computer forecast model developed by scientists – rising temperatures in the Arctic are set to make life and reproduction a lot easier for mosquitoes. The tiny airborne bloodsuckers will swarm across the circumpolar regions and make a sanguinary meal of the antler-bristling Rangifer tarandus (aka reindeer, or in some circles “caribou”) which abound up there.

  4. Tomi Engdahl says:

    Burn ALL the COAL, OIL – NO danger of SEA LEVEL rise this century from Antarctic ice melt
    Hardcore warmist’s amazing admission

    One of the world’s most firmly global-warmist scientists says that even if humanity deliberately sets out to burn all the fossil fuels it can find, as fast as it can, there will be no troublesome sea level rise due to melting Antarctic ice this century.

    Dr Ken Caldeira’s credentials as a global warmist are impeccable. He is not a true green hardliner – he has signed a plea to his fellow greens to get over their objections to nuclear power, for instance, and he doesn’t totally rule out geoengineering as a possible global-warming solution. But that’s as far as he’ll go: in Dr Caldeira’s view, it is plain and simple unethical to release greenhouse gases into the air. There’s no middle ground on that as far as he’s concerned – he’s not OK with gas power as an alternative to coal, for instance.

    But he’s a scientist, and like all proper scientists he’s willing to admit inconvenient truths. In this case, the truth in question is his own prediction that no matter what humans do in the way of carbon emissions, sea levels are not going to rise by more than 8cm this century due to melting Antarctic ice. For context, the seas have been rising faster than that for thousands of years.

    But carbon emissions certainly are going up pretty fast, so it’s not crazy of the scientists to make such assumptions. That’s not the problem with the study.

    The problem is that, yes, they might be right and all the Antarctic ice might well melt if all the coal and oil and shale and everything is burned as fast as we can burn it. But – according to their modelling – this increased melting will take at least a thousand years, probably more. And, crucially, it will not really get rolling at all until well after the year 2100.

  5. Tomi Engdahl says:

    AMD Trims Carbon Footprint

    Advanced Micro Devices has put a lot of energy into energy efficiency. At a company event, held here, AMD touted a reduced carbon footprint for its x86 Carrizo processor.

    Carrizo users can expect a 46% smaller carbon footprint than that of previous-generation Kaveri, AMD wrote in a company-led study. “Ultimately, an end user that replaces a Kaveri-based notebook computer with a Carrizo-based notebook will save approximately 49 kWh and 34 kg of [greenhouse gas] emissions over the three year service life of the computer.”

    The study, conducted by AMD’s Power and Performance Lab, defines “overall life cycle greenhouse gas implications” as the carbon footprint associated with wafer fabrication, assembly, test, packaging and consumer use. However, the study found that 82% of emissions come from the consumer use phase while 18% comes from manufacturing, transportation, and the like.

    “In 2013, approximately three billion personal computers used more than 1% of total energy consumed, and 30 million computer servers worldwide used another 1.5% of all electricity, at an annual cost of $14 billion to $18 billion,” AMD Corporate Fellow Sam Naffziger wrote in a white paper. “By 2030, the number of connected devices is estimated to grow to 100 billion. Since virtually all of these products consume electricity, their use contributes to GHG emissions.”

    AMD did not provide information on charging laptops specifically impacts the environment, but advocated for power management through improved processor architecture. Its x86 processor Carrizo will ship this year, following much talk about the chip’s innovative design and integration with Windows 10

    “We have recognized that [energy efficiency] is a key area and where we have put a lot of scarce engineering, and where we intend to differentiate,” Naffziger told EE Times.

  6. Tomi Engdahl says:

    The ‘echo chamber’ effect misleading people on climate change
    Dubious bloggers like DeSmogBlog refuse to accept consensus

    Trick-cyclists in America have come out with research which could explain why the debate on climate change continues to rumble on, even though there is a solid consensus on the facts of the matter.

    Essentially, according to the researchers, people tend to live in “echo chambers” as far as climate matters go, seeking out information and advisers who agree with what they already believe. Thus, they may persist in deluded views regardless of what others think.

    “Individuals who get their information from the same sources with the same perspective may be under the impression that theirs is the dominant perspective, regardless of what the science says,” explains Professor Dana Fisher, the corresponding author who led the research.

  7. Tomi Engdahl says:

    MASSIVE GLOBAL COOLING process discovered as Paris climate deal looms
    ‘Could explain recent disagreements’

    As world leaders get ready to head to Paris for the latest pact on cutting CO2 emissions, it has emerged that there isn’t as much urgency about the matter as had been thought.

    A team of top-level atmospheric chemistry boffins from France and Germany say they have identified a new process by which vast amounts of volatile organic compounds (VOCs) are emitted into the atmosphere from the sea – a process which was unknown until now, meaning that existing climate models do not take account of it.

    The effect of VOCs in the air is to cool the climate down, and thus climate models used today predict more warming than can actually be expected. Indeed, global temperatures have actually been stable for more than fifteen years, a circumstance which was not predicted by climate models and which climate science is still struggling to assmilate.

    Global models at the moment assume total emissions of isoprene from all sources – trees, plants, plankton, the lot – of around 1.9 megatons per year. But, according to the new research, the newly discovered “abiotic” process releases as much as 3.5 megatons on its own – which “could explain the recent disagreements” between models and reality.

    VOCs such as isoprene are known to be a powerful factor in the climate, as they cause the formation of aerosol particles. Some kinds of aerosol, for instance black soot, warm the world up: but the ones resulting from VOCs actually cool it down substantially by acting as nuclei for the formation of clouds. It has previously been suggested that production of VOCs by pine forests could be a negative feedback so powerful that it “limits climate change from reaching such levels that it could become really a problem in the world.”

  8. Tomi Engdahl says:

    Storing solar energy as hydrogen: Photovoltaic systems for plants

    In Germany, an innovative storage power plant stores the energy produced by photovoltaic systems as hydrogen for seasonal storage, in addition to batteries for daily storage. By employing the PLC-based system, smaller companies can reduce their carbon footprint to zero.

    Photovoltaic solar power can be used effectively in plants by converting the harnessed energy to hydrogen for long-term, seasonal storage, beyond daily storage provided by batteries. A programmable logic controller (PLC) monitors chemical processes in a electrolyzer and fuel cell used in the power storage system.

    A stereotype paints the people from Germany’s far north as moving at a more leisurely pace.

  9. Tomi Engdahl says:

    Top boffin Freeman Dyson on climate change, interstellar travel, fusion, and more
    When physics gurus speak, they speak to El Reg

  10. Tomi Engdahl says:

    Some like it hot … very hot: How to use heat to your advantage in your data center
    We try to melt preconceptions about staying cool

    Heat has traditionally been the sysadmin’s enemy. We may have turned technology to our advantage and chipped away at heat’s wasteful nature over the years, but our old foe has remained.

    From turning data centers into walk-in fridges, and hot/cold aisle separation to cold aisle containment and positive pressure, we’ve tried everything, and yet the standard operating temperature has jumped from 18°C to around 23°C.

    As compute requirements get bigger, then heat production increases – what lies in the future?

    Surprisingly, the future could be one where we ride rising head levels and turn them back to the greater good.

    Having kept server inlet temperatures at Baltic levels, the hardware vendors have come out in force to say their kit was robust enough to stand higher inlet temperatures. We almost didn’t believe it was true, but servers (and other hardware) will run comfortably at 22-23°C – a full 4-5°C higher than conventional wisdom has always taught us.

    In May 2015, for example, nowhere in the UK recorded a temperature higher than 23.4°C. That’s a peak temperature less than half a degree above the temperature that we’re chilling our cold aisles down to, and for reference, the mean temperature for the month was a measly 9.6°C.

    That means all that power spent cranking the computer room air conditioning round the clock was a monumental waste, as you were making the aisle temperature barely lower than the air outside. Enter free-air cooling.

    This is not as simple as just throwing the data hall doors open to the elements. Let’s face it, you’d be inviting inconsistencies, moisture, and all sorts of particulate nasties into your lovely clean environment. But if you’ve got heating, ventilation, and air-conditioning that supports it, you can open the external vents, divert the fans, and flood your cooling system with good old-fashioned British temperatures.

    That’s not a typo: hot-water-cooled

    Yes, hot-water-cooled. And that’s not just warm, it’s most definitely hot – as in, 40°C or 104°F – which is nearly 20°C above your common-or-garden rack mount hardware’s preferred air inlet temperature. You can run that water through an external piping radiator system and cool it in almost any ambient air temperature, without even having to turn on the compressors for the chiller systems to artificially cool the fluid lines.

    Originally launched in 2012 by the Leibniz Supercomputing Centre in Germany, in their IBM “SuperMUC” supercomputer this hot-water-cooled beast is reckoned to save around 40 per cent on comparable supercomputer operating costs, with no noticeable decrease in performance. There’s no requirement to run it at a lower clock speed or compromise on the computational output; it’s designed to run at 40°C without having to back off the throttle.

    Raising the inlet temperatures in servers by a couple of degrees was a game-changer. Raising the water-cooling temperature to 40°C? That’s remarkable.

    Using spinning disks and humming boxes to heat offices

    Where a free-air cooling system draws air in from the external environment, diverts the ventilation flow, and mixes the lower-temperature external air to lower the internal temperature, the digital furnace does the exact opposite; the waste heat that is inevitably produced by the buzzing power supplies and spinning disks is siphoned out of the data hall, and put to better use.

    The obvious application of this is central heating. That wasted heat is pumped into a ducting system that, when combined with a building and environmental management system as clever as your HVAC, is diverted under the floors and over the ceilings of attached offices to maintain a constant, pleasant temperature in the occupied areas.

    It sounds fanciful and altruistic, but data centers and information processing centers could provide heating and hot water for their neighbors. Eco-friendly housing developments are winning awards for their energy efficiency by using geothermal energy and networked ventilation systems to provide heating and hot water year-round. Why not data centers?

  11. Tomi Engdahl says:

    Morocco’s Solar Power Mega-Project

    Morocco, located along the northwestern African coast, is in prime position to take advantage of solar technology, and they’ve committed to one of the biggest such projects in the world. The city of Ouarzazate will host “a complex of four linked solar mega-plants that, alongside hydro and wind, will help provide nearly half of Morocco’s electricity from renewables by 2020.” It will be the largest concentrated solar power plant in the world.

    The first phase of the project, called Noor 1, comprises 500,000 solar mirrors that track the sun throughout the day, with a maximum capacity of 160MW. When the full project finishes, it will be able to generate up to 580MW.

    Morocco poised to become a solar superpower with launch of desert mega-project

    World’s largest concentrated solar power plant, powered by the Saharan sun, set to help renewables provide almost half the country’s energy by 2020

    The Moroccan city of Ouarzazate is used to big productions. On the edge of the Sahara desert and the centre of the north African country’s “Ouallywood” film industry it has played host to big-budget location shots in Lawrence of Arabia, The Mummy, The Living Daylights and even Game of Thrones.

    Now the trading city, nicknamed the “door of the desert”, is the centre for another blockbuster – a complex of four linked solar mega-plants that, alongside hydro and wind, will help provide nearly half of Morocco’s electricity from renewables by 2020 with, it is hoped, some spare to export to Europe. The project is a key plank in Morocco’s ambitions to use its untapped deserts to become a global solar superpower.

  12. Tomi Engdahl says:

    On February 10, 2014, the U.S. Department of Energy (DOE) published a revision to its EISA 2007 external power supply (EPS) efficiency standard, increasing the minimum efficiency requirements as well as expanding the range of products applicable under the new standard. Set to go into effect February 10, 2016, this law will have implications for any OEM that designs products with an external AC-DC power supply for the US market.


  13. Tomi Engdahl says:

    Greenhouses power themselves with photovoltaic glass

    “When life gives you lemons, make lemonade” is one of my favorite sayings, in no small part because it applies not only to life in a general sense but also to engineering in particular. Any technology option (as you all already know well) consists of a set of tradeoffs; no single candidate is optimum for every possible implementation. As engineers, one of your key roles (in partnership with your marketing counterparts) is to intimately understand the technologies at your disposal, both their respective strengths and shortcomings, and identify applications and customers that enable you to accentuate the strengths while playing down the shortcomings.

    One of Rick’s more recent gigs is Vice President of Business Development at Brite Solar, a company whose name is reflective of its photovoltaic aspirations, albeit with a somewhat unique twist.

    Solar cells, in conjunction with batteries, super-capacitors, or other charge storage technologies that enable the electrons to keep flowing after dark, are one of the more promising approaches available to enable energy harvesting and in the process, wean humanity off natural gas, coal, oil, and other traditional greenhouse gas-generating energy sources. Deserts are in many respects ideal locations for large solar energy “farms;” they get plenty of sun exposure, are otherwise unused, and offer large expanses of fairly flat terrain. Unfortunately, they also tend to be remotely located relative to the dense cities and industrial sites that consume the generated energy, thereby leading to expensive and lossy transmission topologies between sources and destinations.

    Alternatively, of course, you can locate a solar cell array in close proximity to its consumption partner; on a residence roof, for example, or in its back yard. Impact-resistant solar cell arrays can even be used as an alternative to traditional asphalt in constructing roads and parking lots. And then there are windows … wait, solar cells in windows? Wouldn’t the necessary transparency be … err … thrown out the window in the process? Not necessarily. Recently-promoted R&D projects, as Orlando explained to me, leverage glass-embedded transparent light guides or other schemes to route inbound-photon energy to photovoltaic material located at the windows’ edges, for example.

    Then there’s Brite Solar’s approach, which leverages inkjet-deposited titanium dioxide nanotechnology. The resultant PanePower material is only around 75% transparent (and then only in certain visible wavelengths), is also reddish in perceived tint, and is only about 5% power-efficient (versus around 20% with traditional opaque solar cells). But these characteristics aren’t necessarily showstoppers; you just need to find the right application for the approach. Brite Solar’s first candidate is the greenhouse. As Orlando noted in one of his emails, “75% transparency in the correct wavelengths is what is needed for photosynthesis in plants in the greenhouse.”

    “The reddish tinge … is better for the plants,” he also suggested, along with offering that “we also block 98% of the UV light, which has significant benefits to the plants as well as reducing the insect population.”

    Along with fueling “grow bulbs” for use after dark, the electricity generated by Brite Solar-enhanced glass can also power a 24 hour/day hydroponics system.

  14. Tomi Engdahl says:

    for external power supplies

    Wall plug-in supplies comply with Level VI

    Furnished with either North American or European input blades, CUI’s SWI series of wall plug-in AC/DC power supplies meets the stringent average efficiency and no-load power requirements mandated by the U.S. Department of Energy set to go into effect on February 10, 2016. Level VI standards aim to significantly lower the amount of power consumed when the end product is not in use or is no longer connected to the system.

    Each device meets the Level VI standard’s no-load power consumption requirement of <0.1 W. Overvoltage, overcurrent, and short-circuit protection are standard.

  15. Tomi Engdahl says:

    Will IoT Make Taxing the Internet Inevitable?;

    Following Ben Franklin’s logic, if IoT is as inevitable as death, taxes are sure to follow. Have we been living in a fool’s paradise with regard to the Internet?

    Have we been living in a fool’s paradise with regard to the Internet? Do global power considerations mean that the Internet is about to become a limited resource and much more expensive, even taxed?

    It looks likely that global chip sales will decline in 2015 compared with 2014 implying that average selling price erosion is exceeding unit supply increases. A decline in the value of the global chip market in the absence of an economic crash, a natural catastrophe, or at least an oversupply bubble, has been very rare.

    One of the problems would seem to be that the smartphone, as killer product and market driver, has more or less run its course. Many people are hoping the next killer application will be the Internet of Things; nanoelectronics everywhere.

    Think twice about that, because a nanoelectronics research compendium makes the point that — without several orders of magnitude reduction in power consumption in electronics — the projected roll outs of mobile data and the IoT could cause a global energy crisis as soon as 2020. That’s just over four years away.

    It’s not just the billions of end-point “leaf nodes” that will be consuming power — probably battery power unless nanoelectronics can make the nodes autonomous — but the exabytes of data being generated and sent to and from data centers. According to Hoefflinger’s book that data processing is rising at a compound annual growth rate of 61 percent at present, and is simply not sustainable.

    But if it is not sustainable, what is going to give way?

    Unfortunately, one of the conclusions in the book is that the traditional evolutionary progress of electronics rarely allows radical solutions to come to market — even though radical solutions are now required to reduce power consumption.

    So if the electronics industry cannot prevent the global electronics power budget from increasing rapidly then EITHER many more power stations must be built – but of what type – OR the public’s insatiable demand for the Internet must be stifled, probably by higher charges or taxation.

    No One Likes Taxes
    Building fossil fuel, nuclear and renewable power stations are all fraught with problems and Hoefflinger’s conclusion is that the world is unlikely, in the short term, to bring online the quantity of power stations required to meet the forecast Internet demand. So it seems that the future of the Internet is power-constrained.

    But taxation and the Internet is traditionally a thorny question.

    It would be hard, if not impractical, to create an equitable global taxation system for the Internet. But if we do nothing significant to reduce power consumption per node and across the network and allow data volumes to double every 18 months, by 2020 the Internet’s energy demand will rival the world’s total generation capacity, according to Hoefflinger’s book. At that point communities and governments will be forced to introduce rationing and to choose between power for essential infrastructure such as hospitals and transport, power for the lights, and power for the Internet.

    Brownouts of the Internet will probably precede blackouts. They will likely manifest themselves as a gradual degradation of quality of service as ISPs are forced to make their own resource allocation decisions

    At the same time businesses that are based on the Internet, such as Amazon, Facebook, Google, and just about everyone else will be saying this cannot be allowed to happen. There will be thoughts about first- and second-class access to the Internet and about top-down management of peak demand, just as there is with tariffs on electricity.

    A Fool’s Paradise
    Have we been living in a fool’s paradise that is about to come to an end with a sickening crunch?

    Of course, any such crunch will be mediated by money in some way because as resources become scarce they go up in price. One possible outcome would be significantly higher and progressive charges from Internet Service Providers used to meet government levies to pay for investment in power generation, which is effectively a tax.

  16. Tomi Engdahl says:

    John Cook / GeekWire:
    Gates, Bezos, Zuckerberg, and other tech leaders form Breakthrough Energy Coalition to invest in clean energy — Bill Gates, Jeff Bezos and other tech titans form the Breakthrough Energy Coalition to invest in zero-carbon energy technologies … Some of the world’s wealthiest …

    Bill Gates, Jeff Bezos and other tech titans form the Breakthrough Energy Coalition to invest in zero-carbon energy technologies

    Some of the world’s wealthiest and most powerful individuals are forming a new organization known as the Breakthrough Energy Coalition to invest in technologies that could help solve problems associated with climate change and other clean tech initiatives.

    The coalition includes Microsoft co-founder Bill Gates, founder Jeff Bezos, Salesforce CEO Marc Benioff, Facebook co-founder Mark Zuckerberg; Alibaba CEO Jack Ma and many others. (Full list below).

    The organization is not announcing a specific monetary figure that it plans to invest, but the collective wealth of the inaugural members is in the hundreds of billions of dollars. They plan to invest in a range of technologies that will help “transition the world to a near zero emissions energy future.”

    “Our primary goal with the Coalition is as much to accelerate progress on clean energy as it is to make a profit.”

    Word of the effort leaked via ClimateWire late last week, with the formal announcement coming tomorrow at the U.N. Climate Change Conference in Paris.

  17. Tomi Engdahl says:

    New Type of ‘Flow Battery’ Can Store 10 Times the Energy of the Next Best Device

    Industrial-scale batteries, known as flow batteries, could one day usher in widespread use of renewable energy—but only if the devices can store large amounts of energy cheaply and feed it to the grid when the sun isn’t shining and the winds are calm. That’s something conventional flow batteries can’t do.

    New type of ‘flow battery’ can store 10 times the energy of the next best device

    Industrial-scale batteries, known as flow batteries, could one day usher in widespread use of renewable energy—but only if the devices can store large amounts of energy cheaply and feed it to the grid when the sun isn’t shining and the winds are calm. That’s something conventional flow batteries can’t do. Now, researchers report that they’ve created a novel type of flow battery that uses lithium ion technology—the sort used to power laptops—to store about 10 times as much energy as the most common flow batteries on the market. With a few improvements, the new batteries could make a major impact on the way we store and deliver energy.

    Flow batteries aren’t much different from the rechargeables we’re all used to, aside from their massive size. In conventional rechargeables, electrical charges are stored in an electrode called an anode. When discharged, electrons are pulled off the anode, fed through an external circuit where they do work, and returned to a second electrode called a cathode.

    But in flow batteries, the charges are stored in liquid electrolytes that sit in external tanks. The charge-carrying electrolytes are then pumped through an electrode assembly, known as a stack, containing two electrodes separated by an ion-conducting membrane. This setup allows large volumes of the electrolytes to be stored in the tanks. Because those tanks have no size limit, the storage capacity of a flow battery can be scaled up as needed. That makes them ideal for storing large amounts of power for the grid.

    Today, the most advanced flow batteries are known as vanadium redox batteries (VRBs), which store charges in electrolytes that contain vanadium ions dissolved in a water-based solution.

    Lithium ion batteries have a far higher energy density than VRBs. But it’s been difficult to incorporate their technology into flow batteries.

    To address this problem, researchers led by Qing Wang, a materials scientist at the National University of Singapore, came up with a bit of a hybrid solution.

  18. Tomi Engdahl says:

    Technology Tackling Climate Change
    Getting industry innovation behind world needs;

    This week EE Times will feature updates on some of the carbon-reducing technologies being developed or already in use that could help companies, countries and citizens reduce their carbon footprint. These are some examples to show the infinite variety and to give credit to engineers and companies working on them.

    There is little doubt that the climate is heating up (see NASA GISS chart below) and that it is due to the production of greens-house gases, which have been steadily increasing since the maturation of the industrial revolution circa 1880, according to The Intergovernmental Panel on Climate Change (IPCC 2015) meeting this week in Paris at the Conference of the Parties (COP21, Nov. 30–Dec. 11).

    Last year the IPCC declared that scientists were 95 percent certain that global warming is being caused (mostly) by increasing concentrations of man-made greenhouse gases—carbon dioxide, methane and nitrous oxide—most of which is being produced by electrical power plants and internal combustion engines.

    Carbon-free sustainable electrical power generation has been accomplished with power-generating river dams since the invention of the electrical generator, but in many places the dams are being disassembled because of the negative impact they have had on fish runs. No matter. We now have even cleaner methods of electrical power generation.

    Solar cells
    The most promising zero-carbon electrical power generators are solar cells, which already come in all sorts of formulations, sizes and capacities.

    According to the U.S. Department of Energy, every hour, enough energy from the sun reaches Earth to meet the world’s energy usage for an entire year. Of course its impossible to cover the lighted half of the 198 million square miles of the Earth’s surface in solar cells. Even collecting all 365 days of the year with widely distributed solar cell arrays illuminated half the day (12/7) at 20 percent efficiency would take over 225 thousand square miles to satisfy the entire world’s need for energy—a seemingly unachievable goal.

    However, that is not stopping the world’s scientists from trying. One of the latest attempts comes from multi-band solar cells

    Berkeley Lab’s trick is creating a defect-free atomically thin film of molybdenum disulfide (MoS2) to create ultra-high-efficiency solar cells (and bright yet transparent displays for that matter).

    “Solar cells are able to provide the highest possibly voltage when the photoluminescence quantum yield (a parameter that is extremely sensitive to defects) is perfect.”

    Gasoline forever
    With the Saudi’s pumping enough oil to keep prices below $2 a gallon, there is little incentive to pay a premium price for a electric car. Oak Ridge National Laboratory (ORNL) is counting on this, plus the fact that internal combustion still has the highest performance to weight ratio, by continuing to create ever lighter weight powertrain materials. ORNL’s stated target is Obama’s 55 miles per gallon mandate by 2025 and if they meet their goal–with near-zero emissions–the slow adoption of electric vehicles (EV) will be of little consequence.

    ORNL claims that by “using higher temperature cast aluminum alloys we can contribute to cutting down green-house gases emissions using two beneficial characteristics; lighter weight and increased temperature capacity. The higher temperature capacity of cylinder head materials enables combustion strategies that result in higher efficiency engines that burn less fuel and generate fewer emissions,” ORNL scientist James Allen Haynes told EE Times.

  19. Tomi Engdahl says:

    A Big Win for Cheap, Clean Energy

    I’m in Paris today with several world leaders for a big announcement on energy and climate change. It is deeply moving to be in this city just two weeks after the horrific attacks here, and I am inspired by the way the French people have persevered in such a difficult time.

    Two related initiatives are being announced at today’s event. One is Mission Innovation, a commitment by more than ten countries to invest more in research on clean energy. The other is the Breakthrough Energy Coalition, a global group of private investors who will support companies that are taking innovative clean-energy ideas out of the lab and into the marketplace. Our primary goal with the Coalition is as much to accelerate progress on clean energy as it is to make a profit.

    The world is going to be using 50 percent more energy by mid-century than it does today. That should be good news, especially for the world’s poorest, because right now more than 1 billion people live without access to basic energy services. Affordable and reliable energy makes it easier for them to grow more food, run schools and hospitals and businesses, have refrigerators at home, and take advantage of all the things that make up modern life. Low- and middle-income countries need energy to develop their economies and help more people escape poverty.

    But the world’s growing demand for energy is also a big problem, because most of that energy comes from hydrocarbons, which emit greenhouse gases and drive climate change. So we need to move to sources of energy that are affordable and reliable, and don’t produce any carbon.

  20. Tomi Engdahl says:

    Tech Tackles Climate Change: Wind & Will Power
    “future of our planet in your hands,” U.N. Secretary-General;

    This week EE Times will feature updates on some of the carbon-reducing technologies being developed or already in use that could help companies, countries and citizens reduce their carbon footprint. These are some examples to show the infinite variety and to give credit to engineers and companies working on them.

    Yesterday in Paris at the United Nations’ Conference of the Parties (COP21 where the U.N. Intergovernmental Panel on Climate Change (IPCC 2015) pronounces the progress of greenhouse warming, 151 heads of state and government expressed their opinions on the urgency of the problem.

    “We’ll work to mobilize support to help the most vulnerable countries expand clean energy and adapt to the effects of climate change we can no longer avoid,” President Barack Obama wrote on his Facebook page.

    “The future of the people of the world, the future of our planet, is in your hands,” U.N. Secretary-General Ban Ki-moon said in a speech to negotiators in Paris. “We cannot afford indecision, half measures or merely gradual approaches. Our goal must be transformational.”

    Windmills have been used for ages to provide pure mechanical power to grind grain into flour, wind-pumps for water pumping out of wells, and wind sails to propel ships. All these use cases, however, have succumbed to gas and coal-fired electrical power generators during the rise of the industrial revolution. Now today we are paying for this conversion with a hotter world, rising oceans and unpredictable weather patterns.

    The return to wind power–but this time generating electricity which can be more easily distributed–has all the green aspects of windmills, a close rival to solar cells. They are plentiful, renewable, clean, emit zero greenhouse gases, and can coexist with existing uses of land (such as on farms, or even on corporate campuses.

    Today the turbines themselves are typically variable speed running at 34.5 kVolts with their own power converters feeding a local substation. The substations uses transformers to up the voltage to 230kVolts for connection to the grid. The substation also must be able to ride through low-voltage conditions usually by use of capacitor banks to make up for the lost power. The grid operators typically supply wind farmers with its specific requirements, including power factor, frequency constancy and required dynamic behaviors during brown-outs and outages.

    Most of the greening of power generation comes from government mandates to the power companies which themselves consider greening merely a cost center. For instance, California state has mandated that 30 percent of its grid energy come from renewable resources by 2020.

    Since Schneider Electric is a electricity distribution, automation and management company with over $26 billion in revenue ($2 billion profit), it has a vested interest in green energy and uses green-branded names, such as ecoDesign Way and Green premium, for its products.

    The biggest personal contribution that any individual can make is to use public transportation, or even better, to start riding a bike to work.

    Bosch produces the battery, motor, drive gears and handlebar controller to bike manufacturers who want to add an electric model to their line.

  21. Tomi Engdahl says:

    Tech Tackles Climate Change: From Batteries to Mini-Grids
    Go Green Behind or In-Front of the Meter;

    While they hammer out a worldwide agreement at the United Nations’ Conference of the Parties (COP21 where the Intergovernmental Panel on Climate Change (IPCC 2015) will pronounce the progress of greenhouse warming, we are examining today the heart of what will make sustainable renewable energy work–the battery. Today we have wind farms and solar-cell farms and the ability to sell excess energy to the grid, but storing grid-sized excess energy is still the most outstanding problem facing both renewable energy sources and the grid itself.

    Large-scale batteries are still mostly dependent on massive banks of the same type of batteries that power your cell phone–lithium ion (Li-Ion)–but other solutions specifically designed for grid-sized problems are here, albeit unproven. But regardless of the battery technology used, what really counts is the behind-the-meter systems and the mini- and micro-grids using them.

    The reason that electric vehicles (EVs), electric augmented airplanes, behind-the-meter solutions and even mini-grids still mostly depend on lithium ion batteries is that Li-Ion batteries are proven. Yes, they have had problems–like catching on fire–and yes they have to be amassed in banks by the hundred, and yes they are not the ideal solution for large-scale problems, but they are here.

    Panasonic and Samsung are two of the most prolific suppliers of inexpensive lithium ion batteries, but they need to be repackaged for grid sized solutions in order to achieve the kind of volumes necessary to get their price down there. Tesla Motors Inc. (Palo Alto, Calif.), famous for its EVs, is also marketing a lithium-ion PowerWall that it suggest mating to roof-mounted solar cells in a home so that it charges during the day and supplies power at night for the house and, of course, to recharge the EV in the garage. With enough PowerWalls and solar cells, Tesla suggests that a house-hold could go net-zero–that is, store enough energy to last all day and all night (plus, of course, recharge two EVs in the garage).

    Beyond Lithium
    Tesla’s patent actually mentions metal-air batteries, because they can also be built with a dozen different metals, and in fact most of the next-generation batteries use some variation of the traditional electrode-electrolyte-electrode architecture started with lead-acid batteries (in all gasoline-powered cars) and repeated with different materials over the years ad nauseam.

    Take GE’s Duration battery, a molten-salt electrolyte battery–officially a sodium-metal halide battery–with a 20-year lifetime and the ability to be built a grid-sized dimensions. Unfortunately molten salt apparently didn’t make the cut, since GE discontinued the project just this year, but fear not–other salt-based batteries are already being made.

    Sumitomo, for instance, built a salt-based battery that is molten at 142 degrees Fahrenheit, much lower than GE’s, and is nonflammable and is supposedly fire- and explosion-proof.

    But for my money, I’m betting on the Aqueous Hybrid Ion (AHI) battery from Aquion Energy (Pittsburgh, Penn.) which instead of molten salt uses salt water.

    Batteries don’t matter
    Many other choices among grid-level batteries are available now and being developed to be available soon, but the bottom line is that the battery is just not the most important part of building a grid level storage system. After all what do you expect a grid-battery to do? The utilities want them to bolster their existing infrastructure so they can continue their mega-monopoly on supplying power and sending those monthly bills to everyone, business and non-profit. The only reason that they are incorporating renewable energy sources into their grid at all is because government mandates are forcing them too. Even so, utilities will profit by storing renewable energy–from solar arrays and wind farms–which are variable sources at best and need batteries to buffer them from the grid, allowing it to draw from them as necessary such as during peak times and store their energy when its not needed.

    For the rest of us, the advantage of storage batteries is to cut those monthly grid bills and to go-green, lower our carbon footprints, stay up and running even during grid outages and generally feel good as a friend to the planet instead of a pillager.

    Green Charge Networks (Santa Clara, Calif.) may not be unique, but their business plan is almost irresistible. They will come onto your premises, measure your daily energy usage, review the last year of your energy usage patterns, and run it through a software algorithm that calculates the optimal-sized battery backup system for your venue. And get this. They’ll install that system for free.

    “We believe that its not the battery technology that really makes the system, but your software algorithms,”

    Indeed, Green Charge Networks uses lithium-ion battery banks today, because they are the most reliable, but will switch to any other battery that proves itself better just by slightly tweaking their algorithms.

    “For the average business consuming 100 kiloWatts, our system typically saves about $4000 per month,”

    And that’s without adding renewable energy sources, which Green Charge Networks also supports and encourages users to add. All this takes place on the customer premises “behind the meter” to keep that meter running slower than it ever did before.

    “Its analogous to data storage caching–we relieve the grid infrastructure from supplying peaks, using software algorithms that make it more economical to cache energy locally,” Shao told us

    But how does the company make any money? By sharing the saving and paying off the equipment financing over a 10-year period.

    Perhaps the biggest challenge to the utility monopolies is coming from the mini-grid business, which like the behind-the-meter business, I predict will be booming over the next few years, because it does not require new inventions, but just smart business plans using existing equipment.

    By installing 50-kiloWattHour to 10-megaWattHour banks of advanced lead acid and lithium-ion batteries, PDE Total Energy Solutions can minimize their energy bills in a manner similar to the way Green Charge Networks does. And by installing renewables like solar panels on every roof the savings multiple. As a bonus, the entire mini-grid can run the whole show in the event of a main grid outage.

    Getting its start with military installations that need uninterruptible energy supplies–like 10-megaWattHour GE generators–for critical operations that an be moved about anywhere in the world, PDE Total Energy Solutions is now moving to renewable energy sources in both the military and private sectors.

  22. Tomi Engdahl says:

    Tech & Climate: Stanford’s Grand, Green Plan;

    This week EE Times features carbon-reducing technologies being developed or already in use that could help companies, countries and citizens reduce their carbon footprint. These are some examples to show the infinite variety and to give credit to engineers and companies working on them.

    Earlier in our series, we noted how solar cells are already outpacing all other zero-carbon emission energy generating technologies and that the government was funding high-temperature high-pressure turbo-charged aluminum engines that approached zero-emissions while still using gasoline. We also discussed wind power and noted that wind power was actually running neck-to-neck with solar power as the most popular zero-emission electrical generation technology.

    Today we look at how Stanford University (Stanford, Calif.) is leading the way in converting on-site wastes into usage energy and fill-in the gaps with smartly distributed green energy generation aiming for a zero-carbon footprint facility for all the world to see. Stanford University campus converted from a 1980s water-based ecology into a 21st century green electricity based ecology that is already nearly 65 percent of the way to Stanford’s goal of zero-carbon footprint.

    The first step was to analyze its campuses energy use and identify how best to turn wasted energy into productive technology. Like many campuses–from teaching to corporate worldwide–the Stanford campus relied on distributing cold water for cooling and steam for heating campus-wide year-round.

    The steam return (actually hot water) was then heated back cup into steam with gas heaters and the cold water return used exterior cooling towers to release the unwanted heat into the atmosphere.

    When Joseph Stagner, executive director Sustainability and Energy Management measured the exact amount of heating and cooling required, he found a 75 percent overlap. In other words, by using heat pumps he could use the excess heat from the cold return to heat the water from the hot return and visa versa.

    Heat pumps–which Stagner calls “chillers on steroids”–need electrical input to perform the temperature transfer, but that could be obtained from solar cells thus creating a energy saving system that only consumed green zero-carbon electrical generation.

    “My advise to others is that the path to heating and cooling and powering buildings is to electrify all the things you used to use gas to perform, and then generate the electricity with nonpolluting renewable sources,”

    The new system became operational just Jan 4th of this year [2015] when the 50megaWatt natural gas co-generation plant was turned off.

    he new system eliminates imposing natural gas by switching to a 6-megaWatt on site solar cell array plus a 70 megaWatt off-campus site in Mojave

  23. Tomi Engdahl says:

    Study Claims Lettuce Is “Three Times Worse Than Bacon” For GHG Emissions

    Sticking to a vegetarian diet may not the best for environment — in fact, it might be harmful to it. According to new research from Carnegie Mellon University, following the USDA recommendations to consume more fruits, vegetables, dairy and seafood is more harmful to the environment because those foods have relatively high resource uses and greenhouse gas emissions per calorie. “There’s a complex relationship between diet and the environment

    As you might suspect some find the study dubious at best.

    Vegetarian and “Healthy” Diets Could Be More Harmful to the Environment

    Carnegie Mellon Study Finds Eating Lettuce Is More Than Three Times Worse in Greenhouse Gas Emissions Than Eating Bacon

    Contrary to recent headlines — and a talk by actor Arnold Schwarzenegger at the United Nations Paris Climate Change Conference — eating a vegetarian diet could contribute to climate change.

    In fact, according to new research from Carnegie Mellon University, following the USDA recommendations to consume more fruits, vegetables, dairy and seafood is more harmful to the environment because those foods have relatively high resource uses and greenhouse gas (GHG) emissions per calorie. Published in Environment Systems and Decisions, the study measured the changes in energy use, blue water footprint and GHG emissions associated with U.S. food consumption patterns.

    “Eating lettuce is over three times worse in greenhouse gas emissions than eating bacon,”

  24. Tomi Engdahl says:

    North Carolina Town Defeats Big Solar’s Plan To Suck Up the Sun

    The citizens of Woodland, N.C. have spoken loud and clear: They don’t want none of them highfalutin solar panels in their good town. They scare off the kids. “All the young people are going to move out,” warned Bobby Mann, a local resident concerned about the future of his burg. Worse, Mann said, the solar panels would suck up all the energy from the Sun.

    North Carolina citizenry defeat pernicious Big Solar plan to suck up the Sun
    Town council votes to deny zoning permit that would allow solar farm development.

  25. Tomi Engdahl says:

    North Carolina Town That Defeated Solar Plan Talks Back

    city officials in Woodland, North Carolina have taken issue with being ridiculed by the internet and want to set the record straight. According to the article: “Usually what happens in Woodland stays in Woodland, a town 115 miles east of Raleigh with one Dollar General store and one restaurant. But news of the Northampton County hamlet’s moratorium on solar farms blew up on social media over the weekend after a local paper quoted a resident complaining to the Town Council that solar farms would take away sunshine from nearby vegetation.”

    “town officials say the Internet got it wrong.”

    Rural NC town mocked on social media after passing solar moratorium

    As outlandish as those claims seem, town officials say the Internet got it wrong.

    It would be foolish to conclude that all the town’s residents have an aversion to solar energy, said Ron Lane, who has been on the Woodland Town Council for two years. In the past year, Lane noted, the town approved zoning changes to accommodate a trio of major solar farms, one of which is nearly completed.

    Woodland simply got too cramped for a fourth solar installation, he said.

    For Woodland’s elected officials, the viral response to the solar blackout became a crash course on the power of social media.

  26. Tomi Engdahl says:

    Cloud service giant Salesforce has signed a three years contract a total of 12 years, the purchase of wind energy in datacenters needs. Investing in a West Virginia wind park is part of the company’s aim to modify its operations carbon neutral by 2050.

    Salesforce is not the only renewable invested in the IT giant, but, for example, Google has made similar agreements. In all, US companies signed in 2015, more than three gigawatts in front of wind and solar power contracts in the previous year when the figure was only 1.2 GW.


  27. Tomi Engdahl says:

    Facebook to set up second data center in Europe
    The company’s sixth data center in the world will use gear from the Open Compute Project

    Facebook is setting up a data center in Clonee, Ireland, which will be its sixth in the world and its second outside the U.S.

    The new data center will be equipped with servers and storage from the Open Compute Project, a Facebook initiative that shares designs as open source with other data center operators to standardize and drive down the costs of equipment.

    “We will outfit this data center with the latest OCP server and storage hardware, including Yosemite for compute,”

    The Clonee center will run entirely on renewable energy by taking advantage of the abundant supply of wind in the location, Parikh wrote. “This will help us reach our goal of powering 50 percent of our infrastructure with clean and renewable energy by the end of 2018,” he added.

  28. Tomi Engdahl says:

    US Could Lower Carbon Emissions 78% With New National Transmission Network

    mdsolar writes with this story from Smithsonian magazine about how building a national transmission network could lead to a gigantic reduction in carbon emissions. From the story: “The United States could lower carbon emissions from electricity generation by as much as 78 percent without having to develop any new technologies or use costly batteries, a new study suggests. There’s a catch, though. The country would have to build a new national transmission network so that states could share energy. ”

    The U.S. Could Switch to Mostly Renewable Energy, No Batteries Needed
    Better electricity sharing across states would dampen the effects of variable weather on wind and solar power

    The United States could lower carbon emissions from electricity generation by as much as 78 percent without having to develop any new technologies or use costly batteries, a new study suggests. There’s a catch, though. The country would have to build a new national transmission network so that states could share energy.

    “Our idea was if we had a national ‘interstate highway for electrons’ we could move the power around as it was needed, and we could put the wind and solar plants in the very best places,” says study co-author Alexander MacDonald, who recently retired as director of NOAA’s Earth System Research Laboratory in Boulder, Colorado.

    One of the big issues with wind and solar power is that their availability is dependent upon the weather. Solar is only available on sunny days, not during storms or at night. Wind turbines don’t work when the wind doesn’t blow enough—or when it blows too much. Because of this, some studies have argued that these technologies are only viable if large-capacity batteries are available to store energy from these sources to use when they aren’t working. That would raise the cost of electricity well beyond today’s prices.

    But “there’s always wind and solar power available somewhere,” MacDonald notes. So he and his colleagues set out to design a low-carbon electricity-generation system that better incorporated—and even took advantage of—the nation’s weather. Their study appears today in Nature Climate Change.

    Their computer model showed that by switching to mostly wind and solar power sources—with a little help from natural gas, hydroelectric and nuclear power when the weather doesn’t cooperate—the United States could reduce carbon emissions by 33 to 78 percent from 1990 levels, depending on the exact cost of renewable energy and natural gas.

    Adding coal into the mix did not make electricity any cheaper, but it did result in a 37 percent increase in carbon emissions.

    The key to this future would be the development of a system for transferring electricity across the country, so that a windy day in North Dakota could power a cloudy, calm day in New York. This would not only require new agreements between states—Texas, for instance, has its own separate power grid—but also an upgrade to the transmission lines that move electrons from one place to another.

    In most areas, energy moves over high-voltage alternating current lines, but there are limitations in how far these lines can transmit energy. Switching to high-voltage direct current would let energy producers transmit more electricity a longer distance.

    Building a new network for transmitting electricity would be a big job. But the computer model showed that it can be cost effective, because in the long run it would allow cheap power to be available

    “By building these transmission facilities, we reduce the costs to remove the carbon rather than increasing it,”

    “We can use existing transmission pathways,” Jacobson says, and just improve the lines that run across them. “You don’t need as many new pathways as you think.”

  29. Tomi Engdahl says:

    Using warm water for data center cooling

    There are many ways to cool a data center. Engineers should explore the various cooling options and apply the solution that’s appropriate for the application.

    Moving toward today’s technology

    One can glean from this information that it wasn’t until the late 20th/early 21st century that computing technology really took off. New processor, memory, storage, and interconnection technologies resulted in more powerful computers that use less energy on a per-instruction basis. But one thing remained constant: All of this computationally intensive technology, enclosed in ever-smaller packages, produced heat—a lot of heat.

    As the computer designers and engineers honed their craft and continued to develop unbelievably powerful computers, the thermal engineering teams responsible for keeping the processors, memory modules, graphics cards, and other internal computer components at an optimal temperature had to develop innovative and reliable cooling solutions to keep pace with this immense computing. For example, modern-day computational science may require a computer rack that houses close to 3,000 cores, which is roughly the equivalent of 375 servers, in one rack. This equates to an electrical demand (and corresponding cooling load) of 90 kW per rack. This will yield a data center with an electrical density of considerably more than 1,000 W/sq ft, depending on the data center layout and the amount of other equipment in the room. With numbers like this, it was clear: Conventional air cooling will not work in this type of environment.

    Current state of data center cooling

    Data center cooling system development, employing the most current and common industry methodologies, range from split-system, refrigerant-based components to more complex (and sometimes exotic) arrangements, such as liquid immersion, where modified servers are submerged in a mineral oil-like solution, eliminating all heat transfer to the ambient air because the circulating oil solution becomes the conduit for heat rejection. Other complex systems, such as pumped or thermo-syphon carbon-dioxide cooling also offer very high efficiencies in terms of volume of heat rejection media needed; 1 kg of carbon dioxide absorbs the same amount of heat as 7 kg of water. This potentially can reduce piping and equipment sizing, and also reduce energy costs.

    Water-based cooling in data centers falls somewhere between the basic (although tried-and-true) air-cooled direct expansion (DX) systems and complex methods with high degrees of sophistication. And because water-based data center cooling systems have been in use in some form or another for more than 60 yr, there is a lot of analytical and historical data on how these systems perform and where their strengths and weaknesses lie. The most common water-based approaches today can be aggregated anecdotally into three primary classifications: near-coupled, close-coupled, and direct-cooled.

    Most of these complications stem from proximity and physical containment; if the hot air escapes into the room before the cold air can mix with it and reduce the temperature, the hot air now becomes a fugitive and the cold air becomes an inefficiency in the system. In all air-cooled data centers, a highly effective method for reducing these difficulties is to use a partition system as part of an overall containment system that physically separates the hot air from the cold air, allowing for a fairly precise cooling solution.

  30. Tomi Engdahl says:

    Swiss Project Looking To Harness Kite Power

    Switzerland has bought us many things: the cuckoo clock, cheese with holes in it, and.. kite power? That’s the idea of a Swiss project that is trying to tap the energy of a regular wind that blows between Lake Geneva and the Alps. The group hopes to build large kites that fly at about 150 meters above the ground, with a generator and other components on the ground. The way that this wind energy is converted into electricity is interesting: the kite is pulled up by the wind, spiraling higher and pulling the cable which drives the generator. Once it reaches a maximum height, the kite is trimmed so it sinks down to a lower altitude, and the kite is trimmed again to catch the wind and climb.

  31. Tomi Engdahl says:

    Router configurations suck (power out of mobile devices, that is)
    RFC asks IPv6 admins to quiet routers so mobile devices don’t have to wake up quite so often

    Unknown and unseen to most users, your smartphone is “talking” in its sleep, and that can sap your battery.

    The problem? Routing advertisements, one of the fundamental operating principles of the Internet, can demand enough communications to have a noticeable impact on battery life.

    Router advertisements are multicasts that remind the devices they serve what IP address the router’s interface is using (in the old IPv4 world,, for example). However, when the smartphone receives that advertisement, it has to process it, even if the screen stays dark.

    Over at the IETF, a Cisco* engineer called Andrew Yourtchenko and Google* researcher Lorenzo Colitti are suggesting ways that sysadmins can lighten the load on users, at least in the IPv6 world.

    In particular, the authors say the habits of sysadmins in wired networks, where router advertisements might fly around every few seconds, don’t translate well to the world of mobile devices.

    In RFC 7772, the pair lay down the current best practice for configuring systems so that on devices like phones and tablets, router advertisements don’t suck more than 2 per cent of a device’s power during sleep mode.

    They note that “current-generation devices might consume on the order of 5 mA when the main processor is asleep. Upon receiving a packet, they might consume on the order of 200 mA for 250 ms, as the packet causes the main processor to wake up, process the RA, attend to other pending tasks, and then go back to sleep. Thus, on such devices, the cost of receiving one RA will be approximately 0.014 mAh”.

    That’s too high, the RFC contends: to keep to their suggested two per cent power budget, the document says the average power budget for router advertisements has to be kept to 0.1 mA, which equates to the device receiving seven advertisements per hour.

  32. Tomi Engdahl says:

    Technology Research Centre of Finland and Aalto University have started together with domestic enterprises Tekes project, which is developing scalable and energy-efficient data centers, optical switching and transmission solutions. Optical technologies are needed to support the explosive growth of data volumes, for example, caused by the penetration of cloud services and mobile streaming.

    Data volumes have long grown exponentially and data center capacity has doubled every eighteen months. In 2014 alone, data centers in EU countries consumed about 120 terawatt-hours of energy. The energy equivalent of a four-second annual one-gigawatt nuclear reactor maximum production, or double the current Finnish electric energy production.

    In the current technology, the variations of optical and electronic signals as well as the required electronics cooling are the biggest sources of energy loss.


  33. Tomi Engdahl says:

    The Quest for Server Power Efficiency
    APEC still focuses on data center power use;

    Glamour items like energy harvesting and wireless power transfer are likely to make “guest appearances” at next week’s APEC Conference. GaN transistor deployments will be carefully monitored. But on-going efforts to promote data-center energy transfer efficiency retain their “bread-and-butter” utility.

    Energy transfer efficiency in data centers — and techniques for improving it — should be at (or near) the top of your list.

    Although factory automation and commercial lighting — a rebuilding of our industrial infrastructure — are replacing computing and consumer electronics as the drivers for power product development, voltage regulator products with higher energy transfer efficiency still generate revenues for manufacturers of power management ICs, modules, and power discretes.

    When you look at the power distribution chain for a medium-sized data center (about 935 kilowatts in 2014), you can spot pockets of waste and inefficiency. Running 7 days per week, 365 days per year, with (say) a 75% load such a data center will generate a $600,000 annual electricity bill (assuming a cost of 10 cents per kilowatt-hour). A mere one-percent improvement in power train efficiency provides a $6,000 annual savings.

    Reducing data center power consumption
    It turns out, a huge portion of the electricity bill — up to 60% in legacy equipment — is the cost of running cooling equipment.

    The other cooling recommendation, driven by Facebook and the Open Compute Project, may have more demonstrable support. The Open Compute Project, whose members include Google, Quanta and HP, has been promoting standards for hyper scale computing card form factors. These would enable server cards to mechanically interchangeable. The bigger contribution to power management efficiency is in the streamlining of the airflow from the back or the card to its front. While this open frame architecture creates a greater reliance on small fans, it reducts the on-time for the room-sized fans and chillers. (It also jacks up the PUE rating by giving a larger weight to the IT equipment denominator.)

    After a reduction in cooling costs, the reductions made by other techniques appear small. But they remain significant. There is consensus among data center systems providers — companies like Schneider Electric and General Electric (these days concentrating on IoT-enabled smart buildings) — that a significant energy savings can be provided by streamlining the power transmission train. This is done chiefly by reducing the number of the voltage conversions in the power transmission chain. Legacy power transmission chains would run the 480V AC coming into a building through a single large uninterruptable power supply (UPS) system, based on a lead-acid battery backup. The UPS would convert the AC to a lower voltage DC, to charge the battery, and then convert the DC back to a higher voltage to distribute to the racks. More contemporary thinking suggests keeping the UPS out of the circuit, putting it in a straight wire “bypass” mode, and switching the UPS back into the circuit only when a power line fault is detected. The high-voltage AC is delivered to the power supplies mounted in the racks. In many cases, the three-phase AC input is put through a transformer to generate single-phase AC, but the rack-mounted power supplies recommended by the Open Compute Project are intended to take a 277V AC input and output 12 volts DC, 480 watts per card.

  34. Tomi Engdahl says:

    Running Out Of Energy?

    The anticipated and growing energy requirements for future computing needs will hit a wall in the next 24 years if the current trajectory is correct. At that point, the world will not produce enough energy for all of the devices that are expected to be drawing power.

    A report issued by the Semiconductor Industry Association and Semiconductor Research Corp., bases its conclusions on system-level energy per bit operation, which are a combination of many components such as logic circuits, memory arrays, interfaces and I/Os. Each of those contributes to the total energy budget.

    For the benchmark energy per bit, as shown in the chart below, computing will not be sustainable by 2040. This is when the energy required for computing is estimated to exceed the estimated world’s energy production. As such, significant improvement in the energy efficiency of computing is needed.

    “It’s not realistic to expect that the world’s energy production is going to be devoted 100% to computing so the question is, how do we do more with less and where are the opportunities for squeezing more out of the system?”

    “Anytime someone looks at a growth rate in a relatively young segment and extrapolates it decades into the future, you run into a problem that nothing can continue to grow at exponential rates forever,”

    Case in point: Once upon a time gasoline cost 5 cents a gallon and cars were built that got 5 or 10 miles to the gallon. If it were extrapolated to say that if the whole world was driving as much as the Americans do in their cars at 5 miles to the gallon, we are going to run out of oil by 2020. But as time goes on, technology comes along that changes the usage patterns and cars are not built to get 5 miles to the gallon.

    He noted that the core of the argument here is really a formula about the relationship between computing performance and energy, and it is primarily tracking the evolution of CPUs and capturing the correct observation that in order to go really really fast it takes more than linear increases in power and complexity.

    “To push the envelope you have to do extraordinary things and the core assumption of the whole report is that you will continue on that same curve as you ramp up computing capability further still.”

    And a key area of focus today is the interface between design and manufacturing where there is a constant need to keep focusing on how to contribute to getting more done with less power, and ultimately with less total energy consumed.

    That also requires adapting to the new hardware architectures that bring more parallelism and more compute efficiency, and then working with very large distributed systems. Given the immense challenges of lowering the energy requirements of computing in the future, it is obvious the task will be accomplished with all hands on deck. And given the impressive accomplishments of the semiconductor industry in the past 50 years, there’s no doubt even more technological advancements will emerge to hold back the threat of hitting the energy wall.

  35. Tomi Engdahl says:

    Energy savings in electrical distribution systems

    Consider these energy efficiency options for retrofitting existing and designing new buildings’ electrical distribution systems.

    The codes and standards associated with energy efficiency establish the minimum energy efficiency requirements needed for the design of new buildings and renovations to existing buildings. The codes, however, are geared toward the efficiency of mechanical and lighting systems. Not a lot of information is provided within these codes to establish energy efficiency measures for the design of the power distribution systems

    The standard also establishes that the voltage drop shall not exceed 2% for feeders and 3% for branch circuits (Chapter 8.4.1). Although ASHRAE 90.1 is not more stringent than NFPA 70: National Electrical Code (NEC) voltage-drop recommendations outlined in the fine point notes (FPNs) section included in Article 210.19, ASHRAE 90.1 does establish voltage drop as a requirement for meeting the standard. An NEC FPN recommends a maximum voltage drop of 5%, with the feeders to not exceed 3%; because this is an FPN, it is not a code requirement.

    Copper versus aluminum

    Copper and aluminum are the most commonly used materials for conductors, busing in distribution equipment, and windings in transformers. There is a common misconception that because copper is more conductive than aluminum, copper distribution equipment will be more energy-efficient than aluminum. That is not the case. There are other factors to take into account including conductor size, equipment size, cost, and weight of the equipment and conductors.

    Depending upon the alloy of aluminum used for the conductors or bus, the conductivity of aluminum is approximately 56% to 61% that of copper. Although the difference in conductivity is significant, this will not significantly affect the overall efficiency of the distribution equipment because the panelboards, switchboards, and transformers because, regardless of the material used, the equipment is still required to meet NEMA and UL standards for temperature rise, which would affect the efficiency of the equipment.

    Similarly, although the conductors will be larger, the efficiency of the cables will not be affected.

    The cost of the materials is dependent upon the market. However, it is typical to see the following cost savings when using aluminum: 30% to 50% for dry-type transformers, 20% for substation dry-type transformers, 25% for liquid-filled pad-mounted transformers, $1,000 per vertical section for a 1,000-amp switchboard, and $1,500 per vertical section for 3,000/4,000-amp switchboards.

    On average, the equivalent aluminum conductor will reduce the length a cable can be run by approximately 40% to still meet the ASHRAE-recommended 3% voltage drop.

    Although there isn’t a significant energy efficiency advantage using copper versus aluminum, the material selected for the project application should always be evaluated during design.

    Although the efficiency difference between copper and aluminum transformers is not significantly above 15 kVA, there is enough difference that dictates copper should be used when energy efficiency is the main goal of the project. However, when cost is accounted, the initial cost savings for aluminum often outweighs the loss in energy efficiency.

    Balancing electrical loads

    One of the no-cost measures to establish energy efficiency in the distribution system design is to balance the single-phase loads on 3-phase distribution systems. If the loads are not properly distributed among the 3-phase buses, the result will be unequal current and unbalanced voltage at the load (unbalanced distortion). Although not a code requirement, the designer should always take balancing the loads into consideration during their design. As a good engineering practice, the unbalanced load should be designed to not exceed 2% unbalance. The unbalanced distortion will cause power loss, voltage-drop issues, and overheating of induction motors and transformers.

    Additionally, unbalanced single-phase loads lead to harmonics within the electrical system.

    Heat loss

    Figure 3: This new motor starter was installed within an existing motor control center for exhaust fans, which were included as part of an HVAC energy efficiency upgrade project in an academic building that housed classrooms, offices, and various art stud

    Heat is the byproduct of inefficiency in electrical systems. Typically, the more efficient the transformer, the less heat that is dissipated from the transformer, therefore, transformers with lower temperature rise tend to be more efficient. General-purpose dry-type transformers used for distribution within buildings come in three standard temperature rises: 80°C (176°F), 115°C (239°F), and 150°C (302°F).

    The cost of the power lost to heat can be significant over the lifetime of the transformer. This cost does not take into account the cost of the additional air conditioning and ventilation required to make up for the heat lost into the space, which can also be significant.

    Over the lifetime of the transformer, the cost of power lost to heat will justify the additional cost associated with purchasing a transformer with the lower temperature rise when choosing between the 150°C (302°F) and 115°C (176°F) temperature-rise transformers.

    Although manufacturers list efficiencies at 25%, 50%, 75%, and 100% full load, the NEMA TP-1 and 2016 DOE transformer efficiency standards are based upon A DOE study determined transformers are typically loaded at 32%.

    CSL-3 and 2016 DOE transformers provide significant annual cost savings

    Once again, using the example of the 75-kVA transformer, loaded at 50%, there is an annual cost savings of $256.23 and $337.37, which equates to $6,405.75 and $8,434.25, respectively, over the 25- to 30-year lifetime of the transformers.

  36. Tomi Engdahl says:

    APEC 2016 – Plenty of hardware, but where was the software?—Plenty-of-hardware–but-where-was-the-software-?_mc=NL_EDN_EDT_EDN_today_20160523&cid=NL_EDN_EDT_EDN_today_20160523&elqTrackId=35ce668d728147228785b3a04962b081&elq=54ab3c1ee23c42d4bcd6b3d1d6559e00&elqaid=32348&elqat=1&elqCampaignId=28260

    Power management, of course, is the theme of this event and component-level product advancements certainly need to be given their proper recognition. Indeed, the ability of converters to step directly from 48V to 1V or less, for example, is clearly attractive to drive efficiency gains in distributed power systems

    Technology-wise, GaN remains a mainstay of the show, except that where previously it’s always been on the cusp of becoming mainstream, now it finally seems to have arrived with GaN devices providing the basis for real products

    However, digital power products by themselves do not deliver what can truly be considered as Software Defined Power ®, which is what CUI sees as the essential next step for digital power in moving to the system level and adding intelligence outside of the power supply. Other industries are layering software on top of hardware to advance system-level solutions for greater efficiency, lower capital and running costs, and other benefits. Considering that 10% of the world’s electricity is now consumed by data centers, efforts to manage power on a system-wide basis are well overdue. Strategies like “peak load shaving” combined with backup capabilities are needed to even out load fluctuations and eliminate single points of failure.

  37. Tomi Engdahl says:

    Programmers Aren’t Writing Green Code Where It’s Most Needed

    Confession? I don’t write green code. I mean, it might be green code just by coincidence, but I’ve never really thought too much about the relative energy consumption demanded by this design pattern or algorithm versus some other. Sadly, this is true even when I’m working with actual hardware and low-level software, such as that written in plain C for embedded devices (in my case, for an Arduino board or other microcontroller platform). What’s more, I don’t think the green code idea has ever come up in my years of computer science classes.

    I’m hardly the exception, according to a paper presented this week at the 38th International Conference on Software Engineering. In interviews and surveys conducted with 464 software engineers from a range of disciplines—including mobile, data center, embedded, and traditional software development—researchers found that where green coding most matters, its practice is rare.

    Green software development is as it sounds. In their own words, the researchers behind the new paper, a team drawn from IBM, Google, Microsoft, and the University of Delaware, were looking specifically for answers relating to how software engineers think about battery life/energy usage when they write requirements, design, construct, test, and maintain their software.

    “Based on our interviews, we initially theorized that practitioners with experience in mobile (‘battery life is very important, especially in mobile devices’), data center (‘any watt that we can save is either a watt we don’t have to pay for, or it’s a watt that we can send to another server’), and embedded (‘maximum power usage is limited so energy has a big influence on not only hardware but also software’) would more often have requirements or goals about energy usage than traditional practitioners (‘we always have access to power, so energy isn’t the highest priority’).”

    This turned out to be accurate for only mobile developers, who used green practices more than any other group, with 53 percent reporting that they “almost always” or “often” wrote applications with energy usage requirements.

    An empirical study of practitioners’ perspectives on green software engineering

    The energy consumption of software is an increasing concern as the use of mobile applications, embedded systems, and data center-based services expands. While research in green software engineering is correspondingly increasing, little is known about the current practices and perspectives of software engineers in the field. This paper describes the first empirical study of how practitioners think about energy when they write requirements, design, construct, test, and maintain their software.

  38. Tomi Engdahl says:

    Report: US data centers are not energy hogs

    In 2014, data centers in the US consumed 70 billion kWh, a figure only four percent higher than the usage in 2010, according to to the report, “United States Data Center Energy Usage,” sponsored by the Department of Energy, and put together by researchers at the Lawrence Berkeley National Laboratory led by Arman Shehabi.

    This is a significant slow-down on the growth from 2000 to 2010, and has come about despite a huge growth in demand for data, because of current efficiency trends in data centers.

  39. Tomi Engdahl says:

    Report unpacks data center cooling market opportunities til 2021

    The analysis contends that “the exponential growth of data (structured and unstructured) and the increasing global demand for cloud computing is driving the demand for data centers, thereby fueling the growth of the data center cooling systems.”

    “However, the major challenge faced by the industry is the huge set-up cost and electricity consumption associated with these cooling systems,” adds the report’s summary. “To overcome the same, the service providers are coming up with greener cooling solutions to make the systems more energy-efficient.”

  40. Tomi Engdahl says:

    Home> Community > Blogs > Eye on Efficiency
    Battery chargers to become more efficient by 2018

    Early this month, the U.S. Department of Energy (DOE) published a final rulemaking describing the country’s first Energy Conservation Standards for Battery Chargers (BCs). It’s been a long time coming – the Department first proposed BC efficiency requirements in early 2012.

    The charger regulation is based on a single metric, Unit Energy Consumption (UEC). It limits the annual energy consumption for 7 different classes of BCs. Expressed as a function of battery energy (Ebatt), a charger’s UEC reflects the “non-useful” energy consumed in all modes of operation (i.e. the amount of energy consumed but not transferred to the battery as a result of charging).

    The DOE decided to exclude “back-up” battery chargers and uninterruptible power supplies (UPSs) from this ruling because of specific product testing issues. The Department is currently working on a separate rulemaking for UPSs.


  41. Tomi Engdahl says:

    By 2040, computers will need more electricity than the world can generate
    So says the semiconductor industry’s last ever communal roadmap

    Without much fanfare, the Semiconductor Industry Association earlier this month published a somewhat-bleak assessment of the future of Moore’s Law – and at the same time, called “last drinks” on its decades-old International Technology Roadmap for Semiconductors (ITRS).

    The industry’s been putting together the roadmap every two years since the 1990s, when there were 19 leading-edge chip vendors. Today, there are just four – Intel, TSMC, Samsung and Global Foundries – and there’s too much road to map, so the ITRS issued in July will be the last.

    The group suggests that the industry is approaching a point where economics, rather than physics, becomes the Moore’s Law roadblock. The further below 10 nanometres transistors go, the harder it is to make them economically.

    That will put a post-2020 premium on stacking transistors in three dimensions without gathering too much heat for them to survive.

  42. Tomi Engdahl says:

    Mistbox Makes Your AC Sweat Its Way to Better Efficiency

    A new smart-home device takes inspiration from the human body’s ability to use perspiration and evaporation to cool down to help consumers get more efficiency out of their air conditioning (AC) units as well as save money on energy costs.

    Called Mistbox, the device—attached to the outside part of a house’s AC system—works together with related tubing to create mist around an AC unit to keep the air around it cool.

    This type of pre-cooling has been used commercially for decades, but residential AC users haven’t similar capability until now, according to the company, which was co-founded by CEO Josh Teekell and COO Andrew Parks.

    Mistbox is solar powered, which means it doesn’t require batteries that need changing. The unit can harvest enough solar power if the AC unit to which it’s attached receives direct or indirect sunlight for at least some portion of the day, the company said.

    Users control Mistbox via an accompanying smartphone app that allows them to set a temperature at which they would like the unit to start working.

    To optimize how the device functions, Mistbox uses automatically functioning algorithms that choose the best time to mist the area around the AC unit.

    Benefits of using the unit include a 20 percent to 30 percent savings on the monthly cost of cooling homes during the hot months of year, the company said.

  43. Tomi Engdahl says:

    Report: US data centers are not energy hogs

    “The energy used in US data centers grew by only four percent in the last five years, and could shrink in the next five, according to a US government sponsored report which contradicts fears of an explosion in energy consumption by the cloud.”

    In 2014, data centers in the US consumed 70 billion kWh, a figure only four percent higher than the usage in 2010, according to to the report, “United States Data Center Energy Usage,” sponsored by the Department of Energy, and put together by researchers at the Lawrence Berkeley National Laboratory led by Arman Shehabi.

  44. Tomi Engdahl says:

    By 2040, computers will need more electricity than the world can generate
    So says the semiconductor industry’s last ever communal roadmap

    Without much fanfare, the Semiconductor Industry Association earlier this month published a somewhat-bleak assessment of the future of Moore’s Law – and at the same time, called “last drinks” on its decades-old International Technology Roadmap for Semiconductors (ITRS).

    The industry’s been putting together the roadmap every two years since the 1990s, when there were 19 leading-edge chip vendors. Today, there are just four – Intel, TSMC, Samsung and Global Foundries – and there’s too much road to map, so the ITRS issued in July will be the last.

    The group suggests that the industry is approaching a point where economics, rather than physics, becomes the Moore’s Law roadblock. The further below 10 nanometres transistors go, the harder it is to make them economically.

    That will put a post-2020 premium on stacking transistors in three dimensions without gathering too much heat for them to survive.

  45. Tomi Engdahl says:

    Google is using its highly intelligent computer brain to slash its enormous electricity bill

    Google has finally revealed a commercial use for DeepMind – a British artificial intelligence company it acquired for over $600 million in 2014.

    DeepMind made headlines for beating the best human in the world at the notoriously complex board game Go and it’s recently started working with hospitals in the UK on a number of healthcare projects but the startup is yet to make any money for Google, until now.

    Google announced on Wednesday that it has been using a DeepMind-built AI system to control certain parts of its power-hungry data centers over the last few months as it looks to make its vast server farms more environmentally friendly.

    Last year, a Greenpeace report predicted that the electricity consumption of data centers is set to account for 12% of global electricity consumption by 2017 and companies like Google, Amazon, Facebook and Apple have some of the biggest data centers in the world.

    The 40% energy saving on cooling helped one of Google’s data centers to achieve a 15% reduction in power usage efficiency, or PUE. PUE is defined as the ratio of the total building energy usage (pumps, chillers, cooling towers) to the IT energy usage (Google’s servers). The lower the PUE, the better.

  46. Tomi Engdahl says:

    HVAC codes and standards: cooling and energy efficiency

    Codes and standards dictate the design of HVAC systems; however, there are ways to improve the design of nonresidential buildings to achieve maximum energy efficiency.

  47. Tomi Engdahl says:

    Are Energy Standards for Computers on Horizon?;

    California has put the wheels in motion, and a NRDC says electricity use by computers can be cut in half using off-the-shelf technology with no impact on performance, and at negligible cost.

    The computer industry may soon be facing another round of regulatory measures. This time, they may come in the form of state-imposed energy efficiency rules.

    The California Energy Commission appears to be moving ahead with the nation’s first energy efficiency standards for computers and monitors. Some reports indicate that the standards, which would apply to the power-use settings for desktops, laptops and computer monitors sold in the state, may be adopted by the end of this year; given California’s market size and influence, adoption of these standards could spark industrywide changes, the news report noted.

    The standards, which would vary by computer type and possibly be phased in during 2017 and/or 2018, would save consumers hundreds of millions of dollars every year, according to the CEC’s March 2015 press release. For desktop computers alone, it is estimated that a $2 increase in manufacturing costs will return $69 to consumers in energy savings over the five-year life of a desktop, the organization claims.

  48. Tomi Engdahl says:

    Are Energy Standards for Computers on Horizon?;

    California has put the wheels in motion, and a NRDC says electricity use by computers can be cut in half using off-the-shelf technology with no impact on performance, and at negligible cost.

    The California Energy Commission appears to be moving ahead with the nation’s first energy efficiency standards for computers and monitors. Some reports indicate that the standards, which would apply to the power-use settings for desktops, laptops and computer monitors sold in the state, may be adopted by the end of this year; given California’s market size and influence, adoption of these standards could spark industrywide changes, the news report noted.

    The standards, which would vary by computer type and possibly be phased in during 2017 and/or 2018, would save consumers hundreds of millions of dollars every year, according to the CEC’s March 2015 press release. For desktop computers alone, it is estimated that a $2 increase in manufacturing costs will return $69 to consumers in energy savings over the five-year life of a desktop, the organization claims.

  49. Tomi Engdahl says:

    New UPS efficiency standards proposed for U.S.–?_mc=NL_EDN_EDT_EDN_today_20160905&cid=NL_EDN_EDT_EDN_today_20160905&elqTrackId=508bb48ee2be4b809337a44ddfeb34b9&elq=ff853ba69ba84864913f7e9ce7dbc21f&elqaid=33701&elqat=1&elqCampaignId=29467

    The U.S. Department of Energy (DOE) recently published a Notice of Proposed Rulemaking (NOPR) describing minimum efficiency levels for the product group. The proposal is an offshoot of this year’s battery charger efficiency standard published by the Department. The UPS standard is expected to save consumers as much as $4.4 billion in energy savings for products purchased between 2019 and 2048.

    If adopted, the efficiency requirements would apply to the following product classes:

    Voltage and frequency dependent: UPSs that produce an AC output where the output voltage and frequency are dependent on the input voltage and frequency
    Voltage independent: UPSs that produce an AC output that’s independent of under- or over-voltage variations in the input voltage
    Voltage and frequency independent: UPSs that produce an AC output voltage and frequency that’s independent of input voltage and frequency variations, protecting the load from variations without depleting the internal battery

    The standard will affect UPS products manufactured in, or imported into, the United States, starting two years after publication in the Federal Register.


Leave a Comment

Your email address will not be published. Required fields are marked *