Monday, November 28, 2011

Predicts 2012: Data Center Growth and the Impact of Cloud Computing on Energy Efficiency


Overview

Data center deployments are moving at different speeds in emerging and mature markets, with focus on operational efficiency to drive down cost for energy and real estate. Off-premises cloud computing not only provides more-flexible infrastructure and fewer physical network layers, but also enables organizations to conserve energy and therefore improve their environmental performance. Despite the benefits of cloud computing, the lack of IT maturity in emerging markets will slow cloud computing adoption by enterprises.

Key Findings

  • Operational efficiency and its impact on the data center footprint in urban areas are often triggered by high real-estate costs, as in Brazil.
  • The stability and availability of energy and the lack of broadband connections remain big challenges for data center deployments in emerging countries.
  • As data centers are increasing their server and rack workloads, their power and cooling requirements are expected to push the stable energy availability of the utility grid.
  • Cloud providers are challenged to optimize their energy consumption planning by mapping peak utilization and safety margins of capacity with intelligent patterns to level usage across their available infrastructure resources.
  • While China's government is pushing cloud computing through initiatives to enable economic growth, enterprises are slow in adoption due to security concerns and poor business models.
  • Markets with low understanding of IT maturity and affinity are slow in adopting cloud computing and instead maintain their physical data center infrastructure.
  • Stakeholder, government and shareholder interest is to avoid the brand reputation of a "dirty" data center, using coal-fired power.

Recommendations

  • Data center sourcing should include energy sourcing and pricing agreements. Especially as data center capacity is increasing, data center executives should plan for capacity and performance constraints to avoid potential outages.
  • Managers of cloud data centers or high-performance computing (HPC) should consider low-energy servers when mapping power and cooling directly to data center footprint.
  • Reporting green data center performance, especially in a cloud delivery environment, has to include an assessment framework that is based on technology, workloads and applications, as well as energy sources as key parameters.
  • A successful cloud computing strategy in data centers not only must include technical requirements but also must be tied into the overall business objectives of an organization.
  • As governments, such as in Brazil and China, are pushing for IT penetration, utilize the momentum to build out data center infrastructure based on some government initiatives.
  • Develop security solutions or hosted security services in order to overcome the security concerns of enterprise customers assessing the cloud computing opportunity.

Table of Contents

Contents
  • Analysis
    • What You Need to Know
    • Strategic Planning Assumptions
    • A Look Back

Analysis

What You Need to Know

Data center energy management and cost and availability of energy remain main trends for stakeholders in data center management. In facilities and on the IT level, the deployment of low-energy servers, as well as cloud computing, can not only provide energy cost savings, but also significantly reduce the floor space, resulting in a decrease of real estate value, especially for data centers in urban environments. Especially in emerging countries, the range of operational efficiency not only has cost components but also includes the availability and the stability of the electricity grid in a growth mode of economic development. The plans to expand data center capacity must include early adoption of new infrastructure as a service and other cloud computing models in order to remove the glass ceiling of available total power in a grid for the data center. That could painfully inhibit international offshoring to markets such as India, China and Brazil when pricing for the availability of electricity is increasing with demand. As a result, vendors need to position their ability to offer energy management as a holistic view in order to enable their customers to visualize and efficiently reduce energy usage.
Table of Contents

Strategic Planning Assumptions

Strategic Planning Assumption: By year-end 2012, sales of extreme-low-energy servers, offered by several server providers, will grow to 1.5% of server market revenue.
Analysis By: Errol Rasit
Key Findings:
Extreme-low-energy servers are defined as servers using processor types previously not designed to be used in server systems, rather typically found in small devices like tablets and smartphones, or small objects with embedded processors. Examples today for this type of server are Intel's Atom, ARM architecture and Tilera's TILE-Gx. The aim of using extreme-low-energy servers is to optimize data center space and reduce power and cooling costs. Optimization is achieved by rightsizing the processor to the requirement of the application thread or unit of work.
Extreme-low-energy servers are typically optimized for a limited number of workload types, unlike mainstream x86 servers that are appropriate for a broad range of workload types. Examples of suitable applications or application functions are as follows: Hadoop MapReduce, database management system searches, shared-memory servers like memcached, static Web servers performing many fetch functions, video servers running unmodified fetch functions, big data/simple logic searches, and HPC workloads in which input/output or memory is the point of constraint (e.g., specialized implementations in which thousands of nodes run one application, such as IBM's Blue Gene).
Market Implications:
Despite the number of workloads that are suitable for low-energy servers, today the technology segment is embryonic and has relatively little provider support but does show promise as a differentiated segment.
Extreme-low-energy servers are predominantly targeted at enterprise customers or those that typically buy servers in large volumes. Due to this market positioning, we expect growth in low-energy server sales will largely be targeted at mainstream x86 servers rather than purchases of non-x86 and PC-class servers.
Customers focused on the best power and cooling efficiency and organizations looking to reduce the data center footprint will stand to benefit from adoption of these servers. Energy and space-saving improvements are largest when transitioning from mainstream alternatives. We expect the relative difference between mainstream and extreme-low-energy servers will remain largely stable, with small incremental improvements as extreme-low-energy server technology evolves further.
Customer segments that have a workload bias toward applicable low-energy server workloads (for example, cloud data center providers or HPC customers) are obviously a natural target for low-energy servers. The profile of the workloads that can be addressed by these types of server, being typically broad — like Web or database search — means that potential adoption isn't limited to any particular customer vertical, but rather is limited by the attitude of the IT organization to invest in alternative solutions.
In order to gain a broad footprint, there will be ecosystems of energy efficiency in the energy stack of the data center, which needs to include the OS and software applications engineered to really leverage low-energy-server approaches effectively.
Recommendations:
  • Benchmark to verify fit, and engineer the production environment in enough detail to understand the resulting expansion of server images, network and storage connections, and their effects on operational processes.
  • Move work from traditional to extreme-low-energy servers when the net benefit, factoring in all the consequences and project costs, is sufficient to justify some added risk and the switching costs involved.
  • Verify the key attributes of extreme-low-energy servers — relatively light CPU demands and excellent scaling — and benchmark them on real machines before committing to any purchases.
  • Apply all traditional and mature approaches to increasing energy efficiency to solve short-term tactical constraints (such as imminent exhaustion of spare energy or space in an existing data center) before undertaking a move to extreme-low-energy solutions as a quick fix.
Related Research:
"Hype Cycle for Server Technologies, 2011"
"SWOT: SeaMicro, Servers, Worldwide"
"Market Insight: The Top Five x86 Server Workloads for the Optimal Data Center Strategy"
"Introducing Extreme Low-Energy Servers"
Strategic Planning Assumption: By 2013, 15% of enterprises investing in off-premises cloud computing will rate green measures among their top three priorities.
Analysis By: Errol Rasit
Key Findings:
In 2010, a global survey of organizations with more than 1,000 employees revealed that 33% of organizations planning cloud investment cited "green" as a driver, and 27% cited social responsibility as a driver. These drivers, however, ranked sixth and seventh on the list of priorities. The top three drivers for cloud investment were "improve business agility," "provide capital expenditure (capex) savings" and "part of our data center transformation strategy."
We believe that a number of factors will drive customers to increase the importance of green as a driver to invest in cloud computing:
  • Gartner inquiries reflect that most organizations overprovision their infrastructure resources, such as server or storage, by provisioning based on peak utilization. In some cases, safety margins are added on top. Collectively, a public cloud provider has the potential to be extremely resource-efficient due to a high level of standardization that may result in better sharing and optimization of resources across a larger infrastructure base. While this is resulting in operating expenditure savings, it also offers more sustainable usage of infrastructure.
  • Work by organizations such as Greenpeace has increased industry and customer interest around cloud provider power sources. Greenpeace published a report titled "How Dirty Is Your Data?" that judged several providers' data centers on the amount of coal used in powering data centers, the transparency of this information to the public domain, infrastructure location and mitigation strategy. In addition, financial benchmarks, such as the Dow Jones Sustainability Index, are rating green IT efficiency and the utilization of the cloud for a more sustainable performance, giving thrust for CFOs to ask questions about sustainable business operations.
  • Carbon tax schemes, aimed at penalizing the use of fossil fuels as a power source, are being implemented by many governments around the world, largely in response to the treaty set out by the United Nations Framework Convention on Climate Change (UNFCCC or FCCC). The treaty is commonly referred to by its most famous legally binding agreement to reduce greenhouse gases, the Kyoto Protocol. Due to legally binding initiatives like the Kyoto Protocol, enterprises should expect that governments will increasingly seek to penalize consumers of fossil-fuel-based power.
Market Implications:
Greening of any service is an end-to-end process, such as selection of technology, implementation, waste management or power production source. There are a number of alliances that are focused on providing standards or references for green computing, all of which are still evolving. Examples are the EU, Energy Star and the Green Grid.
Many of the current reference frameworks focus on measuring the technology that delivers the service. Although some providers are focusing on providing some measurements on green cloud services, green visibility varies wildly from provider to provider; some of them do not share information, because they consider the design and management of the data center to be a competitive differentiator and therefore a closely guarded secret. The responsibility of measuring the green credentials of a provider will likely stay with the customer in the near term until providers improve the visibility of their green credentials.
As it stands, green frameworks and standards are not all-encompassing, so there is no single standard to adhere to. There is significant scope for providers to improve their participation in standards adoption and investment. Gartner believes that the likelihood of an all-encompassing or recognized singular green cloud standard in the near future is low.
Recommendations:
  • Monitor the development of green standards, such as Energy Star or the Code of Conduct of the EU; in particular, assess the implication of laws or taxes that may come into effect in locations where your IT resides.
  • Before signing a commercial agreement, test your cloud computing provider's ability to share data that commonly used green metrics, frameworks and ratings require.
  • Apply the same amount of stringency and detail of your internal IT infrastructure to your off-premises IT infrastructure in order to measure the end-to-end greenness of IT services independent of responsibility or ownership of infrastructure.
Related Research:
"Greening the Cloud: Location Is Critical for the Sustainable Future of Outsourced Data Storage and Services"
"Data Center Decisions: Build, Retrofit or Colocate; Why Not a Hybrid Approach?"
Strategic Planning Assumption: By 2016, data centers in India will reach limitations in power supply by utilities, which will gravely impact business operations.
Analysis By: Naresh Singh
Key Findings:
Data center capacities in India are expanding at a 20%-to-30% annual rate and are expected to touch a raised-floor supply of 5.5 million square feet by 2016. While service provider space will grow at a higher rate, captive data centers owned and managed by users will also see a healthy growth in the forecast period. For more details, see "Emerging Market Analysis: Future Outlook of Indian Data Center Market."
With growing adoption of high-density multicore servers and more-powerful network and storage devices, the energy use by data centers has become a big challenge for the IT organization among Indian users. Users are realizing the need to design and upgrade their data center power and cooling facilities to meet the current requirement, as well as future requirements. As users try to increase their server workload and rack equipment load, users are planning for an ever-higher energy "footprint" for their data centers. Planning for a modular data center, users are typically segmenting their data centers into racks with high-density usage, normal usage and low-density usage to be able to optimally meet the composite current and future requirements. These designs also essentially factor the cooling requirement implications that are different for different zones with varying IT loads.
Assuming that users will go for a mix of data center zones with typical rack loads of 4 kilowatts (kW), 10 kW and 20 kW, the total energy required, including the load necessary to cool all the data centers in India, will reach 4,397 megawatts (MW) by the end of 2016. This energy use equates to over 2% of the 218,209 MW projected demand of the country in the same period, according to the 17th Electric Power Survey of India published by Central Electricity Authority in 2007. This is an extraordinarily high task for an emerging country that is already challenged to meet its current public- and private-sector obligations. Power blackouts and unavailability are a persisting common problem even in the commercial capital of India, Mumbai — which has the largest concentration of data centers in India.
Market Implications:
  • Both users and service providers stand a significant risk of their data center energy sourcing strategy becoming unsustainable. They need to address this risk by securing their future requirement through long-term commitments from local utilities. They also need to closely work with the utilities, sharing their forecast plans, etc., so that their future requirement can be met adequately and on time.
  • Data centers in India will continue to see the need for relatively higher power generation backups than their global counterparts, because they are less likely to rely on their utility providers for an uninterrupted supply of electricity. The consequent greater use of captive generators will also mean higher capex and cost of operations.
  • Energy-efficient IT equipment, technologies and data center designs will see growing demand in India, as the challenges escalate upward. Also, energy monitoring and management tools will have higher adoption in the coming years.
  • India is unlikely to reduce or remove the negative perception of being an infrastructure-challenged location for the purpose of setting up data centers for a regional and global requirement. This not only can seriously impact India's ambitions of emerging as a preferred regional location for data center hosting, but also could impede the overall IT and business outsourcing opportunities for India.
Recommendations:
  • Data center technology providers: Encourage and educate users to adopt energy-efficient solutions and designs, even if they mean an apparently higher capex at the onset — especially if it will give the customer a sustainable data center strategy and also help achieve a lower total cost of ownership in the long run.
  • Data center hosting service providers: Monitor the energy requirement mix of your existing as well as potential customers, and conduct scenario planning for low, medium and high energy demand. Shape your data center planning based on the most likely scenario, while having a Plan B for either of the other scenarios getting more realistic in the future.
  • Users and hosting service providers: Make facility planning a board-level priority — with the stakes translated out adequately to the key business leaders. Initiate or support an energy-efficient culture among internal and external users of your data center.
  • Users and hosting service providers: While planning your data centers, seriously evaluate locations that are not necessarily business hubs but have adequate current and future supplies of power, like major electricity grid sources, in addition to other necessary factors, like telecommunications facilities, water supply and disaster implications.
  • Users: Create adequate power source backups and redundancies, like multiple grid providers. Maintain enough captive power generation capacities (along with redundancy designs) and adequate fuel to keep the data center running in the event of a long period of power blackouts, which are not uncommon in India.
Related Research:
"Emerging Market Analysis: Future Outlook of Indian Data Center Market"
"How to Build a World-Class Data Center in India"
Strategic Planning Assumption: By 2012, Brazil will surpass Canada and become the No. 7 country in the server market in terms of revenue.
Analysis By: Kiyomi Yamada
Key Findings:
Emerging markets have been increasing their presence in the server market as many organizations in these regions have been trying to build new IT infrastructure. By 2012, Brazil will become the new No. 7 country by surpassing Canada in terms of server revenue. The outlook for data center spending in Brazil is robust because its economy is expected to continue a growth spurt for the next five years. Brazil's data center business has been also supported by the government's commitment to push IT modernization throughout the country, as well as preparation for worldwide events, such as the Olympics (2016) and the World Cup (2014).
Market Implications:
The Brazilian server market has been steadily growing, and more providers are focusing on this market. We believe that the market is still undersaturated and has potential to grow further. In comparison with other technologies, such as PCs, the No. 7 ranking is not impressive for Brazil. It often is the case that consumer-related technologies (e.g., PCs or mobile phones) take off first and enterprise technology adoption follows. Brazil is the No. 4 country in the PC market in 2011 (in end-user spending).
The Brazilian server market outlook is bright, but the country needs to work more on the following points:
  • IT infrastructure modernization projects have been done mainly in the metropolitan areas, although they are spreading to smaller cities.
  • Few small and midsize businesses (SMBs) embrace data center functions.
  • A shortage of trained IT personnel has been a big issue.
  • Cloud services are still in an infancy stage because of unstable broadband connections and limited applications.
  • Although many organizations have strong interests in cloud services, broadband coverage is still sketchy and expensive due to lack of infrastructure. Better broadband coverage and service will create additional data center demand, as the country can provide offshore services for other countries in addition to domestic service.
Recommendations:
  • Understand the characteristics for the Brazilian data center market. The interest in energy efficiency is relatively low compared with other countries because the country has abundant oil supplies and alternative energy resources (the country is a top global producer of ethanol and hydroelectricity). Instead, demand for small-footprint data centers is high, as the country's real estate prices are rising.
  • Keep monitoring cloud service adoption in Brazil. Although interest in cloud services is very high, currently many organizations prefer to have their own data centers. In addition to immature infrastructure environments, conservative attitudes toward new technologies hinder further cloud adoption. This could, however, change dramatically once these services start being accepted.
  • Work closely with government. The Brazilian government is very aggressive in promoting IT development.
  • Try to expand market reach to smaller organizations and smaller cities via channels.
Related Research:
"Market Trends: Brazil's Emerging Middle-Class Consumer Subsegment Shines With IT Opportunities"
"Emerging Market Analysis: Brazil, a Growing IT Frontier"
Strategic Planning Assumption: By 2015, China's cloud computing will make up more than 25% of the Chinese data center market.
Analysis By: Jennifer Wu
Key Findings:
In 2010, the number of servers used in China's public cloud computing was estimated to be 10% of all servers sold. Gartner estimates that by 2015 this number will reach 20%. Given that servers are the main components of a data center, this projected growth indicates a healthy future of cloud in China (see "Market Trends: Opportunities for Server Providers in China's Public Cloud").
In mid-2010, Gartner conducted a survey of large enterprises in China to analyze the growth of private clouds in the country. Fifty-eight percent of Chinese respondents indicated they had already invested in cloud computing or planned to do so in 2011 (see "User Survey Analysis: China's Data Centers Accelerating Adoption of Storage Technologies and Cloud Computing").
In addition, in 2010, China's government vowed to support the five "cloud city" projects, serving as a signal of the government's incentives for enterprises to push the cloud throughout the country. Data centers and cloud infrastructure are seen as the foundation for future industrial growth and services (see "China Plans to Advance Its Economy by Exploiting Cloud Computing").
Market Implications:
Cloud computing has drawn the attention of the Chinese government and is seen as one of the ways in which the country can leapfrog over technologically more advanced economies. Given the government's long involvement and role as a catalyst to industrial innovation, its adoption of cloud computing as part of its Five-Year Plan indicates the importance that this will have in the market. The government is also directly investing in the development of a cloud-based environment both at the national and provincial levels. Enterprises generally follow the government's lead, hoping to emulate the success of mobile technologies that helped China skip the further spread of landlines to move directly into mobile telecommunications some 20 years ago.
Notably in 2011, China bypassed Japan, becoming the second-largest data center market in the world. That year, data centers in China accounted for about 8.3% of the world market. Gartner predicts that growth will continue into 2015 and rise to more than 11% of the global data center market. Cloud computing, both private and public, will be a significant factor in such growth. At the moment, many companies are being held back by concerns for security and stability. Gartner, however, believes that by 2013, these will become less of a problem due to the improvement of security technology and government endorsement of public cloud, and more than 30% of large companies will deploy private cloud computing solutions.
Successful cloud computing calls for more than just the technical infrastructure, as it also needs the support of management to be tied closely to overall business objectives. For now, it seems that Chinese enterprises still are weak in integrating the cloud into clear overall business plans and extracting the best usage for competitive ends.
Recommendations:
  • Cloud technology providers have to build cloud teams capable of providing wider services than the physical components of clouds — integrating the cloud into the overall objectives of the enterprise and providing clear lines to successful business solutions.
  • Cloud technology providers should use pilot and service trials for potential clients to facilitate commitment and broad implementation.
  • Technology providers should be prepared to adjust product development priorities and market strategy to address China's unique characteristics. In particular, providers should explore market opportunities in security as a service and platform as a service in China. Cloud computing service suppliers should leverage the government's support to invest in China's cloud computing market.

Thursday, October 20, 2011

A New Approach To Measuring Performance in Solar

Written by Marc van Gerven
Marc van Gerven is managing director of Q-Cells North America.


Marc van Gerven:
We all know an energy revolution is coming, whether we embrace it or not. One day soon (give or take a few years), we’ll live in a world that generates energy in radically different ways than it does today. For me, that’s the exciting part of working in the solar industry. And I suspect, as most media coverage I read suggests, this is also the exciting part for most casual observers of the clean technology sector.
Yet, while we all enjoy reading articles about new gizmos, electric cars, space-age wind turbines, and of course, the latest advances in photovoltaics, these advancements represent only the surface of a much larger, systemic change in the way we will soon conduct business. New technologies, especially disruptive ones, bring with them new business models, new forms of measurement, new modes of communication and ultimately entirely new practices.
To drive the adoption of new things, we must be cognizant of the fact that change is difficult and often scary– even for the “experts.” Bridging the gap between what is well-understood today and what will eventually be commonplace in the future requires a great deal of confidence building. Ultimately, it comes down to predictability.
For the solar energy industry to truly mature, it needs a performance model that can offer investors confidence and predictability in their return on investment. In order to achieve this confidence, a new model must do more than merely offer performance expectations; it must assure system performance in the same way a traditional coal plant does today.
Until relatively recently, the solar energy industry had been too young, and the technology too new, to accurately predict return on investment. This led to the acceptance of simple models to evaluate the cost of solar projects. While these models are straightforward and present a more complete picture of a solar plant’s operational capability than even earlier metrics, they still only measure the sum of the parts rather than the whole.
Sadly, these and other simplified metrics are not focused on lifetime costs versus lifetime returns, and so they paint an inaccurate and short-term portrait of plant production. The end result is misaligned expectations, which could doom big, utility-scale projects to financial failure.
Fortunately, a new model for evaluating the complete system is emerging, the power plant performance ratio, which measures a system’s efficiency in converting solar radiation into electricity. Because the performance ratio looks at the whole, rather than the sum of the parts, this indicator can take into consideration individual variables that are ignored by current metrics. By measuring the performance ratio, power plant owners and operators can immediately and fully predict the performance of the plant over its lifetime, accurately assessing the value of the plant. For the first time, the business of solar energy is catching up to the promise of solar technology.

But there is more: Beyond the academics of measuring solar production in more meaningful ways, companies must ultimately execute on their promise of predictability.
This is where one’s business model enters the picture. If you measure systems holistically, then the solutions you offer should be holistic as well. Only players with integrated systems experience can wrap or bundle their offerings based on the real-world experience of product performance. With a project-level perspective, an experienced developer possesses a thorough knowledge of field-tested plant performance down to the level of the solar module and even the solar cell itself. This breadth and depth of knowledge delivers the data needed to offer true performance guarantees– all tying back to predictability and ROI.
In sum, thinking about the development and execution of solar plant production differently – with an emphasis on predictability and system-wide guarantees – is what will move the needle toward mainstream investment in solar energy.



Monday, September 12, 2011

Zero-carbon house

http://gu.com/p/xb58q 

China to cap energy use in national low-carbon plan

A cap on energy consumption is expected to be at the heart of a Chinese low-carbon plan to be issued this year, experts believe, amid reports that officials have now agreed its level.
China is the world's biggest emitter of greenhouse gases, making up a quarter of the global total. Experts say setting an energy limit would add certainty to the country's attempts to rein in emissions and should make it easier for emissions trading schemes to get off the ground.
The cap has been anticipated for some time but is now thought likely to emerge in the low-carbon plan understood to have been broadly approved by a panel set up by the state council, China's cabinet, and chaired by the premier, Wen Jiabao. It should be formally passed later this year.
Reuters reported that officials have settled on a total energy cap of 4.1bn tonnes of coal equivalent (TCE) by 2015 – a level more than 25% higher than last year.
Analysts warn that the plan has yet to be nailed down and that a cap could still be delayed by disagreements, to re-emerge in a later policy document.
The government in March unveiled its five year-plan for 2011-2015. Setting out the economic course for the nation, it aims for a more sustainable pace of growth and includes a new carbon intensity target – trying to slow emissions growth relative to GDP by 17% – as well as a goal of improving energy intensity by 16%. Officials are fleshing it out.
A cap "is significant, because it makes it much clearer to provinces what they have to do regardless of what GDP growth rate is", said Deborah Seligsohn, a climate policy expert working for the World Resources Institute in Beijing.
"The cap they have been talking about is essentially based on the growth rate they expect overall; it doesn't mean cutting further, but it does add certainty."
She said it would also make pilot emissions trading systems in six provinces and cities more effective. Although people have been looking at ways to base systems on carbon intensity, a cap would make matters much more straightforward.
A level of 4.1bn TCE would be higher than many had expected. Zhang Guobao, formerly China's top energy official, told the state news agency Xinhua in April that there would be a cap of 4bn TCE.
"There were some very aggressive suggestions from scholars. Some have suggested [4.1bn TCE] would be rather conservative, but from where I stand I think it is very positive," said Changhua Wu, the Climate Group's China representative.
"When you set a cap you obviously are going to set an attitude towards shifting the structure ... Part of the big lesson we learned in the last five years is that you could grow wind and solar and nuclear energy aggressively but keep consuming more."
Another key issue will be whether the plan spells out targets for individual provinces on energy and carbon intensity.
China's commitment to meeting environmental targets was underlined when provinces abruptly shut down plants last year to try to meet the goals of the last five-year plan.
With the central government keeping a closer eye on the progress of provinces, the National Development and Reform Commission – the country's top economic planning body – recently published the names of those struggling to meet targets.
"The Chinese government has made it pretty clear they expect these targets to be met. This time the provinces know that now, so they will be working from the beginning," said Seligsohn.

Sunday, September 11, 2011

Arctic sea ice is melting at its fastest pace in almost 40 years

The Northwest Passage was, again, free of ice this summer and the polar region could be unfrozen in just 30 years.
 Aerial view of the Petermann glacier, Greenland’s north-west coast – a 100 square-mile block of ice broke off it in August last year; by July this year it had melted.  
Photograph: Nick Cobbing/AFP/Getty Images
Source: http://www.guardian.co.uk/

Arctic sea ice has melted to a level not recorded since satellite observations started in 1972 – and almost certainly not experienced for at least 8,000 years, say polar scientists.
Daily satellite sea-ice maps released by Bremen university physicists show that with a week's more melt expected this year, the floating ice in the Arctic covered an area of 4.24 million square kilometres on 8 September. The previous one-day minimum was 4.27m sq km on 17 September 2007.

The US National Snow and Ice Data Centre (NSIDC) in Boulder, Colorado, which also tracks the extent of sea ice, has not posted data for a week but is expected to announce similar results in the next few days.
The German researchers said the record melt was undoubtedly because of human-induced global warming. "The sea-ice retreat can no more be explained with the natural variability from one year to the next, caused by weather influence," said Georg Heygster, head of the Institute of Environmental Physics at Bremen.
"It seems to be clear that this is a further consequence of the man-made global warming with global consequences. Climate models show that the reduction is related to the man-made global warming, which, due to the albedo effect, is particularly pronounced in the Arctic," he said. The albedo effect is related to a surface's reflecting power – whiter sea ice reflects more of the sun's heat back into space than darker seawater, which absorbs the sun's heat and gets warmer.


Floating Arctic sea ice naturally melts and re-freezes annually, but the speed of change in a generation has shocked scientists – it is now twice as great as it was in 1972, according to the NSIDC, with a decline of about 10% per decade.
Arctic temperatures have risen more than twice as fast as the global average over the past half century.
Separate, less reliable, research suggests that Arctic ice is in a downward spiral, declining in area but also thinning. Using records of air, wind and sea temperature, scientists from the Polar Science Centre of the University of Washington, Seattle, announced last week that the Arctic sea-ice volume reached its lowest ever level in 2010 and was on course to set more records this year.
The new data suggests that the volume of sea ice last month appeared to be about 2,135 cubic miles – just half the average volume and 62% lower than the maximum volume of ice that covered the Arctic in 1979. The research will be published in a forthcoming issue of the Journal of Geophysical Research.
"Ice volume is now plunging faster than it did at the same time last year when the record was set," said Axel Schweiger.
If current trends continue, a largely ice-free Arctic in the summer months is likely within 30 years –that is up to 40 years earlier than was anticipated in the last Intergovernmental Panel on Climate Change (IPCC) assessment report.
The last time the Arctic was uncontestably free of summertime ice was 125,000 years ago, at the height of the last major interglacial period, known as the Eemian.
"This stunning loss of Arctic sea ice is yet another wake-up call that climate change is here now and is having devastating effects around the world," Shaye Wolf, climate science director at the Centre for Biological Diversity in San Francisco told journalists.
Arctic ice plays a critical role in regulating Earth's climate by reflecting sunlight and keeping the polar region cool. Retreating summer sea ice is widely described by scientists as both a measure and a driver of global warming, with negative impacts on a local and planetary scale.
This year, both the North-west and North-east passages were mostly ice free, as they have been twice since 2008.
Last month, the 74,000-tonne STI Heritage tanker passed through the North-east Passage with the assistance of ice breakers in just eight days on its way from Houston, Texas, to Thailand.
The north-east sea route, which links the Atlantic to the Pacific, is likely to become a commercial ship operator's favourite, saving thousands of miles and avoiding tolls on the Suez Canal tolls.
Further evidence of dramatic change in the Arctic came last week from Alan Hubbard, a Welsh glaciologist at Aberystwyth University, who has been studying the Petermann glacier in northern Greenland for several years.
The glacier, which covers about 6% of the icecap, is 186 miles (300km) long and up to 3,280ft (1km) high. In August last year, a 100 square-mile (260 sq km) block of ice calved from the glacier. Photographs show that by July this year it had melted and disappeared.
"I was gobsmacked. It [was] like looking into the Grand Canyon full of ice and coming back two years later to find it full of water," said Hubbard.
Last year (2010) tied with 2005 as the warmest year on record.

Tuesday, September 6, 2011

Carbon offsets near record low !!!

Carbon offsets neared all-time lows Friday, confirming their status as the world's worst performing commodity, as slumping demand meets rising supply of the U.N. instrument traded under the Kyoto Protocol.

 

A worsening global economic outlook has dented prices for emissions permits which depend on a robust economy belching greenhouse gases into the air, and has also impacted oil, grains, coal and natural gas.

Carbon offsets have fared uniquely badly because a U.N. climate panel continues to print new offsets, regardless of a widening glut in emissions permits in the main demand market, the European Union's carbon market.
Countries and companies in the developed world can buy offsets as a way to meet emissions caps agreed under Kyoto, paying for cuts in developing country projects instead, but the financial crisis has left a global oversupply.
"If the European economy goes through a double dip (recession) it could be a lethal threat for the carbon market," said Marius-Cristian Frunza, analyst at Schwarzthal Kapital.

The U.N. scheme for generating certified emissions reductions (CERs), called the clean development mechanism (CDM), faces additional problems besides the economy.
Failure by countries to agree a new round of carbon caps after 2012 under drifting U.N. climate talks, has further curbed prospective demand.
The financial crisis has blown off course talks to agree a global climate deal, which now seems years off. The CER market had a traded value of $18.3 billion last year, down from $26.3 billion in its peak year 2008.

Adding to CER woes, the EU has banned from 2013 imports of the most common type of offset, from refrigerant plants in China, prompting investors to dump these.
Benchmark CERs fell as low as 7.4 euros Friday, down more than 7 percent on the day, fractionally above an all-time low of 7.15 euros.
Prices are now at around cost price in developing countries, squeezing margins for project developers such as London-listed Camco, whose shares were down more than 10 percent at midday, and by nearly 40 percent over the past month.

Rival developer Trading Emissions PLC last week pulled a proposed sale of its assets because of falling carbon prices. Its average CER costs are 7.5 euros per tonne.
European carbon prices also continued falls on Friday, to as low as 10.65 euros or by 5 percent.
See below for a performance ranking of various commodities as of 1245 GMT Friday, compared with December 19 2008 when U.S. crude hit a financial crisis low of $32. Change is also shown over the past month.

Wednesday, August 31, 2011

Himalaya glaciers shrinking on global warming, some may disappear

Three Himalaya glaciers have been shrinking over the last 40 years due to global warming and two of them, located in humid regions and on lower altitudes in central and east Nepal, may disappear in time to come, researchers in Japan said.
Using global positioning system and simulation models, they found that the shrinkage of two of the glaciers -- Yala in central and AX010 in eastern Nepal -- had accelerated in the past 10 years compared with the 1970s and 1980s.
Yala's mass shrank by 0.8 (2.6 feet) and AX010 by 0.81 meters respectively per year in the 2000s, up from 0.68 and 0.72 meters per year between 1970 and 1990, said Koji Fujita at the Graduate School of Environmental Studies in Nagoya University in Japan.
"For Yala and AX, these regions showed significant warming ... that's why the rate of shrinking was accelerated," Fujita told Reuters by telephone.
"Yala and AX will disappear but we are not sure when. To know when, we have to calculate using another simulation (model) and take into account the glacial flow," Fujita said, but added that his team did not have the data to do so at the moment.
Their findings were published in the journal Proceedings of the National Academy of Sciences on Tuesday.
The Himalayas is an enormous mountain range consisting of about 15,000 glaciers and some of the world's highest peaks, including the 8,848-meter-high Mount Everest and K2.
Apart from climate change and humidity, elevation also appears to play a critical role in the lifespan of glaciers, which are large persistent bodies of ice.
The Rikha Samba glacier in the drier region of west Nepal has also been getting smaller since the 1970s, but its rate of shrinking slowed to 0.48 meters per year in the past 10 years compared to 0.57 meters per year in the 1970s and 1980s.
This was because the 5,700-meter-high glacier was located on a higher altitude, which meant that losses in mass from melting could be compensated at least partly by collection of snowfall, Fujita said.
"In the case of Yala and AX, they are situated on lower elevation (altitudes), therefore shrinkage was accelerated. Glaciers that have no chance to get snow mass will eventually disappear," Fujita said.
Yala glacier is located about 5,400 meters above the sea level, while AX is 5,200 meters high.

Thursday, March 31, 2011

India is taking the first step in curbing down pollution !!! hurrah!!!

Govt plans Rs 1 lakh/day fine for missing energy saving target

On Wednesday 30 March 2011, 2:28 PM
New Delhi, Mar 30 (PTI) The Power Ministry plans to impose huge penalties of over Rs 1 lakh per day on industries that fail to achieve energy efficiency targets under the three-year Perform, Achieve and Trade (PAT) programme starting April 1.
PAT, which aims to increase industrial energy efficiency, is expected to bring down energy consumption by 5 per cent, amounting to an avoided capacity of over 5,600 MW over the three-year period.
There would be strict penalties, as well as an incentives, for industries participating in PAT, starting April 1. The penalties would be more than Rs 1 lakh per day, apart from some other charges based on tonnes of oil equivalent consumption, a senior Power Ministry official told PTI.
"Those entities that fail to achieve the targets will have to pay huge penalties. Other (entities) that perform better will be awarded Energy Savings Certificates (ESCerts), which can be traded," the official said.
Entities that are short of targets can also buy these certificates to make up for the shortfall.
Eight industries, which account for over 50 per cent of energy consumption, would be a part of PAT. These are: cement, thermal power plants, pulp & paper, textile, fertiliser, iron & steel, aluminium and chlor-alkali industries.
PAT is expected to result in electricity savings corresponding to about 9.78 million metric tonnes of oil equivalent. This would be more than 5,600 MW of avoided capacity (which otherwise need to be added).
The programme will end on March 31, 2014.
The basic aim of PAT is to bring down energy consumption and the programme has been finalised after many rounds of meetings and consultation with the stakeholders.
An initiative of the National Mission for Enhanced Energy Efficiency (NMEEE), the programme will be implemented by the Bureau of Energy Efficiency (BEE).
As per BEE, ESCerts would be traded on special trading platforms to be created on the two power exchanges. Data on traded prices, traded volumes and trends would also be maintained on the bourses.

Friday, January 7, 2011

Wednesday, January 5, 2011