Beruflich Dokumente
Kultur Dokumente
DataCenterDynamics,Mumbai,India May25,2012
Presentedby:
eere.energy.gov
Download Presentation
eere.energy.gov
Agenda
Introduction Performancemetricsandbenchmarking ITequipmentandsoftwareefficiency UseITtosaveIT(monitoringanddashboards) Datacenterenvironmentalconditions Airflowmanagement Coolingsystems Electricalsystems Resources Workshopsummary
eere.energy.gov
Challenging Conventional Wisdom: Game Changers Conventional Approach Data centers need to be cool and controlled to tight humidity ranges Data centers need raised floors for cold air distribution Data centers require highly redundant building infrastructure Need Holistic Approach IT and Facilities Partnership
eere.energy.gov
Introduction
eere.energy.gov
Provide background on data center efficiency Raise awareness of efficiency opportunities Develop common understanding between IT and Facility staff Review of data center efficiency resources Group interaction for common issues and solutions
eere.energy.gov
eere.energy.gov
eere.energy.gov
Italy South Africa Mexico World Data Centers Iran Sweden Turkey
0 50 100 150 200 250 300
California
Western Europe
eere.energy.gov
From 2000 2006, computing performance increased 25x but energy efficiency only 8x
Amount of power consumed per $1,000 of servers purchased has increased 4x
Cost of electricity and supporting infrastructure now surpassing capital cost of IT equipment Perverse incentives -- IT and facilities costs separate
Wealsoresearchenergyefficiencyopportunity andworkonvariousdeploymentprograms
14 | Lawrence Berkeley National Laboratory eere.energy.gov
eere.energy.gov
NERSC Computer Systems Power (Does not include cooling power) (OSF: 4MW max) 40 35 30 25 20 15 10 5 0
MegaW atts
Energy Efficiency = Useful computation / Total Source Energy Typical Data Center Energy End Use
100 Units Source Energy
Power Conversions & Distribution
Cooling Equipment
33 Units Delivered
eere.energy.gov
Cooling Equipment
High voltage distribution High efficiency UPS Efficient redundancy strategies Use of DC power
On-site Generation
On-site generation Including fuel cells and renewable sources CHP applications (Waste heat for cooling)
eere.energy.gov
20-40% savings typical Aggressive strategies can yield 50+% savings Extend life and capacity of infrastructures
eere.energy.gov
eere.energy.gov
Performancemetricsandbenchmarking
eere.energy.gov
CourtesyofMichaelPatterson,IntelCorporation
22 | Lawrence Berkeley National Laboratory eere.energy.gov
Benchmarking for Energy Performance Improvement: Energy benchmarking can allow comparison to peers and help identify best practices LBNL conducted studies of over 30 data centers:
Wide variation in performance Identified best practices
YourMileageWillVary
Energy Performance varies Therelativepercentagesofthe energydoingcomputingvaries considerably.
Lighting 2% Other 13%
HVAC - Air Movement 7% Lighting 2% HVAC Chiller and Pumps 24%
Electrical Room Cooling 4% Cooling Tower Plant 4% Data Center Server Load 51%
eere.energy.gov
HighLevelMetric:PowerUtilization Effectiveness(PUE)=TotalPower/ITPower
Average PUE = 1.83
eere.energy.gov
U S P U E
3.50
3.50
In d ia P U E
3.00
3.00
2.50
2.50
2.00
2.00
1.50
1.50
1.00
1.00
0.50
0.50
0.00 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26
0.00 1 2 3 4 5
eere.energy.gov
More Needed!
eere.energy.gov
The future: Computational Metrics (e.g. peak flops per Watt; transactions/Watt)
eere.energy.gov
HVAC
Fan watts/cfm Pump watts/gpm Chiller plant (or chiller or overall HVAC) kW/ton
Air Management
Rack cooling index (fraction of IT within recommended temperature range) Return temperature index (RAT-SAT)/ITT
AirflowEfficiency
CoolingSystemEfficiency
eere.energy.gov
eere.energy.gov
ITEquipmentandSoftwareEfficiency
eere.energy.gov
ThevalueofonewattsavedattheITequipment
eere.energy.gov
Moores Law
Power reduction Over Time*
1.E+00 1.E01 1.E02 1.E03
1000
Pentium 4ProcessorEE Pentium 4Processor Pentium IIi Processor Pentium IIProcessor Pentium ProProcessor Pentium Processor
100
1.E04 1.E05
10
1.E06 1.E07
i486DX2
i486
1970
1980
1990
2000
2005
2010
1996
1998
2000
2002
2004
2006
2008
SingleCoreMooresLaw
Every year Moores Law is followed, smaller, more energy-efficient transistors result Miniaturization provides 1 million times reduction in energy/transistor size over 30+ years. Benefits: Smaller, faster transistors => faster AND more energy-efficient chips. Source: Intel Corp.
34 | Lawrence Berkeley National Laboratory 34
Source: Intel Corporate Technology Group
eere.energy.gov
OldServersconsume60%ofEnergy,butdeliveronly4%of PerformanceCapability.
Data collected recently at a Fortune 100 company; courtesy of John Kuzma and William Carter, Intel
35 | Lawrence Berkeley National Laboratory eere.energy.gov
ITEnergyUsePatterns:Servers
Idleserversconsumeasmuchas5060%ofpower@fullload asshowninSpecPower Benchmarks.
No Load
Uptime Institute reported 15-30% of servers are on but not being used Decommissioning goals include:
Regularly inventory and monitor Consolidate/retire poorly utilized hardware
eere.energy.gov
R&D
Production
App OS
App OS VMM HW
HW
HW
VMM HW
10:1inmanycases
Enablesrapiddeployment, reducingnumberofidle,stagedservers
DisasterRecovery
App OS
DynamicLoadBalancing
App 1 OS
App 2 OS App 3 OS
VMM HW
App OS
App 4 OS
VMM HW
VMM HW
CPU Usage 90%
VMM HW
CPU Usage 30%
Upholding high-levels of business continuity One Standby for many production servers
eere.energy.gov
eere.energy.gov
eere.energy.gov
80PLUSCertificationLevels
Level of Certification Efficiency at Rated Load 115V Internal NonRedundant 20% 80 PLUS 80 PLUS Bronze 80 PLUS Silver 80 PLUS Gold 80 PLUS Platinum 80% 82% 85% 87% n/a 50% 80% 85% 88% 90% n/a 100% 80% 82% 85% 87% n/a 230V Internal Redundant 20% n/a 81% 85% 88% 90% 50% n/a 85% 89% 92% 94% 100% n/a 81% 85% 88% 91%
eere.energy.gov
Power Supplies
StorageDevices
Takesuperfluous dataoffline Usethin provisioning technology
Consolidation
Usevirtualization Considercloud services
eere.energy.gov
eere.energy.gov
eere.energy.gov
UsingITtoManageIT
InnovativeApplicationofITinDataCentersforEnergyEfficiency
eere.energy.gov
eere.energy.gov
IT Systems & network administrators have tools for visualization. Useful for debugging, benchmarking, capacity planning, forensics. Data center facility managers have had comparatively poor visualization tools.
eere.energy.gov
Presents real-time feedback and historic tracking Optimize based on empirical data, not intuition.
Image: SynapSense
eere.energy.gov
eere.energy.gov
CRAC
CRAC
CRAC
CRAC
CRAC
eere.energy.gov
Rack-Top Temperatures
eere.energy.gov
Enhanced knowledge of data center redundancy. Turned off unnecessary CRAC units to save energy.
Rack-Top Temperatures
eere.energy.gov
eere.energy.gov
eere.energy.gov
eere.energy.gov
eere.energy.gov
CRAH03
62 | Lawrence Berkeley National Laboratory eere.energy.gov
75 degF 70 65 60 55
50 A B C D E F Time Interval
63 | Lawrence Berkeley National Laboratory eere.energy.gov
control software
BEFORE
2/15
Cost-Benefit Analysis: DASHcostbenefit(sensorsandsoftware) Cost:$56,824 Savings:$30,564 Payback:1.9years Totalprojectcostbenefit Cost:$134,057 Savings:$42,772 Payback:3.1years
eere.energy.gov
An Emerging Technology Control data center air conditioning using the built-in IT server-equipment temperature sensors
eere.energy.gov
Intel Demonstration Typically, data center cooling devices use return air temperature as the primary control-variable
ASHRAE and IT manufacturers agree IT equipment inlet air temperature is the key operational parameter Optimum control difficult
eere.energy.gov
eere.energy.gov
eere.energy.gov
Dashboards
Dashboards can display multiple systems information for monitoring and maintaining data center performance
eere.energy.gov
Why Dashboards?
Provide IT and HVAC system performance at a glance Identify operational problems Baseline energy use and benchmark performance View effects of changes Share information and inform integrated decisions
eere.energy.gov
eere.energy.gov
Evaluate monitoring systems to enhance operations and controls Install dashboards to manage and sustain energy efficiency.
eere.energy.gov
eere.energy.gov
EnvironmentalConditions
eere.energy.gov
Environmental Conditions
eere.energy.gov
eere.energy.gov
Environmental Conditions
ASHRAEs Thermal Guidelines:
Provide common understanding between IT and facility staff. Endorsed by IT manufacturers Enables large energy savings - especially when using economizers. Recommends temperature range of 18 C to 27 C (80.6F) with allowable much higher New (2011) ASHRAE Guidelines
Six classes of equipment identified with wider allowable ranges from 32 C to 45 C (113F). Provides more justification for operating above the recommended limits (in the allowable range) Provides wider humidity ranges
eere.energy.gov
The recommended range is a statement of reliability. For extended periods of time, the IT manufacturers recommend that data centers maintain their environment within these boundaries. The allowable range is a statement of functionality. These are the boundaries where IT manufacturers test their equipment to verify that the equipment will function.
eere.energy.gov
DryBulbTemperature
81 | Lawrence Berkeley National Laboratory eere.energy.gov
ASHRAEs key conclusion when considering potential for increased failures at higher (allowable) temperatures: For a majority of US and European cities, the air-side and water-side economizer projections show failure rates that are very comparable to a traditional data center run at a steady state temperature of 20C.
eere.energy.gov
eere.energy.gov
Building Supplied Cooling Liquid Maximum Temperature 17C (63F) 32C (89F)
L1
Not Needed
L2
Chiller
L3
Dry Cooler
43C (110F)
eere.energy.gov
eere.energy.gov
eere.energy.gov
eere.energy.gov
Airflow Management
eere.energy.gov
Air Management
Typically, more air circulated than required Air mixing and short circuiting leads to:
Low supply temperature Low Delta T
Use hot and cold aisles Improve isolation of hot and cold aisles
Reduce fan energy Improve air-conditioning efficiency Increase cooling capacity
Hot aisle / cold aisle configuration decreases mixing of intake & exhaust air, promoting efficiency.
eere.energy.gov
eere.energy.gov
Leakage
Wastescoolingcapacity.
92 | Lawrence Berkeley National Laboratory
Increasesinlettemperaturetoservers.
eere.energy.gov
eere.energy.gov
Inlet
Equip. Rack
Outlet
Air Recirculation
SynapSense
middle of rack
Inlet
Outlet
SynapSense
eere.energy.gov
EmptyFloor& CeilingCavities
eere.energy.gov
Underfloorpressuremapwithwirelesssensors
96 | Lawrence Berkeley National Laboratory eere.energy.gov
SynapSense
rack-top temperatures
eere.energy.gov
eere.energy.gov
Too hot
Too hot
Just right
Too cold
Coldairescapesthroughendsofaisles
Elevationatacoldaislelookingatracks
There are numerous references in ASHRAE. See for example V. Sorell et al; Comparison of Overhead and Underfloor Air Delivery Systems in a Data Center Environment Using CFD Modeling; ASHRAE Symposium Paper DE-05-11-5; 2005
99 | Lawrence Berkeley National Laboratory eere.energy.gov
eere.energy.gov
Before
After
eere.energy.gov
Return air duct on top of CRAC unit connects to the return air plenum.
102 | Lawrence Berkeley National Laboratory eere.energy.gov
Physical barriers enhance separate hot and cold airflow. Barriers placement must comply with fire codes. Curtains, doors, or lids have been used successfully.
Doors
Lid
Open
104 | Lawrence Berkeley National Laboratory
eere.energy.gov
LBNLColdAisleContainmentstudyachievedfanenergysavingsof ~75%
eere.energy.gov
eere.energy.gov
Baseline
85 80 75 Temperature (deg F) 70 65 60 55 50 45 40 6/13/2006 12:00 Low Med High 6/14/2006 0:00 6/14/2006 12:00
Alternate 1 Setup
6/15/2006 12:00
6/16/2006 12:00
eere.energy.gov
Energy intensive IT equipment needs good isolation of cold inlet and hot discharge. Supply airflow can be reduced if no mixing occurs. Overall temperature can be raised if air is delivered without mixing. Cooling systems and economizers use less energy with warmer return air temperatures. Cooling capacity increases with warmer air temperatures.
eere.energy.gov
eere.energy.gov
eere.energy.gov
eere.energy.gov
eere.energy.gov
Cooling systems
eere.energy.gov
Air Management and Cooling Efficiency Linking good air management and an optimized cooling system:
Improved efficiencies Increased cooling capacity More hours for air-side and water-side free cooling Lower humidification/dehumidification energy Reduced fan energy
eere.energy.gov
eere.energy.gov
CRAC units Fan, direct expansion (DX) coil, and refrigerant compressor. CRAH units Fan and chilled water coil Typically in larger facilities with a chiller plant Both often equipped with humidifiers and reheat for dehumidification Often independently controlled Tight ranges and poor calibration lead to fighting
eere.energy.gov
Evaporatively-Cooled DX
R n era g efri t
Water-Cooled DX
Water
eere.energy.gov
r ate W
Water-Cooled Chiller
W
e at
Water
121
121 | Lawrence Berkeley National Laboratory eere.energy.gov
OptimizetheChillerPlant
Have a plant (vs. distributed cooling) Use warm water cooling (multi-loop) Size cooling towers for free cooling Integrate controls and monitor efficiency of all primary components Thermal storage Utilize variable speed drives on:
Fans Pumps Towers Chillers
CWPump 13%
eere.energy.gov
eere.energy.gov
0.8
0.7
Efficiency (kW/ton)
0.6
0.5
0.4
0.3
0.2
0.1
Tons
Data provided by York International Corporation. 124 | Lawrence Berkeley National Laboratory eere.energy.gov
Moving(Back)toLiquidCooling
As heat densities rise, liquid solutions become more attractive: Volumetric heat capacity comparison
[5380 m3]
Water Air
125 | Lawrence Berkeley National Laboratory eere.energy.gov
eere.energy.gov
InRowLiquidCooling
eere.energy.gov
Cooling Coil
eere.energy.gov
RearDoorLiquidCooling
Rear Door (open) Rear Doors (closed)
InsiderackRDHx,open90
129 | Lawrence Berkeley National Laboratory eere.energy.gov
OnBoardCooling
eere.energy.gov
eere.energy.gov
eere.energy.gov
Outside-Air Economizers
Advantages
Lower energy use Added reliability (backup for cooling system) Space. Dust
Not a concern with Merv 13 filters
Potential Issues
Gaseous contaminants
Not widespread Impacts normally cooled data centers as well
http://cooling.thegreengrid.org/namerica/WEB_APP/calc_index.html
eere.energy.gov
W B = 70
W B = 60
W B = 40
W B = 30
10
15
20
25
30
35
40
45
50
55
60
65
70
75
80
85
90
95
100
105
Air-Side Economizer (93% of hours) Direct Evaporative Cooling for Humidification/ precooling Low Pressure-Drop Design (1.5 total static peak)
Hours of Operation
Mode 1 Mode 2 Mode 3 Mode 4 Mode 5 total 100% Economiser OA + RA Humidification Humid + CH cooling CH only 2207 5957 45 38 513 8760 hrs hrs hrs hrs hrs hrs
eere.energy.gov
Water Cooling:
Tower side economizer Four pipe system Waste heat reuse Headers, valves and caps for modularity and flexibility
eere.energy.gov
WaterSideEconomizers
Advantages Cost effective in cool and dry climates Often easier retrofit Added reliability (backup in the event of chiller failure). No contamination questions
eere.energy.gov
IntegratedWaterSide Economizer
[7C] 44F
139 | Lawrence Berkeley National Laboratory
60F [16C]
Make sure that the heat exchanger is in series (not parallel with chillers on CHW side)
eere.energy.gov
PotentialforTowerCooling
OnBoardCooling RearDoorCooling WaterSideEconomizer
eere.energy.gov
eere.energy.gov
Disconnect and only control humidity of makeup air or one CRAC/CRAH unit Entirely disconnect (many have!)
eere.energy.gov
CostofUnnecessary Humidification
AC 005 AC 006 AC 007 AC 008 AC 010 AC 011 Min Max Avg Visalia Probe Temp RH Tdp 84.0 27.5 81.8 28.5 72.8 38.5 80.0 31.5 77.5 32.8 78.9 31.4 72.8 84.0 79.2 27.5 38.5 31.7 Temp 47.0 46.1 46.1 47.2 46.1 46.1 46.1 47.2 46.4 76 55 70 74 68 70 55.0 76.0 68.8 RH 32.0 51.0 47.0 43.0 45.0 43.0 32.0 51.0 43.5 CRAC Unit Panel Tdp Mode 44.1 Cooling 37.2 Cooling & Dehumidification 48.9 Cooling 50.2 Cooling & Humidification 45.9 Cooling 46.6 Cooling & Humidification 37.2 50.2 45.5
eere.energy.gov
Cooling Takeaways
Use efficient equipment and a central plant (e.g. chiller/CRAHs) vs. CRAC units Use centralized controls on CRAC/CRAH units
Prevent simultaneous humidifying and dehumidifying Optimize sequencing and staging
Move to liquid cooling (room, row, rack, chip) Consider VSDs on fans, pumps, chillers, and towers Use air- or water-side economizers where possible. Expand humidity range and improve humidity control (or disconnect).
144 | Lawrence Berkeley National Laboratory eere.energy.gov
eere.energy.gov
ElectricalSystems
eere.energy.gov
eere.energy.gov
Electrical Distribution
Every power conversion (AC-DC, DC-AC, AC-AC) loses some energy and creates heat Efficiency decreases when systems are lightly loaded Distributing higher voltage is more efficient and can save capital cost (conductor size is smaller) Power supply, uninterruptible power supply (UPS), transformer, and PDU efficiency varies carefully select Lowering distribution losses also lowers cooling loads
eere.energy.gov
eere.energy.gov
Efficiency
Efficiencies vary with system design, equipment, and load Redundancies impact efficiency
100%
95% 90%
85%
80% 75%
70% 0% 10% 20% 30% 40% 50% 60% 70% 80% 90% 100% Percent of Rated Active Power Load
eere.energy.gov
Redundant Operation
eere.energy.gov
Typical operation
eere.energy.gov
Redundancy
Redundancy Understand what redundancy costs and what it gets you is it worth it? Does everything need the same level? Different strategies have different energy penalties (e.g. 2N vs. N+1) Its possible to more fully load UPS systems and achieve desired redundancy Redundancy in electrical distribution puts you down the efficiency curve
eere.energy.gov
3.3V
Server
eere.energy.gov
380V.DCpowerdistribution
Eliminateseveralconversions Alsouseforlighting,andvariablespeeddrives Usewithonsitegenerationincludingrenewable energysources
Back up -
batteries or flywheel
eere.energy.gov
eere.energy.gov
eere.energy.gov
Small relative benefit but easy to accomplish also saves HVAC energy Use energy efficient lighting Lights should be located over the aisles
eere.energy.gov
Motors and Drives Since most cooling system equipment operates continuously, premium efficiency motors should be specified everywhere Variable speed drives should be used
Chillers Pumps Air handler fans Cooling tower fans
eere.energy.gov
eere.energy.gov
eere.energy.gov
Resources
eere.energy.gov
Resources
Advanced Manufacturing Office Tool suite & metrics for baselining Training Qualified specialists Case studies Recognition of high energy savers R&D - technology development GSA Workshops Quick Start Efficiency Guide Technical Assistance EPA Metrics Server performance rating & ENERGY STAR label Data center benchmarking
163 | Lawrence Berkeley National Laboratory
Federal Energy Management Program Workshops Federal case studies Federal policy guidance Information exchange & outreach Access to financing opportunities Technical assistance
Industry Tools Metrics Training Best practice information Best-in-Class guidelines IT work productivity standard
eere.energy.gov
ResourcesfromDOEFederalEnergy ManagementProgram(FEMP)
BestPracticesGuide BenchmarkingGuide DataCenter ProgrammingGuide TechnologyCase StudyBulletins Procurement Specifications ReportTemplates ProcessManuals QuickStartGuide
eere.energy.gov
eere.energy.gov
InDepthAssessmentTools Savings
AirManagement Hot/cold separation Environmental conditions RCIandRTI ElectricalSystems UPS PDU Transformers Lighting Standbygen. ITEquipment Servers Storage& networking Software Cooling Airhandlers/ conditioners Chillers,pumps, fans Freecooling
eere.energy.gov
eere.energy.gov
Training&CertificateDisciplines/Levels/Tracks
Level 1 Generalist: Prequalification, Training and Exam on All Disciplines + Assessment Process + DC Pro Profiling Tool Level 2 Specialist: Prequalification, Training and Exam on Select Disciplines + Assessment Process + DC Pro System Assessment Tools
IT-Equipment, Air-Management, Cooling Systems, and Electrical Systems
Cooling Systems
Air Management
Electrical Systems
IT Equipment
HVAC (available)
IT (2012) Two Tracks: Certificate track (training + exam) Training track (training only)
eere.energy.gov
Energy Star
eere.energy.gov
Resources
http://www1.eere.energy.gov/femp/program/data_center.html
http://hightech.lbl.gov/datacenters.html
http://www.energystar.gov/index.cfm?c=prod_development.
server_efficiency
http://www1.eere.energy.gov/industry/datacenters/
eere.energy.gov
eere.energy.gov
WorkshopSummary
Best Practices
eere.energy.gov
Results at LBNL
More to come!
174 | Lawrence Berkeley National Laboratory eere.energy.gov
More to Come
NextStepsforLBNLs LegacyDataCenter
Integrate CRAC controls with wireless monitoring system Retrofit CRACs w/ VSD
Small VAV turndown, yields big energy savings
Improve containment (overcome curtain problems) Increase liquid cooling (HP in-rack, and APC inrow) Increase free cooling (incl. tower upgrade)
175 | Lawrence Berkeley National Laboratory eere.energy.gov
DataCenterBestPractices Summary 1. 2. 3. 4. 5. 6. 7. 8. Measure and Benchmark Energy Use Identify IT Opportunities Use IT to Control IT Manage Airflow Optimize Environmental Conditions Evaluate Cooling Options Improve Electrical Efficiency Implement Energy Efficiency Measures
eere.energy.gov
eere.energy.gov
eere.energy.gov
eere.energy.gov
eere.energy.gov
eere.energy.gov
eere.energy.gov
eere.energy.gov
eere.energy.gov
DataCenterBestPractices
Most importantly Get IT and Facilities People Talking and working together as a team!!!
eere.energy.gov
eere.energy.gov
ContactInformation:
Dale Sartor, P.E.
Lawrence Berkeley National Laboratory Applications Team MS 90-3111 University of California Berkeley, CA 94720 DASartor@LBL.gov (510) 486-5988 http://Ateam.LBL.gov
eere.energy.gov