Sie sind auf Seite 1von 187

BestPracticesforDataCenterEnergyEfficiency Workshop

DataCenterDynamics,Mumbai,India May25,2012
Presentedby:

DaleSartor,P.E. ApplicationsTeamLeader,BuildingTechnologies LawrenceBerkeleyNationalLaboratory


(Version:4/29/12)

1 | Lawrence Berkeley National Laboratory

eere.energy.gov

Download Presentation

This Presentation is Available for download at:


http://datacenterworkshop.lbl.gov/

2 | Lawrence Berkeley National Laboratory

eere.energy.gov

Agenda

Introduction Performancemetricsandbenchmarking ITequipmentandsoftwareefficiency UseITtosaveIT(monitoringanddashboards) Datacenterenvironmentalconditions Airflowmanagement Coolingsystems Electricalsystems Resources Workshopsummary

3 | Lawrence Berkeley National Laboratory

eere.energy.gov

Challenging Conventional Wisdom: Game Changers Conventional Approach Data centers need to be cool and controlled to tight humidity ranges Data centers need raised floors for cold air distribution Data centers require highly redundant building infrastructure Need Holistic Approach IT and Facilities Partnership

4 | Lawrence Berkeley National Laboratory

eere.energy.gov

Introduction

5 | Lawrence Berkeley National Laboratory

eere.energy.gov

Workshop Learning Objectives

Provide background on data center efficiency Raise awareness of efficiency opportunities Develop common understanding between IT and Facility staff Review of data center efficiency resources Group interaction for common issues and solutions

6 | Lawrence Berkeley National Laboratory

eere.energy.gov

High Tech Buildings are Energy Hogs

7 | Lawrence Berkeley National Laboratory

eere.energy.gov

Data Center Energy

Data centers are energy intensive facilities


10 to 100 times more energy intensive than an office Server racks now designed for more than 25+ kW Surging demand for data storage 1.5% of US Electricity consumption Projected to double in next 5 years Power and cooling constraints in existing facilities

8 | Lawrence Berkeley National Laboratory

eere.energy.gov

World Data Center Electricity Use - 2000 and 2005

Source: Koomey 2008


9 | Lawrence Berkeley National Laboratory eere.energy.gov

How Much is 152B kWh?

Italy South Africa Mexico World Data Centers Iran Sweden Turkey
0 50 100 150 200 250 300

Final Electricity Consumption (Billion kWh)


Source for country data in 2005: International Energy Agency, World Energy Balances (2007 edition)
10 | Lawrence Berkeley National Laboratory eere.energy.gov

Aggressive Programs Make a Difference


Energy efficiency programs have helped keep per capita electricity consumption in California flat over the past 30 years
14,000 12,000 10,000 KWh 8,000 6,000 4,000 2,000 1960 1965 1970 US 1975 1980 1985 1990 1995 2000

California

Western Europe

11 | Lawrence Berkeley National Laboratory

eere.energy.gov

Projected Data Center Energy Use

Current Estimated Energy Use

EPA Report to Congress 2008 12 | Lawrence Berkeley National Laboratory eere.energy.gov

The Rising Cost of Ownership

From 2000 2006, computing performance increased 25x but energy efficiency only 8x
Amount of power consumed per $1,000 of servers purchased has increased 4x

Cost of electricity and supporting infrastructure now surpassing capital cost of IT equipment Perverse incentives -- IT and facilities costs separate

Source: The Uptime Institute, 2007


13 | Lawrence Berkeley National Laboratory eere.energy.gov

LawrenceBerkeleyNationalLaboratory LBNL operates large systems along with legacy systems

Wealsoresearchenergyefficiencyopportunity andworkonvariousdeploymentprograms
14 | Lawrence Berkeley National Laboratory eere.energy.gov

LBNL Feels the Pain!

15 | Lawrence Berkeley National Laboratory

eere.energy.gov

LBNL Super Computer Systems Power

NERSC Computer Systems Power (Does not include cooling power) (OSF: 4MW max) 40 35 30 25 20 15 10 5 0

N8 N7 N6 N5b N5a NGF Bassi Jacquard N3E N3 PDSF HPSS Misc


eere.energy.gov

MegaW atts

2001 2003 2005 2007 2009 2011 2013 2015 2017

16 | Lawrence Berkeley National Laboratory

Data Center Energy Efficiency = 15% (or less)

Energy Efficiency = Useful computation / Total Source Energy Typical Data Center Energy End Use
100 Units Source Energy
Power Conversions & Distribution

35 Units Power Generation

Cooling Equipment

Server Load /Computing Operations

33 Units Delivered

17 | Lawrence Berkeley National Laboratory

eere.energy.gov

Energy Efficiency Opportunities


Server innovation Virtualization High efficiency power supplies Load management Power Conversion & Distribution Server Load/ Computing Operations Better air management Move to liquid cooling Optimized chilled-water plants Use of free cooling Heat recovery

Cooling Equipment

High voltage distribution High efficiency UPS Efficient redundancy strategies Use of DC power

On-site Generation

On-site generation Including fuel cells and renewable sources CHP applications (Waste heat for cooling)
eere.energy.gov

18 | Lawrence Berkeley National Laboratory

Potential Benefits of Data Center Energy Efficiency

20-40% savings typical Aggressive strategies can yield 50+% savings Extend life and capacity of infrastructures

19 | Lawrence Berkeley National Laboratory

eere.energy.gov

20 | Lawrence Berkeley National Laboratory

eere.energy.gov

Performancemetricsandbenchmarking

21 | Lawrence Berkeley National Laboratory

eere.energy.gov

Electricity Use in Data Centers

CourtesyofMichaelPatterson,IntelCorporation
22 | Lawrence Berkeley National Laboratory eere.energy.gov

Benchmarking for Energy Performance Improvement: Energy benchmarking can allow comparison to peers and help identify best practices LBNL conducted studies of over 30 data centers:
Wide variation in performance Identified best practices

Cant manage what isnt measured


23 | Lawrence Berkeley National Laboratory eere.energy.gov

YourMileageWillVary
Energy Performance varies Therelativepercentagesofthe energydoingcomputingvaries considerably.
Lighting 2% Other 13%
HVAC - Air Movement 7% Lighting 2% HVAC Chiller and Pumps 24%

Computer Loads 67%

Office Space Conditioning 1%

Electrical Room Cooling 4% Cooling Tower Plant 4% Data Center Server Load 51%

Data Center CRAC Units 25%

24 | Lawrence Berkeley National Laboratory

eere.energy.gov

Benchmarks Obtained by LBNL

HighLevelMetric:PowerUtilization Effectiveness(PUE)=TotalPower/ITPower
Average PUE = 1.83

25 | Lawrence Berkeley National Laboratory

eere.energy.gov

PUE for India Data Centers

U S P U E
3.50
3.50

In d ia P U E

3.00

3.00

Datac entertotal/ ITpower

Datac entertotal/ ITpower

2.50

2.50

2.00

2.00

1.50

1.50

1.00

1.00

0.50

0.50

0.00 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26

0.00 1 2 3 4 5

26 | Lawrence Berkeley National Laboratory

eere.energy.gov

Indian Data Center Benchmarking Sources Thanks to:

Intel Hewlett Packard APC Maruti Texas Instruments

More Needed!

27 | Lawrence Berkeley National Laboratory

eere.energy.gov

Energy Metrics and Benchmarking


Key Metrics:
PUE and partial PUEs (e.g. HVAC, Electrical distribution) Utilization Energy Reuse (ERF)

The future: Computational Metrics (e.g. peak flops per Watt; transactions/Watt)

28 | Lawrence Berkeley National Laboratory

eere.energy.gov

Other Data Center Metrics:


Watts per square foot, Watts per rack Power distribution: UPS efficiency, IT power supply efficiency
Uptime: IT Hardware Power Overhead Multiplier (ITac/ITdc)

HVAC
Fan watts/cfm Pump watts/gpm Chiller plant (or chiller or overall HVAC) kW/ton

Air Management
Rack cooling index (fraction of IT within recommended temperature range) Return temperature index (RAT-SAT)/ITT

Lighting watts/square foot


29 | Lawrence Berkeley National Laboratory eere.energy.gov

Metrics & Benchmarking


PowerUsageEffectiveness

AirflowEfficiency

CoolingSystemEfficiency

30 | Lawrence Berkeley National Laboratory

eere.energy.gov

31 | Lawrence Berkeley National Laboratory

eere.energy.gov

ITEquipmentandSoftwareEfficiency

32 | Lawrence Berkeley National Laboratory

eere.energy.gov

IT Server Performance Saving a Watt

ThevalueofonewattsavedattheITequipment

33 | Lawrence Berkeley National Laboratory

eere.energy.gov

Moores Law
Power reduction Over Time*
1.E+00 1.E01 1.E02 1.E03

Core Integer Performance Over Time*


10000
Core 2DuoExtremeQX6700 Core 2DuoProcessorX6800 Pentium DProcessor

1000

Pentium 4ProcessorEE Pentium 4Processor Pentium IIi Processor Pentium IIProcessor Pentium ProProcessor Pentium Processor

100
1.E04 1.E05

10
1.E06 1.E07

i486DX2

i486

1970

1980

1990

2000

2005

2010

386 1986 1988 1990 1992 1994

1996

1998

2000

2002

2004

2006

2008

SingleCoreMooresLaw

Every year Moores Law is followed, smaller, more energy-efficient transistors result Miniaturization provides 1 million times reduction in energy/transistor size over 30+ years. Benefits: Smaller, faster transistors => faster AND more energy-efficient chips. Source: Intel Corp.
34 | Lawrence Berkeley National Laboratory 34
Source: Intel Corporate Technology Group

eere.energy.gov

IT Equipment Age and Performance


AgeDistribution ofServers EnergyConsumption ofServers PerformanceCapability ofServers

2007 & Earlier 2008, 2009 2010 - Current

OldServersconsume60%ofEnergy,butdeliveronly4%of PerformanceCapability.
Data collected recently at a Fortune 100 company; courtesy of John Kuzma and William Carter, Intel
35 | Lawrence Berkeley National Laboratory eere.energy.gov

Perform IT System Energy Assessments

ITEnergyUsePatterns:Servers
Idleserversconsumeasmuchas5060%ofpower@fullload asshowninSpecPower Benchmarks.

No Load

60% of full Load


36 | Lawrence Berkeley National Laboratory eere.energy.gov

Decommission Unused Servers


PHYSICALLY RETIRE AN INEFFICIENT OR UNUSED SYSTEM

Uptime Institute reported 15-30% of servers are on but not being used Decommissioning goals include:
Regularly inventory and monitor Consolidate/retire poorly utilized hardware

37 | Lawrence Berkeley National Laboratory

eere.energy.gov

Virtualize and Consolidate Servers and Storage


Run many virtual machines on a single physical machine Developed in the 1960s to achieve better efficiency Consolidate underutilized physical machines, increasing utilization Energy saved by shutting down underutilized machines
38 | Lawrence Berkeley National Laboratory eere.energy.gov

Virtualize and Consolidate Servers and Storage Virtualization: Workload provisioning


ServerConsolidation
App OS

R&D

Production

App OS

App OS VMM HW

HW

HW

VMM HW

10:1inmanycases

Enablesrapiddeployment, reducingnumberofidle,stagedservers

DisasterRecovery
App OS

DynamicLoadBalancing
App 1 OS
App 2 OS App 3 OS

VMM HW

App OS

App 4 OS

VMM HW

VMM HW
CPU Usage 90%

VMM HW
CPU Usage 30%

Upholding high-levels of business continuity One Standby for many production servers

Balancing utilization with head room


39 | Lawrence Berkeley National Laboratory eere.energy.gov

Cloud Computing Vertualized cloud computing can provide


Dynamically scalable resources over the internet Can be internal or external Can balance different application peak loads Typically achieves high utilization rates

40 | Lawrence Berkeley National Laboratory

eere.energy.gov

Storage Systems and Energy


Powerroughlylineartostoragemodules Storageredundancysignificantlyincreasesenergy Considerlowerenergyhierarchalstorage StorageDeduplication Eliminateunnecessarycopies

41 | Lawrence Berkeley National Laboratory

eere.energy.gov

Use Efficient Power Supplies


LBNL/EPRI measured power supply efficiency
Typical operation

42 | Lawrence Berkeley National Laboratory

eere.energy.gov

Use Efficient Power Supplies Power Supply Units


Mostefficientinthemidrangeofperformancecurves Rightsizeforload Powersupplyredundancyputsoperationloweronthecurve UseEnergyStarorClimateSaverspowersupplies

Source: The Green Grid 43 | Lawrence Berkeley National Laboratory eere.energy.gov

Use Efficient Power Supplies

80PLUSCertificationLevels
Level of Certification Efficiency at Rated Load 115V Internal NonRedundant 20% 80 PLUS 80 PLUS Bronze 80 PLUS Silver 80 PLUS Gold 80 PLUS Platinum 80% 82% 85% 87% n/a 50% 80% 85% 88% 90% n/a 100% 80% 82% 85% 87% n/a 230V Internal Redundant 20% n/a 81% 85% 88% 90% 50% n/a 85% 89% 92% 94% 100% n/a 81% 85% 88% 91%

44 | Lawrence Berkeley National Laboratory

eere.energy.gov

Use Efficient Power Supplies


Power supply savings add up
Annual Savings:Standard vs. High Eff Power Supply
$8,000 $7,000 Annual Savings per Rack $6,000 $5,000 $4,000 $3,000 $2,000 $1,000 $200 300 400 500

Mechanical UPS Power supply

Watts per server


45 | Lawrence Berkeley National Laboratory eere.energy.gov

IT System Efficiency Summary Servers


Enablepower management capabilities! UseEnergyStar Servers

Power Supplies

Reconsider Redundancy Use80PLUSor ClimateSavers products

StorageDevices
Takesuperfluous dataoffline Usethin provisioning technology

Consolidation
Usevirtualization Considercloud services

46 | Lawrence Berkeley National Laboratory

eere.energy.gov

References and Resources

www.ssiforums.org www.80plus.org www.climatesaverscomputing.org http://tc99.ashraetcs.org/

47 | Lawrence Berkeley National Laboratory

eere.energy.gov

48 | Lawrence Berkeley National Laboratory

eere.energy.gov

UsingITtoManageIT
InnovativeApplicationofITinDataCentersforEnergyEfficiency

49 | Lawrence Berkeley National Laboratory

eere.energy.gov

Use IT to Manage IT Energy


Using IT to Save Energy in IT: Most operators lack visibility into their data center environment. An operator cant manage what they dont measure. Goals:
Provide the same level of monitoring and visualization of the physical space that exists for monitoring the IT environment. Measure and track performance metrics. Spot problems before they result in high energy cost or down time.

50 | Lawrence Berkeley National Laboratory

eere.energy.gov

The Importance of Visualization

IT Systems & network administrators have tools for visualization. Useful for debugging, benchmarking, capacity planning, forensics. Data center facility managers have had comparatively poor visualization tools.

51 | Lawrence Berkeley National Laboratory

eere.energy.gov

LBNL Wireless Sensor Installation


LBNL installed 800+ point sensor network. Measures:
Temperature Humidity Pressure (under floor) Electrical power

Presents real-time feedback and historic tracking Optimize based on empirical data, not intuition.

Image: SynapSense

52 | Lawrence Berkeley National Laboratory

eere.energy.gov

Real-time Temperature Visualization by Level

53 | Lawrence Berkeley National Laboratory

eere.energy.gov

Displayed Under-floor Pressure Map

CRAC

CRAC

CRAC

CRAC

CRAC

54 | Lawrence Berkeley National Laboratory

eere.energy.gov

Provided Real-time Feedback During Floor-tile Tuning


Under-Floor Pressure

Removed guesswork by monitoring and using visualization tool.

Rack-Top Temperatures

55 | Lawrence Berkeley National Laboratory

eere.energy.gov

Determined Relative CRAC Cooling Energy Impact


Under-Floor Pressure

Enhanced knowledge of data center redundancy. Turned off unnecessary CRAC units to save energy.

Rack-Top Temperatures

56 | Lawrence Berkeley National Laboratory

eere.energy.gov

Feedback Continues to Help: Note impact of IT cart!


Real-time feedback identified cold aisle air flow obstruction!

57 | Lawrence Berkeley National Laboratory

eere.energy.gov

Real-time PUE Display

58 | Lawrence Berkeley National Laboratory

eere.energy.gov

PUE Calculation Diagram

59 | Lawrence Berkeley National Laboratory

eere.energy.gov

Franchise Tax Board (FTB) Case Study


Description: Description 10,000SqFt 12CRAHcoolingunits 135kWload Challenges: Challenges Overprovisioned Historyoffighting Manualshutoffnotsuccessful Solution: Solution Intelligentsupervisorycontrolsoftwarewithinletair sensing
60 | Lawrence Berkeley National Laboratory eere.energy.gov

FTB Wireless Sensor Network


WSN included 50 wireless temperature sensors (Dust Networks radios) Intelligent control software

61 | Lawrence Berkeley National Laboratory

eere.energy.gov

WSN Smart Software: learns about curtains


CRAH3influenceatstart CRAH3influenceaftercurtains

CRAH03
62 | Lawrence Berkeley National Laboratory eere.energy.gov

WSN Provided Effect on Cold-aisle Temperatures:


90 85 80
rack blanks VFDs

upper limit of ASHRAE recommended range

75 degF 70 65 60 55

lower limit of ASHRAE recommended range

50 A B C D E F Time Interval
63 | Lawrence Berkeley National Laboratory eere.energy.gov

hot aisle isolation

floor tile changes

control software

WSN Software = Dramatic Energy Reduction


60 Main breaker, kW 50 40 30 20 10 0 1/31 AFTER 2/5 2/10 date/time
64 | Lawrence Berkeley National Laboratory eere.energy.gov

BEFORE

DASH software started

2/15

Cost-Benefit Analysis: DASHcostbenefit(sensorsandsoftware) Cost:$56,824 Savings:$30,564 Payback:1.9years Totalprojectcostbenefit Cost:$134,057 Savings:$42,772 Payback:3.1years

65 | Lawrence Berkeley National Laboratory

eere.energy.gov

An Emerging Technology Control data center air conditioning using the built-in IT server-equipment temperature sensors

66 | Lawrence Berkeley National Laboratory

eere.energy.gov

Intel Demonstration Typically, data center cooling devices use return air temperature as the primary control-variable
ASHRAE and IT manufacturers agree IT equipment inlet air temperature is the key operational parameter Optimum control difficult

Server inlet air temperature is available from ICT network


Intelligent Platform Management Interface (IPMI) or Simple network management protocol (SNMP)

67 | Lawrence Berkeley National Laboratory

eere.energy.gov

Intel Data Center HVAC:

68 | Lawrence Berkeley National Laboratory

eere.energy.gov

Intel Demonstration Demonstration showed:


Servers can provide temperature data to facilities control system Given server inlet temperature, facility controls improved temperature control and efficiency Effective communications and control accomplished without significant interruption or reconfiguration of systems

69 | Lawrence Berkeley National Laboratory

eere.energy.gov

Dashboards
Dashboards can display multiple systems information for monitoring and maintaining data center performance

70 | Lawrence Berkeley National Laboratory

eere.energy.gov

Why Dashboards?
Provide IT and HVAC system performance at a glance Identify operational problems Baseline energy use and benchmark performance View effects of changes Share information and inform integrated decisions

71 | Lawrence Berkeley National Laboratory

eere.energy.gov

Another Dashboard Example

72 | Lawrence Berkeley National Laboratory

eere.energy.gov

Use IT to Manage IT: Summary

Evaluate monitoring systems to enhance operations and controls Install dashboards to manage and sustain energy efficiency.

73 | Lawrence Berkeley National Laboratory

eere.energy.gov

74 | Lawrence Berkeley National Laboratory

eere.energy.gov

EnvironmentalConditions

75 | Lawrence Berkeley National Laboratory

eere.energy.gov

Environmental Conditions

What are the main HVAC Energy Drivers?


IT Load Climate Room temperature and humidity
Most data centers are overcooled and have humidity control issues Human comfort should not be a driver

76 | Lawrence Berkeley National Laboratory

eere.energy.gov

Equipment Environmental Specification


Air Inlet to IT Equipment is the important specification to meet

Outlet temperature is not important to IT Equipment

77 | Lawrence Berkeley National Laboratory

eere.energy.gov

Environmental Conditions
ASHRAEs Thermal Guidelines:
Provide common understanding between IT and facility staff. Endorsed by IT manufacturers Enables large energy savings - especially when using economizers. Recommends temperature range of 18 C to 27 C (80.6F) with allowable much higher New (2011) ASHRAE Guidelines
Six classes of equipment identified with wider allowable ranges from 32 C to 45 C (113F). Provides more justification for operating above the recommended limits (in the allowable range) Provides wider humidity ranges

78 | Lawrence Berkeley National Laboratory

eere.energy.gov

Purpose of the Ranges

The recommended range is a statement of reliability. For extended periods of time, the IT manufacturers recommend that data centers maintain their environment within these boundaries. The allowable range is a statement of functionality. These are the boundaries where IT manufacturers test their equipment to verify that the equipment will function.

79 | Lawrence Berkeley National Laboratory

eere.energy.gov

2011 ASHRAE Thermal Guidelines

2011ThermalGuidelinesforDataProcessingEnvironments ExpandedDataCenterClassesandUsage 20 Guidance.WhitepaperpreparedbyASHRAETechnicalCommitteeTC 9.9


80 | Lawrence Berkeley National Laboratory eere.energy.gov

2011 ASHRAE Allowable Ranges

DryBulbTemperature
81 | Lawrence Berkeley National Laboratory eere.energy.gov

2011 ASHRAE Thermal Guidelines

ASHRAEs key conclusion when considering potential for increased failures at higher (allowable) temperatures: For a majority of US and European cities, the air-side and water-side economizer projections show failure rates that are very comparable to a traditional data center run at a steady state temperature of 20C.

82 | Lawrence Berkeley National Laboratory

eere.energy.gov

ASHRAE Liquid Cooling Guidelines


ASHRAE and a DOE High Performance Computer (HPC) user group have developing a white paper for liquid cooling Three temperature standards defined based on three mechanical system configurations:
Chilled water provided by a chiller (with or without a tower side economizer Cooling water provided a cooling tower with possible chiller backup Cooling water provided by a dry cooler with possible backup using evaporation

83 | Lawrence Berkeley National Laboratory

eere.energy.gov

Summary Recommended Limits

Liquid Cooling Class

Main Cooling Equipment

Supplemental Cooling Equipment

Building Supplied Cooling Liquid Maximum Temperature 17C (63F) 32C (89F)

L1

Cooling Tower and Chiller Cooling Tower

Not Needed

L2

Chiller

L3

Dry Cooler

Spray Dry Cooler, or Chiller

43C (110F)

84 | Lawrence Berkeley National Laboratory

eere.energy.gov

Example Server Specification

85 | Lawrence Berkeley National Laboratory

eere.energy.gov

Environmental Conditions: Summary


Most computer room air conditioners (CRACs) are controlled based on the return air temperature this needs to change A cold data center = efficiency opportunity Perceptions, based on old technology lead to cold data centers with tight humidity ranges this needs to change. Many IT manufacturers design for harsher conditions than ASHRAE guidelines Design Data Centers for IT equipment performance - not people comfort. Address air management issues first

86 | Lawrence Berkeley National Laboratory

eere.energy.gov

87 | Lawrence Berkeley National Laboratory

eere.energy.gov

Airflow Management

88 | Lawrence Berkeley National Laboratory

eere.energy.gov

Air Management: The Early Days at LBNL


It was cold but hot spots were everywhere High flow tiles reduced air pressure

Fans were used to redirect air


89 | Lawrence Berkeley National Laboratory eere.energy.gov

Air Management

Typically, more air circulated than required Air mixing and short circuiting leads to:
Low supply temperature Low Delta T

Use hot and cold aisles Improve isolation of hot and cold aisles
Reduce fan energy Improve air-conditioning efficiency Increase cooling capacity

Hot aisle / cold aisle configuration decreases mixing of intake & exhaust air, promoting efficiency.
eere.energy.gov

90 | Lawrence Berkeley National Laboratory

Benefits of Hot- and Cold-aisles


Improves equipment intake air conditions by separating cold from hot airflow.
Preparation: Arrangingracks withalternatinghot andcoldaisles. Supplycoldairto frontoffacing servers. Hotexhaustair exitsintorear aisles.
Graphics courtesy of DLB Associates

91 | Lawrence Berkeley National Laboratory

eere.energy.gov

Reduce Bypass and Recirculation


BypassAir/ShortCircuiting Recirculation

Leakage

Wastescoolingcapacity.
92 | Lawrence Berkeley National Laboratory

Increasesinlettemperaturetoservers.
eere.energy.gov

Maintain Raised-Floor Seals


Maintain sealing of all potential leaks in the raised floor plenum.

Unsealed cable penetration

Sealed cable penetration

93 | Lawrence Berkeley National Laboratory

eere.energy.gov

Manage Blanking Panels Any openingwilldegradetheseparationofhotandcoldair


maintainserverblankingandsidepanels.
One12 blankingpaneladded Temperaturedropped~20
top of rack

Inlet

Equip. Rack

Outlet
Air Recirculation

SynapSense

middle of rack

Inlet

Outlet

SynapSense

94 | Lawrence Berkeley National Laboratory

eere.energy.gov

Reduce Airflow Restrictions & Congestion

ConsiderTheImpactThatCongestion CongestedFloor& HasOnTheAirflowPatterns CeilingCavities


95 | Lawrence Berkeley National Laboratory

EmptyFloor& CeilingCavities
eere.energy.gov

Resolve Airflow Balancing


BALANCING is required to optimize airflow. Rebalancing needed with new IT or HVAC equipment Locate perforated floor tiles only in cold aisles

Underfloorpressuremapwithwirelesssensors
96 | Lawrence Berkeley National Laboratory eere.energy.gov

Results: Tune Floor Tiles


under-floor pressures

SynapSense

Too many permeable floor tiles if airflow is optimized


under-floor pressure up rack-top temperatures down data center capacity increases

rack-top temperatures

Measurement and visualization assisted tuning process


SynapSense

97 | Lawrence Berkeley National Laboratory

eere.energy.gov

Optimally Locate CRAC/CRAHs


LocateCRAC/CRAHunitsatendsofHotAisles
AirHandlingUnits

98 | Lawrence Berkeley National Laboratory

eere.energy.gov

Typical Temperature Profile with Under-floor Supply


Hotaircomesaroundtopandsidesofservers

Too hot

Too hot

Just right

Too cold
Coldairescapesthroughendsofaisles

Elevationatacoldaislelookingatracks
There are numerous references in ASHRAE. See for example V. Sorell et al; Comparison of Overhead and Underfloor Air Delivery Systems in a Data Center Environment Using CFD Modeling; ASHRAE Symposium Paper DE-05-11-5; 2005
99 | Lawrence Berkeley National Laboratory eere.energy.gov

Next step: Air Distribution Return-Air Plenum

100 | Lawrence Berkeley National Laboratory

eere.energy.gov

Return Air Plenum


Overhead plenum converted to hot-air return Return registers placed over hot aisle CRAC intakes extended to overhead

Before

After

101 | Lawrence Berkeley National Laboratory

eere.energy.gov

Return-Air Plenum Connections

Return air duct on top of CRAC unit connects to the return air plenum.
102 | Lawrence Berkeley National Laboratory eere.energy.gov

Isolate Hot Return

Duct on top of each rack connects to the return air plenum.


103 | Lawrence Berkeley National Laboratory eere.energy.gov

Other Isolation Options

Physical barriers enhance separate hot and cold airflow. Barriers placement must comply with fire codes. Curtains, doors, or lids have been used successfully.

Doors

Lid

Open
104 | Lawrence Berkeley National Laboratory

Semi-enclosed cold aisle

Enclosed cold aisle


eere.energy.gov

Adding Air Curtains for Hot/Cold Isolation

105 | Lawrence Berkeley National Laboratory

eere.energy.gov

Isolate Cold and Hot Aisles

95-105F vs. 60-70F

70-80F vs. 45-55F


106 | Lawrence Berkeley National Laboratory eere.energy.gov

Cold Aisle Airflow Containment Example

LBNLColdAisleContainmentstudyachievedfanenergysavingsof ~75%

107 | Lawrence Berkeley National Laboratory

eere.energy.gov

Fan Energy Savings


Isolationcansignificantly reduceairmixingandhence flow Fanspeedcanbereduced andfanpoweris proportionaltothecubeof theflow. Fanenergysavingsof70 80%ispossiblewithvariable airvolume(VAV)fansin CRAH/CRACunits(orcentral AHUs)
Without Enclosure With Enclosure Without Enclosure

108 | Lawrence Berkeley National Laboratory

eere.energy.gov

LBNL Air Management Demonstration


Better airflow management permits warmer supply temperatures!
Cold Aisle NW - PGE12813
90

Baseline
85 80 75 Temperature (deg F) 70 65 60 55 50 45 40 6/13/2006 12:00 Low Med High 6/14/2006 0:00 6/14/2006 12:00

Alternate 1 Setup

ASHRAE Setup Recommended Range


Alternate 2

6/15/2006 0:00 Time

6/15/2006 12:00

Ranges during demonstration


6/16/2006 0:00

6/16/2006 12:00

109 | Lawrence Berkeley National Laboratory

eere.energy.gov

Isolating Hot and Cold Aisles

Energy intensive IT equipment needs good isolation of cold inlet and hot discharge. Supply airflow can be reduced if no mixing occurs. Overall temperature can be raised if air is delivered without mixing. Cooling systems and economizers use less energy with warmer return air temperatures. Cooling capacity increases with warmer air temperatures.

110 | Lawrence Berkeley National Laboratory

eere.energy.gov

Efficient Alternatives to Underfloor Air Distribution


Localized air cooling systems with hot and cold isolation can supplement or replace under-floor systems (raised floor not required!) Examples include: Row-based cooling units Rack-mounted heat exchangers Both options Pre-engineer hot and cold isolation

111 | Lawrence Berkeley National Laboratory

eere.energy.gov

Example Local Row-Based Cooling Units

112 | Lawrence Berkeley National Laboratory

eere.energy.gov

Air Distribution Rack-Mounted Heat Exchangers

113 | Lawrence Berkeley National Laboratory

eere.energy.gov

Review: Airflow Management Basics


Air management techniques: Seal air leaks in floor (e.g. cable penetrations) Prevent recirculation with blanking panels in racks Manage floor tiles (e.g. no perforated tiles in hot aisle) Improve isolation of hot and cold air (e.g. return air plenum, curtains, or complete isolation) Impact of good isolation: Supply airflow reduced
Fan savings up to 75%+

Overall temperature can be raised


Cooling systems efficiency improves Greater opportunity for economizer (free cooling)

Cooling capacity increases


114 | Lawrence Berkeley National Laboratory eere.energy.gov

115 | Lawrence Berkeley National Laboratory

eere.energy.gov

Cooling systems

116 | Lawrence Berkeley National Laboratory

eere.energy.gov

Air Management and Cooling Efficiency Linking good air management and an optimized cooling system:
Improved efficiencies Increased cooling capacity More hours for air-side and water-side free cooling Lower humidification/dehumidification energy Reduced fan energy

117 | Lawrence Berkeley National Laboratory

eere.energy.gov

HVAC Systems Overview

118 | Lawrence Berkeley National Laboratory

eere.energy.gov

Computer Room Air Conditioners (CRACs) and Air Handlers (CRAHs)

CRAC units Fan, direct expansion (DX) coil, and refrigerant compressor. CRAH units Fan and chilled water coil Typically in larger facilities with a chiller plant Both often equipped with humidifiers and reheat for dehumidification Often independently controlled Tight ranges and poor calibration lead to fighting

119 | Lawrence Berkeley National Laboratory

eere.energy.gov

DX (or AC) units reject heat outside


Dry-Cooler DX Air-Cooled DX

Evaporatively-Cooled DX
R n era g efri t

Water-Cooled DX
Water

120 | Lawrence Berkeley National Laboratory

eere.energy.gov

Computer Room Air Handling (CRAH) units using Chilled-Water


Air-Cooled Chiller

Cooling Tower Evap-Cooled Chiller CRAH


W at e r

r ate W

Water-Cooled Chiller
W

e at

Water
121
121 | Lawrence Berkeley National Laboratory eere.energy.gov

OptimizetheChillerPlant
Have a plant (vs. distributed cooling) Use warm water cooling (multi-loop) Size cooling towers for free cooling Integrate controls and monitor efficiency of all primary components Thermal storage Utilize variable speed drives on:
Fans Pumps Towers Chillers

PrimaryCHW Pump 10% SecondaryCHW Pump 4%

CWPump 13%

CoolingTower 5% Chiller 68%

122 | Lawrence Berkeley National Laboratory

eere.energy.gov

Select Efficient Chillers


CompressorkW/ton 25% 0.69 0.51 0.34 50% 0.77 0.41 0.30 75% 0.96 0.45 0.43 100% 1.25 0.55 0.57

Chiller 400TonAirCooled 1200TonWaterCooledw/oVFD 1200TonWaterCooledwithaVFD

123 | Lawrence Berkeley National Laboratory

eere.energy.gov

Increase Temperature of Chiller Plant


1 0.9

0.8

1,000 Ton Chiller operating at 42 F CHWS Temp and 70 F CWS Temp

0.7

Efficiency (kW/ton)

0.6

0.5

0.4

0.3

0.2

1,000 Ton Chiller operating at 60 F CHWS Temp and 70 F CWS Temp

0.1

0 200 300 400 500 600 700 800 900 1000

Tons
Data provided by York International Corporation. 124 | Lawrence Berkeley National Laboratory eere.energy.gov

Moving(Back)toLiquidCooling
As heat densities rise, liquid solutions become more attractive: Volumetric heat capacity comparison
[5380 m3]

~ 190,000 cubic foot blimp


[1.5 m3]

Water Air
125 | Lawrence Berkeley National Laboratory eere.energy.gov

Why Liquid Cooling?


Heat removal efficiency increases as liquid gets closer to the heat source Liquids can provide cooling with higher temperature coolant
Improved cooling efficiency Increased economizer hours Greater potential use of waste heat

Reduced transport energy:

126 | Lawrence Berkeley National Laboratory

eere.energy.gov

InRowLiquidCooling

Graphics courtesy of Rittal

127 | Lawrence Berkeley National Laboratory

eere.energy.gov

In Rack Liquid Cooling


Racks with integral coils and full containment

Cooling Coil

128 | Lawrence Berkeley National Laboratory

eere.energy.gov

RearDoorLiquidCooling
Rear Door (open) Rear Doors (closed)

Liquid Cooling Connections

InsiderackRDHx,open90
129 | Lawrence Berkeley National Laboratory eere.energy.gov

OnBoardCooling

130 | Lawrence Berkeley National Laboratory

eere.energy.gov

Chill-off 2 Evaluation of Liquid Cooling Solutions

131 | Lawrence Berkeley National Laboratory

eere.energy.gov

Use Free Cooling:

Cooling without Compressors:

OutsideAirEconomizers WatersideEconomizers Letsgetridofchillersindatacenters

132 | Lawrence Berkeley National Laboratory

eere.energy.gov

Outside-Air Economizers
Advantages
Lower energy use Added reliability (backup for cooling system) Space. Dust
Not a concern with Merv 13 filters

Potential Issues

Gaseous contaminants
Not widespread Impacts normally cooled data centers as well
http://cooling.thegreengrid.org/namerica/WEB_APP/calc_index.html

Shutdown or bypass if smoke is outside data center.


eere.energy.gov

133 | Lawrence Berkeley National Laboratory

UCs Computational Research and Theory (CRT) Facility

134 | Lawrence Berkeley National Laboratory

eere.energy.gov

Free Cooling Outside Air Based


1. Blue = recommended supply 2. Green can become blue mixing return and outdoor air 3. Most of the conditions below and right of blue can be satisfied w/ evaporative cooling 4. Hot and humid hours will enter the allowable range or require compressor air conditioning
W B = 50

Ann ual Psychro m etric C hart o f O akland, C A


(relative hum idity lines are stepped by 10% , w etbulb lines by 10 degrees F)

W B = 70

W B = 60

W B = 40

W B = 30

10

15

20

25

30

35

40

45

50

55

60

65

70

75

80

85

90

95

100

105

D rybulb Tem p (F)


135 | Lawrence Berkeley National Laboratory eere.energy.gov

System Design Approach:

Air-Side Economizer (93% of hours) Direct Evaporative Cooling for Humidification/ precooling Low Pressure-Drop Design (1.5 total static peak)
Hours of Operation
Mode 1 Mode 2 Mode 3 Mode 4 Mode 5 total 100% Economiser OA + RA Humidification Humid + CH cooling CH only 2207 5957 45 38 513 8760 hrs hrs hrs hrs hrs hrs

136 | Lawrence Berkeley National Laboratory

eere.energy.gov

Water Cooling:

Tower side economizer Four pipe system Waste heat reuse Headers, valves and caps for modularity and flexibility

Predicted CRT Performance: Annual PUE = 1.1

137 | Lawrence Berkeley National Laboratory

eere.energy.gov

WaterSideEconomizers
Advantages Cost effective in cool and dry climates Often easier retrofit Added reliability (backup in the event of chiller failure). No contamination questions

138 | Lawrence Berkeley National Laboratory

eere.energy.gov

IntegratedWaterSide Economizer

Twb 41F [5C]

You can use either a control valve or pump

44F [7C] 46F [8C]

44F [7C] <60F [16C] 49F [9C]

[7C] 44F
139 | Lawrence Berkeley National Laboratory

60F [16C]

Make sure that the heat exchanger is in series (not parallel with chillers on CHW side)

eere.energy.gov

PotentialforTowerCooling
OnBoardCooling RearDoorCooling WaterSideEconomizer

140 | Lawrence Berkeley National Laboratory

eere.energy.gov

LBNL Example: Rear Door Cooling


Used instead of adding CRAC units Rear door water cooling with toweronly (or central chiller plant in series).
Both options significantly more efficient than existing direct expansion (DX) CRAC units.

141 | Lawrence Berkeley National Laboratory

eere.energy.gov

Improve Humidity Control:

Eliminate inadvertent dehumidification


Computer load is sensible only

Use ASHRAE allowable humidity ranges


Many manufacturers allow even wider humidity range

Defeat equipment fighting


Coordinate controls

Disconnect and only control humidity of makeup air or one CRAC/CRAH unit Entirely disconnect (many have!)

142 | Lawrence Berkeley National Laboratory

eere.energy.gov

CostofUnnecessary Humidification
AC 005 AC 006 AC 007 AC 008 AC 010 AC 011 Min Max Avg Visalia Probe Temp RH Tdp 84.0 27.5 81.8 28.5 72.8 38.5 80.0 31.5 77.5 32.8 78.9 31.4 72.8 84.0 79.2 27.5 38.5 31.7 Temp 47.0 46.1 46.1 47.2 46.1 46.1 46.1 47.2 46.4 76 55 70 74 68 70 55.0 76.0 68.8 RH 32.0 51.0 47.0 43.0 45.0 43.0 32.0 51.0 43.5 CRAC Unit Panel Tdp Mode 44.1 Cooling 37.2 Cooling & Dehumidification 48.9 Cooling 50.2 Cooling & Humidification 45.9 Cooling 46.6 Cooling & Humidification 37.2 50.2 45.5

Humidity down ~2% CRAC power down 28%

143 | Lawrence Berkeley National Laboratory

eere.energy.gov

Cooling Takeaways

Use efficient equipment and a central plant (e.g. chiller/CRAHs) vs. CRAC units Use centralized controls on CRAC/CRAH units
Prevent simultaneous humidifying and dehumidifying Optimize sequencing and staging

Move to liquid cooling (room, row, rack, chip) Consider VSDs on fans, pumps, chillers, and towers Use air- or water-side economizers where possible. Expand humidity range and improve humidity control (or disconnect).
144 | Lawrence Berkeley National Laboratory eere.energy.gov

145 | Lawrence Berkeley National Laboratory

eere.energy.gov

ElectricalSystems

146 | Lawrence Berkeley National Laboratory

eere.energy.gov

Electrical system end use Orange bars

Courtesy of Michael Patterson, Intel Corporation

147 | Lawrence Berkeley National Laboratory

eere.energy.gov

Electrical Distribution
Every power conversion (AC-DC, DC-AC, AC-AC) loses some energy and creates heat Efficiency decreases when systems are lightly loaded Distributing higher voltage is more efficient and can save capital cost (conductor size is smaller) Power supply, uninterruptible power supply (UPS), transformer, and PDU efficiency varies carefully select Lowering distribution losses also lowers cooling loads

148 | Lawrence Berkeley National Laboratory

eere.energy.gov

Electrical Systems Points of Energy Inefficiency

149 | Lawrence Berkeley National Laboratory

eere.energy.gov

UPS, Transformer, & PDU Efficiency


Factory Measurements of UPS Efficiency
(tested using linear loads)

Efficiency

Efficiencies vary with system design, equipment, and load Redundancies impact efficiency

100%

95% 90%

85%

80% 75%

Flywheel UPS Double-Conversion UPS Delta-Conversion UPS

70% 0% 10% 20% 30% 40% 50% 60% 70% 80% 90% 100% Percent of Rated Active Power Load

150 | Lawrence Berkeley National Laboratory

eere.energy.gov

Measured UPS efficiency

Redundant Operation

151 | Lawrence Berkeley National Laboratory

eere.energy.gov

LBNL/EPRI Measured Power Supply Efficiency

Typical operation

152 | Lawrence Berkeley National Laboratory

eere.energy.gov

Redundancy
Redundancy Understand what redundancy costs and what it gets you is it worth it? Does everything need the same level? Different strategies have different energy penalties (e.g. 2N vs. N+1) Its possible to more fully load UPS systems and achieve desired redundancy Redundancy in electrical distribution puts you down the efficiency curve

153 | Lawrence Berkeley National Laboratory

eere.energy.gov

From Utility Power to the Chip Multiple Electrical Power Conversions


AC DC AC DC
5V 12V Internal Drive External Drive I/O 12V DC/DC 12V 3.3V 3.3V Voltage Regulator Modules DC/DC 1.5/2. 5V 1.1V1.85V Memory Controller Processor SDRAM Graphics Controller

Bypass In Battery/Charger Rectifier Inverter Out


AC/DC PWM/PFC Switcher Unregulated DC To Multi Output Regulated DC Voltages

3.3V

AC/DC Multi output Power Supply

Uninterruptible Power Supply (UPS) Power Distribution Unit (PDU)

Server

154 | Lawrence Berkeley National Laboratory

eere.energy.gov

Emerging Technology: DC Distribution

380V.DCpowerdistribution
Eliminateseveralconversions Alsouseforlighting,andvariablespeeddrives Usewithonsitegenerationincludingrenewable energysources

Back up -

batteries or flywheel

155 | Lawrence Berkeley National Laboratory

eere.energy.gov

Standby generation loss


Losses Heaters Battery chargers Transfer switches Fuel management systems Opportunities to reduce or eliminate losses Heaters (many operating hours) use more electricity than the generator produces (few operating hours) Check with generator manufacturer on how to reduce the energy consumption of block heaters (e.g. temperature settings and control)

156 | Lawrence Berkeley National Laboratory

eere.energy.gov

Standby generator heater

157 | Lawrence Berkeley National Laboratory

eere.energy.gov

Data center lighting Lights are on and nobodys home


Switch off lights in unused/unoccupied areas or rooms (UPS, Battery, S/Gear, etc) Lighting controls such as occupancy sensors are well proven

Small relative benefit but easy to accomplish also saves HVAC energy Use energy efficient lighting Lights should be located over the aisles

158 | Lawrence Berkeley National Laboratory

eere.energy.gov

Motors and Drives Since most cooling system equipment operates continuously, premium efficiency motors should be specified everywhere Variable speed drives should be used
Chillers Pumps Air handler fans Cooling tower fans

159 | Lawrence Berkeley National Laboratory

eere.energy.gov

Key Electrical Takeaways


Choose highly efficient components and configurations Reduce power conversion (AC-DC, DC-AC, AC-AC, DCDC) Consider the minimum redundancy required as efficiency decreases when systems are lightly loaded Use higher voltage

160 | Lawrence Berkeley National Laboratory

eere.energy.gov

161 | Lawrence Berkeley National Laboratory

eere.energy.gov

Resources

162 | Lawrence Berkeley National Laboratory

eere.energy.gov

Resources
Advanced Manufacturing Office Tool suite & metrics for baselining Training Qualified specialists Case studies Recognition of high energy savers R&D - technology development GSA Workshops Quick Start Efficiency Guide Technical Assistance EPA Metrics Server performance rating & ENERGY STAR label Data center benchmarking
163 | Lawrence Berkeley National Laboratory

Federal Energy Management Program Workshops Federal case studies Federal policy guidance Information exchange & outreach Access to financing opportunities Technical assistance

Industry Tools Metrics Training Best practice information Best-in-Class guidelines IT work productivity standard

eere.energy.gov

ResourcesfromDOEFederalEnergy ManagementProgram(FEMP)
BestPracticesGuide BenchmarkingGuide DataCenter ProgrammingGuide TechnologyCase StudyBulletins Procurement Specifications ReportTemplates ProcessManuals QuickStartGuide

164 | Lawrence Berkeley National Laboratory

eere.energy.gov

DOE Advanced Manufacturing Office (AMO - was ITP)


DOEs AMOdatacenterprogramprovidestoolsand resourcestohelpownersandoperators:
DCProSoftwareToolSuite Toolstodefinebaselineenergyuseandidentifyenergysaving opportunities Informationproducts Manuals,casestudies,andotherresources Enduserawarenesstraining WorkshopsinconjunctionwithASHRAE DataCenterEnergyPractitioner(DCEP)certificateprogram Qualificationofprofessionalstoevaluateenergyefficiencyopportunities Research,development,anddemonstrationofadvancedtechnologies

165 | Lawrence Berkeley National Laboratory

eere.energy.gov

DOE DC Pro Tool Suite


HighLevelOnLineProfilingandTrackingTool Overallefficiency(PowerUsageEffectiveness[PUE]) Endusebreakout Potentialareasforenergyefficiencyimprovement Overallenergyusereductionpotential

InDepthAssessmentTools Savings
AirManagement Hot/cold separation Environmental conditions RCIandRTI ElectricalSystems UPS PDU Transformers Lighting Standbygen. ITEquipment Servers Storage& networking Software Cooling Airhandlers/ conditioners Chillers,pumps, fans Freecooling

166 | Lawrence Berkeley National Laboratory

eere.energy.gov

Data Center Energy Practitioner (DCEP) Program


A certificate process for energy practitioners qualified to evaluate energy consumption and efficiency opportunities in Data Centers. Key objective: Raise the standards of assesors Provide greater repeatability and credibility of recommendations. Target groups include: Data Center personnel (in-house experts) Consulting professionals (for-fee consultants)

167 | Lawrence Berkeley National Laboratory

eere.energy.gov

Data Center Energy Practitioner (DCEP) Program

Training&CertificateDisciplines/Levels/Tracks
Level 1 Generalist: Prequalification, Training and Exam on All Disciplines + Assessment Process + DC Pro Profiling Tool Level 2 Specialist: Prequalification, Training and Exam on Select Disciplines + Assessment Process + DC Pro System Assessment Tools
IT-Equipment, Air-Management, Cooling Systems, and Electrical Systems

Cooling Systems

Air Management

Electrical Systems

IT Equipment

HVAC (available)

IT (2012) Two Tracks: Certificate track (training + exam) Training track (training only)

168 | Lawrence Berkeley National Laboratory

eere.energy.gov

Energy Star

A voluntary public-private partnership program


Buildings Products

169 | Lawrence Berkeley National Laboratory

eere.energy.gov

Energy Star Data Center Activities


ENERGY STAR Datacenter Rating Tool Build on existing ENERGY STAR platform with similar methodology (1-100 scale) Usable for both stand-alone and data centers housed within another buildings Assess performance at building level to explain how a building performs, not why it performs a certain way ENERGY STAR label to data centers with a rating of 75+ Rating based on data center infrastructure efficiency Ideal metric would be measure of useful work/energy use. Industry still discussing how to define useful work. Energy STAR specification for servers Evaluating enterprise data storage, UPS, and networking equipment for Energy STAR product specs
170 | Lawrence Berkeley National Laboratory eere.energy.gov

Resources

http://www1.eere.energy.gov/femp/program/data_center.html

http://hightech.lbl.gov/datacenters.html

http://www.energystar.gov/index.cfm?c=prod_development.

server_efficiency

http://www1.eere.energy.gov/industry/datacenters/

171 | Lawrence Berkeley National Laboratory

eere.energy.gov

172 | Lawrence Berkeley National Laboratory

eere.energy.gov

WorkshopSummary
Best Practices

173 | Lawrence Berkeley National Laboratory

eere.energy.gov

Results at LBNL

LBNLs Legacy Data Center:


Increased IT load
>50% (~180kW) increase with virtually no increase in infrastructure energy use

Raised room temperature 8+ degrees AC unit turned off


(1) 15 ton now used as backup

Decreased PUE from 1.65 to 1.45


30% reduction in infrastructure energy

More to come!
174 | Lawrence Berkeley National Laboratory eere.energy.gov

More to Come

NextStepsforLBNLs LegacyDataCenter
Integrate CRAC controls with wireless monitoring system Retrofit CRACs w/ VSD
Small VAV turndown, yields big energy savings

Improve containment (overcome curtain problems) Increase liquid cooling (HP in-rack, and APC inrow) Increase free cooling (incl. tower upgrade)
175 | Lawrence Berkeley National Laboratory eere.energy.gov

DataCenterBestPractices Summary 1. 2. 3. 4. 5. 6. 7. 8. Measure and Benchmark Energy Use Identify IT Opportunities Use IT to Control IT Manage Airflow Optimize Environmental Conditions Evaluate Cooling Options Improve Electrical Efficiency Implement Energy Efficiency Measures
eere.energy.gov

176 | Lawrence Berkeley National Laboratory

DataCenterBestPractices 1. Measure and Benchmark Energy Use


Use metrics to measure efficiency Benchmark performance Establish continual improvement goals

177 | Lawrence Berkeley National Laboratory

eere.energy.gov

DataCenterBestPractices 2. Identify IT Opportunities


Specify efficient servers (incl. power supplies) Virtualize Refresh IT equipment Turn off unused equipment.

178 | Lawrence Berkeley National Laboratory

eere.energy.gov

DataCenterBestPractices 3. Use IT to Control IT Energy


Evaluate monitoring systems to enhance real-time management and efficiency. Use visualization tools (e.g. thermal maps). Install dashboards to manage and sustain energy efficiency.

179 | Lawrence Berkeley National Laboratory

eere.energy.gov

DataCenterBestPractices 4. Manage Airflow


Implement hot and cold aisles Fix leaks Manage floor tiles Isolate hot and cold airstreams.

180 | Lawrence Berkeley National Laboratory

eere.energy.gov

DataCenterBestPractices 5. Optimize Environmental Conditions


Follow ASHRAE guidelines or manufacturer specifications Operate to maximum ASHRAE recommended range. Anticipate servers will occasionally operate in allowable range. Minimize or eliminate humidity control

181 | Lawrence Berkeley National Laboratory

eere.energy.gov

DataCenterBestPractices 6. Evaluate Cooling Options


Use centralized cooling system Maximize central cooling plant efficiency Provide liquid-based heat removal Compressorless cooling

182 | Lawrence Berkeley National Laboratory

eere.energy.gov

DataCenterBestPractices 7. Improve Electrical Efficiency


Select efficient UPS systems and topography Examine redundancy levels Increase voltage distribution and reduce conversions

183 | Lawrence Berkeley National Laboratory

eere.energy.gov

DataCenterBestPractices 8. Implement Standard Energy Efficiency Measures


Install premium efficiency motors Upgrade building automation control system
Integrate CRAC controls

Variable speed everywhere Optimize cooling tower efficiency

184 | Lawrence Berkeley National Laboratory

eere.energy.gov

DataCenterBestPractices

Most importantly Get IT and Facilities People Talking and working together as a team!!!

185 | Lawrence Berkeley National Laboratory

eere.energy.gov

186 | Lawrence Berkeley National Laboratory

eere.energy.gov

ContactInformation:
Dale Sartor, P.E.

Lawrence Berkeley National Laboratory Applications Team MS 90-3111 University of California Berkeley, CA 94720 DASartor@LBL.gov (510) 486-5988 http://Ateam.LBL.gov

187 | Lawrence Berkeley National Laboratory

eere.energy.gov

Das könnte Ihnen auch gefallen