Sie sind auf Seite 1von 13

Virtualisation @ CIT

HEAnet National Networking Conference 2009

Aidan McDonald
Programme

1. Introduction

2. VMWare ESX

3. SAN and ESX

4. Today and Future

5. Pitfalls

6. Additional work in CIT


1. Introduction
Introduction

 Instead of CIT planning a Virtualisation project,


CIT’s Infrastructure evolved from various small
IT initiatives in the college.
 Now, it represents a sizeable virtualised
infrastructure and our reliance and dependency
on it has grown enormously.
 Administration, student services, IT services,
academic departments all now depend on this
infrastructure.
In the beginning ….

 Cisco Academy’s Netlab


System - introduced support
for virtual PCs in the pods
using VMWare GSX, capable
of supporting 8 virtual PCs per
server.
 VM Workstation used in
network labs, to allow students
experience installing OS’s and
configuring administration
settings without effecting
underlying lab PC.
 Approached VMWare to
customise ESX for Netlab –
introduced to our neighbours in
Ovens, MD ex-CIT graduate
VMware ESX

CIT’s Cloud
 Initially, installed ESX for
research, had no interest in
large scale virtualisation
agenda.
 CIT became a VMware
university allowing academic
licensing to be used for
creating ESX servers.
ESX SERVER 1  Research includes
Pow er Edge
2950

virtualising an existing server


– i.e. our TACACS server, as
well as creating some new
servers for NM (e.g. Nagios,
Squid, DNS);
Computer Dept. SAN

CIT’s Cloud  Initial ESX server was


running out of local
disk space.
VM ESX CLUSTER
Virtual Infrastructure Center Server
ESX Management Server

 School of Computing
P o we r E d g e 2950

2 5

0 3 6

1 4 7

invested in a SAN.
ESX SERVER 1

PowerEdge
2950

 EMC Clarion CX300


SAN  Initially used in
DISK STORE 2 EMC2

computing as storage
DISK STORE 1 EMC2
Fiber channel switch

CIT’s Cloud
 A colleague in Finance
VM ESX CLUSTER
managed an internal WWW
Virtual Infrastructure Center Server server which crashed.
ESX Management Server
0
Power Edge 29 50

3
5

6
 Convinced him to buy fibre
channel switch instead of new
1 4 7

ESX SERVER 1

PowerEdge
2950
server, if we hosted
ESX SERVER 2 finance.cit.ie – 1st Production
Pow erEdge
29 5 0

server
 An upgrade to phone
C I SC O

STATUS
P/S
FAN
CONSOLE MGMT 10/100
M
DS 9124 MULTILAYER FABR
IC SWITCH

management server required


new server – we took this
1 2 3 4 5 6 7 8 8 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24
RESET DS-C9124-K9

M
DS 9124 MULTILAYER FABR
IC SWITCH

C I SC O
CONSOLE MGMT 10/100
STATUS

server as our 2nd ESX server


P/S
FAN

1 2 3 4 5 6 7 8 8 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24
RESET DS-C9124-K9

and virtualised.
SAN

DISK STORE 2 EMC 2

 Provided high speed link to


DISK STORE 1
SAN for storage and retrieval
of data from VMware servers.
EMC 2
Value of VM – AD Project

CIT’s Cloud
 College undertook to
Virtual Infrastructure Center Server
VM ESX CLUSTER
centralise AD – 13
ESX Management Server
Power Edge 29 50

2 5
different departmental
ADs to two central ones –
0 3 6

1 4 7

Staff and Student


ESX SERVER 1

PowerEdge
29 5 0

ESX SERVER 2
 Impossible to build new,
parallel infrastructure
Pow erEdge
29 5 0

ESX SERVER 3

while old existed


 Virtualised the whole old
PowerEdge
R9 0 0

MDS 9124 MU
LTILAYER FABRIC SWITCH

C I SC O
CONSOLE MGMT 10/100

AD estate, all 13 ADs,


STATUS
P/S
FAN

1 2 3 4 5 6 7 8 8 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24
RESET D
S-C9124-K9

MDS 9124 MU
LTILAYER FABRIC SWITCH

C I SC O
CONSOLE MGMT 10/100
STATUS
P/S
FAN

and in this test


1 2 3 4 5 6 7 8 8 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24
RESET D
S-C9124-K9

SAN

DISK STORE 2 EMC 2


environment, mocked up
the AD migration
DISK STORE 1 EMC 2

 Implemented VMMotion
CIT Today

 Server consolidation – ban on new physical servers


 SAN upgrade, CX400 – i-SCSI, mirroring
 4 ESX Servers, 63 VMs, 15Tb Disk, in two centers
Pitfalls – Server for server sake

 VM Servers/machines have an undoubted


value and can be created instantaneously
 However, there is a danger in creating
VMs for the sake of it
 This can lead to unstructured and
unmanageable growth of virtualisation
infrastructure
 Network admin looses visibility of “the
port” which servers attached – vSwitch
Future
 VSphere
 CIT plans to upgrade its Virtualisation Infrastructure to VSphere 4.0
in 2010
 VDI
 CIT plans to investigate VDI as a means of rapid desktop deployment
 It will be also used to deploy individual desktops to Staff and
Students
 Most likely pioneered in Dept of Computing and IT Services
 Project Students and CIT’s Open Access facility for Students could
benefit.
 EMC/VMWare/CIT Academy
 CIT over the years has enjoyed good relations with EMC and
VMware
 Recently CIT have created a VMware Academy
 Built a VMware Training Infrastructure to support delivery of VMware
courses
 Plans to offer a number of courses including the VCP in 2010
Questions

 Aidan McDonald – IT Network Manager


(aidan.mcdonald@cit.ie)
 Aaron Krawczyk – Virtualisation Administrator
(aaron.krawczyk@cit.ie)
 Pat McCarthy – VMWare Academy Coordinator
(pat.mccarthy@cit.ie)

Thank You

Das könnte Ihnen auch gefallen