Sie sind auf Seite 1von 16

Conversion Strategy

Mosaic Project
Kuali Financial System Implementation
Conversion Strategy
Version 1.2Project Center

Revision History
Date

Version

Description

Author

12/10/08

1.0

Initial Document

Rachel Serrano

1/21/09

1.1

Revision (various sections from the


template)

Raj Kar

2/15/09

1.2

Updated based on feedback from


Cindy DeMaio

Raj Kar

The University of Arizona

KFS Implementation Conversion Strategy

V 1.1 1/21/2009

Contents
1

Executive Summary
1.1 Goal
1.2 Strategy Approach
1.3 Challenges

Scope

Conversion Deliverables

Assumptions/Background

Stakeholders/Roles

Environments
6.1 Conversion Environment Build process

Tactics

Success Factors

Strategy Roadmap
9.1 High Level Timeline

The University of Arizona

KFS Implementation Conversion Strategy

V 1.1 1/21/2009

Conversion Strategy
1
1.1

Executive Summary
Goal

This Data Conversion Requirements and Strategy document defines the requirements,
scope, objectives, and strategy for KFS Project for the conversion of Finance, Labor,
Project Accounting, Purchasing, and Payroll data from legacy systems to Kuali Financial
Systems (KFS).
The Data Conversion Requirements and Strategy document is used as follows:
o The primary use of this document is to record and communicate the data conversion
scope, objectives, approach, and requirements.
o The conversion team uses this document to communicate the strategy for successfully
converting legacy data to the Kuali system.
o The conversion team uses this document as a road map for performing the
conversion. All members of the team should understand and follow the same
strategy.
o The project manager uses this document to understand how the conversion team plans
to perform the conversion, and how the conversion effort may impact the overall
project.
Distribute and communicate the Data Conversion Requirements and Strategy to the:
o Project manager from the implementation services provider
o Client project manager, who should sign-off on the conversion requirements and
strategy
o Conversion team members (where applicable)
o The leads of the production development and support teams.
o Other process leaders who are responsible for tasks that are prerequisites for
conversion tasks, or whose tasks are dependent on output from conversion tasks
Use the following criteria to ensure the quality of this deliverable:
o Are the project scope and objectives clearly identified?
o Are specific critical success factors documented?
o Is the impact of each dependent task from other processes considered?
o Are the conversion requirements clearly defined, including all legacy applications and
business objects? Do the definitions indicate the level of detail and history to be
converted?
o Is a disposition path for every existing business object/data element clearly defined?
o Is the strategy understood by those on the distribution list for this deliverable?
o Are all assumptions, constraints, and risks that could impact the conversion stated,
understood, and mitigated?

The University of Arizona

KFS Implementation Conversion Strategy

1.2

V 1.1 1/21/2009

Strategy Approach

Conversion approach will include both automatic and manual processes. Majority of the
data (reference and transactional) will be converted using automated scripts while a small
portion of data will be entered manually.
The following diagram provides a graphical description of the standard conversion
approach that will be used to convert legacy systems data to the Kuali Financials System.
Internet
Computing
Environment

Legacy Systems
E
n
v
i
r
o
n
m
e
n
t

Client w/
browser

Client w/
browser

Application
Server

Mainframe
Legacy Data
S
t
D
o
a
r
t
a
a
g
e

ASCII
Flat Files

Database
Server

Network

Intermediate
Staging Area

Relational
non-Relational
Database /
Indexed Files

Application Tables

ORACLE
Database

Kuali Systems 3.0

Spreadsheets /
Ext. Feeds

P
r
o
c
e
s
s

Extract Data

Load to Staging

Load

Data Validation /
Transformation

Legacy data will be cleansed either at the source or after the extraction of data in ASCII
flat files. Data cleansing will include such things as closing open items and deleting
duplicate data. Data cleansing may also be performed once the data has been converted to
the flat files. These flat files will then be transferred to staging environment using custom
SQL*LDR process. Once in the Oracle environment the data will be transformed in the
staging area and validated prior to loading in the KFS application tables.
Open Interfaces exist for most subsets of data, and these ensure that data loaded will have
no data integrity issues and will ultimately support subsequent business processes. Error
records will be corrected in separate tables for troubleshooting and fixes.

The University of Arizona

KFS Implementation Conversion Strategy

V 1.1 1/21/2009

Conversion Strategy

1.3

1. Conversion Data
Mapping/ Legacy
Data cleansing

6. Data Validation
,Translation and
load into staging
tables

2. Download
Programs

7. Custom Load
Programs

3. ASCII Flat File

8. KFS Application
Production Tables

4. Upload Programs
to Staging Area

9. Reconciliation/
Testing

5. Description and
Creation of
Intermediate Staging
Tables

10. Write Perform


Conversion
Execution Plan

Challenges

Amongst many factors affecting successful systems implementations, data conversion is


critical from both from accuracy and timeline perspective. In fact, a go/no-go decision for a
cutover in many cases contingent upon successful conversion of required data. Following
is a brief list of major conversion challenges facing most implementations:
1. Poor quality data
Risk: High
Consequence: Inability to convert legacy data with resultant delay in implementation
of new Kuali system.
Mitigation,: Ensure that resources are identified at the earliest opportunity to analyze
and understand the requirements of the new system. Retain legacy systems until data
can be cleansed.
2. Lack of understanding of required key data translation, clean-up and
transformation criteria
Risk: Moderate
Consequence: Possible loss of data during conversion or failure to link data to
relevant entities in Kuali environment. On-going problems with converted data in the
new system.
Contingency: Maintain legacy systems until such times as data in new system can be
verified. Again, early communication and analysis of the legacy data systems is
imperative to avoid this

The University of Arizona

KFS Implementation Conversion Strategy

V 1.1 1/21/2009

3. Systems/environment refreshes
Risk: Medium
Consequence: Incorporation of weekly refreshes from Kuali Foundation might
require additional testing and potential delay in conversion tasks with impacts on
project milestones
Contingency: Ensure stand-by backups are available
4. Timeline
Risk: High (timescales are aggressive)
Consequence: Missing of project deadlines and milestones and delay in
implementation of Kuali 3.0. Delay in dependent tasks could have repercussions in
terms of go-live/cut-over and incur extra project costs.
Contingency: Ensure that project plan is watertight and that all phases of the data
conversion process are included. Project plan and associated milestones should be
communicated to all data conversion team members.
5. Availability/Retention of Legacy technical resources to analyze, identify and extract
legacy data
Risk: Low
Consequence: Based on initial extracts, it is assumed that this may not be an issue
given the extraction programs already tested can be modified for all historical /
current data extraction.
Contingency: If Legacy personnel are not made available then this should be brought
to the attention of the proper level of command in the project. Suitable personnel
should be earmarked and made available at the earliest opportunity.
6. Availability of Database Administrator resource
Risk: Low
Consequence: Possible non-availability of environment for both test and production
conversions. Inconsistency of patch-levels across database instances.
Contingency: Ensure that DBA resource is on site for entire conversion process and
for support after cutover to Kuali.
7. KualiFinancials functionality/ Gaps not fully defined and therefore the scope of
data to be migrated may still be uncertain
Risk: Low
Consequence: If the Kuali Functionality is not tightly defined, the scope of the data
conversion cannot be certain as the type and quantity of data to be migrated would
not be known.

The University of Arizona

KFS Implementation Conversion Strategy

V 1.1 1/21/2009

Contingency: All Kuali business processes and functionality as it relates to


conversion requirements must be fully defined and agreed unequivocally before data
migration can be executed successfully..
8. Baseline Application set-up incomplete or not agreed
Risk: Low
Consequence: The data migration is dependent on the Application configuration and
set-up being completed. Any missing set up data will impact on the migration of
legacy data and consequently on project deadlines.
Contingency: Application and configuration set-up should be agreed and signed-off
prior to commencement of any data conversion processes.
2

Scope
The scope of data conversion for KFS implementation is to ensure that all data for last 3
(?) years up to the system cutover in FRS is brought over to the new Kuali Financial
System. Detail scope at the time of writing this document is listed below; further
information may be available during analysis/design and may be incorporated in future
versions of this document. It is assumed that no more historical data, other than that
mentioned, will require conversion. Refer to Conversion Inventory and Conversion
Requirements / Specification documents for details on candidate data entities. Data will
be converted based on the requirements provided by SMEs and Business Area leads.
Data Entities

The following is a list of the entities categorized into Reference and Transactional
data that will be converted as part of the KFS Project implementation.
Reference Data Entities (Please refer to the conversion inventory for details.)
o Chart of Accounts (COA)
- Source is FRS, however, a spreadsheet is created to provide the logic for
transformation of old object codes to new ones prior to loading in KFS
o Vendor
- Source is FRS, but requires an alias table to cross-reference old vendor# to new
vendor#
o Labor Distribution Reference
- Data supplied by SMEs in spreadsheet serve as the source.
o Purchasing Tables
o Billing Addresses
o Vendor Reference Tables
o Core Tables
o Country Codes / Zip code
o Assets
o Accounts Receivable Tables
o Cross-Walk

The University of Arizona

KFS Implementation Conversion Strategy

V 1.1 1/21/2009

o Parameters
- to be entered manually through Kuali User Interface (U/I)
o Workgroups
- to be entered manually through Kuali User Interface (U/I)
o Users
o Netid and other users information extracted from UA LDAP (applicable for initial
load only)
Transactional Data Entities (Please refer to conversion inventory for details)
o Beginning Fund Balance
o Contract and Grant Account Balances Project-to-date
o Beginning Balances A/C Encumbrances
o Labor Ledger Beginning Balance
o Labor Ledger Encumbrances
o Account Budget
o Open Purchase Orders (PO)
o Payroll Feed
o Detail Monthly Transactional Data (BI Team)
o Interfaces (locate the Matrix of external feeds)
3

Conversion Deliverables
The major tasks and corresponding deliverables provided in this conversion project are:

Task

Description

Deliverable

Define Data Conversion Requirements


and Strategy

Analyze and document the conversion scope,


objectives, approach, requirements, and the
strategy for how the conversion will be
performed.
Prepare the conversion environment for the
development and testing of the conversion
programs.
Map legacy data files and elements to the KFS
table(s) and columns. This map includes an
explanation of business, translation, foreign key
rules, and default settings where applicable
Define procedures for manually converting
applicable business objects through KFS User
Interface
Design how the conversion programs should be
coded using conventional programming
techniques including any translation modules
Specify test procedures to be followed for
performing conversion unit, business object, and
validation tests.
Build conversion programs based on the
Conversion Program Design

Data Conversion Requirements and


Strategy Document

Prepare Conversion Environment


Perform Conversion Data Mapping

Define Manual Conversion Procedures


Design Conversion Programs
Prepare Conversion execution Plan
Develop Conversion Programs

Perform Conversion Unit Tests


Perform Conversion Business Object

Test the performance of each of the individual


conversion program modules.
Test how the converted data performs within the

The University of Arizona

Conversion Environment
Conversion Data Mapping

Manual Conversion Procedures


Conversion Program Designs
Conversion Test Plans
Conversion Programs to include
Conversion Table creation scripts
SQL*Loader scripts
Translation scripts
PL/SQL modules
Rconciliation scripts
Unit test scripts/documentation?
Unit-Tested Conversion Programs
Business Object-Tested Conversion

KFS Implementation Conversion Strategy

V 1.1 1/21/2009

Task

Description

Deliverable

Tests
Perform Conversion Validation Tests

target application.
Test how the converted data performs within the
entire suite of target applications.
Install the conversion programs that have been
coded and tested. Scripts used for conversions
should remain in repository installed until the final
conversion is performed
Convert data that has been verified by the users
before commencement of production operations.

Programs - System Test


Validation-Tested Conversion Programs
Integration test
Installed Conversion Programs

Install Conversion Programs

Convert and Verify Data

Converted and Verified Data

Assumptions/Background
The following assumptions have been made in the development of this strategy:
o Conversion team will include resource(s) with knowledge of legacy data in FRS.
o Conversion requirement including clean-up, transformation and business rules will
be??

Stakeholders/Roles
For a detailed list of roles and responsibilities, see the document entitled Roles and
Responsibilities. The following roles are involved in the processes described in this
strategy:
o Conversion Lead (F/T) Provides direction and day-to-day management, coordination
with business leads. Also helps build ETL mapping, build and loading of data as
appropriate.
o Technical specialist (F/T) responsible for extracting data, scrubbing, mapping,
developing conversion programs, loading and testing data.
o Business System Analyst (P/T) provides area specific functional expertise, documents
conversion requirements, helps in mapping and testing of data.

Environments
The following software and hardware requirements are considered components of this
conversion effort:
Software
The application software used as part of this project includes:
Database Server - RDBMS 11G
Application - Kuali Financial Systems 3.0
Application Server Tomcat 5.5.16
Web Server Apache xxx

Hardware Environment (please refer to environment strategy document for


details)
The hardware and operating software used as part of this project include:
Hardware - Dell <Model no>

The University of Arizona

KFS Implementation Conversion Strategy

V 1.1 1/21/2009

Operating Systems - Redhat Linux <XX>

Instances
Data conversion will be done in CNV environment and will be maintained by Env
management team per instructions from conversion team.
6.1

Conversion Environment Build process

Conversion environment (CONV) will be built initially using CFG as a source prior to data
conversion. Subsequently Conv will be re-built on periodic basis from CFG based on milestone
changes with configuration. These configurations might be manually entered by BSAs to rebuild
the environment.

7 Tactics
Conversion Inventory (deliverable) will serve as a living document which identifies scope of
entities to be converted and data sources. Please refer to this document for an up-to-date list of
entities.

The University of Arizona

KFS Implementation Conversion Strategy

V 1.1 1/21/2009

1. Conversion Data Mapping/Legacy data cleansing


The data mapping process provides detailed lists of the data sets and data elements that
will need to be moved into KFS tables during the data conversion. During this process,
some decisions will need to be made with regard to obtaining information needed by the
target application that may not be present in the old system. Default settings, user input,
and new data entries are some of the issues that must be addressed during this phase.
The output of this activity is a Data Mapping document that shows what is needed for the
KFS target module processing to meet business operational requirements and where these
data elements will come from in the legacy system. This mapping will form the basis for
the legacy data extract requirements. All scripts and mapping documents will be stored in
shared network drive (P).
Once the legacy data to be extracted has been identified then cleansing can begin. This
will be an ongoing process. The legacy data cleansing should be done, where applicable,
before the data is extracted from the current systems. If cleansing is done within the
legacy system, resource(s) from source systems (e.g. FRS) should perform the cleansing
operation, as they are the owners of the data and therefore have the greatest knowledge of
the data structures, format and the means of access to the data. On the contrary, when
data is cleansed after extraction (e.g. chart of accounts) it should be in coordination with
Business System Analysts.
2. Download Programs
Resources from Source systems are responsible for writing these programs (in this case,
this person is from the project, correct?). These programs are used to extract the
identified conversion data elements from the current systems in the form of an ASCII flat
file or Spreadsheet. It is important to remember how the flat file will be structured (the
order of the records as they are pulled), type of delimitation used, number of records, etc.
The flat files must match how the interim tables are set up. The output from a download
program is an ASCII flat file as described in the next section.
3. ASCII Flat File
Most database or file systems output data in text form. A comma or space delimited,
variable or fixed format data file from the existing system should be generated. If you
cannot find a way to produce clean text data, try generating a report to disk, using a text
editor to format your data.
One of the outputs of conversion data mapping is to identify the legacy data element data
types and precision. After the conversion data mapping is complete, you should know if
there are inconsistencies between the legacy data types and the requirements for the
Oracle data types. If there are translations that need to take place, these translations can
be performed on the legacy system prior to creating the extract or in an interface table.
Also, if you are creating a fixed length extract file, you need to take into account the
precision level of decimal numbers.

The University of Arizona

KFS Implementation Conversion Strategy

V 1.1 1/21/2009

4. Upload Program

Once the data has been extracted to a series of ASCII flat files and physically moved onto
the same computer as the Oracle RDBMS for staging area, the next step is to load the
data into a relational database environment.
Conversion team will write programs to move data, validate data, and insert/update
standard values into default fields. Usually a single loader program is written for each
data staging tables using scripts /tools (e.g. SQL*Loader).
5. Description and Creation of Intermediary Staging Tables

Before loading the KFS production tables, the legacy data should first be loaded into
temporary or interface tables. The interface tables provide a location for you to
manipulate and translate the data as needed before validating the data and loading the
application production tables. These temporary interface tables need to be built before
you run the loader script to populate these tables. The interface tables may be standard
Oracle Application interface tables or may be custom interface tables. Scripts will be
written to create these tables and any associated indexes and to allocate storage for them.
6. Data Validation, Translation and Transformation

Prior to loading the data KFS production tables there will be a need to validate and tidy
up certain data items. Scripts will be developed to translate data from the existing system
format into useful data for the target module where necessary. Some data items may need
translating into codes in order to satisfy KFS application look-up validation. There may
be several or no translation programs, depending on both the type of data coming across
and the format of that data. Conversion team is responsible for transformation,
translation, and the program to do carry out these while BSAs will provide the business
rules for translation/transformation. Spreadsheets provided / updated by users will be
stored in network shared drive (P) under the folder data conversion.
7. Load Programs

The Load programs will be used to load data from the staging tables into KFS tables. The
various Programs will also provide error capturing and reporting mechanisms for any
records rejected during the upload process. Record rejection tends to be caused by either
failure of validation within the Import routine or because of an earlier failure that has
gone unnoticed. Conversion team is responsible for writing these load programs as well
as any error reporting routines. Programs will be stored and maintained in SVN.
8. Application (Module) Production Tables

These are the target production tables where the converted data resides. These tables are
identified early during the data mapping process. These tables drive some of the
translation programs that must ultimately ensure that 100% of the information that the
target applications require is present in the final data structures.
9. Reconciliation/Testing

Tools or custom reports will be used for even during the pre-conversion steps,
some type of validation reports are generated from the legacy systems, to be
compared later with the converted data.

The University of Arizona

KFS Implementation Conversion Strategy

V 1.1 1/21/2009

The approach taken is to use as many standard reports as possible that are
available in the legacy and target system for the final data validation. If no
reports support the validation requirements, then custom scripts will need to be
created for specific validation purposes.
Data Reconciliation should form part of the testing. This process will compare at
all stages the number of records extracted with the number of records currently
undergoing processing and eventually the number of records in the Oracle
Financials. Reconciliation will include hash totals of certain identified data items.
Testing will include such things as record counts, random sampling as well as
functional and process testing.
(rk) I agree with the fact that listing the items for validation and types of test
should help in audits. However, types of tests seem to be mixture of activities
from different areas. For example, Transaction tests may not validate data
migration and usually if not always outside of data validation.
We need to clarify who will be responsible for these validations, and expected
time for validations. Users in some areas have already been doing the validation.
We need to make sure these are accounted and not a repeat of whats already
performed.
For logical errors on records load, who(roles) will be responsible.to make sure
data is corrected?
Testing will include validation of each of the components that are converted. This
is not to say that 100% of the detail will be validated, but that a sample of each
item will be validated for accuracy of the conversion in terms of what was to be
converted for each row of data (eg are all the fields there and represented
correctly). Further, row counts will be employed to validate completeness of the
data to ensure that all that was to be converted actually made it to the target
database. The following table shows a sample a list of what will be performed.
Each area will have detailed conversion reviews included for those items that
impact their area.
This list is intended to represent most of what is being validated. For a complete
list of conversion entities, please refer to conversion inventory and the project
plan. It is possible that there are other items that are being converted that are not
covered here, but will be tested nonetheless for completeness, accuracy, and
ability to be transacted.
Conversion
Topic

Related items to
validate

Accuracy test

Completeness Test

Transaction Test

Chart of
Accounts

Chart

All fields related to


each item were
pulled appropriately

Row count of each


field ties out to either
match what was in

Transactions are able


to be entered for
these items and post

Accounts

The University of Arizona

KFS Implementation Conversion Strategy

Object Codes

Purchase
Orders

V 1.1 1/21/2009

from FRS

PO Header
PO Line
PO Line Accounting

All fields related to


each item were
pulled appropriately
from FRS

FRS or reconciles
based on known
exceptions (eg GL to
SL conversion
process)

successfully.

PO heaer and line


counts will be
validated against
row counts from
FRS

PO change and
cancel functionality
will be tested.

Hash total of PO line


dollars matches that
of FRS (this may
only be for
outstanding amount,
as opposed to
original amount)
Vendors

Vendor header
information
Vendor Address
Information
Vendor Divisions

All fields related to


each item were
pulled appropriately
from FRS
Vendor
consolidation
occurred as
expected (creating
Parent and
Divisions)

Vendor and vendor


address counts will
match up to what
was expected to be
converted (eg no
Employees or
Students will be
converted)

Reporting is able to
be generated as
expected.

Invoice processes will


be tested (payment
requests).
Receiving processes
will be tested

Vendor change and


inactivation testing
Requisition Creation
Purchase Order
Creation
Disbursement
Voucher Creation
Payment Request
Creation
Pre-Disbursement
Processor payment
creation

Accounts
Receivable

Customer
Invoice Header
Invoice Line
Invoice Line
Accounting

Customers will be
converted
manually, and will
therefore be
validated at time of
entry.
Invoice information
will come in the
form of a predefined upload-able
format. Each
converted filed will
need to be tested
to ensure the
upload was
successful

Contracts and
Grants

Similar to
accounts?

The University of Arizona

Row counts match


for customers

Invoice writeoff for


converted items

Row counts match


for invoices and
invoice lines.

Invoice writeoff/credit
memo processing for
converted items

Outstanding AR total
for converted
invoices agrees to
source

Invoice payment
receipt for converted
items
Invoice partial
payment receipt for
converted items

KFS Implementation Conversion Strategy

Budgeting

Budget balances

V 1.1 1/21/2009

Budget figures
match for selected
combinations

Number of budget
records agrees to
total to be converted
Total budget dollars
equals amount from
FRS

Assets

Asset header
information
Asset Depreciation
information

Asset information
matches source
system
Depreciation
converted correctly

Number of assets
converted agrees
Total value of assets
matches
Total amount of
depreciation
matches

Transaction budget
validation, when
enabled, appropriately
evaluates transaction
amount against
available budget (eg
not previously
account for via
encumbrance, preencumbrance, or
expense)
Converted assets can
be changed or
inactivated/disposed
Converted assets can
successfully process
through depreciation
Depreciation
calculates correctly

10. Write and Perform Conversion Execution Plan

Detail execution plan will be part of the conversion project plan in MS Project and
stored in network shared drive (p).
.
8

Success Factors
The following is brief list of major critical success factors impacting conversion:
o Early conversion planning and co-ordination across all University of Arizona business
groups to maximize timely and cost-effective conversion and migration development
investments
o Well defined technical architecture strategy, requirements, and application
configurations that are agreed upon and stable
o Participation of representatives from each business entity/legacy system group as part
of the project team helping to ensure consideration and analysis of all business and
system interface points
o Timely availability of planned KFS environments
o Early identification and completion of key data translations, clean up, and
transformations as a consequence of planned meetings and liaison with the owners of
the legacy systems.
o Availability of Kuali functional resources and legacy resources to plan, define
requirement, mapping, execute and support the migration process.
o Successful completion of set-up of all required application
o Availability of fully patched and tested database environment to receive the newly

The University of Arizona

KFS Implementation Conversion Strategy

converted data.

9
9.1

Strategy Roadmap
High Level Timeline

o Deliverable #1 - date
o Deliverable #1 - date
Insert Pipeline Timeline Here

The University of Arizona

V 1.1 1/21/2009

Das könnte Ihnen auch gefallen