Sie sind auf Seite 1von 4

Professional Summary:

He has 8 years of experience in Data warehouse Management System in Business Intelligence and Data warehousing with successful project deliverables. Well versed with Project Management involve into several High Level Project management activities like Project Scope,Plan,Schedules,Project cost and resources estimation, , Risk Management, process rationalization, monitoring, tracking & delivery. Project roles incorporating the full project life cycle of feasibility study, analysis, design, build and testing. Excellent experience in Extraction Transformation and Loading processes using IBM Infosphere Datastage ETL tool, UNIX scripting(Loading Scripts), Datastage User Defined Routines. (ETL) shell

Datastage Administration 7.5 / 8.1 which includes creation of new projects, Project Permissions, Installation of the IBM InfoSphere Datastage 8.1 software, License Maintained. as part of Datastage Server Support. Extensively worked on various stages like Oracle Connector, Teradata Connector, XML Input, External Source Stage, Transformer, Sequential file, Datasets, Join, Lookup, Sort, Aggregator, Copy, Dataset, Merge, Oracle, Remove duplicates ,Column Import, Column export,SCD stage, Change Capture for building staging, data marts/warehouses. Expertise in working with various operational sources like XML files, Oracle, Teradata, SQL Server, Flat files (CSV) into a staging area. Experienced in all stages of the software lifecycle Architecture for building a Data warehouse, Star schema and Snow flake Schema by using data modeling tool Erwin 7.0/4.1. Proven onsite experience with daily customer interaction for more than 3 years. Hands on experience in writing, testing and implementation of the Procedures, functions at Database level using PL/SQL. Exceptional verbal and written communication skills, excellent team player I am comfortable dealing with personnel at all levels of an organization with varying degrees of technical ability and ensuring that business goals are

accurately translated into requirements and then systems that fulfill those stated goals.

Business Intelligence Operating Systems Languages Tools & Utilities Domain Knowledge Databases Other areas

DataStage8.x/7.x and OWB IBM AIX 5.2/5.1, Sun Solaris Windows XP/NT/2000/98
PL/SQL

V8.0,

HP-UX

V11.0,

Java, C, C++.,VC++ MFC Classes, Visual Basic 6.0 / 5.0, Star Team, Synergy CM,Visual Source Safe, Visual Caf, and Perforce Pharma ,Banking Insurance , Telecom,Logistics,Healthcare,

Oracle,Teradata,Sql Server Team Management, Project Management, Customer Interaction at Onsite/Offshore, System Analysis and Design, Dimensional Data Modeling, Relational Data Modeling, ETL Design and Implementation

Projects Profile:
Project #1:

1 Client Role Organization Duration Team Size:15 Environment (with skill versions)
Project Description:

Project Name : GDM ODS GSK Project Architect Satyam Computer Services Limited, India Mar 2010 to Till Date Software Database : Oracle 10g,XML Sources Tools : TOAD, Datastage 8.1 O/s : Windows XP,SUN Unix

The GDM ODS Warehouse system is built in the form of Operational Data Store (ODS) ,an Integrated Data Store (IDS) and Data Marts/DW system. The architecture consists of Operational Data Store (ODS) to extract the data from LIFT and MERPS. Incremental data is extracted from LIFT ODS on daily basis. Integrated Data Store (IDS) is dimensionally modeled as a snow-flake schema where in the dimensions are loaded initially by generating Surrogate key. After having performed the referential integrity checks on the data as per the Sample type, valid data will be loaded in to the Fact table. Hence Sample can be of three types QC, Stability and Non-routine development samples. Parallel to this Certificate of Analysis data from ODS is loaded into IDS independent of dimensions and facts. Stability Shared Service (SSS) has been set up under the agile initiative to provide stability testing to 8 GMS donor sites. The SSS shall operate out of 2 hubs, Barnard Castle and Zebulon. In addition to the 8 sites which shall use the shared service, there are a further 5 sites which shall use LIFT to record their stability data. Currently much of this data collation, transformation and reporting is generated manually resulting in resource intensive activities, multiple opportunities for data error and ultimately, a negative impact on the time taken to deliver improved performance. The reports are either hand crafted, or in the case where a system can generate the data tables, then they still require full data checking before they can be submitted. This process also introduces human errors into the data. In addition, LIFT is a transactional system and does not allow for easy access or simple interrogation of data. Also the current data in ODS for LIFT does enable the most effective trending. Finally, there is a requirement to generate reports which combine LIFT and MERPS data, which is not possible in either source system To enable effective stability data reporting, an aggregation and reporting solution is required so that product stability performance can be evaluated for accurate, knowledge-based decision making. All data extracted from LIFT relating to stability shall be prepared and aggregated with MERPS data in the Integrated Data Store (IDS). Specifically, the solution will provide reporting capability for stability data within LIFT to support the Stability Shared Service hubs and other sites using LIFT to store their stability data From the IDS, 2 data marts shall be generated to pre-process the combined data into a series of tables which can then quickly and effectively generate reports.

Project Contribution:
Preparation of SSS Solution Architecture, Data Architecture, Key Business requirements, System Requirements, Project Scope, Project Plans, Cost and Resource Estimates, schedules, BDF documents Risk Logs and Data Health Check reports for the Project. Involved in GMS IT Risk assessment carried out with IT Risk management with design risks and mitigations. Involved in requirements gathering and source data analysis and identified business rules for data migration and for developing data warehouse data marts.

Involved in identifying the data inputs from various source systems depending upon the business requirement. Created source to target mapping documents from staging area to Data Warehouse and from Data Warehouse to various Data Marts. Involved in low-level design and developed various jobs and performed data loads and transformations using different stages of Datastage and pre-built routines, functions and macros. Key Role in developing the Conceptual Data Models for ODS, IDS, DM Used IBM Infosphere Datastage 8.1 for developing various jobs to extract, cleansing, transforming, integrating and loading data into Data Warehouse. Developed Various UNIX Shell Scripts for XML / CSV Files extraction, Validation, Loading from Different Source systems LIFT, MDES, and MERPS. Created UNIX STOP File Scripts for each source system Developed the Datastage Routines and Buildops for implementing the custom requirements. Used Crontab to schedule running the parallel jobs, monitoring scheduling and validating its components. Mentoring the Team members about the Tool Knowledge, Domain Knowledge, Applications Knowledge to develop and support. Guiding the Team Members technically to resolve the issues. Managed the project in all areas of service with high customer appreciation. Parameterized Datastage jobs to allow portability and flexibility during runtime using Datastage Parameter Sets. Responsible for Preparation of Design Documents / Test Case Specifications / Performance Review and Coding and getting it signed off from client. Plan and conduct the conference calls from off-shore. Plan and distribute the work among the off-shore team mates. .

Das könnte Ihnen auch gefallen