Sie sind auf Seite 1von 6

Ramesh Bachu

Summary:
9 plus years of IT experience on different tools and technologies with strong project development
skills including the ability to work through the entire software development life cycle from gathering
requirements through development, production, support and documentation of the complete project.
Strong experience in designing and developing of Extraction transformation and Loading (ETL)
process using the Ab Initio Software.
Strong hands on experience with Ab Initio GDE (3.1x/1.15/1.14/1.13), Co>Op (3.1x/ 2.15/2.14/2.13).
Excellent knowledge and experience in creating source to target mapping, edit rules and validation,
transformations, and business rules.
Well versed with various Ab Initio parallelism techniques and implemented Ab Initio Graphs using
Data, Component, pipeline parallelism and Multi File System (MFS) techniques.
Configured Ab Initio environment to connect to different databases using DB config, Input Table,
Output Table, Update table Components.
Experience in providing production support to various Ab Initio ETL jobs and developing various
UNIX shell wrappers to run Ab Initio and Data base jobs
Implementation of Physical and Logical Data warehouses and Data Marts.
ETL mappings, analysis and documentation of OLAP reports requirements. Solid understanding of
OLAP concepts and challenges, especially with large data sets.
Experience Interacting directly with Client and working on-shore / off-shore Model.
Experience in Change Management and Code Migration process
Expertise in Data Warehouse Testing Methodologies. Extensive experience in developing UAT Test
plans, Test Cases, creating and editing test datasets, and generating/executing SQL Test Scripts and
Test results.
Small Team experience: strong experience as an independent Senior Developer.
SOFTWARE EXPERTISE:
Data warehousing: Ab Initio (GDE 3.x/1.15/1.14/1.12 , App hub, Co>Operating System
3.x,2.15/2.14/2.12)
Data Modeling: Star-Schema Modeling, Snowflakes Modeling, Erwin 4.0, Visio.
RDBMS:DB2,Oracle 9i/8i/8.0/7.x, Teradata V2 R6/V2 R5, MS SQL Server 6.5/7.0/2000
Programming: UNIX Shell Scripting, Korn Shell, SQL, SQL*Plus, PL/SQL.
Operating Systems: Windows 2000/NT, MVS, UNIX
BI tools: Business Objects suite 6.1/5.1, Crystal Reports 8.0/8.5
Work Experience:
Client: Key Bank
Sr. Ab Initio Consultant

Apr 2014 Till Date

The data from different operational source were extracted, transformed and loaded into Dimensional and
Fact tables of the Data warehouse. Credit Profile, Customer Profile, Funds Transfer & Billing Profile were
generated regularly. Many UNIX scripts and Ab Initio graphs were developed to provide a best-fit solution
for varying data sources.

Worked with business users, to understand the business requirements and coordinated the changes
with development and testing teams.
Created High Level Design documents and Technical design documents for ETL graphs
Involved in creating source to target mapping and low-level design documents.

Developed generic graphs to validate incoming data using built in functions provided by Ab Initio

Performed data cleansing using Ab Initio functions such as is_valid, is_error, is_defined.
Developed complicated graphs using various Ab Initio components such as Join, Rollup, Lookup,
Partition by Key, Round Robin, Gather, Merge, Dedup sorted, Scan and Validate
Responsible for automating the ETL process through scheduling and exception-handling routines
Used BI tool Business Objects for reporting purposes.
Developed parameterized generic graphs for data extraction and load.
Used the FTP components to migrate data from different servers to facilitate subsequent
transformation and loading processes.
Developed several Korn shell scripts to develop interface of the incoming file as well as automation
of the daily & Monthly routines.
Developed graphs using partitioning components to take advantage of data parallelism and used
functions like lookup_local wherever necessary instead of lookup.
Developed Crystal Reports to address diverse levels of the organizational requirements ranging from
Individual employees to a complete line of operations.
Written wrapper utility for invoking Abinitio graphs with appropriate parameters.
Responsible for creating a Job execution schedule and working it with scheduling team.
Extensive documentation including technical specifications, interface specifications, source to target
mapping, implementation plan, run guide etc.
Used control center for monitoring jobs and analyzed issues/performance improvement
oppurtunities.
Extensively involved in Abinitio Graph Design, development and Performance tuning.
Gathering the knowledge of existing operational sources for future enhancements and performance
optimization of graphs.
Used UNIX environment variables in All the Ab Initio Graphs, which comprises of specified locations
of the files to be used as source and target
Worked with EME / sandbox for version control and did impact analysis for various Ab Initio projects
across the organization.
Documentation of complete Ab Initio Graphs

Environment: Ab Initio GDE 3.X EME, Co>Op 3.X, UNIX SUN Solaris, Teradata V2 R6/V2 R5, SAP,
SQL,Oracle 9i, ERWIN 3.5, Business objects, UNIX Shell Scripting, EME.
Merck and Company, NC
Sr. Ab Initio Consultant

Jan 2012- Mar 2014

GIS (Global Integration System) This application is initiated to consolidate all Warehouse application
data and migrate It to GIS application

Designed and developed various Ab Initio graphs for EDW enhancements.


Documented Internal Design Documents for processing each feed and involved in design review
meetings
Participated in Software Requirements Specifications (SRS) meetings with various downstream
businesses, technical teams.
Developed number of Ab lnitio Graphs for the ETL processes based on business requirements
using Various Ab lnitio components like Partition by Key, Partition by round robin, reformat, join,
Rollup, gather, replicate etc.
Extensively used the Ab Initio tools feature of Component, Data and Pipeline parallelism.
Developed batch graphs for Monthly Ground revenue for sales.
Used PSETS and PLAN to create Ab Initio jobs.
Responsible for code review and Performance-tuning of existing Ab Initio graph changes.
Worked extensively in modifying the existing wrapper Scripts and Ab Initio code from Unix to Linux.

Page 3 of 6

Developed shell scripts to automate file manipulation and data loading.


Created and automated a comparison process to compare the results generated by the old
application and the new application and captured the unmatched records.
Worked on Autosys for scheduling the Batch Jobs.
Analyzed the issues with the unmatched records and provided code fix to the issues.
Developed and automated Load utility graphs which refreshes the data across databases.
Worked extensively with EME Management Console to manage and administer EME, promote
code.
Created test plans and test cases and responsible for unit testing.

Environment: Ab Initio (Co>Op 3.1.5,GDE 3.1.5), UNIX,EME, Teradata TOAD, Autosys and Windows
UBS, NJ
Ab Initio Developer

Jul 2011 Jan 2012

PRV (Payment & Receivables Vision): The objective of this assignment to complete the Payments and
Receivables Data acquisition for the North America region customers of Capital One and the Vision is to
Normalize the data, Load it into the Global Information Warehouse and the Marts, so that the data can be
displayed on the Front End.

Analysis of End Users/Business Analysts and Developed Strategies for ETL processes
Gathered and Documented Business requirements in PSDs (Physical Solution Document)
Performed tuning and optimization of database configuration and application SQL.
Implemented History preservative methods SCD2 and SCD3 for individual applications in Application
data mart area using Ab initio. And also participated in data modeling meetings.
Designed and developed numerous graphs using Ab Initio and written numerous wrapper scripts
(korn shell) to automate jobs.
Worked on EDW(Enterprise Data Warehouse), developed Ab Initio graphs to extract Enterprise
Data Warehouse to create Data marts (Full Load and Incremental load).
Extensively used ETL to load data using Ab Initio DB components from heterogeneous and as well
as homogeneous source systems from DB2, Oracle etc to target Data Warehouse Teradata
Database and other file systems.
Performed potential transformations at the staging area, such as cleansing the data (dealing with
missing elements, parsing into standard formats) combining data from multiple sources, de-duping
the data and assigning surrogate keys.
Enabled component folding for optimal performance used phases, check points to an effective and
watchers and debugging mode while debugging the graphs.
Tune Teradata Development and Production systems for optimal performance.
Experience in SQL Tuning using EXPLAIN PLAN.
Statistics collection and tuning based on UPI, NUPI, USI and NUSI.
Metadata mapping from legacy source system to target database fields and involved in creating Ab
Initio DMLs.
Used BTEQ and SQL Assistant (Query man) front-end tools to issue SQL commands matching the
business requirements to Teradata RDBMS.
Developed and worked on numerous BTEQ/SQL scripts, making changes to the existing scripts and
completed unit and system test.
Worked with Remedy (change management system). Created CR (Change request) and RFCs
(Request for change).
Performed unit testing and written test cases for every object.
Worked with QA, UAT and SIT teams to resolve incidents during object testing phase.
Worked on Tivoli scheduling tool. Created Tivoli Web Forms for scheduling jobs.

Page 4 of 6

Environment: Ab Initio Co >Op 2.15/14, GDE 1.15/14,EME, DB2, Teradata V2R12/6, Oracle 10g, UNIX,
DB2, Tivoli, SQL Assistant, TOAD
Client: AT&T, WI
Ab Initio Developer

Nov 2010 - Jul 2011

BOB (Book Of Business): This project is an enhancement to existing book of business project with new
innovative business rules in engaging the RM to engage in customers and revenue generation

Wrote processes to produce dynamic xfrs and dmls using meta-programming features.
Developing totally metadata driven architecture
Worked with business to translate the business requirements into High level design, Detailed Level
Design and Functional code.
Developed highly generic graphs to serve the instant requests from the business.
Developed Modified subject area graphs based on business requirements using various Ab Initio
components like Filter by Expression, Partition by Expression, reformat, join, gather, merge, rollup,
normalize, denormalize, scan, replicate etc.
Extensively worked under the Unix Environment using Shell Scripts and Wrapper Scripts.
Responsible for writing the wrapper scripts to invoke the deployed Ab Initio Graphs.
Used phases and checkpoints in the graphs to avoid the deadlocks, improve the performance and
recover the graphs from the last successful checkpoint.
Worked with Infrastructure team to write some custom shell scripts to serve the daily needs like
collecting logs and cleaning up data.
Involved in writing processes to continuously capture the data from different servers across the
country continuously.
Involved in Creating dynamic compare reports for the users.
Extensively used Ab Initio built in string, math, and date functions.
Used Enterprise Meta Environment (EME) for version control, Control-M for scheduling purposes.
Involved in Unit testing, System testing and debugging during testing phase
Worked on database connections, SQL joins, Loops, Materialized Views, Indexes, aggregate
conditions, parsing of objects and Written PL/SQL procedures and functions for processing business
logic in the database.

Environment: Ab Initio 2.12, UNIX shell scripting, Control M, Oracle, Oracle 9i, Sun Solaris,
Windows 2K
Client: Accenture @ Verizon, VA
UNIX Admin

Apr 2008 - Oct 2010

Responsible for installing, configuration, administration for UNIX/Sun Solaris Servers.


Applying UNIX skills to monitor user related logins and issues.
Responsible for Patches, Package Installations, upgrades and backup for Servers.
Daily Morning Health checks of Network & Servers Performance and log Monitoring.
Developed Health Check scripts to monitor the status of all the applications.
Experience in working with Virtualization Technologies Solaris Zones/Containers and LDOMs.
Performed hardware maintenance, upgrades, and component-level.
Implementing the middle ware packages (Weblogic/iplanet and Vizbroker/Corba)
Capacity planning of hardware, memory, OS for enhancement of application servers.
Providing on call support for technicians using administered software and hardware
applications.
Troubleshooting unintended results and error logs to investigate cause of unintended results.

Page 5 of 6

Disk and File system management through VERITAS Volume Manager


Actively participating in team meeting to discuss and analyzes production issues.
Automated the builds, Deploy through the IPM tool in Verizon.
Used CMIS Ticket system (solved user problems or escalated timely problems).
Creating CA (Change Availability) in Vcop, Nchange (Verizon CA tools) TP (Task Plan) for
UAT,PTE,DR and PRODUCTION for Deploy.

Client: T-Mobile, TX
UNIX Admin

Oct 2007 - Apr 2008

Installing, Configuring & Upgrading of operating systems Solaris Linux.


Redhat Server 4 & 5 for and patching Linux Servers.
Manage users accounts for the team access for Redhat Linux Servers.
User Administration, management and archiving.
Installing operating systems, patches, hardware, vendor software packages,
customization and documentation.
Monitoring system resources, logs, disk usage, scheduling backups and restore.
Working knowledge on Zones creation and deletion and maintenance.
Software and Hardware Installation, Backups, Cabling, Network.
Batch job (Cntrl M) monitoring and running,
Developed CRON scripts for error reporting and debugging.
Check and repair File system Super-Blocks with fsck, dd utilities
Used the remedy ticket system to open and solve tickets.
Developed various startup scripts for Tomcat and Websphere server.

system

Environment: Solaris 10/9/8 Sun Enterprise Servers, Linux Redhat, WebSphere,Tomcat,CVS, ORACLE,
AWK, Network Appliance NFS/CIFS server.

Page 6 of 6

Das könnte Ihnen auch gefallen