Sie sind auf Seite 1von 4

Informatica

Profile2:
This
Useful
Around 8 + years designing, developing large-scale data integration solutions for data
warehouses, application migrations and such across a variety of platforms and databases.
Having Good knowledge on Data Warehousing and Business Intelligence concepts with emphasis on
Extraction Transformation and Loading data and Extensive experience in analysis, design,
development, testing and implementation involved in the complete life cycle for the projects and
contributed to the development of the system architecture.
extensive experience in ETL methodology for developing data extraction,
loading and transformation trong development experience in Informatica

with various databases like Oracle 11G, MySQL, Teradata 12 and MS SQL Server.
Having Good Understanding of Dimensional Modeling like Star Schema, Snow Flake Schemas.
Good understanding of Informatica Architecture and Extensive experience using Informatica
components Designer, Workflow Manager, Workflow Monitor.
Extensive experience in implement the SCD (Type I, Type II and Type III) Logic Implementation.
Good Exposure in Partition Points, Cache Requirements, Commit points, Push down optimization,
Session recovery.
Good Exposure in Teradata Architecture, Writing B-TEQ scripts and hands on experience on using
TeradaUtilities like FastExport, Mload, Tpump and Fast Load.
Good Exposure in Creating Macros, Views, procedures, Query writing and performance tuning
depends on requirements.
Loading data into Teradata production and development warehouse using Teradata Parallel
Transporter (TPT) with Informatica.
Hands on knowledge on UNIX operating systems (Linux and Solaris) and Windows 98/2000/2003
Good Exposure in Shell scripting and Perl scripting
Good experience and knowledge on Business Objects, Microstrategy
Build OLAP Reports and Dashboards in MicroStrategy.8.1.1 Created Logical schema by defining
Attributes, Facts, Metrics, filters, Custom Group etc
Using Narrowcast Administration 8.1.1 created Service, subscribers, and subscription with
personalization characteristics of each subscription and scheduled and monitored the services.
Experience Details:
Working as a Team Leader in Accenture, Hyderabad from July 2010 to Till date.
Worked as a Senior Software Engineer in Surgical information Systems, Hyderabad from May 2006 to
June 2010.
Worked as a Software Engineer in ICSA, Hyderabad, from Nov 2004 to April 2006.
Certifications:
1. Informatica Certified Developer 8.x
2. Informatica Certified Administrator 8.x

3. Teradata certified professional 12.0


Education Details:
1 M.C.A from Nagarjuna University Passed Out 2003.
Technical Skills:
ETL
: Informatica Power Center 9.1.0/8.6.1/7.x
OLAP Tools
: Business Objects 6.5, Microstrategy 9.0, Qlickview.
Database Languages : SQL Server 2005, Oracle 9i,Teradata12.0,13.0,
SQL data modeler 3.3
Operating Systems
: Windows NT and Windows 2003,UNIX.
Project #1:
Title
: Novartis AMAC CI to Informatica migration
Client
: Novartis.
Role
: ETL Lead
Duration
: Oct 2014 to Till Date
Environment
: Informatica Power Center 9.5.0, Informatica Cloud service, Castiron, Flat
Files, Oracle11g, SQL, Salesforce.com, Windows, UNIX.
Project Description:
The main intension of this project needs to migrate from current platform to new platform
alongside scale up the current system. Fetching flat files data from salesforce.com (SFDC). Need to load
the database tables alongside SFDC system to oracle base by using Informatica Cloud service.
Loading from multiple source systems Flat Files, Veeva related data in salesforce.com and from Cloud
loading data into Oracle tables.
Responsibilities:
Involved in requirement gatherings and analyzing BRDs and worked with system analysts to
create source to target documents.
Analyzed data in source databases before loading into data warehouse and created technical
documents according to BRD.
Created complex mappings using technical documents and loaded data into various data bases.
Used dynamic cache in lookups and created slowly changing dimensions according to the
requirements.
Developed mappings involving complex business logic using mapping parameters, mapping
variables and unconnected lookups SQL overrides, Normalizer, Union etc.,
Wrote pre and post session SQL scripts while loading data in Oracle data base.
Identified bottlenecks at mapping level using debugger and resolved them to increase
performance.
Optimized performance of mappings by using appropriate transformations like source qualifier,
aggregator, connected lookups etc.,
Worked with data architects in analyzing loaded data in data base and modified transformation
logic if necessary
Created pre-session and post-session shell scripts and e-mail notifications.
Optimized Query performance and DTM buffer size, Buffer Block Size to tune session
performance
Used Parameter files to initialize workflow variables, Mapping parameters and mapping
variables and used system variables in mappings for filtering records in mappings.
Performed data validation, reconciliation and error handling in the load process.

Created, optimized, reviewed, and executed SQL test queries to validate transformation rules
used in source to target mappings/source views, and to verify data in target tables.
Project #2:
Title
: Star Aggregator (Analytics)
Client
: Star TV.
Role
: ETL Lead
Duration
: July 2014 to Sep 2014.
Environment
: Flat Files, MySQL, SQL, Windows, Qlikview, UNIX, Erwin 4.1
Description:
AVS BE source systems make available data in the shape of interfaces/extracts to Star Aggregator System
as input for the Staging Area.
Source data will be composed of flat csv files produced by AVS BE source system.
Star DataMart solution, contains an aggregated level, with the purpose of creating pre-aggregate data
tables from existing target tables, as input for dash boarding QlikView system.
QlikView reporting tool takes in input the four tables provided by MySQL database, after being filled in
the Reporting Area.
Every night a reload of new available data is scheduled, in order to give the customer the possibility to
watch each KPI updated to the last useful date.
Roles and Responsibilities:

Performed Customer Requirements Gathering, Requirements Analysis, Design, Development,

Testing, End User Acceptance Presentations, Implementation and Post production support of BI
projects.
Designed Source to Target mappings from primarily Flat files and MYSQL tables using Unix Jobs.
Developed Transformation Logic to cleanse the source data of inconsistencies before loading the
data into staging area which is the source for stage loading by using procedures.
Responsible for performance tuning for several ETL process .
Created UNIX shell scripts to read and archive files from source directory.
Designed and developed error handling strategies to re-route bad data.
Involved in conceptual, logical, physical data modeling and designed star schema for data warehouse
Used Crontab to schedule and run sessions, as well as to check logs for all activities.
Worked with migration team, testing team to fix defects in various environments like DEV and QA.
Involved in the optimization of SQL queries which resulted in substantial performance improvement
for the conversion processes.
Used mapping variables and mapping parameters and created parameter files.
Participated in code reviews and modified procedures according to the feedback from client.
Developed various Adhoc and created Documents using Report Services Documents.
Deployed procedures code to Integration,QA and production environments and supported post
production issues.
Documented technical mapping specifications and reviewed them with architects.

Project #3
Title
: BWIN-PARTY-EDWH
Client
: BWIN.
Role
: Team Lead (ETL Developer)
Duration
: July 2012 to June 2014
Environment
: Informatica Power Center 9.1.0, ODI, Flat Files, Oracle9i, SQL
Server2005, Teradata12.0, Windows, UNIX.

Project & Role Description


Bwin.Party Digital Entertainment is an online gambling company, formed by the March2011 merger of
PartyGaming plc and Bwin Interactive Entertainment AG. The worlds largest publicly traded online
gambling firm. It is headquartered in Gibraltar. The Bwin.Party having four key products/verticals: Sports
betting, Poker, Casino and Bingo. Sports betting is a core business and three main brands bwin,
Gamebookers and PartyBets. Poker is one of core businesses and main brand is PartyPoker.

Responsibilities:
Wrote technical requirements and specifications for the modifications after interacting with
customers/end users to obtain the requirements
Worked with Business Analysts and Data Architects in gathering the requirements and designed
the Mapping Specification Documents.
Prepared technical requirements documents which include both macro-level and micro-level
design documents
Used Erwin data modeler to design the data marts and also generate the necessary DDL scripts
for objects creation for DBA review
Involved in preparing & documenting Test cases and Test procedures. Involved in developing
these for Unit test, Integration Test and System Tests
using various Informatica Transformations like Union, Joiner, Expression, Lookup, Aggregate,
Filter, Router Normalizer, Update Strategy etc
Involved in the performance tuning of Informatica mappings and Reusable transformations
and analyze the target based commit interval for optimum session performance
Wrote Pre-session and Post-session shell scripts for dropping, creating indexes for tables, Email
tasks and various other applications
Used Sequence Generator to create Dimension Keys and Update Strategy to insert records into
the target table in staging and Data Mart.s
This
Wrote Pre-session and Post-session shell scripts for dropping, creating indexes for tables, Email
tasks and various other applications
volved in preparing & documenting Test cases and Test procedures. Involved in developing these
for Unit test, Integration Test and System Tests
ing various Informatica Transformations like Union, Joiner, Expression, Lookup, Aggregate,
Filter, Router Normalizer, Update Strategy etc

Das könnte Ihnen auch gefallen