Sie sind auf Seite 1von 5

Big Data Analyst

Data & Business Analyst, Data Architect, ETL Designer & Data Migration Lead

Toronto, ON - Email me on Indeed: indeed.com/r/f31611a7d1c7a121

• Overall 11 + years of Experience in Data Management and Data Warehousing Projects in various roles as
a Data Modeler and Data Analyst
• Highly disciplined, committed and self-motivated professional having worked on a wide range of Data Driven
and Business Intelligence projects on multiple domains
• Implemented end-to-end BI delivery solutions based on industry tools such as Erwin , Informatica , Oracle &
Big Data Hive
• Excellent communication skills to manage and liaise with business champions and technical consultants to
explore business process improvement opportunities
• An accomplished and fluent communicator with strong investigation, problem-solving and decision-making
skills, combined with a pragmatic approach and sound business acumen
• Team Player with ability to articulate , facilitate , negotiate , working collaboratively with a better understanding
of managing conflict of diverse stakeholder groups
• Ability to manage client expectations and negotiate a win – win situation

WORK EXPERIENCE

Big Data Analyst

TD Bank - Toronto, ON -

May 2016 to Present

Building a new enterprise Data Lake on Big Data Platform for integrating data from disparate sources such as
Cobol & Delimited Files in order to consume Data as a service to shop for data, Data analytics and Downstream
EDW & BI Reporting at TD
• Engaged Source Owners to create requirements specifications for Data Ingestion from various sources such
as Auto Finance, TSYS Credit cards into Data Lake and to provide Extraction File structure to be consumed
by downstream thereby integrate these systems into MDM as a single golden record
• Effectively Analyzed Cobol Copy Books and Delimited Files to create and manage Hive tables, map and
ingest data into the Enterprise Data Lake using Podium Data as a Data Ingestion Tool
• Effectively handled Copy Book Redefinitions, Occurrences & Modifications of data segments to match the
standard fixed record length formats to suit the tailor ingestion requirements using Podium Data
• Analyzed and Troubleshooting of various Data Quality Issues encountered during the Ingestion such
as Alphanumeric Numeric values in Numeric, Date Format Conversions, EDCIDIC to ASCII character set
conversions, French Non ASCII Characters & Control Characters, Header and Trailer Ingestions issues,
Record reconciliations and Variable length block records
• Analyzed already Ingested existing source such as FISA Loan Serve implemented on Hadoop using Map
Reduce to Reingest it again using Podium Data
• Effectively Extracted data to generate flat file extract for downstream data consumption combining, filtering,
joining & pivoting set of data using Podium Prepare Tool, Apache Pig & HQL scripts used for ETL
• Collected and Gathered Business Metadata to populate the Metadata Information of all ingested sources
managed within Podium Meta Store managed internally by Podium Data using Post GRESS database
• Created Post GRESS SQL Scripts to extract the metadata from Podium Meta Store to facilitate Business
Acceptance Testing using TD Internal Marketplace, a tool used for shopping of data ingested into Data lake
• Assisted other DA’s within the team in resolving data ingestion issues either around Cobol copybook
modifications, HQL’s validation or Data File Validation

Data Modeler - Hive Big Data Platform

CIBC Bank - Toronto, ON -

July 2015 to April 2016

Responsibilities
• Effectively Analyzed the existing legacy Cobol files & small relational sources to produced physical data
models in Big Data Hive database using Erwin for a new Enterprise Data Hub to effectively stage and liaise
the Customer Centric Risk Data based on various customer subject areas as needed by the various regulatory
teams
• Developed the basic understanding of the “AS-IS” relational Model, by reverse engineering the SQL scripts
from the existing data warehouse to integrate the relational source into a Hive database
• Analyzed various legacy Cobol sources to retrofit the Cobol File Redefinitions and Occurrences to design
the related Hive Tables making use of complex data types such as Map and Structures
• Organized the Data Models into various subject areas based on various compliance team to generate Data
Definition reports
• Developed Mapping spreadsheet analyzing data samples for the existing legacy Cobol Files & relational
sources to the new Hive tables in terms of the organizing the file partitioning and translating the legacy storage
types to appropriate data types in Hive database
• Effectively engaged the source team owners for GAP resolution found in the source artifacts i.e. between
Cobol Copybooks and File Technical Metadata stored , thereby clearly articulating differences in both of them ,
so as to finalize the target tables with their data types in Hive
• Support the development team for Data Ingestion issues , retrofitting the existing data models based on the
actual Data trials

Skills Used
Erwin Data Modeling Tool , Data Analysis ,
Data Type Conversion from Primtive to Hive Data Types , Requirements Analysis ,
Toad For Oracle , Mainframes Cobol Copy Books Understanding and Interpretation

Data & Business Analyst, Data Architect, ETL Designer & Data Migration Lead

Capgemini India Consulting India Pvt Ltd -

June 2008 to June 2015

As A ETL lead managed offshore delivery for one of the Sweden's largest telecom Giant on one of the its
telecom Data Warehouse implemented through Oracle PL/SQL and MS BI
o Managed Weekly Reporting, Issues and Client Escalations with 2 resources
o Handled Change Requests and Annual Maintenance of the Existing Data Marts pertaining to subject areas
such as Various Subscriber Talk Plan Usages and Insurances, Customer Billing, Customer Churn, Customer
Score, Customer queries and Feedback
o Functionally Analyzed and Implemented some of the New complex requests to Integrate the data into the
existing warehouse
* As a Data Architect travelled onshore to engage with the stakeholders in order to understand the application
requirements surrounding the re architecting of the core pricing application for a famous beverage company
based in US Atlanta
o Played a key role in Designed the Application Data Model and Data Mappings combining the new enhanced
Use Cases and the existing Data Model of Legacy System
o Ensured smooth Migration of key Referenced business entities by building appropriate cross references
across sources to mitigate issues surrounding the data quality
o Instrumental in building ETL Design components and Error handling using Oracle PL/Sql routines and o Well
Managed Weekly Reporting, Issues and Client Escalations with a single resource
* As a Data Designer travelled onshore to Support one of the core application related to the Debt Recovery
Management for one of the UK's leading Gas Supplier
o Modified the Data Model for the application based on either change requirements or emerged due source
data profiling
o Applied Go Live Data Fixes related to Migrated Debt data minimizing the data quality and mapping issues
o Proactively co-ordinated across multiple teams to report on the Data issues reported on a daily basis found
in batch load
o Resolved the Data quality issues resulting in the data exchange from the debt Application database to the
Business Warehouse
* As an ETL Designer and Lead travelled onshore to successfully deliver The Automated Project Management
Solution for Britain's Leading Railway using WhereScape Red ETL Tool and Oracle Database.
o Conducted Change Requests Workshops and helped delivered ETL & Reporting needs using Agile
Methodologies.
o Reduced the Manual Intervention of ETL processes by designing Automation wherever possible and helped
reduce the Number of existing reporting defects, thereby improving user experience.
* As a Technical Business Analyst travelled onshore for requirements gathering for re - architecting existing
data warehouse for one of the Popular Betting giants in the UK London
o Effectively engaged the business champions to translate reporting requirements and build the new Enhanced
Data warehouse
o Instrumental in identifying the existing feeds for the Key Critical Business processes mapping them against
the new DWH platform thereby aiding the "As - Is " business continuity using Agile Methodologies
o Aided the Data Modeling and Data Mapping activity by continual reviewing, refining and highlighting the
missing Data Elements, thereby ensuring smooth capture of necessary business requirements
o Supported the UAT and ETL test activities by producing technical reconciliation scripts.
* As a Data Analyst travelled onshore for carrying out the Detailed Data Analysis for a one of the UK
Government based Legal Aid institution to aid in the Migration from Legacy Oracle Systems to Oracle ERP
o Carried out Initial Financial Data Analysis of the Legacy Sources outlining the dependencies of various Legal
Cases on each other, by effectively engaging the stakeholders for defining the scope of work
o Produced the Financials Transactional Data Mappings and Process Flow Documents to aid the development
team migrate the data from Legacy systems to Oracle Apps ERP
* As a ETL Designer Liased with the onshore lead co-coordinator to understand the Clinical Trials requirements
to migrate the Clinical Data for one of the US based Clinical Trial Company
o Detailed Technical Specification based on the Functional Requirements for Migrating Clinical Studies
o Played a Key role in building a Migration Tool Framework to build transformation rules to migrate the clinical
studies based on the source application API's using Oracle Database and CA Erwin

Developer Analyst

Teradata India Pvt Ltd -

April 2008 to June 2008

As an ETL Developer Analyst worked on one of the major release consisting of Change Request and defect
fixes for one of UK's telecom Client using Teradata
o Trained on Teradata SQL and Basics
o As a team member analyzed, developed, documented, Unit tested and Deployed Data Integration
Components
o Supported and Fixed go live defects

Data Analyst, Designer Analyst, Developer Analyst

Tata Consultancy Services Ltd -

November 2005 to March 2008

As a Junior Data Analyst aided the Data Modeling exercise by carrying the Data Mapping activity for one of
the fortune 500 Supplier Chain Manufacturing Company to Implement a Supply Chain Data warehouse
o Aided the Data Modeling team by Profiling the source data for establishing the validity of the Data Model
o Aided the Data Mapping Activity by producing the Mapping Spreadsheets
* As a Designer Analyst Implemented Data Migration Strategy for Archiving and Purging of growing
transactional tables in Oracle Financials for one of the leading Financial Consulting Company
o Designed the Migration Approach Strategy
o Developed Code Prototypes and Error handling Components using Oracle Pl Sql for Migration growing
transactional tables in Oracle Financials using Oracle Pl Sql
o Supported and Fixed go live defects
* As an Developer Analyst Implemented Change Requests for one of the major US Airline parts manufacturing
company to aid its decision support system and reporting implement through Oracle Pl Sql and Oracle Reports
o As a team member analyzed, developed, documented, Unit tested and Deployed various Data Integration
Components
o Supported and Fixed go live defects
KEY COMPETENCIES AND TECHNICAL SKILLS
1. Data & Business Analysis, Data Modeling,
2. Data Mappings, Data Quality, Data Migration, Data Warehousing
3. Change Management
4. Agile, Waterfall and Iterative Methodologies
5. Oracle SQL & PL/SQL, Toad, SQL Developer, WhereScape Red ETL tool,
Informatica Power Centre , Hive Database 13.0
6. MS BI - SSIS
7. Team Handling
8. Analytical Problem Solving

DOMAIN
• Aerospace & Engineering Bill Of Materials
• Supply Chain Management
• Oracle Financials - e Business Suite (AR, AP, PO)
• Pharmacy - Clinical Trials
• Beverage Manufacturing, Supply & Pricing
• Online Gambling & Betting
• Project Management & Budgeting
• Debt Recovery Management
• Telecom
EDUCATION

Post Graduate Diploma in Computing

CDAC - Mumbai, Maharashtra

2002

Bachelor of Engineering in Construction Engineering

Shah and Kutchi Anchor College of Engineering - Mumbai, Maharashtra

2001

SKILLS

Data Warehousing (10+ years), Data Migration (9 years), Data Modeling (6 years), Business Analysis (6
years), Oracle PL/SQL (5 years), Informtica Power Center (4 years), WhereScape RED (1 year), Agile
Methodologies (4 years), Podium Data (Less than 1 year), Data Ingestion (2 years), Data Lake (2 years),
Apache Hive (2 years), Apache Pig (1 year), Cobol Copybook Analysis (2 years)

CERTIFICATIONS/LICENSES

Oracle Certified Associate

August 2005 to Present

Coursera Certification – UC San Diego: Introduction to Big Data

Coursera Certification – UC San Diego: Big Data Modelling and Management Systems

ADDITIONAL INFORMATION

• Overall 11 + years of Experience in Data Management and Data Warehousing Projects in various roles such
as Data Analyst, Business Analyst, Data Architect, Data Modeler, Data Migration Lead, ETL Designer, ETL
Senior Developer
• Highly disciplined, committed and self motivated professional having worked on a wide range of Data Driven
and Business Intelligence projects on multiple domains.
• Ability to identify & visualize business & systems process improvements from Big Picture to detail level,
thereby helping translate requirements into technical specifications
• Excellent communication skills to manage and liaise with business champions and technical consultants to
explore business process improvement opportunities
• An accomplished and fluent communicator with strong investigation, problem-solving and decision-making
skills, combined with a pragmatic approach and sound business acumen
• Team Player working collaboratively with a better understanding of dealing with different cultural values and
sensitivities
• Ability to manage client expectations and negotiate a win - win situation
• Ability to facilitate, negotiate, gain consensus and manage conflict of diverse stakeholder groups

Das könnte Ihnen auch gefallen