Sie sind auf Seite 1von 3

SWAMY K

Email:kyerriswamy3121@gmail.com
Mobile: +91-9494460907
Skills : Hadoop, Bigdata, Hdfs, Hive, Pig, Sqoop, Oozie, Yarn
OBJECTIVE:
Work in a challenging environment, learn new technologies and build a successful career in designing
and developing software applications.
PROFESSIONAL EXPERIENCE SUMMARY:

• Having total 3 years of extensive experience on the industry of information Technology.


• Hadoop Developer with around 3 years of experience in developing Hive and Pig scripts to solve
customer specific Big Data problems and create new business insights.
• Expertise in using various Hadoop eco-system such as HDFS, Hive, Pig, Sqoop for data storage
and analysis.
• Experience in writing HIVE queries & Pig scripts.
• Writing Hive UDF's and Pig UDF's based on requirements.
• Experience with Oozie Workflow Engine in running workflow jobs with actions that run Hive and
Pig jobs.

Experience:

• Working as a Software Engineer at HCL Infosystems Pvt Ltd, Bangalore Since April 2015 to till
date.

Education:
• Graduated from SK University.

PROJECTS PROFILE:

1. Client: Society International de Telecommunications Aéronautiques (SITA)


Project: Migration of the SITA Referential to Cloudera Hadoop
Project Synopsis:

Current Solution

Data is ingested from a different set of Data Sources that include CRM, Siebel, and other systems

Several Transformations are applied to the Data and loaded into RDBMS – Oracle

Transformation jobs are defined and managed using Talend

Targeted Solution
Using Distributed Storage and Processing by leveraging the capability of the existing Cloudera Hadoop
ecosystem. Transform the existing ETL jobs into Distributed Data processing

Migrate from traditional approach for ETL to Big Data ecosystem

Migration Project

Migration of existing Referential DB (Approximate size of 300GB) to Cloudera Hadoop infrastructure

Migration of 95+ Talend ETLs

Interfaces to the Web applications to be altered to support Hadoop

Migrating the existing Referential APIs to Hadoop

Migrating Product (PMS), Customer (CMS) and Project applet data sources including backend queries
and stored procedures. Complete system testing and deployment

2. Client: Society International de Telecommunications Aéronautiques (SITA)


Project: SITA Gabriel MIPS Reduction
Project Synopsis:

Integrating HADOOP based cluster with Mainframe: Reducing MIPS usage on the Mainframe by
transferring all batch processing on to HADOOP based cluster. Development of AUDIT Tracing System
and SYSLOG manager used for tracing system specific information and delivery of messages in between
various entities. Developed Map Reduce application to enable the settlement of Accounting and Billing
for SITA Finance and Revenue Accounting, such that revenue settlement could be fully automated with
ATPCO (Air Tariff Publishing Co).

3. Client: Society International de Telecommunications Aéronautiques (SITA)


Project: SITA Passenger Handling System
Project Synopsis:

All the datasets was loaded from two different sources such as Oracle, MySQL to HDFS and Hive
respectively. Dataset contains the flight details of various airlines such as: Airport id, Name of the
airport, Main city served by airport, Country or territory where airport is located, Code of Airport,
Decimal degrees, Hours offset from UTC, Time zone, etc.

Roles & Responsibilities:


• Used Sqoop for data transfer between MS-Sql and HDFS

• Impala for Ad-hoc Query testing.

• Writing Hive queries to read from HBase.

• Writing Shell scripts to automate the process flow.

• Writing Hive, Pig Queries and UDF’s on different datasets and joining them.

• Used Bucketing tables for join optimizations & sampling

• Writing the Hadoop Job workflows & scheduling using Oozie

4. Client: Society International de Telecommunications Aéronautiques (SITA)


Project: SITA Passenger Handling System/CKI
Project Synopsis:
The project involves maintenance of 3 major applications namely E-ticketing, passenger reservation and
check-in solution for SITA. SITA provides a multi host environment where nearly 110 airlines are hosted
and some more airlines are hosted as sub hosts on major carriers. All the three applications (TKT, RES
and CKI) functions in non-linked environment and communicates with each other by internal message
passing. Nature of work involves three major things. Monitoring and troubleshooting various NFMs and
data extraction jobs, which are very critical for the functioning of the system. On call support (24*7)
where we support all 3 the applications (TKT, RES and CKI) by resolving urgent production issues. Bug
fixing (solving trillium tickets) by performing root cause analysis. Programming new development and
enhancements to suite the business needs of the customer.

Declaration:

I hereby declare that all the information endowed above is true to best of my knowledge and I bear the
responsibility for the correctness of the above mentioned particulars.

Date:

Place: Bangalore (Swamy)

Das könnte Ihnen auch gefallen