Sie sind auf Seite 1von 9

NAVEEN REDDY

PH NO : 571-527-8252
OBJECTIVE

To obtain a challenging position in fast paced environment in the field of Software Quality Assurance
that would best utilize my technical and interpersonal skills.

CAREER SUMMARY

 Over 5 years experience in Information Technology with emphasis on Software Quality


Assurance.
 Excellent in analyzing Business Requirements Specification, developing Test Plans, Test
Cases and Test Scripts.
 Hands on experience with all phases of Software Development Life Cycle (SDLC).
 Good expertise in Manual and Automation Testing tools like Quick Test Pro, Soap UI tool,
FitNesse, Service Test Tool, Load Runner, Quality Center, Test Director and TSC (ERIE
Insurance in house tool). Also used QTP 10 to test web services.
 Extensive experience in following QA Methodologies, preparing Test Plans, writing Test
Cases and executing them; performed Defect Reporting and Tracking through the entire defect
life cycle.
 Expertise in testing Client-Server and Web Applications.
 Executed and updated SQL queries for Manual Back-End Testing.
 Expertise in Regression, Functional, System, Performance, Load, Stress, GUI and Data-
Driven testing.
 Very good understanding of WSDL, XML, HTML, XHTML.
 Experienced using Test Director, Quality Center and Rational Clear Quest defect reporting
tools.
 Experience in Black Box, Positive, Negative, Data Driven, Unit, System, Integrated and
Back End Testing.
 Ability to work in team environment as well as individually.
 Strong analytical skills coupled with Good Communication and Interpersonal Skills.
 Expertise working in Agile methodology and Test Driven Development
 Good expertise in Insurance and Banking industry domains

TECHNICAL SKILLS

Testing Tools Load Runner, QTP, Win Runner, Rational Team Test
Test Reporting Tools Test Director, Quality Center, Rational Clear Quest, MQC
Programming
C, C++, VB, .NET, Visual Basic, Java, XML
Languages
Web Technologies ASP.NET, ADO.NET, ASP, HTML, MS FrontPage
Databases SQL Server, MS Access, Oracle
Operating Systems Windows XP/2000/NT/98, UNIX, DOS, MAC OSX
Web Servers Websphere, IIS, Apache, Sonic MQ
GUI Visual Basic 5.0,6.0, Business Objects designer 5.X,XI, Toad 7.6

PROFESSIONAL EXPERIENCE

ERIE INSURANCE GROUP - Erie, PA Sep 2008 – Present

Quality Assurance Analyst

Erie Insurance offers a broad range of services to meet family insurance needs, including a
variety of Home and tenant insurance policies and Boat insurance. EIG also offers a variety of
business insurance products to meet the needs of both small and large businesses.
The objective of the integration testing team is to demonstrate that the components that comprise
the system interface and interact together in a stable and coherent way before system test. The
objective is to verify that the programs when integrated interact correctly. This test validates the
technical design, application architecture of the web services and the mapping done to the web
services by the developers in the Integration layer.

• As a Quality Assurance Analyst, I was involved in QA Analysis, Software Testing &


Documentation for integration team.
• As the integration team testers, tested interfaces between different vendor applications and
web services wherever communication happens.
• Closely worked with Business Analyst and System Analyst to set up the mapping for IAA
business enterprise code for all the interfaces in integration.
• Performed Development Integration, System Integration, End to End and User Acceptance
Testing for the services.
• Analyzed and reviewed the scope document individually and as a team to discuss potential
questions and finalized the scope document for each quarter.
• Test Planning – Focused on identifying the conditions that will validate the system
components interoperate and developed the approach on how integration test has to be
structured and administered.
• Documented detailed Test cases and created Test Data based on Use case requirements
and Functional Specifications.
• Executed Test Scripts, comparing actual and expected results, logged defects and tracked
through Quality Center.
• Actively Participated in Daily stand up’s, Weekly walkthroughs for project updates and sprint
planning meetings.
• Very good experience testing the web services using tools like FitNesse, Soap UI, QTP (web
services plug-in).
• Created Test data and Test cases for FitNesse to parameterize the XML, invoke the service
and validate the output with the expected results.
• Used Soap UI tool for inspecting WSDL, invoking web services, mocking WSDL and
functional testing of the web services over HTTP.
• Created a new Test Suite for the project and added Test Cases, Sent custom SOAP requests
and checked the responses with assertions in Soap UI.
• Used Quick Test Professional to invoke the operations of the web service and verify
returned XML data using special functionality that has been customized for web services.
• Any defects were logged and SIR (defect) call meetings were held every day during the test
window to discuss the status of the open defects and confirming the defects validity and
severity.
• Performance testing was done early, as developers construct web services and deploy them.
• Using Load Runner’s web service test wizard, performance testing of the services were done.

QA Tester OCT ’07 – Sep ’08


Nextel Telecommunications, VA
Environment: VB 6.0, SQL Server 2000, Windows 2000/XP,XML, Perl,QTP, Mercury Quality
Center, LoadRunner, C++, VOIP billing system, Amdocs (Ensemble),IVR System, IIS

Worked as QA Tester/Analyst for Nextel Telecommunications, which provides wireless services.


Project involves customizing and testing billing system, which is used by customer representatives.
This application allows customers and CSR’s to check minutes used, due dates, services, new
features, billing information and payment information. I worked on a SmartPay system a Prepaid
Wireless billing system based on the Amdocs Enabler system, used by customers (wireless carriers)
with Rating,. Billing, Reporting, and Service units (RSU’s, VRSU’s) for Pre-Rating Wireless calls,
generating call detail records, administering customers, allocating and creating rate plans, and
generating customized carrier level reports

Responsibilities:

• Involved in the Review of Requirements Specification with QA manager and technical


specialists of the application.
• Created Test Plans and Test Cases developed and executed Test Scripts.
• Excellent communication skills and worked along with Amdocs Technical support to
troubleshoot new releases and production issues of Billing system.
• Implemented the whole life cycle of QA methodology starting from planning, capturing,
creating, executing, reporting and tracking the defects using Mercury Quality Center.
• Involved in writing test cases for testing IVR System.
• Designed, Communicated, and enhanced QA testing plan for the application.
• Performed Manual and automated testing.
• Connected to SQL in UNIX and created and executed SQL queries.
• Conducted System, Integration and Regression testing of the AUT.
• Created and executed SQL scripts and Unix Shell Scripts to perform Back-end testing on
the Oracle database.
• Wrote numerous Perl scripts that used Perl DBI to perform data verification against an
Oracle database.
• Prepared Test Cases, according to the business specification and wrote scripts using QTP
according to that.
• Tested user interface and navigation controls of the application using QTP.
• Worked with Functions and Library Files in QTP.
• Performed Data- Driven Testing using QTP.
• Extensively used VB language to modify scripts in QTP.
• Created Parameterization, Rendezvous and Scheduling Operation (ramp up/ramp down)
using Load Runner.
• Mercury Virtual Generator is widely used to capture the actions.
• Customized the VuGen Script by creating parameters, eliminating the correlations and
creating the transactions where ever needed.
• Mercury Controller is then used to check the response times on the application using 685
Concurrent Users.
• Performed Load, Stress and Performance testing Using Load Runner.
• Conducted Functional testing for Middleware services and API.
• Conducted data and referential integrity testing of the database manually.
• Performed Usability and Security testing manually.
• Performed manual and automated testing in IVR application
• Maintained Traceability Matrix and performed Gap Analysis.
• Performed back-end testing by extensively using SQL commands to verify the database
integrity.
• Manually Conducted Positive and Negative testing.
• Coordinated with the developer on defect status on regular basis.
• Used Mercury Quality Center to track and analyze defects.
• Analyzed test results using reports and graphs generated in Mercury Quality Center.
• Worked on Mercury Quality Center in setting up and Customizing Project entities for Defect
Module Screens.

QA Tester JAN‘07 – SEP ‘07


Choice One Communications, Rochester, NY
Environment: Java, Unix, Oracle, QTP, Mercury Quality Center, Windows2000,Business Objects,
IVR, LoadRunner, Amdocs (Ensemble),IIS, perl

Choice One Communications is a leading integrated communications provider offering voice and data
services to small and medium sized businesses. I was involved in testing of Order Management
module and Bill Payment module of the operation support system. The project was targeted to
maximize efficiency through automation, and to achieve excellence in customer care. The order
management module was built to organize and track the progress of customer orders, ensuring that
all steps in the process are completed on time, in correct sequence and in accordance to the
inventory to confirm that necessary assets are available. The Front End of the software was
developed in JAVA using SWING and the Back End was Oracle database server. The bill payment
middle tier module was an asynchronous multi-tier application using Vitria Business Ware as the
middleware. An RMI layer acted as the link between the Vitria Business Ware and the EJB layer on
the front end. COC has specialized in developing IVRS (Interactive Voice Responsive Systems) for its
collection of data and hence expediting clinical trial development on a global scale. I worked as a
Business Analyst for IVRS application.

Responsibilities:

• Worked with the Business Analyst to review test requirements and developed detailed Test
Plan for the testing effort of the User Interface and also accordingly for billing portion of
Amdocs Ensemble.
• Involved in preparing test plan and test cases based on the business requirements.
• Involved in developing test cases for manual testing.
• Implemented the SDLC for the testing life cycle and followed the standards process in the
application.
• Created several Perl scripts to parse one minute statistical entries from application log files to
report on the min, max, and Avg processing times for ýfindminý and ýpricecallý transactions
from subscribers making or receiving calls during load testing.
• Tested the properties of the tables using table checkpoints using QTP.
• Created page checkpoints to test the properties and contents of the web page QTP.
• Performed Back-end testing manually writing SQL queries and also using QTP by Using DB
checkpoints.
• Conducted User Acceptance Testing (UAT) of IVR system.
• Conducted manual testing on the application on a large extent.
• Created Parameterization using LoadRunner.
• Performed Load, Stress and Performance testing Using LoadRunner.
• Rendezvous and Scheduling Operation (ramp up/ramp down) using LoadRunner.
• Developed test cases and scripts for Functionality and Security testing.
• Performed Back-end testing by writing SQL queries using PL/SQL commands and UNIX
Shell Scripting.
• Performed execution of test cases manually to verify the expected results.
• Conducted Security, Integration, functional testing, regression and GUI testing on each
build or version of the application.
• Conducted Positive and Negative testing manually.
• Performed Data Driven testing in WinRunner using parameterization.
• Performed Back-end testing manually writing SQL queries and also using WinRunner By
Using DB checkpoints.
• Conducted class mapping on the objects of non-standard class using WinRunner.
• Maintained traceability matrix and performed gap analysis.
• Actively involved in discussions related to Enhancement requests and Modification
requests.
• Isolated and reported bugs and made sure that the bugs were fixed efficiently and thoroughly.
• Used MQC for error reporting and communicating between developers, product support and
test team members.
• Reported and Tracked bugs using MQC.
Performance Analyst May ’06 – Dec ‘06
UBS Banking and Investments , NJ
Environment: Java, Unix, Load Runner, Win Runner, Oracle 8i, Test Director, MQ Series

UBS is one of the leading banking and investment service provider in the globe handling more than
100 Business Objects that include Personal Accounts, Auto accounts, Mortgage Accounts, Corporate
Accounts, Insurance etc. The objective of this project is test the client server and run Performance
test, take an analysis of the Green Box (Cache Box) and then Install the Coues Box and take an
analysis of it. This project involves in Performance testing, stress testing, Load, functional testing on
the client server.

Responsibilities:

• Performed Load/Stress testing to predict system behavior and performance using


LoadRunner.
• Performance testing is done by creating controller Scenarios.
• Developed script using LoadRunner by recording, placing checkpoint, parameterization
correlation and adding custom code as needed to enhance the scripts as well.
• Developed Vuser scripts using LoadRunner Virtual user generator for various business
processes; performing correlation, data parameterization and customization of scripts.
• Performed Load and Stress testing for various applications using LoadRunner.
• Involved in preparing test data required for running the tests and executed test cases.
• Analyzed system usage information such as task distribution diagram and load model to
create effective test scenarios
• Scheduled 1100 virtual users in a stress test scenario to simulate the peak load to identify
and isolate system performance bottlenecks.
• Configured Web/Application/Database server monitoring setup using Load Runner
Controller.
• Checked the performance of the mainframe server by running a sniff test.
• Performed SQL querying to validate the data in the back end data base, and also to check
the data flow between different modules
• Monitored the CPU, memory, and network utilizations on the Unix server using Site Scope
monitor
• Analyzed results of Transactions Response time, Transaction under load, Transaction
Summary by Vusers, Hit per Second and Throughput Determined the source of bottlenecks
by correlating the performance data with end-user loads and response times.
• Used Mercury Win Runner to write automation of test scripts.
• Developed Automated Test scripts using Win Runner for performing Regression Testing.
• Based on user requirements, documented the test requirements in Test Director.
Performance Analyst March’05 –Dec’05

AFLAC Inc, Columbus, GA

Environment: PL/SQL, Oracle 9i, Java, J2EE, windows2000/NT, XML, Load Runner, Win
Runner, Test Director 7.6.

AFLAC (American Family Life Assurance Corporation of Columbus) Web Enrollment System (AWES)
is a multifunctional web based application designed to provide Administrator capabilities as well as
allow employees to enroll in products offered by AFLAC thru their employers intranet. I Worked as a
Performance Analyst where the main motive of the project is to do Performance testing, stress
testing, Load, functional testing on the client server. Worked on the customer personal account
application of the client.

Responsibilities
• Involved in analyzing the business requirements and system specifications
• Performed Load/Stress testing to predict system behavior and performance using
LoadRunner.
• Developed Vuser scripts using LoadRunner Virtual user generator for various business
processes.
• Performed Data parameterization in order to facilitate the Data-driven tests.
• Added Manual correlations in the script to make it more robust and to take care of the
Dynamic values (Session Id’s, Row Id’s) during replay of the script.
• Conducted Performance Testing using LoadRunner controller and generated Analytical
Performance reports.
• Defined rendezvous points to create intense load on the server and there by to measure
server performance under heavy load.
• Involved in preparing test data required for running the tests and executed test cases.
• Interacted with developers to resolve issues regarding application
• Documented defects accurately and with sufficient information to enable developers to
analyze and reproduce the defect
• Performed automated functional testing using WinRunner.
• Created and enhanced test scripts using WinRunner.
• Performed Data Driven testing in WinRunner using parameterization.
• Actively participated in Daily status and weekly review meetings
• Tested Workflow policies and Assignment rules

Database QA Tester Jan’04 – Feb’05


TTK Health Care Services Private Limited, Chennai, INDIA

Environment: Test Director, Oracle 8i & 9i, TOAD for Oracle 8.6.1.0, Business Objects
5.X,Business Objects Designer5.X, Business Objects 5.1.8, Business Objects6.1.7, UNIX, & MS
OFFICE.

TTK provides insurance protection for business needs of all sizes from the small business owner to
the large commercial operation. BOF Reports is one of the major reporting tasks created. It has the
below major phases(s) designed for BOF project, at TTK.

• Quote Treaty
• Technical Accounting
• Policy Exhibit
• Life Index
• Policy
• Group Scheme
• Claims
• Underwriting
• Message Management
• Actuarial
• Financial
• Stat

Responsibilities:
Handled Testing team & co-ordinate with both on-site & Off-shore Testing team members for the
below mentioned, various challenging tasks for BOF-Reports, designed at TTK:

• Implemented tasks assignments & tasks distributions to QA Team Members.


• Performed complete business & technical interactions with Business Analysts, Reports
development team, for every release of BOF Build(s)
• Accomplished Test Plan Designs, validations & implementations based on the business
requirements & mockup documents for every BOF release & build(s)
• Fulfilled validations of each & every major phases of BOF-Report’s data & verified their
calculations in every build(s), based on,
o Various Data Mart Table(s)
o Various DWH Table(s)
• Co-ordinate with QA Team members & managed defects tracking tasks such as, logging, re-
testing & Re-assigning of new/tested defects to the corresponding development team
member(s) & business team member(s)

QA Analyst JAN’03 – Dec’03


ICICI Bank, Banglore, INDIA
Environment: Windows 98, Client/Server application, oracle, UNIX, Load Runner
ICICI Bank uses, a system originally called "PC Manager" (later upgraded as a product called
"EBanker") which allows ICICI commercial customers to perform many business-banking operations
directly from a PC running Windows. Manual and automated Testing is done on the advanced
modules of stop payment on checks, inquire on the status of checks and other transactions, and get
photocopies of checks. The system contains extensive security features. The communications
module, which handles movement of information between the customers PC and the bank's Tandem
computer and communicates with the main application over the Internet.

Responsibilities:
• Involved in the creation of test plan and test cases based on functional specifications
required for Manual testing of marketing campaigns (promotions) .
• Determined user requirements and goals by conducting meetings.
• Designed, scheduled and executed test plans within the pre-defined timeframe.
• Performed GUI testing manually.
• Performed Functionality testing of the application.
• Conducted Backend testing on the Oracle Database using SQL.
• Regression tests were performed after every bug fix or system enhancement.
• Performed Integration testing and Configuration testing.
• Used Load Runner, for Performance, Stress testing and Load testing.
• Developed Base line scripts in order to test the future releases of the application.
• Performed Unit and System Testing.
• Participated in various meetings and discussed Enhancement and Modification Request
issues.
• Coordinated with the developers on Result and Defects Status on a regular basis.
• Executed Test Cases manually and used locally developed tool for bug and defect tracking.
• Participated in Inspection and walkthroughs.

Education:

Bachelors in Computer Sciences.

References: available upon request

Email: salukutinaveen@gmail.com

Resident Status: H1B

Das könnte Ihnen auch gefallen