Beruflich Dokumente
Kultur Dokumente
Project Experience
Redesigning and refactoring existing system using design patterns like Fly-Weight, Template,
Factory, Strategy, DAO and Singleton to support various patents ingestion process flow and tuned
existing process to earn better performance using Java 8 parallel stream/ lambda/Completable
Future/ Concurrent Hash Map.
Implemented Data refresh process using Spark-DSE Rdds, JDBC Rdds Spark sql, and Scala to
transfer data from Legacy system to Cassandra which can be indexed into Apache Solr. Used SolrJ
Client to push less volume of data from DSE Cassandra to Apache Solr.
POC/Spike for log4j changes to make logging process faster using Async-Logger and Chronicle
Loggers to increase performance for ingestion.
Implemented Ring Buffer/Streaming Queue for Async-MySQL listener to increase performance for
tracking data. Data modeling for Cassandra tables to support Business data view needs.
Implemented the business layer by using DataStax Cassandra drive and POJO java classes using
java 8.0(Optional /Functional Interface) for Cassandra CRUD operations.
Experienced in handling large datasets using Partitions, Spark in Memory capabilities, Broadcasts in
Spark, Effective & efficient Joins, Transformations and other during ingestion process itself.
Implemented parallel processing for Back file using Work Stealing Pool Executor java 8 /Count down
Latch /Semaphore and Cassandra driver Async writes/reads and Tiff image parsing using Tika Parser
and Apache Imaging.
Did analysis for Cassandra Tombstone to provide solution and protocol to follow data repair pattern.
Implemented indexing changes using Spark RDD, Scala to push data into Apache Solr. Used Spark
SQL in spark shell to create ingestion data ad report for User.
Developed Cassandra Table joins using sparks Cassandra RDDs and used accumulators/Broadcast
to capture data indexing failures for final reports.
Used Spring Boot, Cloud, Zuul, Eurka and Rest to create micro services for reports and audits.
Implemented Spring Rest Micro services to support image location, names and No Image Found.
Used Spring IOC, Caching, Transaction, MVC, and JPA Hibernate to support API and UI business
requirements. Developed core search component using Apache Solr for staging views.
Performed procedures like text analytics and processing, using the in-memory computing capabilities
of DSE Spark
Implemented Performance tuning for Cassandra using Cassandra-yaml, and DSE Cassandra drive.
Used Akka framework (actors/routers) for maintenance file creation to support high volume data.
Used MVC Framework Angular.js for data binding and to consume Restful WS.
Used Apache Kafka with Datastax-Connector sink to store real time classification into Cassandra.
POC/Spike for Spring Apache Kafka Integration to provide ingestion logs to AdminOPs UI.
Technical Environment: Java 8, DSE Cassandra 2.X/3.X, Apache Solr 5.x, Angular JS , Oracle 12c,
Spark1.6, Scala, Zookeeper, Hibernate 4.0, JPA 2.1,Spring boot/Cloud/Eureka/Zuul/Ribbon, JAX-RS,
Oracle, MySQL, Rally , Jboss , Jenkins, Maven, Ubuntu2, Xpath, Xml, VTD/DOM parser, SVN, Apache
Kafka 2.10, Power-Mock.
Client Name Delta Dental of Michigan. Mar 2013 May 2015.
Location Lansing, MI.
Domain Dental Health Care.
Project: CESR (Claim Engine Support and Re-Design) is backbone for Dental Claim
processing for USA End User and also for NON USA providers. Since billing and payments are done for
claim approvals, current system will be refactored and approach is to support and process claims.
Role in project
Redesigning and refactoring existing system using design patterns like COR, Service-Locator
Developed Core Java Objects like Spring DAO, Handlers, and Controllers for Pricing Claims Module
Designed and Configured Business Transactions using Spring Framework Transaction API.
Extensively used IOC (for inheritance, Interface using Look up Injection) with auto wiring and AOP
concepts of Spring Framework as part of development and mapped it to Hibernate Configuration File
and also established data integrity among all tables.
Implemented the business layer by using Hibernate, spring and POJO java classes using hibernate
mapping annotations and Hibernate Criteria.
Implemented Jersey Restful Web Services for search claim operation for existing clients.
Used EJB 3.0 Session Bean, MBD for claims batch application and Oracle ESB for data exchange.
Implemented SOAP web services to support Manual Pricing WS operation.
Design new Mocking framework using core Power-mock jar and was involved in POC for Mongo-DB.
Mentoring junior and mid-level developer to follow TDD and creating core framework for it. Used
Stored Procedures/triggers, PL SQL and Oracle Packages to crunch data and maintain history for
multiple tables on Oracle 11g.
Used Core Spring Batch Framework for claims batch processing.
Used Maven for building the project. And handled the QA/Production Support for this Project.
Involved with bug fixing on functionality, and designing issues with JavaScript and JQuery.
Technical Environment: Java7, JMS, Mongo-DB, WLI, Spring 3.x, Hibernate 3.5, EJB3.0,JAX-WS, ,
JSF,Toad10.x,JvisualVM,Mingle,VisualGC,MAT,Oracle(11g),iVersionOne,IlogRuleEngine,GITMingle,
JQuery Web logic(11g), Jenkins, Selenium, Maven, Unix, Selenium 3.x, Accurev, Power-Mock, SOAPUI
4.5,Spring Batch Junit 4.0, JAXB, XSL, DTD, Top link, JAX-WS.
Client Name Think Or Swim. Feb 2012 Feb 2013.
Location Chicago, IL.
Domains Online Trading.
Project: Core TOS (TOS Support Management) is a back bone for Think or Swim technical
framework which is will process all transactions submitted by End clients and on approval to clearance
systems. System is designed to handle cancellation/reorder for complex Options/Future transactions.
Role in the Projects:
Installed/Configured and was responsible for setting up the environment for core spring controller.
Used J2EE Design patterns like VO, Builder, Prototype, and Command across the application.
Coded the Message sender to publish message and Spring JMS POJO to consume the messages.
Developed Core Java 5 (Multithreading, collections, beans) in conjunction with JDBC to get data from
multiple accounts, allocation systems and store it in Oracle. Used SQL squirrel for creating Stored
Procedures, Functions, and View etc.
Developed Listener multithreaded logic to process client Options/Futures trade files using Buffered
Reader Properties, Class loader.
Re-factored existing Account Transfer processing/ Derivatives (For paper Money Mode) in order to
make it enable for multithreaded operation and used Spring Transactions.
Used JAX WS Web services for bookkeeping request.
Load Raw /Market Data in DB (For Integration Testing) to integrate TOS with TD Ameritrade for
Equities, Exchange-traded funds, futures, mutual funds and bonds.
Developed stored procedures, Triggers and functions to process the trades in Oracle and mapped it
to Hibernate Configuration File and also established data integrity among all tables.
Technical Environment: Java6, Strut 2.0, Spring 2.X, Hibernate 3.5, JBoss, Clear Case, Mockito, JMS,
MAT, J-Console, Maven, Web sphere 7.x,Oracle 10g, DB2 9.x, Scala.
Client Name JPMorgan Chase (Bear Stearns) Sep 2009 Jan 2012.
Location Whippany , NJ