Sie sind auf Seite 1von 4

Rohan Surve 240-485-8895

J2EE/Spark Cassandra Solr rohansurve26@gmail.com


Summary of Professional Experience
Masters degree in Computer Engineering with experience of around 12 years in Softwares full cycle
projects including Internet and Intranet applications.
Expertise at using frameworks and tools like Struts, Spring, Hibernate, Top link, Ibatis, ANT, Maven
and Cactus. Strong experience using various web/application servers like IBM Web Sphere
6.0/5.1/4.0, BEA Web Logic 11g /10.3/8.1, Jboss and Apache Tomcat 7.0/6.0/5.0/4.0.
Handled the Requirement Analysis, Design and Development of a Middle size Development Project
with the team at JPMC and an off shore team.
Experience in writing SQL queries/functions/ stored procedures in Oracle 11g/10g.
Extensive experience in J2EE Design Patterns and technologies like JAX WS/RS and EJB.
Reasonable Domain Knowledge in the area of Equities, Online Trading, Fixed Income /Futures,
HealthCare and Telecom. Excellent analytical skills in defect fixing and solving complex problems.
Self-starter, highly motivated, technically sound, training and mentoring skills in driving development
teams to successful resolution. In Person appreciation from Federal CIO - USPTO.

Project Experience

Client Name USPTO. May 2015 Till date.


Location Fair Fax VA.
Domain Federal/Search Engine.
Project: EST-Search Application (Data-Ingestion/Search UI/API) is backbone for EST
search engine for USPTO. Data-Ingestion is the only one source to ingest backend data from various
system to Cassandra and from Cassandra to Apache Solr using Apache Spark.
Role in project

Redesigning and refactoring existing system using design patterns like Fly-Weight, Template,
Factory, Strategy, DAO and Singleton to support various patents ingestion process flow and tuned
existing process to earn better performance using Java 8 parallel stream/ lambda/Completable
Future/ Concurrent Hash Map.
Implemented Data refresh process using Spark-DSE Rdds, JDBC Rdds Spark sql, and Scala to
transfer data from Legacy system to Cassandra which can be indexed into Apache Solr. Used SolrJ
Client to push less volume of data from DSE Cassandra to Apache Solr.
POC/Spike for log4j changes to make logging process faster using Async-Logger and Chronicle
Loggers to increase performance for ingestion.
Implemented Ring Buffer/Streaming Queue for Async-MySQL listener to increase performance for
tracking data. Data modeling for Cassandra tables to support Business data view needs.
Implemented the business layer by using DataStax Cassandra drive and POJO java classes using
java 8.0(Optional /Functional Interface) for Cassandra CRUD operations.
Experienced in handling large datasets using Partitions, Spark in Memory capabilities, Broadcasts in
Spark, Effective & efficient Joins, Transformations and other during ingestion process itself.
Implemented parallel processing for Back file using Work Stealing Pool Executor java 8 /Count down
Latch /Semaphore and Cassandra driver Async writes/reads and Tiff image parsing using Tika Parser
and Apache Imaging.
Did analysis for Cassandra Tombstone to provide solution and protocol to follow data repair pattern.
Implemented indexing changes using Spark RDD, Scala to push data into Apache Solr. Used Spark
SQL in spark shell to create ingestion data ad report for User.
Developed Cassandra Table joins using sparks Cassandra RDDs and used accumulators/Broadcast
to capture data indexing failures for final reports.
Used Spring Boot, Cloud, Zuul, Eurka and Rest to create micro services for reports and audits.
Implemented Spring Rest Micro services to support image location, names and No Image Found.
Used Spring IOC, Caching, Transaction, MVC, and JPA Hibernate to support API and UI business
requirements. Developed core search component using Apache Solr for staging views.
Performed procedures like text analytics and processing, using the in-memory computing capabilities
of DSE Spark
Implemented Performance tuning for Cassandra using Cassandra-yaml, and DSE Cassandra drive.
Used Akka framework (actors/routers) for maintenance file creation to support high volume data.
Used MVC Framework Angular.js for data binding and to consume Restful WS.
Used Apache Kafka with Datastax-Connector sink to store real time classification into Cassandra.
POC/Spike for Spring Apache Kafka Integration to provide ingestion logs to AdminOPs UI.
Technical Environment: Java 8, DSE Cassandra 2.X/3.X, Apache Solr 5.x, Angular JS , Oracle 12c,
Spark1.6, Scala, Zookeeper, Hibernate 4.0, JPA 2.1,Spring boot/Cloud/Eureka/Zuul/Ribbon, JAX-RS,
Oracle, MySQL, Rally , Jboss , Jenkins, Maven, Ubuntu2, Xpath, Xml, VTD/DOM parser, SVN, Apache
Kafka 2.10, Power-Mock.
Client Name Delta Dental of Michigan. Mar 2013 May 2015.
Location Lansing, MI.
Domain Dental Health Care.
Project: CESR (Claim Engine Support and Re-Design) is backbone for Dental Claim
processing for USA End User and also for NON USA providers. Since billing and payments are done for
claim approvals, current system will be refactored and approach is to support and process claims.
Role in project
Redesigning and refactoring existing system using design patterns like COR, Service-Locator
Developed Core Java Objects like Spring DAO, Handlers, and Controllers for Pricing Claims Module
Designed and Configured Business Transactions using Spring Framework Transaction API.
Extensively used IOC (for inheritance, Interface using Look up Injection) with auto wiring and AOP
concepts of Spring Framework as part of development and mapped it to Hibernate Configuration File
and also established data integrity among all tables.
Implemented the business layer by using Hibernate, spring and POJO java classes using hibernate
mapping annotations and Hibernate Criteria.
Implemented Jersey Restful Web Services for search claim operation for existing clients.
Used EJB 3.0 Session Bean, MBD for claims batch application and Oracle ESB for data exchange.
Implemented SOAP web services to support Manual Pricing WS operation.
Design new Mocking framework using core Power-mock jar and was involved in POC for Mongo-DB.
Mentoring junior and mid-level developer to follow TDD and creating core framework for it. Used
Stored Procedures/triggers, PL SQL and Oracle Packages to crunch data and maintain history for
multiple tables on Oracle 11g.
Used Core Spring Batch Framework for claims batch processing.
Used Maven for building the project. And handled the QA/Production Support for this Project.
Involved with bug fixing on functionality, and designing issues with JavaScript and JQuery.
Technical Environment: Java7, JMS, Mongo-DB, WLI, Spring 3.x, Hibernate 3.5, EJB3.0,JAX-WS, ,
JSF,Toad10.x,JvisualVM,Mingle,VisualGC,MAT,Oracle(11g),iVersionOne,IlogRuleEngine,GITMingle,
JQuery Web logic(11g), Jenkins, Selenium, Maven, Unix, Selenium 3.x, Accurev, Power-Mock, SOAPUI
4.5,Spring Batch Junit 4.0, JAXB, XSL, DTD, Top link, JAX-WS.
Client Name Think Or Swim. Feb 2012 Feb 2013.
Location Chicago, IL.
Domains Online Trading.
Project: Core TOS (TOS Support Management) is a back bone for Think or Swim technical
framework which is will process all transactions submitted by End clients and on approval to clearance
systems. System is designed to handle cancellation/reorder for complex Options/Future transactions.
Role in the Projects:
Installed/Configured and was responsible for setting up the environment for core spring controller.
Used J2EE Design patterns like VO, Builder, Prototype, and Command across the application.
Coded the Message sender to publish message and Spring JMS POJO to consume the messages.
Developed Core Java 5 (Multithreading, collections, beans) in conjunction with JDBC to get data from
multiple accounts, allocation systems and store it in Oracle. Used SQL squirrel for creating Stored
Procedures, Functions, and View etc.
Developed Listener multithreaded logic to process client Options/Futures trade files using Buffered
Reader Properties, Class loader.
Re-factored existing Account Transfer processing/ Derivatives (For paper Money Mode) in order to
make it enable for multithreaded operation and used Spring Transactions.
Used JAX WS Web services for bookkeeping request.
Load Raw /Market Data in DB (For Integration Testing) to integrate TOS with TD Ameritrade for
Equities, Exchange-traded funds, futures, mutual funds and bonds.
Developed stored procedures, Triggers and functions to process the trades in Oracle and mapped it
to Hibernate Configuration File and also established data integrity among all tables.
Technical Environment: Java6, Strut 2.0, Spring 2.X, Hibernate 3.5, JBoss, Clear Case, Mockito, JMS,
MAT, J-Console, Maven, Web sphere 7.x,Oracle 10g, DB2 9.x, Scala.

Client Name JPMorgan Chase (Bear Stearns) Sep 2009 Jan 2012.
Location Whippany , NJ

Domains Fixed Income and Futures


Project: Geneva Integration FabricFixed Income & Geneva RPF
(Task Orchestration) is a JPMorgan Stearns technical framework which will integrate
existing systems and data with Advent Geneva system. Various business rules like Cover
short/Revenue/GAI/Long etc. for Fixed Income( Maturity, Payment In Kind, Gross Amount Income,
Coupon processing, Gross Dividend, Exemption, Default, Corporate Bonds, ABS/MBS Bonds,
Government Hybrid Bond) along with Equities and Futures are implemented.
Project: Geneva - Equities Account Cross (EAC) request for brand new account
can be entered anytime during the week, but are processed only at EOD Friday. System will handle new
activities/trades are already coming in but will lost/not processed because new account is not yet defined
in Geneva database and support to price market changes and stock split transactions-(TBD).
Role in the Project:
Created Use case, Class, Sequence Diagrams using Rational Rose for Maturity, Money Market
Funds and Bond Redemption Processing. Use various design patterns like DAO, Transformer, Chain
of responsibility, Builder, Singleton, Data-Mapper etc. across the application.
Design and development of core engine to handle business logic using Spring bean wiring, Caching
strategies like Eh-Cache for Instrument Maturity Date/Coupon Date and Currency.
Developed Data Extractor, Data loader and Batch status handler for Geneva RPF to process Default
Bond transaction and Asset Backed Securities/ Mortgage Backed Securities products Re-factored,
existing Equities BOD/EOD file processing in order to make it enable for multithreaded operation.
Re-factored existing Account Transfer processing/ Derivatives (For paper Money Mode) in order to
make it enable for multithreaded operation using Thread Pool and Executor Service.
Developed Multithread process for Geneva RPF using Concurrency package for Default-bond/ Money
Market-Fund/ Municipal Bond Fixed Rate Coupon/Convertible Bond fixing DCL design.
Implementing business rules(Spring AOP) for Maturity, Payment In Kind, Coupon Interest Payment,
Gross Amount Dividend and Spot FX processing to generate Geneva loadable data. Developed
Spring JMS Listener multithreaded logic to process notification for Cross Currency Reporting using
spring listener and Thread classes.
Set up various financial instruments in Geneva (Defaults bonds/Futures/Bonds/Mutual Funds/ABS) e
for QA and Pre QA testing.
Worked on bug fixing for Geneva Future and Equities Integration (Production Support) using
Concurrent Hash Map and Priority Queue. Used user Interface for Monitoring and invoking Batch
Uploads and creating listeners for Money Market, Pay down, Maturity processing, Spot FX,
bookkeeping transactions and Geneva Market data format.
Utilized Collections Framework extensively and Data Access Architecture to develop the Data
Component, which persist the data in Sybase. Used korn Shell for start/stop Geneva-Components.
Technical Environment: Java1.5, JMX, ibatis2.3, Spring3.X, JMS, Tibco,AdventGeneva-7.6, Ibatis,
ClearCase Mercury Quality Center 9.0, Hibernate 3.X, MQs 6.x, Oracle 10g, Sybase 15.5.
Client Name Verizon Business. June 2007 Aug 2009.
Location Boston, MA.
Domain Telecom.
Project: International PIP and EVPL GOPA - (IOrder) will provide Sales, and
Billing user communities, capability to order products and to automate ordering process to reduce cost.
Role in the Projects:
Involved in analysis, design, & development phase of Software Development Lifecycle (SDLC JAD).
Message queues for connection to legacy systems and also to communicate between BPM and ESB
Using Castor xml data binding Framework for marshalling and un-marshalling data to and from xml
and created Jet-Scripts for retry mechanism to avoid urgent patches to support production drops.
Developed CRUD Workflow templates using spring, HSQl scripts and Hibernate configuration.
Implement Web Services for PIP portal to interact with ilog rules Workflow.
Developed a web-based interface using which a client can update his/her personal information. The
interface was developed using HTML, CSS and MS FrontPage.
Developed Message-driven Beans (MDB) for Work flow templates to consume message posted by
Presentation tier and establishing new routing rules (ilog - Rules) using routing engine for User in
different environments. Used UNIX K-shell scripting for bulk order processing.
Used XSLT to generate backend system response html reports. Used CVS for version control.
Technical Environment: XML-Spy, LDAP, Castor, Struts, Spring 2.5, JavaScript ,Ajax,JSP, EJB 2.0,
Toad9.x, WLI 7.0, JMS, Hibernate, Tibco, MyEclipse6.x, Web services, ilog, Oracle(9.0), Web logic(10.3),
MQSeries5.x.
Client Name Reliance Info solutions Pvt. Ltd. May 2004 Jan 2006.
Location Mumbai India.
Domain Tel com and Finance
Project: TRAP (Trade Reporting & Approval Workflow) captures critical fixed income portfolio
data and generates reports for the front office (traders and portfolio managers) and middle office.
Supporting current Equity products such as Treasury Bonds, Strip Bonds, Municipal Bonds
Role in project
Developed user interface on Jakarta Struts Framework using JSP, java scripts and HTML.
Developed Struts Action Class which route submittals via Business Delegate to Business appropriate
components and render retrieved information.
Developed Session Beans perform user authentication. Preparing program specs and test plans for
various features such as Logistics etc. Used user Interface for Monitoring / invoking Batch Uploads.
Utilized Collections Framework extensively to implement suitable data structures and algorithms and
use Data Access Architecture to develop the Component, which persist the data in Sybase.
Technical Environment : Unix, Java, MQ-series, Struts 1.2, Express framework, DB2 7.2, Toad 7.x,
Servlets, JSP, JDBC, JMS, Top link ,HTML, JBoss, JavaScript, Sybase 12.5, Tomcat 4.0, JMS, EJB,
SAX.
Academic Qualification & Certification

Degree School Year


Masters in Computer Science. SUNY- NewPaltz, NY. May 2007.

Bachelor of Engineering (Computers Science). University of Mumbai, India. May 2004.

Certificate School Year


Functional Programming Principles in Scala. Coursera 2014.

Das könnte Ihnen auch gefallen