SlideShare a Scribd company logo
MOPURU BABU
Mob no#+1-321-210-3823
E Mail : babu.m@sunshineconsulting.co
EXPERIENCE SUMMARY
Over 9 years of professional experience in Software development in Java Technologies with 2 years
in Hadoop development using Hadoop, Hive, HBase, MapReduce, Pig, Sqoop, Oozie, Shell scripting,
Yarn,Scala,Spark to include design, developing and deploying n-tired and enterprise level distributed
applications.
• Working as HADOOP Developer since 3 years on various Hadoop platforms like Cloudera &
Hortonworks and over SIX years of experience in Software development on Java & Spring frame
works(Spring IoC/core, Spring DAO support, Spring ORM, Spring AOP, Spring Security, Spring
MVC, Spring Cache and Spring Integration).
• Expert knowledge over J2EE Design Patterns like MVC Architecture, Front Controller, Session
Facade, Business Delegate and Data Access Object for building J2EE Applications.
• Designed & developed several multi-tier Web based, Client-Server and Multithread applications
using Object Oriented Analysis and Design concepts and Service Oriented Architecture (SOA)
mostly in cross platform environment.
• Excellent working knowledge of popular frameworks like Struts, Hibernate, and Spring MVC.
• Developed core modules in large cross-platform applications using JAVA, J2EE, Spring, Struts,
Hibernate, JAX-WS (SOAP)/JAX-RS(REST) Web Services JMS.
• Good understanding of HDFS Designs, Daemons, federation and HDFS high availability (HA).
• Experience in developing Hive and Pig scripts.
• Extensive knowledge of creating different Hive tables with different file compressions formats.
• Hands on experience in installing, configuring and administrating Hadoop cluster components
like MapReduce, HDFS, HBase, Hive, Sqoop, Spark, Pig, Zookeeper, Oozie and Flume using
Apache Code base.
• Experience in Managing scalable Hadoop clusters including Cluster designing, provisioning,
custom configurations, monitoring and maintaining using Hadoop distributions: Cloudera CDH.
• Good experience using Apache SPARK.
• Worked on a prototype Apache Spark Streaming project, and converted our existing Java Storm
Topology.
• Experience managing Cloudera distribution of Hadoop(Cloudera Manager).
• Excellent understanding of NoSQL databases like HBase.
• Extensive working knowledge in setting up and running Clusters, monitoring, Data analytics,
Sentiment analysis, Predictive analysis, Data presentation with big data world.
• Hands on experience working on structured, unstructured data with various file formats such as
xml files, Json files, sequence files using Map Reduce programs.
• Extensive experience with wiring SQL queries using HiveQL to perform analytics on structured
data.
• Expertise in Data load management, importing & exporting data using SQOOP & FLUME.
• Performed different PIG operations, joining operations and transformations on data to join,
clean, aggregate and analyze data.
• Excellent interpersonal and communication skills, creative, research-minded, technically
competent and result-oriented with problem solving and leadership skills.
• Expert in Database, RDBMS concepts and using MS SQL Server and Oracle 10g.
• Expertise in working with web development technologies such as HTML, CSS, and JavaScript.
• Expertise in working with different methodologies like Waterfall and Agile.
• Proficient experience in using the databases such as MySQL, MS SQL Server, DB2.
1
SOFTWARE SKILLS
Elements Particulars
Primary Skills Analysis, Design, Development, Implementation, Testing & Packaging.
Languages Java
Big Data Skills Hadoop, Map Reduce, Hive, Pig, Sqoop, Oozie, Scala, Spark
RDBMS MS SQL server 2000/2005, DB2, Oracle 8.x/9i/10g
No SQL HBase, MarkLogic
Internet Technology JSP, HTML, XML & CSS
Scripting Language Java Script, JSON, Angular JS
Application Server Web Sphere 6.0, Web Logic 8.1, Jboss5.0
Web Server Tomcat 5.0
Frameworks Struts, Spring, Hibernate, Log4j
CM Tools IBM Clear Case, IBM Rational Team Concert (Think client &Web version),
WinCVS, SVN, GIT
Defect Tracking Tools IBM TSRM, IBM Rational Team Concert, IBM Rational Quality Manager
Build Tools Apache Ant - 1.6.5
Testing Tools JUnit
IDE & GUI Eclipse 3.3, IBM RAD, IBM RSA & Net Beans
Operating System Windows 7, Windows 95/98/ME/NT/XP, Unix & Linux
UML Modeling Tools Star UML
Web Technologies Servlets, JSP
INDUSTRY EXPERIENCE
• UK Public Sector
• Finance Sector
• Retail Sector
Achievements at Workplace
• Received star performer in IBM.
• Was appreciated for committed and reliable work.
• Received value awards, Deep Skill Adder awards in IBM.
• Received lot of appreciation in showing the team work, adapting to new technologies & many
client Appreciation.
Interpersonal Competencies
• Ability to interact successfully with multiple teams across the global organization, including
services and support for all regions.
• Strong mathematical, analytical background with innovative thoughts and creative action.
• Sound Tech-skills, getting-across business flow, Quick-grasp self-learning professional
• Value-added attitudes.
• Zeal to learn New Technologies.
2
Project #1 : SPST (Service Pac Product Selector Tool)
Client : IBM, Raleigh, NC
Environment : Hadoop, Hive, Pig, SQOOP, Map Reduce, Java(jdk1.7), LINUX, MySQL,
NoSQL, Cloudera, Spring
Duration : Apr’15 - Till date
Description : This is a web based on-line tool designed to help Offering and Sales
Managers and global Geo’s create, maintain and distribute Service Pac Product Offerings data
worldwide. Along with release enhancements and maintenance activities business analytics also
implemented.
Delivered multiple Big Data use cases to support custom sales offerings using Hadoop. Delivered
different KPIs and Metrics like.
Per country wide
- which offer description has least and more times occurred
- which offer number has least and most occurrence
- which offer type has least and more occurrences
- which partpin has least and more occurrences
- which offerpin has least and most occurrences
- what is the highest and what is the lowest cost. Are they repeated? if so how many times and for
what offer types they repeated
- List down the top 10 offer descriptions
- List down the top 10 offer types
- Which offer description has highest price and which one is lowest price
- What is the difference between the highest price and lowest price
- How many offers are released in each year
- for the customer which Part pin you will suggest - At least suggest 3 categories
- For the customer which Offer pin you will not suggest - you should not suggest 3 categories
- for how many records its under loss
- For the combination top offer description and the top offer code how many Part pins are repeated
- Country code has how many Offer numbers
- For each year prepare a tab separated file as below
year maxopencost maxclosecost offernumber offercode country
Zone wide
- All the above mentioned should be repeated for the entire zone
- zone has how many country codes
Total
- All the above should be repeated for the entire zone
- Total how many country codes are there and each zone has how many country codes. Come out
with a CSV formatted file output
Ex: zone1,AM,300
zone1,AT,365
Different Zones
- from each zone which country has more machine types
- from each zone which country has highest price and give the complete details.
- In the zone what is the highest price and give the complete details
− what is the latest announce date in a country, zone.
3
As a Senior Team Member, was responsible for
• Installation & configuration of a Hadoop cluster along with Hive.
• Developed Map Reduce application using Hadoop map reduce programming, a framework
for processing.
• Large data sets in parallel across the Hadoop cluster for pre-processing.
• Developed the code for Importing and exporting data into HDFS and Hive using Sqoop.
• Responsible for writing Hive Queries for analysing data in Hive warehouse using Hive Query
Language(HQL).
• Involved in defining job flows using Oozie for scheduling jobs to manage apache Hadoop jobs
by directed.
• Developing Hive User Defined Functions in java, compiling them into jars and adding them to
the HDFS and executing them with Hive Queries.
• Experienced in managing and reviewing Hadoop log files.
• Responsible to manage data coming from different sources.
• Assisted in monitoring the Hadoop cluster
• Dealing with high volume of data in the cluster.
• Tested and reported defects in an Agile Methodology perspective.
• Consolidate all defects, report it to PM/Leads for prompt fixes by development teams and
drive it to closure.
• Installed Hadoop ecosystems (Hive, Pig, Sqoop, HBase, Oozie) on top of Hadoop cluster
• Importing data from SQL to HDFS & Hive for analytical purpose.
• Involved in developing the Controller, Service and DAO layers of Spring Framework for
developing dashboard for SPST project.
• Attending Business requirement Meetings, UAT Support, Onsite- Offshore Co-ordination,
Work Assignment.
Project #2 : Inventory Management
Client : Wal-Mart, Bentonville, AR
Environment : CDH4,Eclipse,HDFS,Hive, Map Reduce, Spark, Spark-SQL, Oozie, Sqoop, Pig
Duration : 22 Months (Jun’13 – Mar’15)
Description : The application tasks include data extraction from various input sources
like DB2, xml into Cassandra database. The incoming data is specifically related to the Wal-Mart
stores details like address, alignments, divisions, departments and etc. these data filtered according
the business logic and stored in the respective column families. Written some of the services to
retrieve the data from the Cassandra.
As a Senior Team Member, was responsible for
• Developed Spark code using Scala and Spark-SQL/Streaming for faster testing and processing
of data.
• Import the data from different sources like HDFS/HBase into Spark RDD Developed a data
pipeline using Kafka and Storm to store data into HDFS. Performed real time analysis on the
incoming data.
• Automated the process for extraction of data from warehouses and weblogs by developing
work-flows and coordinator jobs in OOZIE.
• Developed Scala scripts, UDFs using both Data frames/SQL and RDD/Map Reduce in Spark
for Data Aggregation, queries and writing data back to OLTP system directly or through
Sqoop.
4
• Exploring with the Spark improving the performance and optimization of the existing
algorithms in Hadoop using Spark Context, Spark-SQL ,Spark YARN.
• Performance optimization dealing with large datasets using Partitions, Spark in Memory
capabilities, Broadcasts in Spark, Effective & efficient Joins, Transformations and other heavy
lifting during ingestion process itself.
• Used Spark API over Horton Works Hadoop YARN to perform analytics on data in Hive.
• Performed transformations like event joins, filter bot traffic and some pre-aggregations
using Pig.
• Developed Map Reduce jobs to convert data files into Parquet file format.
• Developed business specific Custom UDF's in Hive, Pig.
• Configured Oozie workflow to run multiple Hive and Pig jobs which run independently with
time and data availability.
• Optimized Map Reduce code, pig scripts and performance tuning and analysis.
• Involvement in design, development and testing phases of Software Development Life Cycle.
• Performed Hadoop installation, updates, patches and version upgrades when required.
Project #3 : DWP CIS (Customer Information System)
Client : Public Sector, UK Government, Preston, UK
Environment : Core Java, J2ee, JSP, Struts, Web services, Altova XML Spy, WAS 6.0, RAD,
Apache ANT, Oracle SQL Developer, RTC 3.0
RDBMS : Oracle
Duration : 19 Months (Nov’ 11 – May’ 13)
Description : CIS is Customer Information system which holds all the personal details of
the UK Citizens. Along with the personal details that they are entitled for and registered are also
stored in CIS.CIS is primary for the personal and slave for the benefits and award details stored in the
database.
CIS was formed in 2004 by merging two systems (DCI and PDCS). DCI is Department Central Index
and PDCS is Personal Data Computer Store. These two were two different systems altogether and
they never used to communicate to each other.
Because of that whenever the change is happening in one system, the same was not getting
reflected in the other system. As part of Release 1, CIS database was built by having the information
to flow from DCI and PDCS to CIS.PDCS was COBOL based and all systems which were talking to PDCS
started to talk to CIS now through batch and online. Dialogues were replaced by SEF screens and
functions. This will mean that the mainframe dialogues will be replaced by browser based screens.
The CIS online Functions which will provide as replacement for DCI can be decomposed into number
of discrete function types like Primary access, Secondary access, Data link Requests and Inter Service
Access.
As a Senior Team Member, was responsible for
• Implementing GUI as per requirement.
• Involved to create the Functional Design and Technical Design.
• Involved in coding Business logic, Persist data with struts and JDBC, Unit Testing and
Integration Testing.
• Involved Re-Usable Components which can be used in all modules.
• Involved in supporting UAT activities and Production issues, fixing bugs and Defect Tracking.
• Involved in creating XSD.
• Involved in debugging, troubleshooting and defect fixing
• Involved in updating the XSDs based on the business requirements.
• Involved in testing of the Web services and integrating with external vendors and internal
clients.
5
• Assist developers in the technical design phase, construction and unit testing phase.
• Analyzing and understanding the architectural requirements.
• Proposing a new design solution which exceeds the client expectations.
• Involved in conducting peer code Reviews and Functional documents.
Project #4 : E Referrals
Client : Public Sector, UK Government, Preston, UK
Environment : Java, HTML, JSP, J2ee, JDBC, EDS Tool
RDBMS : My SQL 5.5
Servers : Tomcat
Duration : 12 Months (Nov’ 10 to Oct’ 11)
Description : The eReferrals over payments application provides a solution that allows
the user to complete online debt referrals. The service pre-populates, where possible, debt referral
screens with client data held in CIS. This is retrieved using a keyed NINO for the case.
The application at its most basic will:
 Create an eReferral to Debt Manager via user input and data scrape from CIS.
 Route the referral through the approval process
 Collate daily approved referrals into a Debt Referral Batch file.
 Dispatch file to Debt Manager
 A copy of the file held on eReferrals for 7 th day period only.
Incoming data will consist of client data via CIS, in XML format. Outgoing data destined for Debt
Manager will consist of XML document made up Debt referrals and various reports in CSV format.
As a Senior Team Member, was responsible for
• Involved in implementing GUI as per requirement.
• Involved in LLD, Functional Design and Technical Design.
• Involved create the web pages using Java server pages.
• Involved Controller logic in Servlets.
• Implemented client side validations using Java Script.
• Implemented JDBC components.
• Preparing Unit Test Cases.
Project #5 : E forms
Client : Public Sector, UK Government, Preston, UK
Environment : Core Java, HTML, Java script, JSP, Servlets, JDBC, RTC
RDBMS : My SQL 5.5
Servers : Tomcat
Duration : 9 Months (Feb’ 10 to Oct’ 10)
Description : E forms is a service available to all staff via the DWP intranet that allows
DWP staff to submit various personal related forms online. It was initially developed to allow staff
access to their pay information, and to submit expense and overtime claims online. However, the
majority of this week has now been taken over by the strategic resource management system. E
forms continue to provides services for few pay/expenses forms.
As a Senior Team Member, was responsible for
• Involved in implementing GUI as per requirement.
6
• Involved in LLD, Functional Design and Technical Design.
• Involved create the web pages using Java server pages.
• Involved Controller logic in Servlets.
• Implemented client side validations using Java Script.
• Implemented JDBC components.
• Preparing Unit Test Cases
Project #6 : ACG Italy
Client : ACG VISION4, Italy
Environment : Java, JSON, Struts, Hibernate, Web Sphere, RSA, Clear Case, DB2, JUnit,
RQM
RDBMS : DB2
Duration : 18 Months (Aug ’08 – Jan’ 10)
Description : ACG Vision4 is an ERP product this covers all processes of the company,
organized to process, designed to be used easily by all the different types of users, flexible in its
ability to integrate and interact with other systems inside and outside the company and whose
functions and data are accessible with the most popular systems for office automation and data-
independent, so intuitive and simple.
It’s having some modules like Finance, Controlling, Supply Chain Management and SVM.ACG has
developed Vision4 taking advantage of IBM's best technology and the most popular open source
technology on the market. The database and platform infrastructure is built on IBM DB2 and IBM
WebSphere, the activity of query, reporting and analysis is made with Cognos. For the part of
architectural construction, adhering to the principles of SOA, there was an undertaken using proven
industry standard such Hybernia, Struts, and Dojo Web 2.0, development was carried out entirely in
Java, everything installed on multiple platforms: Linux, OS400 and Windows.
As a Team Member, was responsible for
• Responsible for leading the project in various phases including design, development and Unit
testing of the application modules and management of a team.
• Worked as a Functional Group Leader for implementing the functional use cases.
• Develop the Re-Usable Components which can be used in all modules.
• Develop Proof of Concepts for ACG framework and provide technical solutions.
• Developed UI by using JSON.
• Implementing the Action Classes using Struts.
• Written Hibernate components and conducting peer code views.
• Requirement analysis from the business users and from onshore counter partner.
• Involve in enhancements, debugging, troubleshooting and defect fixing.
• Involve in Unit Testing, White Box Testing and Integration Testing of the application.
• Customizing and exploring new things on ACG framework
• Involved in assisting juniors.
Project #7 : eCIS (electronic Customer Information System)
Client : Service Master & MerryMaids, United States
Environment : Java, XML, Spring, Hibernate, JMS, Web Services and EXTJS
RDBMS : MSSQL Server 2005, Oracle
Servers : Tomcat 6 & JBoss 5.4
Duration : 11 Months (Sep '07 – July ’08)
Description : The eCIS –Electronic Customer Information System provides the service to
the customers and maintain the list of services for different branches and Franchises. eCIS is web
7
based and distributed web application. Here we are maintaining customer and Employee
information for different services. The end user is main active participate in the system.
The following are the list of modules for customer:
Customer / Employee / Service / Account Receivables / Utilities / Maintenance
As a Team Member, was responsible for
• Involved in Analysis, Effort estimation, Design & Development in the part of Employee Audit,
Email Reminders, Audit Rules and Workflow features enhancements.
• Involved in Analysis, Design and Development of Customization Frame Work enhancements
such as , Allocation Verification on Save, etc.
• Involved in debugging, troubleshooting and defect fixing.
• Involving in setting up and configuring the development and deployment of the application.
• Design & developed Customized Ext JS components for Application.
• Implementation of Spring Security for application, EXT Delegate / facade Data object controller,
Domain locking, Exception handling using springs.
EDUCATION
• Bachelor of Computer Science (Bsc) from S.V University.
• Master of Computer Application (MCA) from S.V University.
8

More Related Content

What's hot (19)

Big Data Simplified - Is all about Ab'strakSHeN
Big Data Simplified - Is all about Ab'strakSHeNBig Data Simplified - Is all about Ab'strakSHeN
Big Data Simplified - Is all about Ab'strakSHeN
DataWorks Summit
 
HBase and Drill: How loosley typed SQL is ideal for NoSQL
HBase and Drill: How loosley typed SQL is ideal for NoSQLHBase and Drill: How loosley typed SQL is ideal for NoSQL
HBase and Drill: How loosley typed SQL is ideal for NoSQL
DataWorks Summit
 
DoneDeal - AWS Data Analytics Platform
DoneDeal - AWS Data Analytics PlatformDoneDeal - AWS Data Analytics Platform
DoneDeal - AWS Data Analytics Platform
martinbpeters
 
BIGDATA ppts
BIGDATA pptsBIGDATA ppts
BIGDATA ppts
Krisshhna Daasaarii
 
Why Every NoSQL Deployment Should Be Paired with Hadoop Webinar
Why Every NoSQL Deployment Should Be Paired with Hadoop WebinarWhy Every NoSQL Deployment Should Be Paired with Hadoop Webinar
Why Every NoSQL Deployment Should Be Paired with Hadoop Webinar
Cloudera, Inc.
 
Sureh hadoop 3 years t
Sureh hadoop 3 years tSureh hadoop 3 years t
Sureh hadoop 3 years t
suresh thallapelly
 
Pallavi_Resume
Pallavi_ResumePallavi_Resume
Pallavi_Resume
pallavi Mahajan
 
Multitenancy At Bloomberg - HBase and Oozie
Multitenancy At Bloomberg - HBase and OozieMultitenancy At Bloomberg - HBase and Oozie
Multitenancy At Bloomberg - HBase and Oozie
DataWorks Summit
 
DeepeshRehi
DeepeshRehiDeepeshRehi
DeepeshRehi
deepesh rehi
 
Anil_BigData Resume
Anil_BigData ResumeAnil_BigData Resume
Anil_BigData Resume
Anil Sokhal
 
How Salesforce.com uses Hadoop
How Salesforce.com uses HadoopHow Salesforce.com uses Hadoop
How Salesforce.com uses Hadoop
Narayan Bharadwaj
 
Apache Tez: Accelerating Hadoop Query Processing
Apache Tez: Accelerating Hadoop Query Processing Apache Tez: Accelerating Hadoop Query Processing
Apache Tez: Accelerating Hadoop Query Processing
DataWorks Summit
 
DeathStar: Easy, Dynamic, Multi-Tenant HBase via YARN
DeathStar: Easy, Dynamic, Multi-Tenant HBase via YARNDeathStar: Easy, Dynamic, Multi-Tenant HBase via YARN
DeathStar: Easy, Dynamic, Multi-Tenant HBase via YARN
DataWorks Summit
 
Splice machine-bloor-webinar-data-lakes
Splice machine-bloor-webinar-data-lakesSplice machine-bloor-webinar-data-lakes
Splice machine-bloor-webinar-data-lakes
Edgar Alejandro Villegas
 
Introduction to Hadoop
Introduction to HadoopIntroduction to Hadoop
Introduction to Hadoop
Dr. C.V. Suresh Babu
 
50 Shades of SQL
50 Shades of SQL50 Shades of SQL
50 Shades of SQL
DataWorks Summit
 
Apache drill self service data exploration (113)
Apache drill   self service data exploration (113)Apache drill   self service data exploration (113)
Apache drill self service data exploration (113)
MapR Technologies
 
Data warehousing with Hadoop
Data warehousing with HadoopData warehousing with Hadoop
Data warehousing with Hadoop
hadooparchbook
 
Analyzing the World's Largest Security Data Lake!
Analyzing the World's Largest Security Data Lake!Analyzing the World's Largest Security Data Lake!
Analyzing the World's Largest Security Data Lake!
DataWorks Summit
 
Big Data Simplified - Is all about Ab'strakSHeN
Big Data Simplified - Is all about Ab'strakSHeNBig Data Simplified - Is all about Ab'strakSHeN
Big Data Simplified - Is all about Ab'strakSHeN
DataWorks Summit
 
HBase and Drill: How loosley typed SQL is ideal for NoSQL
HBase and Drill: How loosley typed SQL is ideal for NoSQLHBase and Drill: How loosley typed SQL is ideal for NoSQL
HBase and Drill: How loosley typed SQL is ideal for NoSQL
DataWorks Summit
 
DoneDeal - AWS Data Analytics Platform
DoneDeal - AWS Data Analytics PlatformDoneDeal - AWS Data Analytics Platform
DoneDeal - AWS Data Analytics Platform
martinbpeters
 
Why Every NoSQL Deployment Should Be Paired with Hadoop Webinar
Why Every NoSQL Deployment Should Be Paired with Hadoop WebinarWhy Every NoSQL Deployment Should Be Paired with Hadoop Webinar
Why Every NoSQL Deployment Should Be Paired with Hadoop Webinar
Cloudera, Inc.
 
Multitenancy At Bloomberg - HBase and Oozie
Multitenancy At Bloomberg - HBase and OozieMultitenancy At Bloomberg - HBase and Oozie
Multitenancy At Bloomberg - HBase and Oozie
DataWorks Summit
 
Anil_BigData Resume
Anil_BigData ResumeAnil_BigData Resume
Anil_BigData Resume
Anil Sokhal
 
How Salesforce.com uses Hadoop
How Salesforce.com uses HadoopHow Salesforce.com uses Hadoop
How Salesforce.com uses Hadoop
Narayan Bharadwaj
 
Apache Tez: Accelerating Hadoop Query Processing
Apache Tez: Accelerating Hadoop Query Processing Apache Tez: Accelerating Hadoop Query Processing
Apache Tez: Accelerating Hadoop Query Processing
DataWorks Summit
 
DeathStar: Easy, Dynamic, Multi-Tenant HBase via YARN
DeathStar: Easy, Dynamic, Multi-Tenant HBase via YARNDeathStar: Easy, Dynamic, Multi-Tenant HBase via YARN
DeathStar: Easy, Dynamic, Multi-Tenant HBase via YARN
DataWorks Summit
 
Apache drill self service data exploration (113)
Apache drill   self service data exploration (113)Apache drill   self service data exploration (113)
Apache drill self service data exploration (113)
MapR Technologies
 
Data warehousing with Hadoop
Data warehousing with HadoopData warehousing with Hadoop
Data warehousing with Hadoop
hadooparchbook
 
Analyzing the World's Largest Security Data Lake!
Analyzing the World's Largest Security Data Lake!Analyzing the World's Largest Security Data Lake!
Analyzing the World's Largest Security Data Lake!
DataWorks Summit
 

Viewers also liked (14)

Jhon mauricio santacruz
Jhon mauricio santacruzJhon mauricio santacruz
Jhon mauricio santacruz
Jhon Mauricio Santacruz
 
SD5001_1516_15117003G
SD5001_1516_15117003GSD5001_1516_15117003G
SD5001_1516_15117003G
Manas A. Datta
 
Vita actualized March 10th 2015 GPC
Vita actualized March 10th 2015 GPCVita actualized March 10th 2015 GPC
Vita actualized March 10th 2015 GPC
German Perez-Casanova
 
Sunshine consulting Mopuru Babu CV_Java_J2ee_Spring_Bigdata_Scala_Spark
Sunshine consulting Mopuru Babu CV_Java_J2ee_Spring_Bigdata_Scala_SparkSunshine consulting Mopuru Babu CV_Java_J2ee_Spring_Bigdata_Scala_Spark
Sunshine consulting Mopuru Babu CV_Java_J2ee_Spring_Bigdata_Scala_Spark
Mopuru Babu
 
Rsmart Corporate Overview_LinkedIn
Rsmart Corporate Overview_LinkedInRsmart Corporate Overview_LinkedIn
Rsmart Corporate Overview_LinkedIn
Vinay Nerli
 
дом а. п. чехова
дом а. п. чеховадом а. п. чехова
дом а. п. чехова
Sakhuser
 
Video surveillance
Video surveillanceVideo surveillance
Video surveillance
naresh_arthem
 
13_2
13_213_2
13_2
Prince Gupta
 
Online assignment anjana rk
Online assignment  anjana rkOnline assignment  anjana rk
Online assignment anjana rk
AnjanasreedharRK
 
Arterias
ArteriasArterias
Arterias
Daniel García
 
Perfil biográfico de jean piaget
Perfil biográfico de jean piagetPerfil biográfico de jean piaget
Perfil biográfico de jean piaget
Universidad Santo Tomás VUAD
 
Starbucks Buyer Persona
Starbucks Buyer PersonaStarbucks Buyer Persona
Starbucks Buyer Persona
Emily Sell
 
A review of Best Practices in Vascular Access and Infusion Therapy presentati...
A review of Best Practices in Vascular Access and Infusion Therapy presentati...A review of Best Practices in Vascular Access and Infusion Therapy presentati...
A review of Best Practices in Vascular Access and Infusion Therapy presentati...
bsulejma09
 
Innovative lessonplan
Innovative lessonplanInnovative lessonplan
Innovative lessonplan
AnjanasreedharRK
 
Sunshine consulting Mopuru Babu CV_Java_J2ee_Spring_Bigdata_Scala_Spark
Sunshine consulting Mopuru Babu CV_Java_J2ee_Spring_Bigdata_Scala_SparkSunshine consulting Mopuru Babu CV_Java_J2ee_Spring_Bigdata_Scala_Spark
Sunshine consulting Mopuru Babu CV_Java_J2ee_Spring_Bigdata_Scala_Spark
Mopuru Babu
 
Rsmart Corporate Overview_LinkedIn
Rsmart Corporate Overview_LinkedInRsmart Corporate Overview_LinkedIn
Rsmart Corporate Overview_LinkedIn
Vinay Nerli
 
дом а. п. чехова
дом а. п. чеховадом а. п. чехова
дом а. п. чехова
Sakhuser
 
Online assignment anjana rk
Online assignment  anjana rkOnline assignment  anjana rk
Online assignment anjana rk
AnjanasreedharRK
 
Starbucks Buyer Persona
Starbucks Buyer PersonaStarbucks Buyer Persona
Starbucks Buyer Persona
Emily Sell
 
A review of Best Practices in Vascular Access and Infusion Therapy presentati...
A review of Best Practices in Vascular Access and Infusion Therapy presentati...A review of Best Practices in Vascular Access and Infusion Therapy presentati...
A review of Best Practices in Vascular Access and Infusion Therapy presentati...
bsulejma09
 

Similar to Sunshine consulting mopuru babu cv_java_j2_ee_spring_bigdata_scala_Spark (20)

Sourav banerjee resume
Sourav banerjee   resumeSourav banerjee   resume
Sourav banerjee resume
Sourav Banerjee
 
Resume
ResumeResume
Resume
Rama kumar M V
 
Bigdata.sunil_6+yearsExp
Bigdata.sunil_6+yearsExpBigdata.sunil_6+yearsExp
Bigdata.sunil_6+yearsExp
bigdata sunil
 
Jayaram_Parida- Big Data Architect and Technical Scrum Master
Jayaram_Parida- Big Data Architect and Technical Scrum MasterJayaram_Parida- Big Data Architect and Technical Scrum Master
Jayaram_Parida- Big Data Architect and Technical Scrum Master
Jayaram Parida
 
Gubendran Lakshmanan
Gubendran LakshmananGubendran Lakshmanan
Gubendran Lakshmanan
Gubendran Lakshmanan
 
Robin_Hadoop
Robin_HadoopRobin_Hadoop
Robin_Hadoop
Robin David
 
hadoop_bigdata
hadoop_bigdatahadoop_bigdata
hadoop_bigdata
sudheer talluri
 
Venkata
VenkataVenkata
Venkata
Venkata Kumar
 
Rama prasad owk etl hadoop_developer
Rama prasad owk etl hadoop_developerRama prasad owk etl hadoop_developer
Rama prasad owk etl hadoop_developer
ramaprasad owk
 
Srikanth hadoop hyderabad_3.4yeras - copy
Srikanth hadoop hyderabad_3.4yeras - copySrikanth hadoop hyderabad_3.4yeras - copy
Srikanth hadoop hyderabad_3.4yeras - copy
srikanth K
 
ganesh_2+yrs_Java_Developer_Resume
ganesh_2+yrs_Java_Developer_Resumeganesh_2+yrs_Java_Developer_Resume
ganesh_2+yrs_Java_Developer_Resume
Yeduvaka Ganesh
 
Resume_2706
Resume_2706Resume_2706
Resume_2706
Vijay Pai
 
Hadoop Developer
Hadoop DeveloperHadoop Developer
Hadoop Developer
mallikarjunkoriindia
 
Sudhanshu kumar hadoop
Sudhanshu kumar hadoopSudhanshu kumar hadoop
Sudhanshu kumar hadoop
sudhanshu kumar
 
Neelima_Resume
Neelima_ResumeNeelima_Resume
Neelima_Resume
Neelima Kambam
 
HimaBindu
HimaBinduHimaBindu
HimaBindu
HimaBindu Yeddala
 
Prasanna Resume
Prasanna ResumePrasanna Resume
Prasanna Resume
Prasanna Raju
 
Pankaj Resume for Hadoop,Java,J2EE - Outside World
Pankaj Resume for Hadoop,Java,J2EE -  Outside WorldPankaj Resume for Hadoop,Java,J2EE -  Outside World
Pankaj Resume for Hadoop,Java,J2EE - Outside World
Pankaj Kumar
 
Monika_Raghuvanshi
Monika_RaghuvanshiMonika_Raghuvanshi
Monika_Raghuvanshi
Monika Raghuvanshi
 
Deepankar Sehdev- Resume2015
Deepankar Sehdev- Resume2015Deepankar Sehdev- Resume2015
Deepankar Sehdev- Resume2015
Deepankar Sehdev
 
Bigdata.sunil_6+yearsExp
Bigdata.sunil_6+yearsExpBigdata.sunil_6+yearsExp
Bigdata.sunil_6+yearsExp
bigdata sunil
 
Jayaram_Parida- Big Data Architect and Technical Scrum Master
Jayaram_Parida- Big Data Architect and Technical Scrum MasterJayaram_Parida- Big Data Architect and Technical Scrum Master
Jayaram_Parida- Big Data Architect and Technical Scrum Master
Jayaram Parida
 
Rama prasad owk etl hadoop_developer
Rama prasad owk etl hadoop_developerRama prasad owk etl hadoop_developer
Rama prasad owk etl hadoop_developer
ramaprasad owk
 
Srikanth hadoop hyderabad_3.4yeras - copy
Srikanth hadoop hyderabad_3.4yeras - copySrikanth hadoop hyderabad_3.4yeras - copy
Srikanth hadoop hyderabad_3.4yeras - copy
srikanth K
 
ganesh_2+yrs_Java_Developer_Resume
ganesh_2+yrs_Java_Developer_Resumeganesh_2+yrs_Java_Developer_Resume
ganesh_2+yrs_Java_Developer_Resume
Yeduvaka Ganesh
 
Pankaj Resume for Hadoop,Java,J2EE - Outside World
Pankaj Resume for Hadoop,Java,J2EE -  Outside WorldPankaj Resume for Hadoop,Java,J2EE -  Outside World
Pankaj Resume for Hadoop,Java,J2EE - Outside World
Pankaj Kumar
 
Deepankar Sehdev- Resume2015
Deepankar Sehdev- Resume2015Deepankar Sehdev- Resume2015
Deepankar Sehdev- Resume2015
Deepankar Sehdev
 

Recently uploaded (20)

TestMigrationsInPy: A Dataset of Test Migrations from Unittest to Pytest (MSR...
TestMigrationsInPy: A Dataset of Test Migrations from Unittest to Pytest (MSR...TestMigrationsInPy: A Dataset of Test Migrations from Unittest to Pytest (MSR...
TestMigrationsInPy: A Dataset of Test Migrations from Unittest to Pytest (MSR...
Andre Hora
 
Automation Techniques in RPA - UiPath Certificate
Automation Techniques in RPA - UiPath CertificateAutomation Techniques in RPA - UiPath Certificate
Automation Techniques in RPA - UiPath Certificate
VICTOR MAESTRE RAMIREZ
 
How can one start with crypto wallet development.pptx
How can one start with crypto wallet development.pptxHow can one start with crypto wallet development.pptx
How can one start with crypto wallet development.pptx
laravinson24
 
Not So Common Memory Leaks in Java Webinar
Not So Common Memory Leaks in Java WebinarNot So Common Memory Leaks in Java Webinar
Not So Common Memory Leaks in Java Webinar
Tier1 app
 
Explaining GitHub Actions Failures with Large Language Models Challenges, In...
Explaining GitHub Actions Failures with Large Language Models Challenges, In...Explaining GitHub Actions Failures with Large Language Models Challenges, In...
Explaining GitHub Actions Failures with Large Language Models Challenges, In...
ssuserb14185
 
How Valletta helped healthcare SaaS to transform QA and compliance to grow wi...
How Valletta helped healthcare SaaS to transform QA and compliance to grow wi...How Valletta helped healthcare SaaS to transform QA and compliance to grow wi...
How Valletta helped healthcare SaaS to transform QA and compliance to grow wi...
Egor Kaleynik
 
Get & Download Wondershare Filmora Crack Latest [2025]
Get & Download Wondershare Filmora Crack Latest [2025]Get & Download Wondershare Filmora Crack Latest [2025]
Get & Download Wondershare Filmora Crack Latest [2025]
saniaaftab72555
 
EASEUS Partition Master Crack + License Code
EASEUS Partition Master Crack + License CodeEASEUS Partition Master Crack + License Code
EASEUS Partition Master Crack + License Code
aneelaramzan63
 
Designing AI-Powered APIs on Azure: Best Practices& Considerations
Designing AI-Powered APIs on Azure: Best Practices& ConsiderationsDesigning AI-Powered APIs on Azure: Best Practices& Considerations
Designing AI-Powered APIs on Azure: Best Practices& Considerations
Dinusha Kumarasiri
 
What Do Contribution Guidelines Say About Software Testing? (MSR 2025)
What Do Contribution Guidelines Say About Software Testing? (MSR 2025)What Do Contribution Guidelines Say About Software Testing? (MSR 2025)
What Do Contribution Guidelines Say About Software Testing? (MSR 2025)
Andre Hora
 
Douwan Crack 2025 new verson+ License code
Douwan Crack 2025 new verson+ License codeDouwan Crack 2025 new verson+ License code
Douwan Crack 2025 new verson+ License code
aneelaramzan63
 
WinRAR Crack for Windows (100% Working 2025)
WinRAR Crack for Windows (100% Working 2025)WinRAR Crack for Windows (100% Working 2025)
WinRAR Crack for Windows (100% Working 2025)
sh607827
 
How to Batch Export Lotus Notes NSF Emails to Outlook PST Easily?
How to Batch Export Lotus Notes NSF Emails to Outlook PST Easily?How to Batch Export Lotus Notes NSF Emails to Outlook PST Easily?
How to Batch Export Lotus Notes NSF Emails to Outlook PST Easily?
steaveroggers
 
PDF Reader Pro Crack Latest Version FREE Download 2025
PDF Reader Pro Crack Latest Version FREE Download 2025PDF Reader Pro Crack Latest Version FREE Download 2025
PDF Reader Pro Crack Latest Version FREE Download 2025
mu394968
 
FL Studio Producer Edition Crack 2025 Full Version
FL Studio Producer Edition Crack 2025 Full VersionFL Studio Producer Edition Crack 2025 Full Version
FL Studio Producer Edition Crack 2025 Full Version
tahirabibi60507
 
Revolutionizing Residential Wi-Fi PPT.pptx
Revolutionizing Residential Wi-Fi PPT.pptxRevolutionizing Residential Wi-Fi PPT.pptx
Revolutionizing Residential Wi-Fi PPT.pptx
nidhisingh691197
 
Adobe After Effects Crack FREE FRESH version 2025
Adobe After Effects Crack FREE FRESH version 2025Adobe After Effects Crack FREE FRESH version 2025
Adobe After Effects Crack FREE FRESH version 2025
kashifyounis067
 
Adobe Lightroom Classic Crack FREE Latest link 2025
Adobe Lightroom Classic Crack FREE Latest link 2025Adobe Lightroom Classic Crack FREE Latest link 2025
Adobe Lightroom Classic Crack FREE Latest link 2025
kashifyounis067
 
Maxon CINEMA 4D 2025 Crack FREE Download LINK
Maxon CINEMA 4D 2025 Crack FREE Download LINKMaxon CINEMA 4D 2025 Crack FREE Download LINK
Maxon CINEMA 4D 2025 Crack FREE Download LINK
younisnoman75
 
Top 10 Client Portal Software Solutions for 2025.docx
Top 10 Client Portal Software Solutions for 2025.docxTop 10 Client Portal Software Solutions for 2025.docx
Top 10 Client Portal Software Solutions for 2025.docx
Portli
 
TestMigrationsInPy: A Dataset of Test Migrations from Unittest to Pytest (MSR...
TestMigrationsInPy: A Dataset of Test Migrations from Unittest to Pytest (MSR...TestMigrationsInPy: A Dataset of Test Migrations from Unittest to Pytest (MSR...
TestMigrationsInPy: A Dataset of Test Migrations from Unittest to Pytest (MSR...
Andre Hora
 
Automation Techniques in RPA - UiPath Certificate
Automation Techniques in RPA - UiPath CertificateAutomation Techniques in RPA - UiPath Certificate
Automation Techniques in RPA - UiPath Certificate
VICTOR MAESTRE RAMIREZ
 
How can one start with crypto wallet development.pptx
How can one start with crypto wallet development.pptxHow can one start with crypto wallet development.pptx
How can one start with crypto wallet development.pptx
laravinson24
 
Not So Common Memory Leaks in Java Webinar
Not So Common Memory Leaks in Java WebinarNot So Common Memory Leaks in Java Webinar
Not So Common Memory Leaks in Java Webinar
Tier1 app
 
Explaining GitHub Actions Failures with Large Language Models Challenges, In...
Explaining GitHub Actions Failures with Large Language Models Challenges, In...Explaining GitHub Actions Failures with Large Language Models Challenges, In...
Explaining GitHub Actions Failures with Large Language Models Challenges, In...
ssuserb14185
 
How Valletta helped healthcare SaaS to transform QA and compliance to grow wi...
How Valletta helped healthcare SaaS to transform QA and compliance to grow wi...How Valletta helped healthcare SaaS to transform QA and compliance to grow wi...
How Valletta helped healthcare SaaS to transform QA and compliance to grow wi...
Egor Kaleynik
 
Get & Download Wondershare Filmora Crack Latest [2025]
Get & Download Wondershare Filmora Crack Latest [2025]Get & Download Wondershare Filmora Crack Latest [2025]
Get & Download Wondershare Filmora Crack Latest [2025]
saniaaftab72555
 
EASEUS Partition Master Crack + License Code
EASEUS Partition Master Crack + License CodeEASEUS Partition Master Crack + License Code
EASEUS Partition Master Crack + License Code
aneelaramzan63
 
Designing AI-Powered APIs on Azure: Best Practices& Considerations
Designing AI-Powered APIs on Azure: Best Practices& ConsiderationsDesigning AI-Powered APIs on Azure: Best Practices& Considerations
Designing AI-Powered APIs on Azure: Best Practices& Considerations
Dinusha Kumarasiri
 
What Do Contribution Guidelines Say About Software Testing? (MSR 2025)
What Do Contribution Guidelines Say About Software Testing? (MSR 2025)What Do Contribution Guidelines Say About Software Testing? (MSR 2025)
What Do Contribution Guidelines Say About Software Testing? (MSR 2025)
Andre Hora
 
Douwan Crack 2025 new verson+ License code
Douwan Crack 2025 new verson+ License codeDouwan Crack 2025 new verson+ License code
Douwan Crack 2025 new verson+ License code
aneelaramzan63
 
WinRAR Crack for Windows (100% Working 2025)
WinRAR Crack for Windows (100% Working 2025)WinRAR Crack for Windows (100% Working 2025)
WinRAR Crack for Windows (100% Working 2025)
sh607827
 
How to Batch Export Lotus Notes NSF Emails to Outlook PST Easily?
How to Batch Export Lotus Notes NSF Emails to Outlook PST Easily?How to Batch Export Lotus Notes NSF Emails to Outlook PST Easily?
How to Batch Export Lotus Notes NSF Emails to Outlook PST Easily?
steaveroggers
 
PDF Reader Pro Crack Latest Version FREE Download 2025
PDF Reader Pro Crack Latest Version FREE Download 2025PDF Reader Pro Crack Latest Version FREE Download 2025
PDF Reader Pro Crack Latest Version FREE Download 2025
mu394968
 
FL Studio Producer Edition Crack 2025 Full Version
FL Studio Producer Edition Crack 2025 Full VersionFL Studio Producer Edition Crack 2025 Full Version
FL Studio Producer Edition Crack 2025 Full Version
tahirabibi60507
 
Revolutionizing Residential Wi-Fi PPT.pptx
Revolutionizing Residential Wi-Fi PPT.pptxRevolutionizing Residential Wi-Fi PPT.pptx
Revolutionizing Residential Wi-Fi PPT.pptx
nidhisingh691197
 
Adobe After Effects Crack FREE FRESH version 2025
Adobe After Effects Crack FREE FRESH version 2025Adobe After Effects Crack FREE FRESH version 2025
Adobe After Effects Crack FREE FRESH version 2025
kashifyounis067
 
Adobe Lightroom Classic Crack FREE Latest link 2025
Adobe Lightroom Classic Crack FREE Latest link 2025Adobe Lightroom Classic Crack FREE Latest link 2025
Adobe Lightroom Classic Crack FREE Latest link 2025
kashifyounis067
 
Maxon CINEMA 4D 2025 Crack FREE Download LINK
Maxon CINEMA 4D 2025 Crack FREE Download LINKMaxon CINEMA 4D 2025 Crack FREE Download LINK
Maxon CINEMA 4D 2025 Crack FREE Download LINK
younisnoman75
 
Top 10 Client Portal Software Solutions for 2025.docx
Top 10 Client Portal Software Solutions for 2025.docxTop 10 Client Portal Software Solutions for 2025.docx
Top 10 Client Portal Software Solutions for 2025.docx
Portli
 

Sunshine consulting mopuru babu cv_java_j2_ee_spring_bigdata_scala_Spark

  • 1. MOPURU BABU Mob no#+1-321-210-3823 E Mail : [email protected] EXPERIENCE SUMMARY Over 9 years of professional experience in Software development in Java Technologies with 2 years in Hadoop development using Hadoop, Hive, HBase, MapReduce, Pig, Sqoop, Oozie, Shell scripting, Yarn,Scala,Spark to include design, developing and deploying n-tired and enterprise level distributed applications. • Working as HADOOP Developer since 3 years on various Hadoop platforms like Cloudera & Hortonworks and over SIX years of experience in Software development on Java & Spring frame works(Spring IoC/core, Spring DAO support, Spring ORM, Spring AOP, Spring Security, Spring MVC, Spring Cache and Spring Integration). • Expert knowledge over J2EE Design Patterns like MVC Architecture, Front Controller, Session Facade, Business Delegate and Data Access Object for building J2EE Applications. • Designed & developed several multi-tier Web based, Client-Server and Multithread applications using Object Oriented Analysis and Design concepts and Service Oriented Architecture (SOA) mostly in cross platform environment. • Excellent working knowledge of popular frameworks like Struts, Hibernate, and Spring MVC. • Developed core modules in large cross-platform applications using JAVA, J2EE, Spring, Struts, Hibernate, JAX-WS (SOAP)/JAX-RS(REST) Web Services JMS. • Good understanding of HDFS Designs, Daemons, federation and HDFS high availability (HA). • Experience in developing Hive and Pig scripts. • Extensive knowledge of creating different Hive tables with different file compressions formats. • Hands on experience in installing, configuring and administrating Hadoop cluster components like MapReduce, HDFS, HBase, Hive, Sqoop, Spark, Pig, Zookeeper, Oozie and Flume using Apache Code base. • Experience in Managing scalable Hadoop clusters including Cluster designing, provisioning, custom configurations, monitoring and maintaining using Hadoop distributions: Cloudera CDH. • Good experience using Apache SPARK. • Worked on a prototype Apache Spark Streaming project, and converted our existing Java Storm Topology. • Experience managing Cloudera distribution of Hadoop(Cloudera Manager). • Excellent understanding of NoSQL databases like HBase. • Extensive working knowledge in setting up and running Clusters, monitoring, Data analytics, Sentiment analysis, Predictive analysis, Data presentation with big data world. • Hands on experience working on structured, unstructured data with various file formats such as xml files, Json files, sequence files using Map Reduce programs. • Extensive experience with wiring SQL queries using HiveQL to perform analytics on structured data. • Expertise in Data load management, importing & exporting data using SQOOP & FLUME. • Performed different PIG operations, joining operations and transformations on data to join, clean, aggregate and analyze data. • Excellent interpersonal and communication skills, creative, research-minded, technically competent and result-oriented with problem solving and leadership skills. • Expert in Database, RDBMS concepts and using MS SQL Server and Oracle 10g. • Expertise in working with web development technologies such as HTML, CSS, and JavaScript. • Expertise in working with different methodologies like Waterfall and Agile. • Proficient experience in using the databases such as MySQL, MS SQL Server, DB2. 1
  • 2. SOFTWARE SKILLS Elements Particulars Primary Skills Analysis, Design, Development, Implementation, Testing & Packaging. Languages Java Big Data Skills Hadoop, Map Reduce, Hive, Pig, Sqoop, Oozie, Scala, Spark RDBMS MS SQL server 2000/2005, DB2, Oracle 8.x/9i/10g No SQL HBase, MarkLogic Internet Technology JSP, HTML, XML & CSS Scripting Language Java Script, JSON, Angular JS Application Server Web Sphere 6.0, Web Logic 8.1, Jboss5.0 Web Server Tomcat 5.0 Frameworks Struts, Spring, Hibernate, Log4j CM Tools IBM Clear Case, IBM Rational Team Concert (Think client &Web version), WinCVS, SVN, GIT Defect Tracking Tools IBM TSRM, IBM Rational Team Concert, IBM Rational Quality Manager Build Tools Apache Ant - 1.6.5 Testing Tools JUnit IDE & GUI Eclipse 3.3, IBM RAD, IBM RSA & Net Beans Operating System Windows 7, Windows 95/98/ME/NT/XP, Unix & Linux UML Modeling Tools Star UML Web Technologies Servlets, JSP INDUSTRY EXPERIENCE • UK Public Sector • Finance Sector • Retail Sector Achievements at Workplace • Received star performer in IBM. • Was appreciated for committed and reliable work. • Received value awards, Deep Skill Adder awards in IBM. • Received lot of appreciation in showing the team work, adapting to new technologies & many client Appreciation. Interpersonal Competencies • Ability to interact successfully with multiple teams across the global organization, including services and support for all regions. • Strong mathematical, analytical background with innovative thoughts and creative action. • Sound Tech-skills, getting-across business flow, Quick-grasp self-learning professional • Value-added attitudes. • Zeal to learn New Technologies. 2
  • 3. Project #1 : SPST (Service Pac Product Selector Tool) Client : IBM, Raleigh, NC Environment : Hadoop, Hive, Pig, SQOOP, Map Reduce, Java(jdk1.7), LINUX, MySQL, NoSQL, Cloudera, Spring Duration : Apr’15 - Till date Description : This is a web based on-line tool designed to help Offering and Sales Managers and global Geo’s create, maintain and distribute Service Pac Product Offerings data worldwide. Along with release enhancements and maintenance activities business analytics also implemented. Delivered multiple Big Data use cases to support custom sales offerings using Hadoop. Delivered different KPIs and Metrics like. Per country wide - which offer description has least and more times occurred - which offer number has least and most occurrence - which offer type has least and more occurrences - which partpin has least and more occurrences - which offerpin has least and most occurrences - what is the highest and what is the lowest cost. Are they repeated? if so how many times and for what offer types they repeated - List down the top 10 offer descriptions - List down the top 10 offer types - Which offer description has highest price and which one is lowest price - What is the difference between the highest price and lowest price - How many offers are released in each year - for the customer which Part pin you will suggest - At least suggest 3 categories - For the customer which Offer pin you will not suggest - you should not suggest 3 categories - for how many records its under loss - For the combination top offer description and the top offer code how many Part pins are repeated - Country code has how many Offer numbers - For each year prepare a tab separated file as below year maxopencost maxclosecost offernumber offercode country Zone wide - All the above mentioned should be repeated for the entire zone - zone has how many country codes Total - All the above should be repeated for the entire zone - Total how many country codes are there and each zone has how many country codes. Come out with a CSV formatted file output Ex: zone1,AM,300 zone1,AT,365 Different Zones - from each zone which country has more machine types - from each zone which country has highest price and give the complete details. - In the zone what is the highest price and give the complete details − what is the latest announce date in a country, zone. 3
  • 4. As a Senior Team Member, was responsible for • Installation & configuration of a Hadoop cluster along with Hive. • Developed Map Reduce application using Hadoop map reduce programming, a framework for processing. • Large data sets in parallel across the Hadoop cluster for pre-processing. • Developed the code for Importing and exporting data into HDFS and Hive using Sqoop. • Responsible for writing Hive Queries for analysing data in Hive warehouse using Hive Query Language(HQL). • Involved in defining job flows using Oozie for scheduling jobs to manage apache Hadoop jobs by directed. • Developing Hive User Defined Functions in java, compiling them into jars and adding them to the HDFS and executing them with Hive Queries. • Experienced in managing and reviewing Hadoop log files. • Responsible to manage data coming from different sources. • Assisted in monitoring the Hadoop cluster • Dealing with high volume of data in the cluster. • Tested and reported defects in an Agile Methodology perspective. • Consolidate all defects, report it to PM/Leads for prompt fixes by development teams and drive it to closure. • Installed Hadoop ecosystems (Hive, Pig, Sqoop, HBase, Oozie) on top of Hadoop cluster • Importing data from SQL to HDFS & Hive for analytical purpose. • Involved in developing the Controller, Service and DAO layers of Spring Framework for developing dashboard for SPST project. • Attending Business requirement Meetings, UAT Support, Onsite- Offshore Co-ordination, Work Assignment. Project #2 : Inventory Management Client : Wal-Mart, Bentonville, AR Environment : CDH4,Eclipse,HDFS,Hive, Map Reduce, Spark, Spark-SQL, Oozie, Sqoop, Pig Duration : 22 Months (Jun’13 – Mar’15) Description : The application tasks include data extraction from various input sources like DB2, xml into Cassandra database. The incoming data is specifically related to the Wal-Mart stores details like address, alignments, divisions, departments and etc. these data filtered according the business logic and stored in the respective column families. Written some of the services to retrieve the data from the Cassandra. As a Senior Team Member, was responsible for • Developed Spark code using Scala and Spark-SQL/Streaming for faster testing and processing of data. • Import the data from different sources like HDFS/HBase into Spark RDD Developed a data pipeline using Kafka and Storm to store data into HDFS. Performed real time analysis on the incoming data. • Automated the process for extraction of data from warehouses and weblogs by developing work-flows and coordinator jobs in OOZIE. • Developed Scala scripts, UDFs using both Data frames/SQL and RDD/Map Reduce in Spark for Data Aggregation, queries and writing data back to OLTP system directly or through Sqoop. 4
  • 5. • Exploring with the Spark improving the performance and optimization of the existing algorithms in Hadoop using Spark Context, Spark-SQL ,Spark YARN. • Performance optimization dealing with large datasets using Partitions, Spark in Memory capabilities, Broadcasts in Spark, Effective & efficient Joins, Transformations and other heavy lifting during ingestion process itself. • Used Spark API over Horton Works Hadoop YARN to perform analytics on data in Hive. • Performed transformations like event joins, filter bot traffic and some pre-aggregations using Pig. • Developed Map Reduce jobs to convert data files into Parquet file format. • Developed business specific Custom UDF's in Hive, Pig. • Configured Oozie workflow to run multiple Hive and Pig jobs which run independently with time and data availability. • Optimized Map Reduce code, pig scripts and performance tuning and analysis. • Involvement in design, development and testing phases of Software Development Life Cycle. • Performed Hadoop installation, updates, patches and version upgrades when required. Project #3 : DWP CIS (Customer Information System) Client : Public Sector, UK Government, Preston, UK Environment : Core Java, J2ee, JSP, Struts, Web services, Altova XML Spy, WAS 6.0, RAD, Apache ANT, Oracle SQL Developer, RTC 3.0 RDBMS : Oracle Duration : 19 Months (Nov’ 11 – May’ 13) Description : CIS is Customer Information system which holds all the personal details of the UK Citizens. Along with the personal details that they are entitled for and registered are also stored in CIS.CIS is primary for the personal and slave for the benefits and award details stored in the database. CIS was formed in 2004 by merging two systems (DCI and PDCS). DCI is Department Central Index and PDCS is Personal Data Computer Store. These two were two different systems altogether and they never used to communicate to each other. Because of that whenever the change is happening in one system, the same was not getting reflected in the other system. As part of Release 1, CIS database was built by having the information to flow from DCI and PDCS to CIS.PDCS was COBOL based and all systems which were talking to PDCS started to talk to CIS now through batch and online. Dialogues were replaced by SEF screens and functions. This will mean that the mainframe dialogues will be replaced by browser based screens. The CIS online Functions which will provide as replacement for DCI can be decomposed into number of discrete function types like Primary access, Secondary access, Data link Requests and Inter Service Access. As a Senior Team Member, was responsible for • Implementing GUI as per requirement. • Involved to create the Functional Design and Technical Design. • Involved in coding Business logic, Persist data with struts and JDBC, Unit Testing and Integration Testing. • Involved Re-Usable Components which can be used in all modules. • Involved in supporting UAT activities and Production issues, fixing bugs and Defect Tracking. • Involved in creating XSD. • Involved in debugging, troubleshooting and defect fixing • Involved in updating the XSDs based on the business requirements. • Involved in testing of the Web services and integrating with external vendors and internal clients. 5
  • 6. • Assist developers in the technical design phase, construction and unit testing phase. • Analyzing and understanding the architectural requirements. • Proposing a new design solution which exceeds the client expectations. • Involved in conducting peer code Reviews and Functional documents. Project #4 : E Referrals Client : Public Sector, UK Government, Preston, UK Environment : Java, HTML, JSP, J2ee, JDBC, EDS Tool RDBMS : My SQL 5.5 Servers : Tomcat Duration : 12 Months (Nov’ 10 to Oct’ 11) Description : The eReferrals over payments application provides a solution that allows the user to complete online debt referrals. The service pre-populates, where possible, debt referral screens with client data held in CIS. This is retrieved using a keyed NINO for the case. The application at its most basic will:  Create an eReferral to Debt Manager via user input and data scrape from CIS.  Route the referral through the approval process  Collate daily approved referrals into a Debt Referral Batch file.  Dispatch file to Debt Manager  A copy of the file held on eReferrals for 7 th day period only. Incoming data will consist of client data via CIS, in XML format. Outgoing data destined for Debt Manager will consist of XML document made up Debt referrals and various reports in CSV format. As a Senior Team Member, was responsible for • Involved in implementing GUI as per requirement. • Involved in LLD, Functional Design and Technical Design. • Involved create the web pages using Java server pages. • Involved Controller logic in Servlets. • Implemented client side validations using Java Script. • Implemented JDBC components. • Preparing Unit Test Cases. Project #5 : E forms Client : Public Sector, UK Government, Preston, UK Environment : Core Java, HTML, Java script, JSP, Servlets, JDBC, RTC RDBMS : My SQL 5.5 Servers : Tomcat Duration : 9 Months (Feb’ 10 to Oct’ 10) Description : E forms is a service available to all staff via the DWP intranet that allows DWP staff to submit various personal related forms online. It was initially developed to allow staff access to their pay information, and to submit expense and overtime claims online. However, the majority of this week has now been taken over by the strategic resource management system. E forms continue to provides services for few pay/expenses forms. As a Senior Team Member, was responsible for • Involved in implementing GUI as per requirement. 6
  • 7. • Involved in LLD, Functional Design and Technical Design. • Involved create the web pages using Java server pages. • Involved Controller logic in Servlets. • Implemented client side validations using Java Script. • Implemented JDBC components. • Preparing Unit Test Cases Project #6 : ACG Italy Client : ACG VISION4, Italy Environment : Java, JSON, Struts, Hibernate, Web Sphere, RSA, Clear Case, DB2, JUnit, RQM RDBMS : DB2 Duration : 18 Months (Aug ’08 – Jan’ 10) Description : ACG Vision4 is an ERP product this covers all processes of the company, organized to process, designed to be used easily by all the different types of users, flexible in its ability to integrate and interact with other systems inside and outside the company and whose functions and data are accessible with the most popular systems for office automation and data- independent, so intuitive and simple. It’s having some modules like Finance, Controlling, Supply Chain Management and SVM.ACG has developed Vision4 taking advantage of IBM's best technology and the most popular open source technology on the market. The database and platform infrastructure is built on IBM DB2 and IBM WebSphere, the activity of query, reporting and analysis is made with Cognos. For the part of architectural construction, adhering to the principles of SOA, there was an undertaken using proven industry standard such Hybernia, Struts, and Dojo Web 2.0, development was carried out entirely in Java, everything installed on multiple platforms: Linux, OS400 and Windows. As a Team Member, was responsible for • Responsible for leading the project in various phases including design, development and Unit testing of the application modules and management of a team. • Worked as a Functional Group Leader for implementing the functional use cases. • Develop the Re-Usable Components which can be used in all modules. • Develop Proof of Concepts for ACG framework and provide technical solutions. • Developed UI by using JSON. • Implementing the Action Classes using Struts. • Written Hibernate components and conducting peer code views. • Requirement analysis from the business users and from onshore counter partner. • Involve in enhancements, debugging, troubleshooting and defect fixing. • Involve in Unit Testing, White Box Testing and Integration Testing of the application. • Customizing and exploring new things on ACG framework • Involved in assisting juniors. Project #7 : eCIS (electronic Customer Information System) Client : Service Master & MerryMaids, United States Environment : Java, XML, Spring, Hibernate, JMS, Web Services and EXTJS RDBMS : MSSQL Server 2005, Oracle Servers : Tomcat 6 & JBoss 5.4 Duration : 11 Months (Sep '07 – July ’08) Description : The eCIS –Electronic Customer Information System provides the service to the customers and maintain the list of services for different branches and Franchises. eCIS is web 7
  • 8. based and distributed web application. Here we are maintaining customer and Employee information for different services. The end user is main active participate in the system. The following are the list of modules for customer: Customer / Employee / Service / Account Receivables / Utilities / Maintenance As a Team Member, was responsible for • Involved in Analysis, Effort estimation, Design & Development in the part of Employee Audit, Email Reminders, Audit Rules and Workflow features enhancements. • Involved in Analysis, Design and Development of Customization Frame Work enhancements such as , Allocation Verification on Save, etc. • Involved in debugging, troubleshooting and defect fixing. • Involving in setting up and configuring the development and deployment of the application. • Design & developed Customized Ext JS components for Application. • Implementation of Spring Security for application, EXT Delegate / facade Data object controller, Domain locking, Exception handling using springs. EDUCATION • Bachelor of Computer Science (Bsc) from S.V University. • Master of Computer Application (MCA) from S.V University. 8