Suresh B-Snowflake
Suresh B-Snowflake
[email protected]
+91-8885875024
Career Objective:
Seeking a challenging position in an organization which gives me enough opportunities to
exploit my innovative technical ideas and analytical skills, and to enhance my skills, thereby can put
forth my effective contribution to the organization's growth.
Profile:
Total experience: 5.5 years Relevant Experience: 5 years 5 months
Current Job Title: Data Engineer Time in Current Role: 1 Years 1 Months
Visa/Work permit: India Validity Date of Visa/Work Permit: N/A
Willing to relocate: Yes Willing to travel: Yes
Biography/Profile presentation:
Having 5.5 Years’ Experience in Information Technology in Application Support and
Enhancement on Informatica 9.x and Database Oracle SQL & PL/SQL and Snowflake Data
warehouse
Having around 2.9 years of Professional Experience in working with Snowflake cloud data
warehouse
Hands-on experience in bulk loading and unloading data into Snowflake tables using COPY
command.
Good knowledge on different Snowflake (Time travel, Fail safe, Data Streams, Tasks, Cloning,
Data masking, Data sharing, etc.)
Good exposure on Snowflake Insert, Update and Merge command.
Good knowledge on Snowflake Stages (User Stage, Table Stage, Named Stage, External
Stage).
Good exposure on different types of snowflake tables and views.
Handling large and complex data sets with CSV file formats.
Knowledge on loading JSON files.
Good Knowledge on UNIX commands.
Having 2.5 Years of data warehousing experience using Informatica ETL (Extraction,
Transformation and Loading Tool) Power center / Power Mart, Power Exchange.
Strong Experience in Working with ETL Informatica 9.X Which includes components
Informatica Power Center Designer, Workflow manager, Workflow monitor, Informatica
server and Repository Manager.
Experience working in both individual and team environments.
Strong ability to understand new concepts and applications.
Motivation to take independent responsibility as well as ability to contribute and be a
productive team member.
Technical Skills:
Programming Languages SQL, Snow SQL.
Databases Oracle, MySQL.
Cloud Data Warehouse Snowflake.
ETL Tools Snowflake, Informatica 9.X
Other Tools Github, Servicenow, Jira.
Education Qualification:
B.Sc. from Sri Krishnadevaraya University, Ananthapur, Andhra Pradesh.
Work Experience:
Worked as a Snowflake Data Engineer at HCL Technologies since Jan 2022 to Jan 2023
Worked as a Software Engineer at Accenture since July 2017 to Dec 2021
Professional Experience:
Latest Project: Nespresso
Client: Nestle
Environment: Snowflake , Power BI, AWS S3
Description:
The NESSOFT ERP software, used by all the NESPRESSO markets, is running on market
dedicated databases. The NESSSOFT database stores for the market information regarding product,
stocks, warehouse, customers, etc...
We have also to consider that NESIMPORTS schema might become the archive
database, when old data will be deleted from the market Production databases in order to enhance
their performance when the volume of data will be too heavy. In this case, program adaptation will
have to be done to not delete the data in the central database.
PROJECT: 2
Project Name: RDNG (REALDoc® NextGen)
Client: Ocwen Financial Services.
Technologies used: Snowflake, AWS S3
Description:
REALDoc® NextGen is a comprehensive data management program which including letter
logic capture automation, data dictionary and data quality. It is a complete re-design of data
architecture with a real time operational data store (ODS) handling all data driven components of
the Software will ensure high quality, timely availability of data resulting in consistent, guaranteed
and accurate letter generation and increase overall system efficiency.
The optimal REALDoc® setup runs eligibility criteria and data collections from the
correspondence request engine against the customer’s near-real-time ODS which is loaded from
REALServicing®’s Progress database [OLTP] , Based upon the information obtained by
Correspondence the correspondence request engine submits the data to REALDocNG® which in
MongoDB to generate the Letters.
Responsibilities:
Project 3:
Project Name: Cash Reconciliation
Client: JPMorgan
Environment: Informatica 9.5, MS SQL, Oracle 10g, Unix.
Description:
HFS is a sub-group of JPMorgan Worldwide Security Services; it provides mid-office and back-office
solutions to various Hedge Funds which include, Cash and Position Reconciliation, NAV Calculation,
MIS reporting etc.
Cash Reconciliation is a part of HFS which enables Hedge Funds to reconcile trade and cash details i)
provided to JPMorgan by them against ii) details provided to JPMorgan by executing brokers.
Responsibilities:
Supporting all the data warehouse applications of the client.
Working on enhancements or change requests raised by the users to production
environment.
Monitoring the ETL processes regularly and taking care of any data load job failures.
Coordinating with the Source system team in case of any issues in receiving or retrieving the
source files
Ensuring that the data is loaded smoothly in the data warehouse and then to the data mart
Analyzing the data discrepancy issues raised by the users and suggesting the corrective
actions to fix the issue and implementing it in the production environment.
Migrating all the database, ETL and report modifications to Production after UAT.