0% found this document useful (0 votes)
3 views6 pages

Ram Resume 1750075521998 Ramreddy

Ramalinga is an ETL Developer with 2.5 years of experience in data warehousing technologies, specializing in Informatica, Oracle, and Snowflake cloud. He has worked on various projects involving data integration, transformation, and quality, demonstrating proficiency in ETL processes and data warehouse concepts. His educational background includes a B.Tech from Jawaharlal Nehru Technological University, and he has held positions at SG Analytics, Brane Enterprises, and Infosys.

Uploaded by

Ramalinga reddy
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
3 views6 pages

Ram Resume 1750075521998 Ramreddy

Ramalinga is an ETL Developer with 2.5 years of experience in data warehousing technologies, specializing in Informatica, Oracle, and Snowflake cloud. He has worked on various projects involving data integration, transformation, and quality, demonstrating proficiency in ETL processes and data warehouse concepts. His educational background includes a B.Tech from Jawaharlal Nehru Technological University, and he has held positions at SG Analytics, Brane Enterprises, and Infosys.

Uploaded by

Ramalinga reddy
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 6

Ramalinga

Mailid:[email protected]
Mobile:+91-7075875091

OBJECTIVE:

Seeking a position to utilize my skills and abilities in the organization and working in
challenging environment that offers professional growth while being Resourceful
innovative and flexible.

PROFESSIONALSUMMARY:

● Having 2.5 Years IT Experience and Relevant Experience of Data


warehousing Technologies as a team member for various Data warehousing
& Data Marts projects as an ETL Developer using Informatica and Oracle,
Snowflake cloud and IDMC
● Creating Informatica IDMC mappings for the different plans using various
transformations.
● Have working experience in Informatica Intelligent Cloud Services IDMC
components - application integration, data integration, Informatica data
quality and Informatica power center.
● Worked on SCD Type1, SCD Type2 in IICS
● Worked on Mapping, Mapping Task, Mapplet, Task Flows
● Good knowledge of Data Warehousing Concepts.
● Worked on different tasks like Synchronization Replication and Power Center
Task.
● Knowledge on secure agent and Run time environment.
● Worked on Input and In-Out Parameter.
● Used Import and Export from the Internal stage (Snowflake) from external
stage (AWS S3).
● Developed snowflake procedures for executing branching and looping
● Created clone objects to maintain zero-copying cloning
● Worked on different phases of Data Warehouse development lifecycle
from designing, development, testing and implementation.
● Extensive experience in creation of ETL mappings and transformations to
reflect business rules using Informatica power center to move data from
multiple sources into target.
● Extensively Worked on Informatica Designer Components - Source Analyzer,
Warehousing Designer, Transformations Developer, Mapplet and Mapping
Designer
● Extensive work in ETL process consisting of data transformation, data
1
sourcing, mapping, conversion and loading.
● Designed various Mappings for extracting data from various sources
involving relational tables and flat files.
● Created Re-Usable objects like Mapplets & Re-usable transformations for
business logic
● Created different transformations for loading the data into Oracle database
like Unconnected and Connected lookups, Joiner Transformation, Router,
Filter, Expression, Sequence Generator, Aggregator, Joiner, Update Strategy
etc.
● Experience on Workflow Manager Tools - Task Developer, Worklet &
Workflow Designer.
● Experience in Performance tuning of targets, sources, mappings and sessions.
● Used Debugger to test the mappings and fix the bugs.
● Responsible for creating business solutions for Incremental and Full loads.
● Optimizing Informatica Mappings and Sessions to improve the performance.
● Worked on different kind of Set Operators like Intersect, Union, Union all,
Minus.

EDUCATIONDETAILS:

● B. Tech from Jawaharlal Nehru Technological University, Ananthapur (JNTUA)

EXPERIENCEDETAILS:

● Working as an IICS Developer(support) in SG Analytics, from Nov ‘24 to Mar ‘25

● Working as an IICS Developer in Brane Enterprises Pvt Ltd, from Aug’23 to Sept
‘24

● Working as an IICS Developer with Infosys Ltd, from Apr ‘22 to May’23.

TECHNICALSKILLS:

ETL/ELT Tools IICS, Snowflake cloud, Informatica, CDI


Databases Oracle, Snowflake Cloud, Sql Server, TIBCO
Operating Systems Windows, Unix
Languages SQL, Pl/SQL, Snow Flake DB.AWS S3

2
PROJECTSDETAILS:

Project # 1:
Title: WEBU(Western Europe Business Unit)
Client: YUM
Role: IICS Developer (Production Support)
Duration: Nov 2025 to Mar 2025
Domain: Sales
Environment: IICS, Snowflake cloud, Oracle, AWS, Informatica, CDI, Flat files, SQL Server

Description: Yum as a world largest restaurant brand across over 150 countries and over
50,000 restaurants has KFC as one of its largest divisions. As part of its strategic
business initiative YUM have identified a data transformation need to leverage its global
scale to enable a global scale real time reporting of sales and view of its online
customers. The KFC Western Europe Business Unit(WEBU) is looking to stand up a
common, regional cloud data platform in western Europe to serve the data needs for the
WEBU. The regional data platform for Western Europe will bring in best in class data
design and technology assets from Helios and modernize the reporting and analytics
ecosystems for the western Europe markets.
Responsibilities:

● Managed and resolved various types of production issues, including


data inconsistencies, ETL failures, and performance bottlenecks.
● Working Update scripts to update data based on Tickets
● Provided timely updates to clients regarding the status of their tickets,
detailing the impact of issues such as soft deletes, data reloads, or
query changes, and ensuring transparency during issue resolution
● Addressed and resolved issues related to soft deletes in production, ensuring that
records were appropriately flagged and excluded from downstream systems
without permanent deletion
● Coordinated and executed data reloads for affected data sets, ensuring the
restoration of accurate sales data in the event of data corruption or
inconsistencies.
● Worked on Mapping, Mapping Task, Mapplet, Task Flows
● Profound knowledge on Data warehouse concepts like Data warehouse, Star
schema, Data Marts, Dimension and Fact tables. Good Knowledge in
developing SQL with relational databases like Oracle.
● Extensively used Input Parameters, In-Out parameters
● Worked on Synchronization, Replication, Dynamic mapping, Power Center Task.
● Bulk loading from the external stage (AWS S3), Internal stage to snowflake cloud
3
using Copy command.
● Managed issues arising from changes in source queries, ensuring the proper
extraction of data when source systems or queries were modified
● Identified and addressed failures in the ETL workflows due to changes in source
data, missing records, or issues related to transformations, leading to the
successful resolution of tickets and business continuity.
● Worked on DDL, DML, TCL, DCL commands
● Worked closely with business analysts, developers, and database administrators to
analyze changes in source systems and queries, ensuring that any required
adjustments to ETL workflows were implemented efficiently.
● Documented recurring issues and solutions related to soft deletes, data reloads, and
source query changes, creating knowledge-sharing materials to enhance future
troubleshooting and support efficiency.
● Done extensive testing and wrote queries in SQL to ensure the loading of the data.
● Involved in Unit testing
● On-time Production migration without defects
● Involved in Post Production Support.

Project # 2:
Title: Progressive Insurance
Client: Progressive Corporation
Role: ETL/IICS Developer
Duration: Aug 2023 to Sept 2024
Domain: Insurance
Environment: IICS, Snowflake cloud, Oracle, Unix, Informatica, CDI

Description: The Progressive Corporation is an American insurance company. In late


2022, Progressive became the largest motor insurance carrier in the U.S. The company
was co-founded in 1937 by Jack Green and Joseph M. Lewis, and is headquartered
in Mayfield Village, Ohio. The company insures passenger vehicles, motorcycles, RVs,
trailers, boats, PWC, and commercial vehicles. Progressive also provides home, life, pet,
and other insurance through select companies. Additionally, Progressive offers auto
insurance in Australia. To gain a competitive edge and provide better services to their
clients, these companies rely on robust ETL (Extract, Transform, Load) processes. ETL
development is the backbone of their data management strategy, enabling them to
extract valuable insights, make informed decisions, and enhance customer experiences.
Generate the claims for Daily, Weekly, Monthly, Quarterly Reports

Responsibilities:

● Understanding the Client requirement


4
● Creating Informatica IICS mappings for the different plans using various
transformations.
● Have working experience in Informatica Intelligent Cloud Services IICS
components - application integration, data integration, Informatica data
quality and Informatica power center and CRM application - Salesforce.
● Worked on SCD Type1, SCD Type2 in IICS
● Worked on Mapping, Mapping Task, Mapplet, Task Flows
● Profound knowledge on Data warehouse concepts like Data warehouse, Star
schema, Data Marts, Dimension and Fact tables. Good Knowledge in
developing SQL with relational databases like Oracle.
● Extensively used Input Parameters, In-Out parameters
● Worked on Synchronization, Replication, Dynamic mapping, Power Center Task.
● Bulk loading from the external stage (AWS S3), Internal stage to snowflake cloud
using Copy command.
● Loading data into snowflake tables from the internal stage using snowsql.
● Used Import and Export from the Internal stage (Snowflake) from external stage
(AWS S3)
● Developed snowflake procedures for executing branching and looping
● Created clone objects to maintain zero-copying cloning
● Worked on DDL, DML, TCL, DCL commands
● Developed complex mappings using transformations such as the Source, Joiner,
Aggregator, Expression, Connected Lookup, Unconnected Lookup and Router
transformations.
● Created Informatica mappings for stage, Dimensions and Fact table loads.
● Done extensive testing and wrote queries in SQL to ensure the loading of the data.
● Created SCD type-1 and type-2 mappings for loading the dimension tables.
● Developed and implemented the coding of Informatica Mapping for the different
stages of ETL.
● Involved in Unit testing
● On-time Production migration without defects
● Involved in Post Production Support.

Project # 3:
Title: Batch Releasing Hub (BRH)
Client: Novartis
Role: Technology Analyst (ETL developer)
Duration: April 2022 to May 2023
Environment: IICS, Snowflake cloud, TIBCO, Unix, Informatica, CDI

5
Description: Novartis is a leading medicines company. The Batch Claims Hub project will
deliver OBBR (One Button Batch Release Process). OBBR has the objective of
implementing a global batch release process. Currently there are many systems have
batch related information. Information from all these systems needs to be combined for a
final batch release decision. This Data is available TIBCO. Views has been created on top
the Oracle tables. Finally, Informatica consumes the data in views and loads the data into
Snowflake tables.

Responsibilities:

● Understanding the requirement specifications.


● Coordinating with on-site lead for daily activities.
● Extracting data from Oracle and views loading to data warehouse, developing
Informatica Mappings.
● Making effective use of transformations like SQL, EXP, Filter, AGG, LKP, UPD,
Joiner, Sorter and Router.
● Resolving issues in transformations and mappings.
● Developed mapping to implement slowly changing dimensions as per the
requirement.
● Prepared Unit and System test cases.
● Used mapping parameters, variables and different tasks such as command task,
decision task and timer.
● Involved preparation of the Release notes and Job dependency documents.
● Involved in production deployment.
● Raising Non-Prod change requests and Incidents by using Service now for
Migrations, Database issues, Informatica Relational connections updates and
Parameter file updates.
● Analyzed mappings and sessions for better performance.

You might also like