0% found this document useful (0 votes)
19 views2 pages

MNARESH

M Naresh Kumar is an Azure Data Engineer with over 6 years of experience in ETL processes, specializing in Azure Data Factory, Databricks, and SQL. He has a proven track record of optimizing data pipelines and reducing project costs by 20%, with expertise in cloud-based data solutions. His work includes data migration projects and the implementation of scalable data processing systems across various Azure services.
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
19 views2 pages

MNARESH

M Naresh Kumar is an Azure Data Engineer with over 6 years of experience in ETL processes, specializing in Azure Data Factory, Databricks, and SQL. He has a proven track record of optimizing data pipelines and reducing project costs by 20%, with expertise in cloud-based data solutions. His work includes data migration projects and the implementation of scalable data processing systems across various Azure services.
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 2

M Naresh Kumar

Azure Data Engineer


Hyderabad, India
+91-9908413504 | [email protected] | linkedin

SUMMARY
• Azure Data Engineer with 6+ years of experience in designing and implementing efficient ETL processes.
• Expertise in Azure Data Factory, Azure Synapse Analytics, Azure Data Lake Storage, and Databricks.
• Proficient in SQL, PySpark, and cloud-based data platforms for scalable data solutions.
• Skilled in building and optimizing data pipelines to streamline data workflows and enhance performance.
• Proven ability to collaborate with cross-functional teams to deliver high-quality, impactful solutions.
• Strong track record of reducing project costs by approximately 20% through optimized data strategies.
• Adept at integrating cloud-based technologies to support data storage, processing, and transformation needs.
• Focused on delivering solutions that align with business objectives and drive efficiency.

SKILLS
• Azure Data Factory • Spark SQL • Pyspark
• Azure Data Lake Storage • Python • Azure SQL
• Databricks • Logic Apps • PowerBI
• Pyspark • Synapse Analytics • Unity Catalog
• Delta Lake • Microsoft Fabric

WORK EXPERIENCE
IT Monk Inc., Hyderabad July 2018 – Till Now
As a Data Engineer August 2020 – Present
Project #1: EBM | Client: Trust Mark Health benefits
Tools & Technologies: Azure Data Factory, Azure DataBricks, Python, PySpark, Azure SQL, Logic Apps,Keyvaults, Azure
Synapse and Microsoft Fabric.
• Created ETL pipeline with Azure Ecosystem like Azure Data Factory. Building simple to complex pipelines, activities, Datasets
& data flows.
• Developed and maintained data ingestion and processing systems using Azure Data Factory, Databricks, Pyspark, Python,
Logic Apps, and SQL.
• Ensured data consistency and accuracy through data validation and cleansing techniques.
• Implemented scalable ETL pipelines using Azure Data Factory and Databricks.
• Performed different kind of transformations based on the requirements using Python, Pyspark and Spark SQL.
• Designed Metadata Driven Framework to load the hundreds of tables from On Prem system to Cloud using Azure Data Factory
activities like Get Metadata, Lookup, Stored Procedure, IF activity, Copy activity, Switch Activity and Validation activity.
• Created Dynamic Pipelines, Data Sets, Linked servers in Azure Data Factory for data movement and data transformations.
• Worked on migration of data from On-prem SQL server to Cloud databases (Azure Synapse Analytics (DW) & Azure SQL
DB).
• Implemented Data Ingestion, Data Processing (Transformations, enrichment and aggregations) and Reporting.
• Achieved proficiency in Azure Data Factory, Databricks, ADLS Gen2, Delta Lake and other Azure services.
• Led and managed data migration projects using Microsoft Fabric.
• Extracted, transformed, and analysed data from Azure Data Lake using Microsoft Fabric.
• Migrated data from multiple on-premises systems to Azure cloud using Azure Data Factory, Azure Synapse, Microsoft Fabric,
Azure Data Lake, and SQL.
• Set up Microsoft Fabric Data Gateway to securely connect on-premises data sources with cloud services, enabling seamless
data integration and real-time analytics.
• Successfully implemented scalable data pipelines within Microsoft Fabric for seamless integration and data processing across
Microsoft services.
• Implemented Microsoft Fabric Dataflow Gen2 for transformation, leveraging advanced tools for scalable processing and
seamless data movement across multiple sources.
• Integrated data lake and warehouse capabilities with Microsoft Fabric Lakehouse, enabling unified storage and high-
performance analytics across structured and unstructured data.

Project #2: Tishman speyer


Tools & Technologies: Azure Data Factory, Azure DataBricks, Python, PySpark and Azure SQL.
• Created an ETL pipeline with Azure Ecosystem like Azure Data Factory. Building simple to complex pipelines, activities,
Datasets & data flows.
• Assisted in the development of Logic Apps, Azure Function apps and Power Shell Workflows.
• Utilized Azure compute services [Data bricks, Spark,] to design transformation logic and stage transformed data.
• Built the ETL pipelines using Pyspark and spark SQL by performing transformations.
• Run the Python, with Apache Spark for designing and building data engineering workloads to leverage cloud computing, and
application of DevOps tooling and frameworks for the building of data pipelines.
• Experience in Extract Transform and Load data from Source Systems to Azure Data Storage services using a combination of
Azure Data Factory.
• Utilized various ADF connectors and linked services like Blob, Azure DWH, Azure Data Lake, Azure SQL server.
• Implemented SCD type1, 2, email notifications using logic app, rollback using batch id and error handling.
• Creating Logic apps to integrate applications, data, and services, systems across various enterprises or organizations.
• Creating logic app to send email notifications to different users when an event happens in various applications, services, and
systems, etc.
• Designed & deployed ETL, ELT jobs to curate, transform and aggregate data to create source models for end user analytics use
cases.
• Experience in data modeling and SQL developer skills in writing stored procedures, functions, transformations etc.
• Experience with MS SQL Server skills.
• Worked with Azure key vaults. Scheduling automation and monitoring instrumentation for data movement jobs.

As an Analyst July 2018 – August 2020


Project #1: Liquid Analytics | Client: CSL
Role: PowerBI Developer
Tools & Technologies: PowerBI Desktop, Power BI Service, DAX Functions
• Imported the data from data sources and preparing the data as per user requirement using powe query editor.
• Built the relationships between the various data sets in power pivot.
• Created new columns, new measures and conditional columns using DAX functions.
• Created different filters like visual level filters and page level filters and report level filters as per reporting needs.
• Expertise in various visualizations like donut chart, Pie chart and Bar graph, Tree map and waterfall Charts and Card
Visualizations.
• Designed diverse types of reports like drill down, drill through, And SyncSlicer.
• Created various Customized and interactive Reports and Dashboards in Power BI.
• Published reports to Power BI server and providing the access to required users.

EDUCATION
• B-Tech from Jawaharlal Nehru Technological University | 2009

CERTIFICATIONS
• Certified Microsoft Azure Data Engineer Associate
• Google Professional Cloud Architect Certification
https://www.credly.com/badges/ccc1eb65-85be-483d-bcf6-dd42fea36d3a/linked_in_profile

You might also like