0% found this document useful (0 votes)
27 views8 pages

Munish Patel

The document provides a summary of a candidate's skills and experience as a SQL/SSIS/Informatica developer. It includes details on their technical skills, tools experience, and work history in roles supporting ETL processes and data warehousing.

Uploaded by

rsoni4980
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
27 views8 pages

Munish Patel

The document provides a summary of a candidate's skills and experience as a SQL/SSIS/Informatica developer. It includes details on their technical skills, tools experience, and work history in roles supporting ETL processes and data warehousing.

Uploaded by

rsoni4980
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
You are on page 1/ 8

Mukeshchandra Patel (Sr.

SQL/SSIS/Informatica Developer)
Cell: (551) 358-5786
Email ID: [email protected]

SUMMARY:
• Over 12 years of experience as SQL-BI Developer in OLTP / OLAP / MSBI / Informatica power center / Informatica
cloud environment for very large databases in SQL Server.
• Experience with Relational and Dimensional data modeling, of Ralph Kimball and Bill Inmon methodology for
Dimensional Modeling, Star and Snowflake Schemas.
• Strong in SQL development, SQL reporting, analytics, and business intelligence (SSIS, SSRS, SSAS, IPC) and Strong
Database Optimization.
• Creating Star Schema and Snowflake Schemas and build OLAP cubes.
• Expertise in Data Extraction, Transforming and Loading (ETL) using SQL Server Integration Services (SSIS).
• Experience in Informatica Powert Center 9.x-Created Repository, Mapping and workflow
• Experience in migration of Data from Excel, Flat file, Oracle to MS SQL Server by using BCP, DTS and SSIS.
• Validate Database integrity after ETL loads, troubleshoot nightly ETL loads.
• Worked with multiple operational information sources like Flat Files/Excel, Oracle, SQL Server, Redshift, Informatica,
Big Query, and MySQL.
• Experience in SSRS/SSAS and Tableau.
• Used output XML files, to remove empty delta files and to FTP the output XML files to a different server.
• Expert in creating and using Stored Procedures, SSIS, Views, User Defined Functions, complex queries involving
multiple tables and partitions better ETL performance.
• Experience in Informatica power center and Informatica cloud
• Excellent experience in creating Indexes, Indexed Views, Query Optimization and Performance Tuning.
• Deployed SSRS reports into a reporting server and implemented security measures at report server by implementing
authentication and authorization methods.
• Experience on version control/configuration management tools – CI-CD and Git
• Extensive experience in writing Parameterized Queries for generating Tabular reports, formatting report layout, Sub
reports using Global Variables, Expressions, Functions, Sorting the data, defining data source and subtotals for the
reports using SSRS 2016/2014/2012.
• Used Power BI Power Pivot to develop data analysis prototype and used Power View and Power Map to visualize
reports.
• Experience with Job scheduling tools like JAMS, Control M and Unix shell scripting.
• Well experience in Data Extraction, Transforming and Loading (ETL) using various tools such as SQL Server
Integration Services (SSIS-2017,2019), replication, Always ON features, In-Memory tables
• Excellent experience in maintaining Batch Logging, Error Logging with Event Handlers and Configuring Connection
Managers using SSIS, C#, Python.
• Experience of Create / Maintain SSIS Package using C#, Java and Python
• Experience with Big data Migration.
• Excellent experience in developing stylish Drill Down, Cascading, Parameterized, Drill through, Sub reports and
Charts using Global variables, Expressions and Functions for the business users.
• Involved in creating Jobs, SQL Mail Agent, Alerts and Schedule SSIS, Environment variables in SQL 2012, 2014,
Migrated SQL 2008r2 to SQL 2012/SQL 2014/2016.
• Experience in configuring Registered Servers, Linked Servers, and Multi Server Administration.
• Self-motivated to learn new technology with Excellent verbal and written communication skills combined with
interpersonal and conflict resolution and strong analytical skills.
• Working experience with Agile Software development method.

TECHNICAL SKILLS:

Tools SQL Server Integration Services (SSIS-2017,2019), SQL Server Reporting Services (SSRS),
SQL Server Analysis Services (SSAS), Data Integrator, Data transformation services (DTS),
Informatica power center, Informatica cloud, Tableau.
Database Tools SQL profiler, SQL Query Analyzer, Index Analyzer, SQL Server Management Studio, Erwin,
Red gate, Lite Speed, fog light,SCOM,OEM,SQLServer 2008/2012/2014/2019, MySQL, T-
SQL.
Source Control TFS, GIT hub, SVN
Ticketing Tools Service Desk, BMC remedy, FogBux, JAMS,Snow
Domain Knowledge Taxation, BFSI,Manufacturin
Languages T-SQL, SQL, PL/SQL,JAVA,UNIX,Linux script, Python, c#
Applications Microsoft Office Suite, Adobe Photoshop, Acrobat Reader/Writer
Databases MS SQL Server, MongoDB, MySQL, Hadoop,Db2,ODS,Azure,AWS
Operating Systems Windows Server, UNIX, Linux

PROFESSIONAL EXPERIENCE:

Client Infosys

Role Technology Lead -SQL/SSIS-Informatica 9.x

Duration Oct 2022 to May 2023

Location Warren, NJ

Responsibilities:
• Supporting to the team for data loads, Manual checking of multiple data validation and various Job failures.
• Updating/Preparation to create Store procedures (Scripts) for all data loads following the business requirement.
• Create Models of Informatica power center
• Maintain or updating pre created jobs and monitoring those auto run jobs and maintain SP, View and SSIS packages.
• Validate records count, data quality, implemented business logic and data population and do Data Analysis.
• Knowledge on Azure and cloud with Agile-scrum technology
• Creating / updating Tableau and Matrix report
• Supporting/creating Month End reporting part for business user
• Identify Error and verify those records and find out failure root cause.
• Experience in OLTP/OLAP system study, Analysis, and modeling, developing data warehouse schemas like Star and
Snowflake Schema used in dimensional and multi-dimensional modeling.
• Follow up Business logic and Update/Designed /Developed SSIS(ETL) packages / Informatica to extract, transform
and load data from non-prod server to Staging table
• Developed ETL program using Informatica 9.x to implement the business requirements.
• Created shell scripting to fine tune the ETL flow of the Informatica workflow.
• Experience with create Repository, Designer, Monitor and Workflow Manager
• Created /updated c#, Python and UNIX shell scripting.
• Update/Created SSIS packages to load data from Xml/Flat File to Data warehouse using transformations such as
Lookup, Derived Columns, Conditional Split, Sort, Multicast, Aggregate and Slowly Changing Dimension.
• Managing or creating database objects like Tables, Views, Stored Procedures, Triggers, Functions etc. using T-SQL to
provide structure to store and to maintain the database efficiently.
• Update/Created SSIS Packages to perform filtering operations and to import the data on a daily basis from the OLTP
system to SQL Server using master config for Automated data loads run.
• Basic QA testing for Stage and Prod Environment.
• Managing BI environment, which includes Data Model, Data Sources, ETL tools, target Data warehouse, Data marts,
OLAP analysis.

Tools: MS SQL Integration Services 2015, MS SQL Server Reporting Service 2015, MS SQL Server 2015, Crystal Reports,
Windows Server 2012, MS Office 2013, Visual Studio 2015/19, Tableau, Informatica power center
Client RSI

Role SQL/MSBI Developer – SSIS/Informatica

Duration Oct 2019 to May 2022

Location Providence, RI

Responsibilities:

• Get Specification book (SPEC) from RSI and generate business requirements and prepared the documentation for
requirements following the SPEC book for each data loads of Individual taxpayer/Business tax return.
• Managing and load all the BMF-BRTF, IMF-IRTF, IRMF, CP2K, SOS, DLT Employee/Employer, DMV License, DMV
Registration, FEIN, PTIN, LEVY data loads (Annually, Monthly and Quarterly)
• Crated SSIS Package to load data into the final destination using C# and Python
• Created Python and C# script for business logic.
• Preparation to create Store procedures (Scripts) for each data loads
• Supporting to the QA team and end user on how to use data warehouse, how to do Unit Testing and assisted them
with their existing queries to point to the Unit testing with ‘STAARS’ vs PW.
• Create a QA script for Unit testing for individual data loads.
• Knowledge on Azure and cloud with Agile-scrum technology
• Identify Error, Associate and Research records and verify those records.
• Creating Tableau /Drill down report.
• Have extensively worked in developing ETL program for supporting data Extraction, Transformation and loading
Informatica Power Center.
• Created UNIX shell scripting to run the informatica workflow and control ETL Flow.
• Effectively used Informatica parameter files for defining mapping variables, workflow variable, FTP connection and
relational connection.
• Experience with Auto Scheduling tools Control M and Autosys and shell scripting.
• Used various transformation like filter, Expression, Sequence Generator, Update Strategy, Joiner, Router and
aggregator to create robust mapping in the Informatica Power Center Designer 9.x
• Implementing ETL to implement Slowly Changing Dimension to maintain historical data in Data Warehouse.
• Experience in OLTP/OLAP system study, Analysis, and modeling, developing data warehouse schemas like Star and
Snowflake Schema used in dimensional and multi-dimensional modeling.
• Created centralized data warehouse (ODS) and developed de-normalized structures based on the requirements and
to improve the query performance for end users.
• Designed and developed SSIS-2014,2017,2019(ETL) packages to extract, transform and load data from non-prod
server to Staging table and use A&G validation process for staging to the Data warehouse.
• Update/Created SSIS packages to load data from Flat File to Data warehouse using transformations such as Lookup,
Derived Columns, Conditional Split, Sort, Multicast, Aggregate and Slowly Changing Dimension.
• Responsible for creating database objects like Table, Views, Stored Procedure, Triggers, Functions etc. using T-SQL to
provide structure to store and to maintain database efficiently.
• Update/Created SSIS Packages to perform filtering operations and to import the data on daily basis from the OLTP
system to SQL Server using master config for Automated data loads run via JAMS.
• Managing BI environment, which includes Data Model, Data Sources, ETL tools, target Data warehouse, Data marts,
OLAP analysis.
• Create a new TFS and follow the TFS workflow.

Tools: MS SQL Integration Services 2015, MS SQL Server Reporting service 2015, MS SQL Server 2015,
Informatica,Crystal Reports, Windows Server 2012, MS Office 2013, JAMS, Visual Studio 2015/19, Tableau Desktop,
Tableau Reader, Power BI Pro.
Client Geico insurance

Role Sr.SQL/MSBI Developer

Duration Jun 2018 to Oct 2019

Location Chevy Chase, MD

Responsibilities:

▪ Met with business users, gathered business requirements and prepared the documentation for requirement
analysis.
▪ Provided training to the business users on how to use the new data warehouse, browsing cubes from Excel and
assisted them in rewriting their existing queries to point to the new warehouse.
▪ Implementing ETL to implement Slowly Changing Dimension to maintain historical data in Data Warehouse.

▪ Experience in OLTP/OLAP system study, Analysis, and modeling, developing data warehouse schemas like Star and
Snowflake Schema used in dimensional and multi-dimensional modeling.
▪ Created centralized data warehouse (ODS) and developed de-normalized structures based on the requirements and
to improve the query performance for end users.
▪ Designed and developed SSIS-2014,2017,2019 (ETL) packages to extract, transform and load data from OLTP system
to the Data warehouse using C# and Python script.
▪ Created SSIS packages to load data from Flat File to Data warehouse using transformations such as Lookup, Derived
Columns, Conditional Split, Sort, Multicast, Aggregate and Slowly Changing Dimension.
▪ Used various SSIS tasks such as Fuzzy Lookup, Fuzzy Grouping etc., which did Data Scrubbing, including data
validation checks during Staging, before loading the data into the Data warehouse.
▪ Worked on Informatica Power Center tools – Designer, Repository Manager, Workflow Manager and Workflow
Monitoring
▪ Involved in building the ETL architecture and Source to Target mapping to load data into Data Warehouse using
Informatica 9.x
▪ Documented Informatica mapping in Excel spread sheet.

▪ Worked along with UNIX team for writing UNIX shell scripting to customize the server scheduling jobs.

▪ Responsible for creating database objects like Table, Views, Stored Procedure, Triggers, Functions etc. using T-SQL to
provide structure to store and to maintain database efficiently.
▪ Created Views and Indexed Views for the Performance Tuning.

▪ Created SQL Server Analysis Projects and created multi-dimensional databases and OLAP cubes and dimensions for
the analysis in specific areas using SSAS.
▪ Developed ETL Programs using Informatica to implement the business requirements and Created shell script to fine
tune the ETL flow of the Informatica workflow.
▪ Created SSIS Packages to perform filtering operations and to import the data on daily basis from the OLTP system
to SQL Server. Responsible for implementation of data viewers, SSIS logging for error handling the packages.
▪ Responsible for managing BI environment, which includes Data Model, Data Sources, ETL tools, target Data
warehouse, Data marts, OLAP analysis and Reporting tools.
▪ Created Tabular reports, Drill down, charts, Functions and Maps using SSRS. Created reports to retrieve data using
Stored Procedures that accept parameters.
▪ Worked with the business to create various dashboards related to different areas of the business to re-establish the
business model.
▪ Responsible for the Performance tuning of the SQL Commands with the use of SQL Profiler and Database Engine
Tuning Wizard.
▪ Supported and Administered TFS for creating new user groups

Tools: MS SQL Integration Services 2012, Informatica, MS SQL Server Reporting service 2012, MS SQL Server 2012,
Crystal Reports, Windows Server 2012, MS Office 2013, Visual Studio 2013/2014, Tableau Desktop, Tableau Reader.

Client National General Insurance

Role MSBI Developer

Duration Jan 2017 to Mar 2018

Location Fort Worth, TX

Responsibilities:

▪ Participated in requirement gathering, cube design, application design and SSAS development.

▪ Constructed cubes based on data volumes mostly adhering to Star Schema.

▪ Used SSIS(2014,2017) to create ETL packages (.dtsx files) to validate, extract, transform and load data to data
warehouse databases.
▪ Extracting data from excel files, high volume of data sets from data files, Oracle, DB2, Salesforce.com (sfdc) using
informatica ETL Mappings/SQL PLSQL scripts and loaded to data store area.
▪ Using Informatica Power Center for extraction, transformation, and load (ETL) of data in the data warehouse.

▪ Created and scheduled SSIS package to nightly process the SSAS cube based on oracle data mart tables updates.

▪ Worked on development/staging/production environments and used Team foundation server/visual source safe for
version controls and communication’s well as defining Actions, translations, and perspectives.
▪ Develop store procedures, tables, views, triggers, and functions implement best practices to maintain optimal
performance.
▪ Created automated T-SQL scripts to process summary database transactions that run off-hours.

▪ Created complex SSAS cubes with multiple fact measure groups and multiple dimension hierarchies based on the
OLAP Reporting needs.
▪ Implemented SQL Server OLAP services (SSAS) for building the data cubes.

▪ Creating complex cubes for seeing business data from different perspectives by slicing and dicing the data.
▪ Defined MDX scripts for querying the data from OLAP cubes (SSAS) and building reports on top of cubes.

▪ Developed MDX Scripts to create datasets to perform reporting and included interactive drill down reports, report
models and dashboard reports.
▪ Designed and implemented fact, dimensions and SSAS cubes using dimensional data modeling standards in SQL
server 2012/ Business intelligence development studio.
▪ Expert in calculating measures and dimension members using multi-dimensional expression (MDX), mathematical
formulas, and user-defined functions.
▪ Created and designed SSAS cube and promoted to production using XMLA scripts.

▪ Designed and implemented parameterized, drill-through, drill-down, and tabular reports on SSRS using SSAS cube
for performance point.
▪ Implemented/Developed multi parameterized reports in Report Builder and SSRS using SSAS cube allowing users to
make selections before executing the reports and making it more users friendly.
▪ Worked with SQL profiler to improve and enhance the MDX query execution time as a part of performance tuning
process.
▪ Wrote several complex stored procedures for efficient handling of the application.

▪ Transferred data from various OLTP data sources, such as Oracle, MS Access, MS Excel, Flat Files, CSV files into SQL
Server 2012.
▪ Involved in the designing, development, modification and enhancing the database structures and database objects.

Tools: SQL Server 2012/2008 R2, T-SQL, Windows 7, SSRS, SSAS, SSIS, MS Visio, performance point services, Informatica,
Power Pivot, Report Builder, Excel.

Client Infinite

Role Data Analyst

Duration May 2013 – Jan 2016

Location Hyd - INDIA

Responsibilities:
▪ Monitor the Database for duplicate records.

▪ Metrics reporting, data mining and trends in helpdesk environment using Access

▪ Gather data from Help Desk Ticketing System and write adhoc reports and, charts and graphs for analysis.

▪ Identify and report on various computer problems within the company to upper management

▪ Report on trends that come up as to identify changes or trouble within the systems using Access and Crystal
Reports.
▪ Guide, train and support teammates in testing processes, procedures, analysis and quality control of data, utilizing
past experience and training in Oracle, SQL, Unix and relational databases.
▪ Maintained Excel workbooks, such as development of pivot tables, exporting data from external SQL databases,
producing reports and updating spreadsheet information.
▪ Ran workflows created in Informatica by developers then compared before and after transformation of data
generated to ensure that transformation was successful.
▪ Deleted users from cost centers, deleted users’ authority to grant certain monetary amounts to certain
departments, deleted certain cost centers and profit centers from database
▪ Load new or modified data into back-end Oracle database.

▪ Merge the duplicate records and ensure that the information is associated with company records.

▪ Standardize company names, addresses, and ensure that necessary data fields are populated.

▪ Review the database proactively to identify inconsistencies in the data, conduct research using internal and external
sources to determine information is accurate.
▪ Resolve the data issues by following up with the end user.

▪ Coordinate activities and workflow with other Data Stewards in the firm to ensure data changes are done effectively
and efficiently.
▪ Review the database to identify and recommend adjustments and enhancements, including external systems and
types of data that could add value to the system.
▪ Extract the data from database and provide data analysis using SQL to the business user based on the requirements.
Create pivots and charts in excel sheet to report data in the format requested.
Environment: PL/SQL, Informatica Power Centre (Power Centre Designer, workflow manager, workflow monitor), SQL
*Loader, Cognos, Oracle, SQL Server, Erwin, Windows, TOAD. MS Outlook, MS Project, MS Word, MS Excel, MS Visio, MS
Access, Power MHS, Citrix, Clarity, MS SharePoint

Client SoftSol

Role Data Analyst with ETL

Duration Oct 2010 – Apr 2013

Location Hyd - INDIA

Responsibilities:
▪ Involved in Building scalable, highly available, and reliable business intelligence systems

▪ Developed Extraction, Transformation and Loading (ETL) processes to acquire and load data spanning across various
applications (enterprise, cloud),data centers, and social graphs
▪ Experience in Performance tuning and Optimization of Cache with Informatica and OBIEE.

▪ Used concept of staged mapping in order to perform asynchronous Webservices request and response as part of
Informatica Mappings
▪ Participated in setting the strategy for data warehousing systems; set standards and best practices for programming,
security, and operations areas
▪ Collaborated with developers, stakeholders and subject-matter experts to establish technical vision and analyze
trade-offs between usability and performance needs
▪ Played a key role in the setting of standards for both the physical and logical database schema and table design

▪ Modifying the UNIX scripts as a part of Upgrade and making sure they point to the correct directories.

▪ Participated in design and specification of the hardware and storage architecture used to support data warehousing
systems
▪ Involved collaborating with global delivery teams

▪ Involved with data modeling using ER and dimensional modeling techniques

▪ Experience in defining security standards aligned with corporate data governance and compliance policies

▪ Involved in designing prototypes and working closely with implementation teams in engineering and operations

▪ To develop ETL procedures to transform the data in the intermediate tables according to the business rules and
functionality requirements.
▪ Implementing various workflows using transformations such as SQL Transformation, XML transformation,
Normalizer, look up, aggregator, stored procedure and scheduled jobs for different sessions.

Education: 1. MS in Advance Diploma in Computer science -NIIT -2001


2. BA - North Gujarat University, India - 2007

You might also like