0% found this document useful (0 votes)
27 views4 pages

John ETL Data Engineer

John K has over 15 years of experience in the Information Technology industry, with more than 8 years in the healthcare domain. He possesses extensive skills in ETL processes, data engineering, and various tools including Snowflake, Informatica, and AWS, along with a strong background in database management and data visualization. His professional experience includes roles as a Lead ETL Data Architect and ETL Developer, where he has successfully managed data integration, migration, and analytics projects.

Uploaded by

shashi87
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
27 views4 pages

John ETL Data Engineer

John K has over 15 years of experience in the Information Technology industry, with more than 8 years in the healthcare domain. He possesses extensive skills in ETL processes, data engineering, and various tools including Snowflake, Informatica, and AWS, along with a strong background in database management and data visualization. His professional experience includes roles as a Lead ETL Data Architect and ETL Developer, where he has successfully managed data integration, migration, and analytics projects.

Uploaded by

shashi87
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
You are on page 1/ 4

John K

====================================================================================================
SUMMARY

 15+ years of overall experience in the Information Technology Industry and Over 8+ years’ experience with
Healthcare Domain Industry.
 Experience in Building snow pipe and data sharing Snowflake.
 Strong Experience in Data Engineer with Snowflake clone and Time Travel
 2 years of experience in Hadoop cluster development.
 3+ years of experience working with Data Stage.
 7 years of UNIX shell scripting and experience in using third party scheduling tools like AutoSys.
 5+ years of experience working with IICS (Informatica intelligent cloud services)
 Extensively worked with Teradata utilities like BTEQ, Fast Export, Fast Load, Multi Load to export and load data
to/from different source systems including flat files.
 Extensive knowledge with dimensional data modeling, star schema/snowflakes schema, fact and dimension
tables and Experience in Data Migration.
 Expertise in Matillion ETL Mappings, Scheduling, Backups, Networking and Infra Setup
 Experience in the AWS components & APIs.
 Sound knowledge with Netezza SQL
 12 years of extensive experience with databases (MS SQL Server 2000/2005/2008, Oracle 8i/9i/10/11g, Teradata).
 Having 12+ years’ experience with ETL data warehouses and using Informatica PowerCenter Development
 Automated multiple manual processes using ETL tool called Matillion, AWS Lambda and Airflow as needed.
 Experience in migrating on-premises applications to Azure and configured VNETs and subnets as per the project
requirement also performed PowerShell scripting to do Patching, Imaging, and Deployments in Azure.
 Good understanding of Star and DataStage & Dimensional Modeling, E-R modeling, Slowly Changing
Dimensions (SCD) and Data warehousing concepts.
 Experience in DBT (data build tools) and Azure Data Factor & Kubernetes
 End-to-End experience in designing and deploying data visualizations using tableau
 Expertise in various ETL and Reporting tools Talend, Pentaho BI Suite (Pentaho Data Integration
Kettle/ Pentaho Report Designer/ Pentaho Schema Workbench/ Pentaho Enterprise Console/ Pentaho Analyzer &
Mondrian OLAP).
 Extensive work with PL/SQL, performance tuning of Oracle using SQL plan, SQL hints, Oracle partitioning, various
indexes and join types.
 Experienced on Informatica B2B Data Transformations & Data Exchange and Knowledge of CDC.
 Perform technical design reviews and code reviews.
 Created completed Stored Procedures and methods that would read data from various types such as Json,
Parquet and flatten the data into a data mart build in Snowflake
 Created Real time DataStream set using Snowpipe in Snowflake
 Created multiple SSRS reports with external subscriptions
 Created Translation Service in AWS to translate Snowflake data by integrating Snowflake with the service via
lambda
 Has a very good knowledge of FACETS tool and Healthcare domain, worked on the various modules like
Subscriber, Groups, Enrollment, EDI, HL7, HEDIS, EMR \EMR, Claims, Billing, Accounting, Provider, Utilization
Management.
 Expertise in Business Intelligence tools such as Tableau, Alteryx, AB Initio, Thought Spot etc.
 Good experience on FACETS CTP (Claims Test Pro) and FACETS Testing.
 Good experience in manipulating EDI X 12 837(Health care claim), 835(Payment/remittance advice),
 834(Benefit enrollment),, 270/ 271(inquire/response health care benefits), 276/277(Claim status) transaction files
 as per HIPAA guidelines
 Experience with Informatica Enterprise Data Catalog (EDC), Informatica Data Transformation, Data Privacy
Management (DPM),
 Experience with Informatica Data Quality (IDQ) & Informatica MDM suite of tools - design, develop, optimize and
support MDM processes, including performance tuning of existing process
 Experience with Data warehousing applications using ETL tools like Talend and Informatica
 Mapping the EDI 12 files as per the HIPAA guidelines for various EDI Transactions.
 Experience in Backend testing of SQL Queries using Oracle and PL/SQL, SQL Server and DB2.
 Strong skills in data analysis, data architect, data requirement analysis and data mapping for ETL processes.
 Experience with IICS Informatica Cloud Application Integrations
 Ability to prioritize and execute tasks in a high-pressure environment. Reliable, proactive, responsible,
hardworking, and good team player.
 Experience in mentoring and providing knowledge transfer to team members, support teams and customers

SKILLS AND INTERESTS:


ETL Tools: IICS, SSIS, DBT (data build tools), Data bricks, IDQ, Data stage , Informatica Cloud/MDM, Informatica
Data Transformation, Matillion , Talend, CAI (Cloud Application Integration),
BI Tools: MSBI Suit, Power BI, Tableau, Alteryx, AB Initio, Thought spot, SFDC CRM,Azure, SSRS
Databases: MS SQL Server 2005/2008, RDBMS, Airflow, Redshift, NO SQL, Oracle/PL/SQL, Hadoop, Java & API, T- SQL
, Netezza Snowflake, Mongo DB
Database Skills: Cursors, Stored Procedures, Functions, Views, Triggers, and Packages
Client-Side Skills: SQL, PL/SQL, Python, pyspark ,Apache Spark, Glue, Kafka, Elastic Search, Lambda, RDS, C# & C+
+, Angular, MongoDB/Dynamo DB
OS: Windows 2000/2003/XP/Vista/Windows 7, UNIX, Red Hat Linux
Methodologies: Data Modeling Logical / Physical, Star/ Snowflake Schema, FACT & Dimension Tables, ETL,
OLAP, Agile, Software Development Life Cycle (SDLC)

PROFESSIONAL EXPERIENCE
Lead ETL Data Architect, Anthem BCBS, Richmond, VA – Nov 2016 – Current
 Worked on various SSIS packages and Matillion Jobs to load data to MSSQL server and Snowflake
 Responsible for data delivery via triggers such as SNS or API Call to external applications like Salesforce etc.
 Responsible for development and maintenance of Stored Procedures that will help validate and copy data from
stage to live schema
 Developed complex Mappings using Informatica PowerCenter.
 responsible for development of real-time data ingestion into snowflake using Lambda triggers and AWS SNS
Queue
 Handled APP and API integrations with IICS.
 Developed integrations using IICS – CDI services for 3rd party vendor integrations
 Developed informatica cloud mapping, data synchronization tasks and data replication tasks on IICS
 Conducted ETL development in the Netezza environment using standard design methodologies.
 Responsible for creation and maintain of streams via Snowpipe.
 Used Pentaho Data Integration Designer to extract data from various sources including flat files of fixed format,
Excel, XML, CSVs and databases like IBM DB2, MySQL and SQL Server.
 Responsible Performance tuning using Auto Clustering, Search optimization service
 Responsible for maintaining data sharing and data masking for external customers
 Coached a team of developers to develop Pentaho Reports, Dashboards, XActions and Analyzer Reports for the
client.
 Clarity experience, HEDIS business knowledge required
 Work with health plan staff to ensure compliance with HEDIS requirements
 Used DBT the data (schema tests, custom tests) and ensures data quality.
 Developed complex custom reports using Pentaho Report Designer which includes developing Business View.
 Experience in assigning the Azure services on locations specific to integrate with web-apps and key-vaults
 Solid understanding of NO SQL database (RDBMS,MONGODB,
 Responsible for maintaining and developing Snowflake Data warehouse and pipelines to feed
 Working as Lead for the projects, involving in all the phases of SDLC
 Extensive knowledge in Matillion Setup, Job Creation, Scheduling and Troubleshooting
 Experience in Setting up Python Libraries and LDAP configuration in Matillion
 Experience in Setting up SNS and Lambdas for pager duty or other notifications via lambda
 Experience in setting up Matillion Job to refresh Power BI datasets after the Warehouse load
 Experience in creating Power BI reports, create shared datasets, configuring on-premise gateway
 Embedded PowerbI reports to external users with Row Level Security enabled
 Advise Business users on data quality requirements for Thought Spot
 Worked with Product Managers and Business Analysts to help create Jira stories and get the stake holders
expectation aligned with technical needs

ETL Developer, IBM, Dublin, OH – Oct 2015 – Nov 2016


 Developed and maintained ETL mappings using Informatica Designer 9.6 to extract the data from multiple
source systems that comprise databases like Oracle 10g, SQL Server 7.2, flat files to the Staging area, EDW and
then to the Data Marts.
 Implement Informatica MDM workflow including data profiling configuration specification and coding match
rules tuning migration.
 Build out best practices regarding data staging data cleansing and data transformation routines within the
Informatica MDM solution.
 Define and build best practices regarding creating business rules within the Informatica MDM solution.
 Overcame challenges like data migration from MySQL to MongoDB.
 Created mappings using different lookups like connected, unconnected and dynamic look-up with different
cache.
 Developed the snow sql scripts to load the final core tables load to Snowflake on the Matillion tool.
 Used matillion tool blob storage component and loaded the tables to snowflake Stage Layer.
 Develops PowerCenter Workflows and Sessions and sets up Power Exchange connections to database and
mainframe files.
 Built interactive dashboards and stories using Tableau Desktop for accuracy in report generation and applying
advanced Tableau functionality, parameters, actions and tooltip.
 Extensively used Cursors, Ref Cursors, User Defined Object Types, Collections, Joins, Sub Query Records, and
Tables in PL/SQL programming.
 Extensively used SQL statements to query the Oracle/PL/SQL Database for Data Validation and Data Integrity.
 Analyzing the source data to know the quality of data by using Talend Data Quality.
 Developed reusable transformations, worklets, mapplets and Mappings Data Quality Plan Mapplets, Build Data
quality Scorecards.
 Used SQL queries and other data analysis methods, as well as TalendEnterprise Data Quality
 Worked with Adobe Analytics and Salesfoce.com CRM
 Extensively worked with transformations Lookup, Update Strategy, Expression, Filter, Stored Procedures,
Router, and others.
 Worked on the design, development and testing of Talend mappings.
 Created ETL job infrastructure using Talend Open Studio
 Knowledge of working with Hadoop as a source system.
 Used Cassandra CQL and Java APIs to retrieve data from Cassandra Table.
 Adapted web application to run in Windows Azure against SQL Azure and using Azure Queues for background
processing.
 Interacted with various upstream and downstream customers in interfacing various systems for data extractions,
ETL, Analytics and Reporting needs.
 Coordinated and monitored the project progress to ensure the timely flow and complete delivery of the project

Database Developer, Anthem BCBS, Richmond, VA – May 2015 – Oct 2015


 Own the change request process and may coordinate with other teams as necessary.
 Develop and define application scope and objectives and prepares technical and/or functional
specifications from with programs will be written.
 Develop, test and support ETL processes necessary to load and validate the data warehouse.
 Used ETL (SSIS) to develop jobs for extracting, cleaning, transforming, and loading data into data warehouse.
 Utilized software tools including SAP Business Objects (including Data Services and Information Design Tool),
Oracle SQL Developer, Microsoft SQL Server 08/12/14, SAP HANA, and Attunity.
 Ensure unit test is completed and meets the test plan requirements, system testing is completed, and system
is implemented according to plan.
 Coordinate on-call support and ensures effective monitoring of system.

ETL/SQL Developer, DirectTV, El Segundo, LA – Sept 2013 – May 2015


 Evaluate, plan, manage, track, and provide status on system maintenance, enhancement, and support
activities at a task level.
 Define, document, and execute test cases to effectively validate the new components of the data warehouse
being developed, enhanced, or otherwise maintained.
 Develop and follow standards for the data warehouse.
 Worked in Informatica B2B Data Transformation.
 Complete mandatory application documentation, as specified by project
 Act as intermediary between user community and technical support teams
 Manage change via an established change management process
 Work with Oracle and Teradata databases, Informatica, and Autosys batch job scheduler.

SQLDeveloper, Metlife, Wilmington, DE – Jan 2013 – Sept 2013


 Involved in gathering the Business requirements from the client and writing and maintaining
technical/design documentation.
 Involved in creating Endpoints for file transfer using Informatica B2B.
 Analyzed the business requirements for ETL process and created schema flow diagrams in VISIO.
 Created various reusable transformations in designer such as Slowly Changing Dimensions using Informatica
Power Center Transformation Developer.
 Development of ETL routines, using Informatica Power Center 9.5

Informatica Developer, Carefirst BCBS, Washington, DC – Jan 2012 – Jan 2013

 Responsible for verification of functional specifications and review of deliverables. Created data cleansing,
validation and loading scripts for Oracle data warehouse using Informatica Power Center 8.6.1.
 Extensively used transformations like router, aggregator, lookup, source qualifier, joiner, expression,
aggregator, and sequence generator transformations in extracting data in compliance with the business logic
developed.
 Involved in Version control of the jobs to keep track of the changes in the Development.
 Extensively used PL/SQL programming in backend and front-end functions, procedures, packages to implement
business rules, security efficiently and test cases.
 Implemented Type1 and Type2 methodologies in ODS tables loading, to keep historical data in data warehouse.
 Developed Shell scripts to setup runtime environment, and to run stored procedures, packages to populate the
data in staging tables.
 Documented the purpose of mapping to facilitate the personnel to understand the process and
incorporate the changes as and when necessary and Developed Unit test cases for the jobs.

ETL/SQL Developer, Phillips, Pittsburgh, PA – Jan 2008 – Mar 2012


• Involved in gathering the Business requirements from the client and writing and maintaining technical/design
documentation.
• Involved in creating Endpoints for file transfer using Informatica B2B.
• Analyzed the business requirements for ETL process and created schema flow diagrams in VISIO.
• Created various reusable transformations in designer such as Slowly Changing Dimensions using Informatica
Power Center Transformation Developer.
• Development of ETL routines, using Informatica Power Center 9.5

EDUCATION
Master of Science in Electrical and Computer Engineering - Stevens Institute of Technology, NJ Graduation Date: May’
12
Bachelor of Engineering in Electronics & Communication - North Gujarat University, India Graduation Date: Jun’09

You might also like