John ETL Data Engineer
John ETL Data Engineer
====================================================================================================
SUMMARY
15+ years of overall experience in the Information Technology Industry and Over 8+ years’ experience with
Healthcare Domain Industry.
Experience in Building snow pipe and data sharing Snowflake.
Strong Experience in Data Engineer with Snowflake clone and Time Travel
2 years of experience in Hadoop cluster development.
3+ years of experience working with Data Stage.
7 years of UNIX shell scripting and experience in using third party scheduling tools like AutoSys.
5+ years of experience working with IICS (Informatica intelligent cloud services)
Extensively worked with Teradata utilities like BTEQ, Fast Export, Fast Load, Multi Load to export and load data
to/from different source systems including flat files.
Extensive knowledge with dimensional data modeling, star schema/snowflakes schema, fact and dimension
tables and Experience in Data Migration.
Expertise in Matillion ETL Mappings, Scheduling, Backups, Networking and Infra Setup
Experience in the AWS components & APIs.
Sound knowledge with Netezza SQL
12 years of extensive experience with databases (MS SQL Server 2000/2005/2008, Oracle 8i/9i/10/11g, Teradata).
Having 12+ years’ experience with ETL data warehouses and using Informatica PowerCenter Development
Automated multiple manual processes using ETL tool called Matillion, AWS Lambda and Airflow as needed.
Experience in migrating on-premises applications to Azure and configured VNETs and subnets as per the project
requirement also performed PowerShell scripting to do Patching, Imaging, and Deployments in Azure.
Good understanding of Star and DataStage & Dimensional Modeling, E-R modeling, Slowly Changing
Dimensions (SCD) and Data warehousing concepts.
Experience in DBT (data build tools) and Azure Data Factor & Kubernetes
End-to-End experience in designing and deploying data visualizations using tableau
Expertise in various ETL and Reporting tools Talend, Pentaho BI Suite (Pentaho Data Integration
Kettle/ Pentaho Report Designer/ Pentaho Schema Workbench/ Pentaho Enterprise Console/ Pentaho Analyzer &
Mondrian OLAP).
Extensive work with PL/SQL, performance tuning of Oracle using SQL plan, SQL hints, Oracle partitioning, various
indexes and join types.
Experienced on Informatica B2B Data Transformations & Data Exchange and Knowledge of CDC.
Perform technical design reviews and code reviews.
Created completed Stored Procedures and methods that would read data from various types such as Json,
Parquet and flatten the data into a data mart build in Snowflake
Created Real time DataStream set using Snowpipe in Snowflake
Created multiple SSRS reports with external subscriptions
Created Translation Service in AWS to translate Snowflake data by integrating Snowflake with the service via
lambda
Has a very good knowledge of FACETS tool and Healthcare domain, worked on the various modules like
Subscriber, Groups, Enrollment, EDI, HL7, HEDIS, EMR \EMR, Claims, Billing, Accounting, Provider, Utilization
Management.
Expertise in Business Intelligence tools such as Tableau, Alteryx, AB Initio, Thought Spot etc.
Good experience on FACETS CTP (Claims Test Pro) and FACETS Testing.
Good experience in manipulating EDI X 12 837(Health care claim), 835(Payment/remittance advice),
834(Benefit enrollment),, 270/ 271(inquire/response health care benefits), 276/277(Claim status) transaction files
as per HIPAA guidelines
Experience with Informatica Enterprise Data Catalog (EDC), Informatica Data Transformation, Data Privacy
Management (DPM),
Experience with Informatica Data Quality (IDQ) & Informatica MDM suite of tools - design, develop, optimize and
support MDM processes, including performance tuning of existing process
Experience with Data warehousing applications using ETL tools like Talend and Informatica
Mapping the EDI 12 files as per the HIPAA guidelines for various EDI Transactions.
Experience in Backend testing of SQL Queries using Oracle and PL/SQL, SQL Server and DB2.
Strong skills in data analysis, data architect, data requirement analysis and data mapping for ETL processes.
Experience with IICS Informatica Cloud Application Integrations
Ability to prioritize and execute tasks in a high-pressure environment. Reliable, proactive, responsible,
hardworking, and good team player.
Experience in mentoring and providing knowledge transfer to team members, support teams and customers
PROFESSIONAL EXPERIENCE
Lead ETL Data Architect, Anthem BCBS, Richmond, VA – Nov 2016 – Current
Worked on various SSIS packages and Matillion Jobs to load data to MSSQL server and Snowflake
Responsible for data delivery via triggers such as SNS or API Call to external applications like Salesforce etc.
Responsible for development and maintenance of Stored Procedures that will help validate and copy data from
stage to live schema
Developed complex Mappings using Informatica PowerCenter.
responsible for development of real-time data ingestion into snowflake using Lambda triggers and AWS SNS
Queue
Handled APP and API integrations with IICS.
Developed integrations using IICS – CDI services for 3rd party vendor integrations
Developed informatica cloud mapping, data synchronization tasks and data replication tasks on IICS
Conducted ETL development in the Netezza environment using standard design methodologies.
Responsible for creation and maintain of streams via Snowpipe.
Used Pentaho Data Integration Designer to extract data from various sources including flat files of fixed format,
Excel, XML, CSVs and databases like IBM DB2, MySQL and SQL Server.
Responsible Performance tuning using Auto Clustering, Search optimization service
Responsible for maintaining data sharing and data masking for external customers
Coached a team of developers to develop Pentaho Reports, Dashboards, XActions and Analyzer Reports for the
client.
Clarity experience, HEDIS business knowledge required
Work with health plan staff to ensure compliance with HEDIS requirements
Used DBT the data (schema tests, custom tests) and ensures data quality.
Developed complex custom reports using Pentaho Report Designer which includes developing Business View.
Experience in assigning the Azure services on locations specific to integrate with web-apps and key-vaults
Solid understanding of NO SQL database (RDBMS,MONGODB,
Responsible for maintaining and developing Snowflake Data warehouse and pipelines to feed
Working as Lead for the projects, involving in all the phases of SDLC
Extensive knowledge in Matillion Setup, Job Creation, Scheduling and Troubleshooting
Experience in Setting up Python Libraries and LDAP configuration in Matillion
Experience in Setting up SNS and Lambdas for pager duty or other notifications via lambda
Experience in setting up Matillion Job to refresh Power BI datasets after the Warehouse load
Experience in creating Power BI reports, create shared datasets, configuring on-premise gateway
Embedded PowerbI reports to external users with Row Level Security enabled
Advise Business users on data quality requirements for Thought Spot
Worked with Product Managers and Business Analysts to help create Jira stories and get the stake holders
expectation aligned with technical needs
Responsible for verification of functional specifications and review of deliverables. Created data cleansing,
validation and loading scripts for Oracle data warehouse using Informatica Power Center 8.6.1.
Extensively used transformations like router, aggregator, lookup, source qualifier, joiner, expression,
aggregator, and sequence generator transformations in extracting data in compliance with the business logic
developed.
Involved in Version control of the jobs to keep track of the changes in the Development.
Extensively used PL/SQL programming in backend and front-end functions, procedures, packages to implement
business rules, security efficiently and test cases.
Implemented Type1 and Type2 methodologies in ODS tables loading, to keep historical data in data warehouse.
Developed Shell scripts to setup runtime environment, and to run stored procedures, packages to populate the
data in staging tables.
Documented the purpose of mapping to facilitate the personnel to understand the process and
incorporate the changes as and when necessary and Developed Unit test cases for the jobs.
EDUCATION
Master of Science in Electrical and Computer Engineering - Stevens Institute of Technology, NJ Graduation Date: May’
12
Bachelor of Engineering in Electronics & Communication - North Gujarat University, India Graduation Date: Jun’09