0% found this document useful (0 votes)
60 views7 pages

Arunkumar Pichaimuthu

This document provides a professional summary and details the skills, qualifications, and experience of Arunkumar Pichaimuthu. It outlines his expertise in AWS technologies and hands-on experience architecting and developing solutions on AWS. It also lists his technical skills in areas like big data, databases, frameworks, and programming languages.

Uploaded by

bharadwaj1906
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
60 views7 pages

Arunkumar Pichaimuthu

This document provides a professional summary and details the skills, qualifications, and experience of Arunkumar Pichaimuthu. It outlines his expertise in AWS technologies and hands-on experience architecting and developing solutions on AWS. It also lists his technical skills in areas like big data, databases, frameworks, and programming languages.

Uploaded by

bharadwaj1906
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
You are on page 1/ 7

Arunkumar Pichaimuthu

PROFESSIONAL SUMMARY:

 Hands-on AWS Technical Architect with 6+ years in developing and architecting


Enterprise level large scale multi-tier solutions that require complex Architectural
decisions.
 Over 20 years of industry experience in all aspects of the Software Life Cycle, including
System Analysis, Design, Development, Testing and Implementation which includes
Amazon Webservices, Date Engineering, BIG DATA and Java technologies.
 Exceptional leadership skills with result-oriented approach.
 Hands-on experience on implementing Cloud Solutions using various AWS Services like
S3, Event Bridge, Step Function, Glue, Document DB, SQS, SNS, ECS, EMR, Data
Bricks, MWAA, Route 53, EKS, IAM (roles and policies), Secrets Manager, KMS, API-
Gateway, CMK, Athena, Redshift, Serverless, Lambda, Kinesis, EC2, DynamoDB,
CloudFormation, CloudWatch, EFS, EBS, etc.
 Good Knowledge on Application migrations and Data migrations and On-premise to
AWS Cloud.
 Identifying appropriate use of AWS Architectural best practices.
 Selecting the appropriate AWS services based on Containerization, Data, Compute,
database and security requirements.
 Estimating AWS Costs and identifying cost control mechanisms.
 Extensive experience defining IT roadmaps, Cloud Strategy, Enterprise Solutions
Architecture assessment.
 Successfully completed Internet of Things (IOT) certification from MIT Professional
Education.
 Strong familiarity with Data Architecture including Data Ingestion Pipeline Design, Data
Modeling and Data Mining, Machine Learning and Advanced Data Processing.
 Highly capable of processing large sets of structured, semi-structured and unstructured
data and supporting systems application architecture using Hadoop Eco Systems.
 Architecture and code Map Reduce programs and responsible for storing data into HDFS file
system.
 Proven skills in project design and implementation of Object-Oriented principles and having
good implementation knowledge of Java/J2EE design patterns.
 Firsthand experience with database designing using Erwin and programming skills including
PL/SQL, JDBC and SQL with DB2, ORACLE and SQL Server.
 Solid experience working with methodologies like Agile SCRUM, RUP and SDLC Waterfall.

TECHNICAL SKILLS:

AWS Services: S3, Event Bridge, Step Function, Glue, Document DB, Dynamo DB, Lambda
Functions, SQS, SNS, ECS, EMR, Databricks, Apache Airflow, Route 53, EKS, Nginx Controller,
Secrets Manager, KMS, API-Gateway, EC2, AWS SDK for Java, Python, Node.js, Serverless
Framework, Kinesis, DynamoDB, RDS, CloudFormation, Terraform, CloudWatch, Cloud Trail, S3,
Managed MSK, Glue ETL, Glue Crawler, Athena, Redshift, EMR, Snowflake

Messaging: Zookeeper 3.3.0, Kafka 0.8, JMS.


Big Data Ecosystems: Hadoop, SQOOP, OOZIE, Yarn, Cassandra, DataStax, MapReduce, HDFS,
Hive, Attivio, Vivo, Apache Tika.

Design Skills: DBeaver, UML (Rational Rose 2003), Microsoft VISIO, Object Oriented Analysis and
Design (OOAD), GOF and J2EE Patterns.

Java & Technologies: Java 1.8/1.7, JAX-WS, JAX-RS, Apache, J2EE- JAXB, JSP, Servlet, EJB,
JDBC, Swing, Memcached 2,7.1, JBoss JBPM.

Frameworks: Java Quarkus Framework 3.7.2, Spring 3.0, Hibernate 3.0, Struts 2.0, JTA, JPA, JAF,
JNDI, LDAP, Metrics-Graphite 2.2.0.

Database: HDFS, Oracle 10G, DB2, MS SQL Server 6.5/7.0/2000, MYSQL, MS Access.

Servers: Glassfish, JBoss 6.0, IBM Web Sphere 5.0, Web Logic 6.1, Tomcat 7.2.24, iPlanet Web
Server 6.1.

XML Technologies: HTTP, SOAP, XMSL, XSLT, XSL, SAX, DOM, HTML.

Scripting Language: jQuery, Shell Scripting, JavaScript, and CSS.

Version Control: Gitlab, GitHub, Maven, ClearCase 2002, Visual SourceSafe 6.0, CVS, SVN,
Merant Dimensions 8.0.5, JUnit, JTest, and Load Runner.

Methodologies: Agile SCRUM, Unified process, RUP (Rational Unified Process) and SDLC
Waterfall.

EDUCATIONAL QUALIFICATION:

Bachelor of Engineering (Computer Science and Engineering)


Madurai Kamarajar University, Madurai, India- 1998.

CERTIFICATIONS:

 Internet of Things: Roadmap to a Connected World from MIT Educational Services


 Project Management Professional (PMP), Certified through Project Management Institute
(PMI), 2011
 Brain Bench Certified Programmer for the Java 2 Platform
 Oracle Certified Point of Sale Developer

PROFESSIONAL EXPERIENCE:

 Working as AWS Technical Architect at SunRay Enterprise Inc. Atlanta, GA, USA from
October 2018 to Until Date
 Working as Application Technical Architect at SunRay Enterprise Inc. Atlanta, GA, USA
from September 2014 to September 2018
 Worked as Architect/Sr. Big Data Developer at GAVS Technologies PVT LTD. Chennai,
India, from March 2013 to September 2014.
 Worked as J2EE Architect at Ecare Technologies Labs PVT LTD. Chennai, India, from
October 2012 to February 2013.
 Worked Assistant Technical Manager at Calsoft Labs. Chennai, India, from December
2011 to October 2012
 Worked as Technical Leader at Data Software Research Company. Chennai, India, from
October 2003 to October 2011
 Worked as Software Engineer at Race Technologies. Chennai, India, from December 2001
to October 2003.
 Worked as Programmer at Child Star Pvt Ltd, Chennai, India, from March 1999 to
December 2001.

PROFESSIONAL SUMMARY:

Project: Financial Integration (FI)


Role: AWS Solutions Architect / Lead Data Engineer
Client: CIGNA HealthCare January 2023 to Present

Project Description: Financial Integration services processes payment files from vendors and
creates allocations for all the line items as part of this process. The data is processed through
various services deployed On-Prem as well as cloud which generates daily and monthly job files
which will be further processed by downstream systems.

Responsibilities:
 Tech stack considered for modernization is to leverage AWS Event Bridge for Event Driven
Architecture, AWS Step Function as Workflow, AWS ECS for Container orchestration
service, AWS SQS to decouple data access, AWS Glue for ETL and document DB for Data
Storage.
 Evaluated various AWS Services and Java Quarkus framework to be implemented.
 Designing, configuring and implementing Glue Job on AWS to Extract, Load and transform
data using PySpark.
 Demonstrated POC on AWS Event Bridge and Step Function.
 Have done POC on AWS Glue to crawl data from S3, transform using PySpark, and store
data in JSON format.
 Identify the configuration needs for all AWS resources to set up various environments for the
project.
 Integrate S3 with Event Bridge to trigger Step Function to start the Glue ETL job and store
JSON object into Document DB.
 Provision required AWS Services using Terraform scripts.
 Identified documented and coded required IAM service roles and policies for service to
service integration.
 Used Secrets Manager to store credentials for DBs to be accessed by application code.
 Validate and Review IAC and application code developed by team and approve the code.
 Process data stored in Document DB through Task definitions running on ECS cluster.
 Customize build configuration for all the resources needed with Terraform, and Jenkins.
 Collaborating with Business Team, and other stakeholders to understand requirements and
design data solutions that meet the needs.
 Worked on Terraform scripts to provision various AWS resources as needed for the project.
 Worked on Java Microservices to build various services to process the data at various
stages.

Environment: Event Bridge, Step Function, S3, Glue, Document DB, SNS, SQS, Lambda, ECS,
ECR, Cloud watch, Cloud Trails, IAM Roles and Policies, saml2aws, terraform, Python, Jenkins,
GitHub Java, Quarkus Framework.

Project: Delivery Infrastructure


Role: AWS Solutions Architect / Lead
Client: Verizon October 2020 to December 2022

Responsibilities:
 Plan, design and develop Microservice Architectures based on application and business
needs.
 Implemented single and multi-tenancy EKS cluster application deployments and migrated
42+ on-perm Kubernetes clusters to cloud.
 Design and develop terraform code to automate EKS clusters for multi region deployment.
 Migrated EKS cluster from 1.15 to 1.20 and 1.23.
 Achieved High availability and DR using ALB’s and NLB’s with NGINX controller and ingress
for various load balancer needs (private and public facing). Developed terraform code to help
deployments.
 Provisioned Prometheus for metrics collection and EKS cluster health through IAC code.
 Customized service mess for inter service communication through ISTIO for EKS nodes.
 Customized open source FluentD for log management from EKS nodes to Cloud Watch.
 Deploy API gateway, VPC LINK and custom domain creation using Terraform.
 Demonstrate working knowledge on AWS Services creation through automated Terraform
scripts.
 Develop Terraform code to store secrets using Secrets Manager and KMS for Data at rest.
 Configured Twist lLock Cloud Native Cybersecurity Platform which provides lifecycle
security for containerized environments.
 Possess in-depth understanding and knowledge on IAAS like Terraform and CloudFormation
template.
 Identify, document and work on IAM roles, policies and their permissions for various
accounts and their cross-account resources access.
 Working on defining security and compliance for various projects with teams.
 Design and develop runner architecture for DevOps team to build, deploy applications and
provision resources on AWS.
 Provision Multiple AWS Accounts and services.
 Work with various Application Teams to understand and document their needs and provide
containerization solutions.
 Trouble shoot issue during the occurrences to remove roadblock for the APP Teams and
other development resources.
 Good understanding and working knowledge in different versions of Terraform.
Environment: Route 53, ALB, NLB, Amazon EKS, API-gateway, ACM, SQS, SNS, SES, Secrets
Manager, KMS, CMK, ACM, CloudWatch, Lambda, Terraform, Helm, Prometheus, Twist locker,
Datadog, ECR, IAM, S3, RDS PostgreSQL, DynamoDB, EC2, VPC, Gitlab, Gitlab Runner
Project: AWS Data Lake Data Migration (FINERP)
Role: AWS Architect / Technical Lead
Client: Advance Auto Parts, Raleigh, NC January 2020 to October 2020

Project Description: FINERP is a financial data Lake system built on AWS. Data is extracted and
migrated from various ERP system to data lake using ETL services like Glue, S3, Lambda, Step
function, snowflake. Data is migrated to data lake in various stages using S3 buckets. Data is stored
in S3 or in snowflake which will be analyzed by business teams.

Responsibilities:
 Identify, design and document ETL process from source to AWS Data Lake.
 Define various stages of data in Data pipeline using AWS Glue jobs, CloudWatch, terraform,
step functions and lambdas.
 Design and develop glue transformation using Amazon SDK for python (boto 3) with
PySpark to implement required business logic to transform data for business team analysis.
 Identify and document transformation logic based on the business requirement collaborating
with business teams.
 Create technical and functional specification, collaborating with the business team for
development and testing.
 Design and work closely with technical teams in data transformation from source to data lake
on required format and storage (S3, Excel and snowflake).
Environment: Amazon IAM, S3, CloudWatch, Glue ETL Jobs, Lambda, Step function, PySpark,
Data Frames, Snowflake, Terraform v0.10, Amazon SDK for Python, Java, SQL Server, Teradata,
cloud9, PyCharm, and Eclipse.

Project: Data Lake on Cloud – Ingestion Pipeline


Role: AWS Architect/ Data /Java Engineer
Client: Liberty Mutual, Dover, NH April 2019 to January 2020

Project Description: The Yoda ingestion framework will provide users the ability to ingest many
diverse types, sources, volumes, and velocities of data into a common, secure, and optimized format
in the Raw Consumable area of the Data Lake. Ingestion pipeline is intended for file preparation
from various sources via daily snapshot or manual initiation. The goal is to make incoming data
accessible to end users as quickly as possible, while keeping security, cost, and performance
considerations top of mind. With that, the base operations of the process are the following:
 Covert the incoming file(s) both Non-Streaming and Streaming into the highly
optimized Parquet format.
 Consolidate files as needed to optimize storage costs and query performance.
 Compress and encrypt files to adhere to security standards that all data must be encrypted in
transit and at rest.
 Provide an interactive SQL interface, leveraging the AWS’ Athena Service, which will provide
end users query-able access to "raw" data sets.
 Automatically log operational metadata.

Responsibilities:
 Worked on prototypes to demonstrate various features and functionality of various AWS
technologies for data ingestion pipeline.
 Evaluated various AWS Services like Step Functions, Lambda, Managed Kafka, S3,
CloudWatch, Apache Glue (ETL, Crawler), etc., for Data Lake ingestion pipeline.
 Evaluate and develop ETL processes for various input file types (CSV, JSON) into Parquet.
 Coded ingestion pipeline using step functions, Lambda’s, SQS Queues, SNS Notifications,
Glue ETL, Crawlers and Athena.
 Assist team in developing various step functions, Lambda’s, CFT’s, SNS Topics, SQS
Queues, and Glue Crawlers for several types of data flow which includes Streaming, Non-
Streaming, CSV, JSON formats.
 Designed and developed Lambda functions using AWS SDK for Java.
Environment: Amazon IAM, CloudFormation, Step Function Service, Lambda Functions, VPC,
EC2, S3, CloudWatch, Glue ETL, Glue Crawlers, Managed Kafka, Git, Maven, AWS SDK for Java
(1.8), Eclipse, Jenkins, Cloud Forge, and Bamboo.

Project: Integrated Data Foundation Platform (Enterprise Data Lake on Cloud)


Role: AWS Architect/Data/Developer
Client: Southwest Airlines, Irving, TX October 2018 to March 2019

Responsibilities:
 Worked on working prototypes to demonstrate various features and functionality of various
AWS technologies.
 Evaluated various AWS services like Managed Kafka, Elastic search, Kinesis, Flink, S3,
Glacier, DynamoDB, CloudWatch, Apache Glue etc. for Data Lake.
 Evaluate various data formats like JSON, Parquet and Avro for data storage and transfer.
 Coded REST APIs for data catalog and data indexing service using serverless, Lambda,
AWS SDK for Node.js and DynamoDB.
 Established elastic search cluster using CloudFormation and serverless.
 Document the findings for the above-mentioned services which includes:
o Architecture and guidelines
o Cost incurred for the project if used.
o Cost comparison
Environment: Amazon IAM, CloudFormation, Serverless Framework, Lambda Functions, Kinesis,
Flink, VPC, EC2, DynamoDB, S3, Glacier, CloudWatch, Glue, Managed Kafka, AWS SDK for
Node.js, Git, Gradle 4.2, Java1.8, Eclipse, Jenkins, and SAFe Agile.

Project: IROPS R2
Role: Messaging Architect/Developer
Client: Southwest Airlines, Irving, TX March 2017 to October 2018

Project: Tech tablet 2.0


Role: Product Architect / Developer
Client: Verizon, Tampa, FL June 2016 to February 2017

Project: XFinity-Home Stream


Role: Product Developer/Architect
Client: Comcast Inc, Philadelphia, PA September 2014 to June 2016

GAVS Technologies PVT LTD. Chennai, India March 2013 to September 2014

Project: ISys
Role: J2ee Architect / Senior Developer
Client: PSCU Financial Services, St Petersburg, FL USA March 2013 to September 2014

ECare Tech Labs India PVT LTD October 2012 to February 2013
Calsoft Labs Chennai India PVT LTD December 2011 to October 2012
Data Software Research Company, Chennai, INDIA October 2003 to November 2011
RACE Technologies, Chennai, India December 2001 to October 2003
Child Start LTD Chennai INDIA March 1999 to December 2001

You might also like