SlideShare a Scribd company logo
Quaish Abuzer
QUAISH ABUZER
Email: abuzer.bit@gmail.com Mobile: +91-9619588402
Qualifications:
Degree and Date Institute Year of Passing
Master of Computer
Application(MCA)
Birla Institute of
Technology(BIT)
,Mesra,Ranchi
2010
Experience Summary:
• Having 5.8 years of experience in Informatica.
• Experience in analysis design, development and implementation of business applications using
Informatica Power Center.
• Strong design and development skills.
• Working experience on Informatica Power Center Tools – Designer, Workflow Manager and
Workflow Monitor.
• Excellent knowledge of DATA WAREHOUSE.
• Significant experience in ETL process in Data Warehouse lifecycle.
• Developed different Mappings/ Mapplets using different transformations.
• Possess excellent learning ability and practical implementation of the same.
• Significant experience in preparing the Unit Test Cases and design document.
• Experience in Performance Tuning the mappings.
• Experience on Oracle SQL. 3 years experience in Teradata (multiset and volatile
table,fastload,multiload, tpt, tpump, daigonastic stats, Joiner index, Loader connection, Teradata
SQL Assistant ,stats,skew factor,dbc tables,export,import,release mload,error table,View point, etc)
• Experience on SQL.
• Performance tuning in sql and Informatica.
• Hands-on experience procedure, function, triggers.
• Error handling in Informatica.
• Worked on all command of Unix. Good exposure of shell scripting in Unix.
• Reviewing the code and documents prepared by the team.
• Can adapt to new technologies quickly
 Capacity to work
 Meet deadlines
• Excellent logical, analytical & debugging skills.
• Have good communication skills, interpersonal relations, hardworking and result oriented as an
individual and in a team.
• A self-starter with clear understanding of the business needs and having radical and tactical
problem solving approaches and possess excellent leadership and motivation skills.
• Client interaction experience.
• Good domain knowledge in Banking and Financial services(BFS).
Additional experience:-
• Informatica admin, deployment/migration, code release, tool release, configuration, setup,
Repository backup/restore,types of connection in workflow manager etc.
• Having good experience in Cloudera BIG data/Hadoop,HDFS,LFS,YARN, Hadoop Eco-friendly
system, PIG,HIVE,Sqoop,Flume ,Hbase,Oozie,Zookeeper,map reduce.
• Connectivity of Hadoop with Informatica to load data from LFSHDFS,HDFSLFS,HDFSHDFS.
1
Quaish Abuzer
• Loading Data from HDFS to HIVE, LFS to HIVE using informatica.
Technology:
Operating Systems Unix, Windows
Databases Oracle,Teradata, Sql Server,Ingress
Technologies Power Center
Tools/IDE Informatica, Toad, Putty, Winscp,Sql
developer,HPQC
Career Profile:
Dates Organization Designation
Dec 2012 - Present Capgemini Consultant
July 2010 – Dec 2012 Tata Consultancy
Services
Systems Engineer
Projects:
The details of the various assignments that I have handled are listed here, in chronological order.
1. Capgemini:-
Project SIMS
Customer Schlumberger, Houston, Texas, United States
Period July 2015 to Till Date
Description We have source system SAP/MS SQL server.BI extractor
comes in picture when source system is SAP. For MS SQL
server source system we create mapping and call procedure.
Pulled data has been directly loaded into Oracle DB (in work
table).Work table to staging table loading done after
implementing business requirement. Staging to Target is
one to one mapping adding some business suggested
column. We maintain history table for work table and Target
table. Once data is ready in target table (Oracle DB), one
team called reported team use that table as source.
We are maintaining audit framework in all tables e.g Work
tableStaging tableTarget tableHistory table
Role • Client owner and responsible for keeping all the track
and follow up with business for my clients.
• Developed codes by interacting with the Business to
understand the requirement and implemented the
ETL logic.
2
Quaish Abuzer
• Implemented cross-reference(Xref) logic.
• Expertise in developing the mappings by using
suitable transformation as per requirement.
• Involved in performance tuning.
• Understanding the whole process and gave
suggestion to change whenever required.
• Involved in analysis before acceptance.
• Provides guidance/training to the new joinees as and
when required.
• Providing effective technical solutions among the
team.
• Enhanced the existing ETL mappings in Informatica
to meet the new requirements.
• Also part of the deployment ,good knowledge if
Informatica administrator
Implemented below scenarios as well being
shared resource:-
• Mapping Variable and Mapping parameter
• Global Parameter
• Power Center utilities (PMCMD command,PMREP
command)
• Global Parameter
• Target Load plan
• Indirect load
• Event Wait
• Dynamic parameter file
• Performance tuning:
Pushdown optimization
Key range partition
Pass through partition
Hash auto key partition
Environment Unix
Tools Informatica 9.6.1,sql-devloper,winscp,putty,DVO
Project Banking Solution
Customer SNS Bank, Netherlands
Period December 2012 to June 2015
Description The project is about all type of banking model. Like net
banking, loan, mortgage, risk management etc.
Creating new models, changing on existing code by CR or
fixing issues on Incident request.
Role • Client owner and responsible for keeping all the track
and follow up with business for my clients.
• Developed codes by interacting with the Business to
understand the requirement and implemented the
ETL logic
• Expertise in developing the mappings by using
suitable transformation as per requirement.
• Involved in performance tuning.
• Understanding the whole process and gave
3
Quaish Abuzer
suggestion to change whenever required.
• Involved in analysis before acceptance.
• Provides guidance/training to the new joinees as and
when required.
• Providing effective technical solutions among the
team.
• Enhanced the existing ETL mappings in Informatica
to meet the new requirements.
• Mapping Variable and Mapping parameter
• Global Parameter
• Power Center utilities (PMCMD command,PMREP
command)
• Global Parameter
• Target Load plan
• Indirect load
• Event Wait
• Dynamic parameter file
• Performance tuning:
Pushdown optimization
Key range partition
Pass through partition
Hash auto key partition
Environment Unix
Tools Informatica 8.6,sql-
devloper,winscp,putty,Teradata,Morphues
2. In Tata Consultancy Services
Project # 1
Project Vehicle upload
Customer GE Capital, France
Period July,2012– December,2012
Description GE has purchased lots of vehicle in Europe. All vehicle
needs to classify on the basis of types and models.
If model changed, then we need to discontinue the
vehicle. On the basis return indicator decide which
vehicle will go for servicing.
Loading the data from flat file Source systems to the
Revel Staging layer and then target involving all the
calculation at transformation level.
Getting source file from customer on business days.
Roles & Responsibilities  Developing and debugging the informatica
mappings to resolve bugs, and identify the causes
of failures.
 User interaction to identify the issues with the data
loaded through the application.
4
Quaish Abuzer
 Developed different Mappings/ Mapplets using
different transformations.
 Providing effective technical solutions among the
team.
 Responsible for handling change.
 Responsible for code check in.
 Participated in the design documentation.
 Enhanced the existing ETL mappings in Informatica
to meet the new requirements.
 Reviewing the code and documents prepared by
other developers.
 Extensively worked on performance tuning.
 Maintaining all the trackers.
 Taking project level initiatives.
 Mapping variable and parameter
 Incremental load
 Mapping Variable and Mapping parameter
 Global Parameter
 Target Load plan
 Indirect load
Tools Informatica 8.6, unix,Ingress DB
Environment Unix
Project # 2
Project NIKE
Customer British Petroleum (BP),UK
Period March,2011– June,2012
Description The NIKE application was using heterogeneous sources
from different countries.That sources is invoice of oil
sale on everyday in that particular country.We use
different transformation as per business
requirements.Finally we populate data in target
table(oracle DB).So business generate report (invoice)
in their standard format.
Roles & Responsibilities  Developing and debugging the informatica
mappings to resolve bugs, and identify the causes
of failures.
 Providing effective technical solutions among the
team.
 Worked on Informatica tool – Source analyzer,
Target Designer, Mapping designer, Mapplet
designer and Transformation Developer.
 Created the partition for newly inserted records.
 Participated in the design documentation.
 The Mappings were Unit tested to check for the
expected results.
 Study and analysis of the mapping document
indicating the source tables, columns, data types,
transformations required, business rules to be
applied, target tables, columns and data types.
 Loaded data into the Data warehouse and there
were two kinds of data loading processes (daily and
monthly) depending on the frequency of source
5
Quaish Abuzer
data.
 Reviewing the code and documents prepared by the
team.
Tools Informatica 8.6, Toad, Putty
Environment Unix
Project # 3
Project Bulk Data Extracts
Customer PNC,US
Period Sept 2010– Feb 2011
Description PNC Financial Services Group, headquartered in
Pittsburgh, PA, is one of the nation's largest financial
holding companies. The company operates through an
extensive banking network primarily in Ohio, Illinois,
Indiana, Kentucky, Michigan, Missouri and
Pennsylvania, and also serves customers in selected
markets nationally.
Bulk Data Extract: Extraction of zipped files and place
the Files in the directory from where informatica can
pick up the files and perform the desired
transformations for loading purpose
Roles & Responsibilities  Understanding the requirements and preparing the
Low Level design doc based on understanding.
 Involved in designing the mappings between
sources and targets
 Developing and debugging the Informatica
mappings to resolve bugs, and identify the causes
of failures.
 Involved in preparing the Unit Test Cases.
Tools Informatica 8.6, Toad
Environment Unix
Passport Details:
Name as on
passport
Relationship Passport
Number
Place of
Issue
Quaish Abuzer Self J4168356 Patna,Bihar
Onsite Experience:
6
Quaish Abuzer
Name of
country
Client Start date End date
Netherlands SNS Bank 03-Mar-
2013
27-Apr-2013
Netherlands SNS Bank 10-Nov-
2014
08-Dec-2014
Current Address:
1204 A-Wing shiv om tower chandivali powai Mumbai 400072
Signature:
7

More Related Content

DOCX
Komal shekhawat resume2017
DOCX
Resume Pallavi Mishra as of 2017 Feb
DOC
Ramachandran_ETL Developer
DOCX
Resume_Raj Ganesh Subramanian
DOC
Shaik Niyas Ahamed M Resume
PDF
Kiran Infromatica developer
DOC
Amit Kumar_Resume
DOC
Puneet Verma CV
Komal shekhawat resume2017
Resume Pallavi Mishra as of 2017 Feb
Ramachandran_ETL Developer
Resume_Raj Ganesh Subramanian
Shaik Niyas Ahamed M Resume
Kiran Infromatica developer
Amit Kumar_Resume
Puneet Verma CV

What's hot (20)

DOC
Amit Kumar_resume
DOC
Resume (3)
DOCX
Pranabesh Ghosh
DOC
Sanjay Lakhanpal 2015
DOC
Mallikarjun_Konduri
DOCX
smrutikanta jena
DOC
PradeepDWH
DOCX
Resume_Kunal N Kadbhane
DOC
Anant_Shekdar_BI_Resume
DOC
Siva - Resume
DOCX
Mahesh_ETL
DOC
Dora_Lauren_resume
DOCX
Shivaprasada_Kodoth
DOC
PG_resume (2)
PDF
Sumanta Panja
DOCX
Jithender_3+Years_Exp_ETL Testing
DOCX
Arun Kondra
DOC
Subhoshree_ETLDeveloper
DOC
Manigandan_narasimhan_resume
DOCX
Ajith_kumar_4.3 Years_Informatica_ETL
Amit Kumar_resume
Resume (3)
Pranabesh Ghosh
Sanjay Lakhanpal 2015
Mallikarjun_Konduri
smrutikanta jena
PradeepDWH
Resume_Kunal N Kadbhane
Anant_Shekdar_BI_Resume
Siva - Resume
Mahesh_ETL
Dora_Lauren_resume
Shivaprasada_Kodoth
PG_resume (2)
Sumanta Panja
Jithender_3+Years_Exp_ETL Testing
Arun Kondra
Subhoshree_ETLDeveloper
Manigandan_narasimhan_resume
Ajith_kumar_4.3 Years_Informatica_ETL
Ad

Similar to Resume quaish abuzer (20)

DOC
Amit_Kumar_CV
DOC
Informatica 5+years of experince
DOC
Informatica 5+years of experince
DOC
Informatica_5+years of experince
DOC
RajeshS_ETL
DOC
DOC
SivakumarS
DOC
Informatica,Teradata,Oracle,SQL
DOCX
Mani_Sagar_ETL
DOC
Ipsita_Informatica_9Year
DOC
informatica_developer
DOC
GouriShankar_Informatica
DOC
Sandeep Grandhi (1)
DOCX
Resume (1)
DOCX
Resume (1)
PDF
jagadeesh updated
DOC
Subhoshree resume
DOCX
Abdul ETL Resume
DOC
Akram_Resume_ETL_Informatica
DOC
Sowjanya H J_BE with 4 yr exp_ Informatica
Amit_Kumar_CV
Informatica 5+years of experince
Informatica 5+years of experince
Informatica_5+years of experince
RajeshS_ETL
SivakumarS
Informatica,Teradata,Oracle,SQL
Mani_Sagar_ETL
Ipsita_Informatica_9Year
informatica_developer
GouriShankar_Informatica
Sandeep Grandhi (1)
Resume (1)
Resume (1)
jagadeesh updated
Subhoshree resume
Abdul ETL Resume
Akram_Resume_ETL_Informatica
Sowjanya H J_BE with 4 yr exp_ Informatica
Ad

Recently uploaded (20)

PPTX
AIRLINE PRICE API | FLIGHT API COST |
PDF
How Creative Agencies Leverage Project Management Software.pdf
PDF
Jenkins: An open-source automation server powering CI/CD Automation
PDF
Flood Susceptibility Mapping Using Image-Based 2D-CNN Deep Learnin. Overview ...
PPTX
How a Careem Clone App Allows You to Compete with Large Mobility Brands
PPTX
Presentation of Computer CLASS 2 .pptx
PDF
Perfecting Gamer’s Experiences with Performance Testing for Gaming Applicatio...
PDF
How to Choose the Most Effective Social Media Agency in Bangalore.pdf
PDF
Teaching Reproducibility and Embracing Variability: From Floating-Point Exper...
PPTX
Odoo Consulting Services by CandidRoot Solutions
PDF
How to Migrate SBCGlobal Email to Yahoo Easily
PPTX
10 Hidden App Development Costs That Can Sink Your Startup.pptx
PDF
ShowUs: Pharo Stream Deck (ESUG 2025, Gdansk)
DOCX
The Five Best AI Cover Tools in 2025.docx
PPTX
Benefits of DCCM for Genesys Contact Center
DOCX
The Future of Smart Factories Why Embedded Analytics Leads the Way
PDF
QAware_Mario-Leander_Reimer_Architecting and Building a K8s-based AI Platform...
PPTX
Visualising Data with Scatterplots in IBM SPSS Statistics.pptx
PPTX
Save Business Costs with CRM Software for Insurance Agents
PDF
Best Practices for Rolling Out Competency Management Software.pdf
AIRLINE PRICE API | FLIGHT API COST |
How Creative Agencies Leverage Project Management Software.pdf
Jenkins: An open-source automation server powering CI/CD Automation
Flood Susceptibility Mapping Using Image-Based 2D-CNN Deep Learnin. Overview ...
How a Careem Clone App Allows You to Compete with Large Mobility Brands
Presentation of Computer CLASS 2 .pptx
Perfecting Gamer’s Experiences with Performance Testing for Gaming Applicatio...
How to Choose the Most Effective Social Media Agency in Bangalore.pdf
Teaching Reproducibility and Embracing Variability: From Floating-Point Exper...
Odoo Consulting Services by CandidRoot Solutions
How to Migrate SBCGlobal Email to Yahoo Easily
10 Hidden App Development Costs That Can Sink Your Startup.pptx
ShowUs: Pharo Stream Deck (ESUG 2025, Gdansk)
The Five Best AI Cover Tools in 2025.docx
Benefits of DCCM for Genesys Contact Center
The Future of Smart Factories Why Embedded Analytics Leads the Way
QAware_Mario-Leander_Reimer_Architecting and Building a K8s-based AI Platform...
Visualising Data with Scatterplots in IBM SPSS Statistics.pptx
Save Business Costs with CRM Software for Insurance Agents
Best Practices for Rolling Out Competency Management Software.pdf

Resume quaish abuzer

  • 1. Quaish Abuzer QUAISH ABUZER Email: [email protected] Mobile: +91-9619588402 Qualifications: Degree and Date Institute Year of Passing Master of Computer Application(MCA) Birla Institute of Technology(BIT) ,Mesra,Ranchi 2010 Experience Summary: • Having 5.8 years of experience in Informatica. • Experience in analysis design, development and implementation of business applications using Informatica Power Center. • Strong design and development skills. • Working experience on Informatica Power Center Tools – Designer, Workflow Manager and Workflow Monitor. • Excellent knowledge of DATA WAREHOUSE. • Significant experience in ETL process in Data Warehouse lifecycle. • Developed different Mappings/ Mapplets using different transformations. • Possess excellent learning ability and practical implementation of the same. • Significant experience in preparing the Unit Test Cases and design document. • Experience in Performance Tuning the mappings. • Experience on Oracle SQL. 3 years experience in Teradata (multiset and volatile table,fastload,multiload, tpt, tpump, daigonastic stats, Joiner index, Loader connection, Teradata SQL Assistant ,stats,skew factor,dbc tables,export,import,release mload,error table,View point, etc) • Experience on SQL. • Performance tuning in sql and Informatica. • Hands-on experience procedure, function, triggers. • Error handling in Informatica. • Worked on all command of Unix. Good exposure of shell scripting in Unix. • Reviewing the code and documents prepared by the team. • Can adapt to new technologies quickly  Capacity to work  Meet deadlines • Excellent logical, analytical & debugging skills. • Have good communication skills, interpersonal relations, hardworking and result oriented as an individual and in a team. • A self-starter with clear understanding of the business needs and having radical and tactical problem solving approaches and possess excellent leadership and motivation skills. • Client interaction experience. • Good domain knowledge in Banking and Financial services(BFS). Additional experience:- • Informatica admin, deployment/migration, code release, tool release, configuration, setup, Repository backup/restore,types of connection in workflow manager etc. • Having good experience in Cloudera BIG data/Hadoop,HDFS,LFS,YARN, Hadoop Eco-friendly system, PIG,HIVE,Sqoop,Flume ,Hbase,Oozie,Zookeeper,map reduce. • Connectivity of Hadoop with Informatica to load data from LFSHDFS,HDFSLFS,HDFSHDFS. 1
  • 2. Quaish Abuzer • Loading Data from HDFS to HIVE, LFS to HIVE using informatica. Technology: Operating Systems Unix, Windows Databases Oracle,Teradata, Sql Server,Ingress Technologies Power Center Tools/IDE Informatica, Toad, Putty, Winscp,Sql developer,HPQC Career Profile: Dates Organization Designation Dec 2012 - Present Capgemini Consultant July 2010 – Dec 2012 Tata Consultancy Services Systems Engineer Projects: The details of the various assignments that I have handled are listed here, in chronological order. 1. Capgemini:- Project SIMS Customer Schlumberger, Houston, Texas, United States Period July 2015 to Till Date Description We have source system SAP/MS SQL server.BI extractor comes in picture when source system is SAP. For MS SQL server source system we create mapping and call procedure. Pulled data has been directly loaded into Oracle DB (in work table).Work table to staging table loading done after implementing business requirement. Staging to Target is one to one mapping adding some business suggested column. We maintain history table for work table and Target table. Once data is ready in target table (Oracle DB), one team called reported team use that table as source. We are maintaining audit framework in all tables e.g Work tableStaging tableTarget tableHistory table Role • Client owner and responsible for keeping all the track and follow up with business for my clients. • Developed codes by interacting with the Business to understand the requirement and implemented the ETL logic. 2
  • 3. Quaish Abuzer • Implemented cross-reference(Xref) logic. • Expertise in developing the mappings by using suitable transformation as per requirement. • Involved in performance tuning. • Understanding the whole process and gave suggestion to change whenever required. • Involved in analysis before acceptance. • Provides guidance/training to the new joinees as and when required. • Providing effective technical solutions among the team. • Enhanced the existing ETL mappings in Informatica to meet the new requirements. • Also part of the deployment ,good knowledge if Informatica administrator Implemented below scenarios as well being shared resource:- • Mapping Variable and Mapping parameter • Global Parameter • Power Center utilities (PMCMD command,PMREP command) • Global Parameter • Target Load plan • Indirect load • Event Wait • Dynamic parameter file • Performance tuning: Pushdown optimization Key range partition Pass through partition Hash auto key partition Environment Unix Tools Informatica 9.6.1,sql-devloper,winscp,putty,DVO Project Banking Solution Customer SNS Bank, Netherlands Period December 2012 to June 2015 Description The project is about all type of banking model. Like net banking, loan, mortgage, risk management etc. Creating new models, changing on existing code by CR or fixing issues on Incident request. Role • Client owner and responsible for keeping all the track and follow up with business for my clients. • Developed codes by interacting with the Business to understand the requirement and implemented the ETL logic • Expertise in developing the mappings by using suitable transformation as per requirement. • Involved in performance tuning. • Understanding the whole process and gave 3
  • 4. Quaish Abuzer suggestion to change whenever required. • Involved in analysis before acceptance. • Provides guidance/training to the new joinees as and when required. • Providing effective technical solutions among the team. • Enhanced the existing ETL mappings in Informatica to meet the new requirements. • Mapping Variable and Mapping parameter • Global Parameter • Power Center utilities (PMCMD command,PMREP command) • Global Parameter • Target Load plan • Indirect load • Event Wait • Dynamic parameter file • Performance tuning: Pushdown optimization Key range partition Pass through partition Hash auto key partition Environment Unix Tools Informatica 8.6,sql- devloper,winscp,putty,Teradata,Morphues 2. In Tata Consultancy Services Project # 1 Project Vehicle upload Customer GE Capital, France Period July,2012– December,2012 Description GE has purchased lots of vehicle in Europe. All vehicle needs to classify on the basis of types and models. If model changed, then we need to discontinue the vehicle. On the basis return indicator decide which vehicle will go for servicing. Loading the data from flat file Source systems to the Revel Staging layer and then target involving all the calculation at transformation level. Getting source file from customer on business days. Roles & Responsibilities  Developing and debugging the informatica mappings to resolve bugs, and identify the causes of failures.  User interaction to identify the issues with the data loaded through the application. 4
  • 5. Quaish Abuzer  Developed different Mappings/ Mapplets using different transformations.  Providing effective technical solutions among the team.  Responsible for handling change.  Responsible for code check in.  Participated in the design documentation.  Enhanced the existing ETL mappings in Informatica to meet the new requirements.  Reviewing the code and documents prepared by other developers.  Extensively worked on performance tuning.  Maintaining all the trackers.  Taking project level initiatives.  Mapping variable and parameter  Incremental load  Mapping Variable and Mapping parameter  Global Parameter  Target Load plan  Indirect load Tools Informatica 8.6, unix,Ingress DB Environment Unix Project # 2 Project NIKE Customer British Petroleum (BP),UK Period March,2011– June,2012 Description The NIKE application was using heterogeneous sources from different countries.That sources is invoice of oil sale on everyday in that particular country.We use different transformation as per business requirements.Finally we populate data in target table(oracle DB).So business generate report (invoice) in their standard format. Roles & Responsibilities  Developing and debugging the informatica mappings to resolve bugs, and identify the causes of failures.  Providing effective technical solutions among the team.  Worked on Informatica tool – Source analyzer, Target Designer, Mapping designer, Mapplet designer and Transformation Developer.  Created the partition for newly inserted records.  Participated in the design documentation.  The Mappings were Unit tested to check for the expected results.  Study and analysis of the mapping document indicating the source tables, columns, data types, transformations required, business rules to be applied, target tables, columns and data types.  Loaded data into the Data warehouse and there were two kinds of data loading processes (daily and monthly) depending on the frequency of source 5
  • 6. Quaish Abuzer data.  Reviewing the code and documents prepared by the team. Tools Informatica 8.6, Toad, Putty Environment Unix Project # 3 Project Bulk Data Extracts Customer PNC,US Period Sept 2010– Feb 2011 Description PNC Financial Services Group, headquartered in Pittsburgh, PA, is one of the nation's largest financial holding companies. The company operates through an extensive banking network primarily in Ohio, Illinois, Indiana, Kentucky, Michigan, Missouri and Pennsylvania, and also serves customers in selected markets nationally. Bulk Data Extract: Extraction of zipped files and place the Files in the directory from where informatica can pick up the files and perform the desired transformations for loading purpose Roles & Responsibilities  Understanding the requirements and preparing the Low Level design doc based on understanding.  Involved in designing the mappings between sources and targets  Developing and debugging the Informatica mappings to resolve bugs, and identify the causes of failures.  Involved in preparing the Unit Test Cases. Tools Informatica 8.6, Toad Environment Unix Passport Details: Name as on passport Relationship Passport Number Place of Issue Quaish Abuzer Self J4168356 Patna,Bihar Onsite Experience: 6
  • 7. Quaish Abuzer Name of country Client Start date End date Netherlands SNS Bank 03-Mar- 2013 27-Apr-2013 Netherlands SNS Bank 10-Nov- 2014 08-Dec-2014 Current Address: 1204 A-Wing shiv om tower chandivali powai Mumbai 400072 Signature: 7