JD 3
JD 3
Competitive compensation package including benefits that support your well-being and
professional development
Everything we do boils down to ‘Why’ – our purpose – to shape a world where people and communities
thrive. We are just as focused on seeing our people thrive as well as our customers. We will give you every
opportunity to develop your career. We are responding faster to changing customer’s requirements,
focusing on the things that matter the most, energizing our people, eliminating waste and reducing
bureaucracy.
A happy workplace is a thriving one. Therefore, to attract and keep the best talents and send thanks for
the hard work, we make sure all our employees are rewarded.
We work flexibly and encourage you to talk to us about how this role can be flexible for you and any
adjustments you may require to our recruitment process or the role itself. If you are a candidate with a
disability, kindly let us know how we can provide you with the additional support.
JOB REQUIREMENTS
We are trying to reimagine the way we help and interact with our customers, so we are looking for
candidates with creativity, an open mind and a positive energy. Our detailed requirements are as below:
At least 8 years of experience in total at Data Engineering or at a similar role
We’re trying to reimagine the way we help and interact with our customers, so we’re looking for
creativity, an open mind and energy to take initiative and drive decision-making
HCL Technologies – Where Values Drive Velocity
‘Must-have” experience:
Experiene with Data warehouse: On-prems (MySQL, SQL Server, Oracle, Teradata, DB2,
Databricks, MongoDB, PostgreSQL, etc.) or on Cloud (AWS Redshift, GCP BigQuery, Azure
Synapse, etc.)
Experience with Business Intelligence: Power BI, Tableau, Oracle BI, Looker Studio, QuickSight,
SSRS, Qlik, IBM Cognos, SAS
Experience with ETL tools: SSIS, Informatica, Pentaho Data Integration, InfoSphere/DataStage,
Talend, SAS, Oracle Data Integrator, DBT, AWS Glue, Azure Data Factory, GCP DataFlow, etc.
Expert-level SQL (problem solving, profiling)
Data modelling experience (Dimensional Modeling, Data Vault)
Experience in programming language (Python)
Experience in automating data collection and data analysis
Experience with Git source control
Hands-on experience of working with Data projects that have a complete CI/CD system
Experiece with Unit Test for data (ETL process, data pipeline)
Experience in building up data analytics solutions with one of these Cloud platforms AWS, Azure,
GCP
Ability to influence both technical and business peers and stakeholders:
Lead by example – jump ‘on the tools’ when required to support the team
Good command of English verbal communication
Good education background of Information Technology (IT) and Computer Science, Software
Engineering
Preferred experience:
GCP Big Query
DBT
Github Actions
Infrastructure as code (Terraform)
SALARY AND BENEFITS
Hybrid working mode (3 working days at office, flexible time)
Salary: Please completely feel free to tell us your expected number!!!
18 paid leaves/year (12 annual leaves and 6 personal leaves)
Insurance plan based on full salary
Attractive Package including 13th month salary and Performance bonus
100% full salary and benefits as an official employee from the 1st day of working
Medical benefit for employee and family
Working in a fast-paced, flexible, and multinational working environment. Chance to travel onsite
(in 49 countries)
Free snacks, refreshment, and parking
Internal training (Technical & Functional & English)
Working time: 08:30 AM - 06:00 PM from Mondays to Fridays (meal breaks included)