0% found this document useful (0 votes)
6 views

Data Engineer - JD

The document outlines the position description for a Data Engineer at Marcel, detailing responsibilities such as leading a team of Data Engineers, creating and maintaining data pipeline architecture, and supporting data-related technical issues. Key competencies include strong experience with Azure, SQL, and big data tools, along with a focus on continuous integration and automated testing. The role emphasizes collaboration with various stakeholders and adherence to business compliance and customer service standards.

Uploaded by

shilpisonixxx
Copyright
© © All Rights Reserved
Available Formats
Download as DOC, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
6 views

Data Engineer - JD

The document outlines the position description for a Data Engineer at Marcel, detailing responsibilities such as leading a team of Data Engineers, creating and maintaining data pipeline architecture, and supporting data-related technical issues. Key competencies include strong experience with Azure, SQL, and big data tools, along with a focus on continuous integration and automated testing. The role emphasizes collaboration with various stakeholders and adherence to business compliance and customer service standards.

Uploaded by

shilpisonixxx
Copyright
© © All Rights Reserved
Available Formats
Download as DOC, PDF, TXT or read online on Scribd
You are on page 1/ 4

Position Description

Enterprise Applications Services

Date: 21st June, 2022


Position title: Data Engineer

Department: Marcel

Location: Position has a global remit

Reports to (role): Senior Manager - Data

 Be the direct contact for internal clients for all technical aspects during the
initial engagement and kick-off stages of application development projects.
Have the ability to drive high level solution architecture design (conceptual,
logical, physical)
 Support the project team for all Data Platform, Data Science Models, Data
Feeds and other data management related issues.
Main purpose:  Provide knowledge on full-scale continuous delivery process & solutions
and understand continuous integration and automated tests (unit, front-end,
load) frameworks, scripts and other artefacts to accelerate data feeds
development and integration with Data Lake and Front End features
 Be responsible for the setup, continuous maintenance, monitoring and
troubleshooting of development/ testing/ staging and production
environments across internal clients accounts in collaboration with the
technology team.
Employees Direct: 0
Dimensions:
managed: Indirect: 0
Key responsibilities: The key accountabilities for this role are, but not limited to;

 Lead and mentor a team of Data Engineers across complex Data Pipelines
for a variety of consumers.
 Create and maintain optimal data pipeline architecture,
 Assemble large, complex data sets that meet functional / non-functional
business requirements.
 Identify, design, and implement internal process improvements: automating
manual processes, optimizing data delivery, re-designing infrastructure for
greater scalability, etc.
 Build the infrastructure required for optimal extraction, transformation, and
loading of data from a wide variety of data sources using SQL and Azure
‘big data’ technologies.
 Build analytics tools that utilize the data pipeline to provide actionable
insights into customer acquisition, operational efficiency and other key
business performance metrics.
 Work with stakeholders including the Executive, Product, Data and Design
teams to assist with data-related technical issues and support their data
infrastructure needs..
 Create data tools for analytics and data scientist team members that assist
them in building and optimizing our product into an innovative industry
leader.
 Work with data and analytics experts to strive for greater functionality in our
data systems.
Page 1 of 4
 The deliverables for each Sprint are clearly understood by the Agile
Team(s).
 Ensure that the Agile team(s) delivers working software of sufficient quality
to deliver to clients at the end of each development sprint.
 Source Control repositories are appropriately managed
 Agile team receives sufficient resourcing to be able to complete its
objectives.

Specific responsibilities:
 Write maintainable and effective data feeds, and pipelines
 Follow best practices for test driven environment, continuous integration.
 Design, develop, test and implement end-to-end requirement
 Contribute on all phases of development life cycle
 Perform unit testing and troubleshooting applications

Business Compliance Ensure a sound understanding of, demonstrate commitment to and comply with
all legislation & Publicis Groupe Policies e.g., Janus, GSO and IT policies, etc.

Personal & Team Actively develop and maintain strong working relationships with all Re:Sources
Accountabilities personnel both at a interpersonal level and across all business processes within the
wider business environment.
Actively maintain communication and behaviour standards that foster a culture of
strong customer and service excellence both within Re:Sources and across all
customer and supplier organisations.

Key relationships Re:Sources Teams, particularly Business Improvement and IT


(internal &/or Agile Development Teams
External System Suppliers
external):
Key Minimum Experience (relevant): 5
competencies:
Maximum Experience (relevant): 9

Must have skills:


 Strong experience in Azure, ADF, PySpark, Scala, DataBricks, SQL.
 Experience with ETLs, JSON, Hop or ETL orchestration tools.
 Experience in working with EventHub, streaming data.
 Understanding of ML models and experience in building ML pipeline,
MLflow, AirFlow.
 Experience with big data tools: Hadoop, Spark, Kafka, etc.
 Experience with relational SQL and NoSQL databases, including Postgres
and Cassandra.
 Experience with data pipeline and workflow management tools: Azkaban,
Luigi, Airflow, etc.
 Experience with stream-processing systems: Storm, Spark-Streaming, etc.
 Understanding of Graph data, neo4j is a plus
 Strong knowledge Azure based services
 Strong understanding of RDBMS data structure, Azure Tables, Blob, and
other data sources
 Experience with test driven development
 Experience in PowerBI or other tools
 Understanding of Jenkins, CI/CD processes using ADF, and DataBricks
preferred.

Good to have skills:


 Bachelor's degree in engineering, computer science, information systems,
or a related field from an accredited college or university; Master's degree
from an accredited college or university is preferred
 Advanced working SQL knowledge and experience working with relational
databases, query authoring (SQL) as well as working familiarity with a

Page 2 of 4
variety of databases.
 Experience building and optimizing ADF and PySpark based data pipelines,
architectures and data sets on Graph and Azure Datalake.
 Experience performing root cause analysis on internal and external data
and processes to answer specific business questions and identify
opportunities for improvement.
 Strong analytic skills related to working with unstructured datasets.
 Build processes supporting data transformation, data structures, metadata,
dependency and workload management.
 A successful history of manipulating, processing and extracting value from
large disconnected datasets.
 Working knowledge of message queuing, stream processing, and highly
scalable Azure based data stores.
 Strong project management and organizational skills.
 Experience supporting and working with cross-functional teams in a
dynamic environment.
 Understanding of Node.js is a plus, but not required.

Attributes/behaviours
 Ability to design, develop, implement complex requirement.
 Building reusable components and front-end libraries for future use
 Translating designs and wireframes into high quality code
 Pro-active support to the business is a key attribute for this role with a
customer service focus to link both systems requirements with business
outcomes.

Page 3 of 4
Declaration

I have fully read the above position description. I declare that:


1. I accept and understand my responsibilities listed in this position description and agree to carry
them out to the best of my ability.
2. I accept and understand the potential implications of breaching my responsibilities listed in this
position description, including the possibility of disciplinary action and / or dismissal.

Date:
Employee Name:

Employee Signature:

Date:
Manager Name:

Manager Signature:

This job description in no way states or implies that these are the only duties to be performed by the employee(s)
currently in this position. Employee(s) will be required to follow any other job related instructions and to perform
any other job-related duties requested by any person authorized to give instructions or assignments.

A review of this position has excluded the marginal functions of the position that are incidental to the
performance of fundamental job duties. All duties and responsibilities are essential job functions and
requirements and are subject to possible modification to reasonably accommodate individuals with disabilities.
To perform this job successfully, the incumbent(s) will possess the skills, aptitudes, and abilities to perform each
duty proficiently. Some requirements may exclude individuals who pose a direct threat or significant risk to the
health or safety of themselves or others. The requirements listed in this document are the minimum levels of
knowledge, skills, or abilities.

This document does not create an employment contract, implied or otherwise, other than an "at-will"
relationship.

Re:Sources is an Equal Opportunity Employer

Page 4 of 4

You might also like