0% found this document useful (0 votes)
12 views

Data as a Strategic Asset: Improving Results Through a Systematic Data Governance Framework

The document outlines the establishment of an Integrated Data Architecture and Data Governance Framework by ADNOC to enhance integrated upstream workflows and operations. It addresses challenges in data management, emphasizing the need for quality, complete, and timely data across various disciplines for effective reservoir management. The framework aims to standardize data processes, improve data governance, and facilitate collaboration in a multi-vendor technology environment.

Uploaded by

John Zoidberg
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
12 views

Data as a Strategic Asset: Improving Results Through a Systematic Data Governance Framework

The document outlines the establishment of an Integrated Data Architecture and Data Governance Framework by ADNOC to enhance integrated upstream workflows and operations. It addresses challenges in data management, emphasizing the need for quality, complete, and timely data across various disciplines for effective reservoir management. The framework aims to standardize data processes, improve data governance, and facilitate collaboration in a multi-vendor technology environment.

Uploaded by

John Zoidberg
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 6

SPE-188962-MS

Upstream Data Architecture and Data Governance Framework for Efficient


Integrated Upstream Workflows and Operations

Richard Mohan David, Dr. Luigi Saputelli, Hafez Hafez, Dr. Ram Narayanan, Pascal Colombani, and Tayba Al
Naqbi, ADNOC

Copyright 2017, Society of Petroleum Engineers

This paper was prepared for presentation at the Abu Dhabi International Petroleum Exhibition & Conference held in Abu Dhabi, UAE, 13-16 November 2017.

This paper was selected for presentation by an SPE program committee following review of information contained in an abstract submitted by the author(s). Contents
of the paper have not been reviewed by the Society of Petroleum Engineers and are subject to correction by the author(s). The material does not necessarily reflect
any position of the Society of Petroleum Engineers, its officers, or members. Electronic reproduction, distribution, or storage of any part of this paper without the written
consent of the Society of Petroleum Engineers is prohibited. Permission to reproduce in print is restricted to an abstract of not more than 300 words; illustrations may
not be copied. The abstract must contain conspicuous acknowledgment of SPE copyright.

Abstract
Integrated Upstream workflows require efficient sharing of data across business disciplines involving
planning studies, development, operations to carry our effective monitoring, surveillance of their reservoir
and operations for better management of reservoir and production optimization. ADNOC requires quality,
complete and timely multi-disciplinary data from its operating companies for efficiently monitoring its
assets and provide informed directions.
An Integrated Data Architecture and Data Governance Framework is being established to enable efficient
deliver data to execute integrated upstream workflows in a multi-disciplinary and multi-vendor technology
environment. The framework facilitates to standardize, streamline G&G, reservoir, production, operations
data and processes, and enables integrated reservoir management workflows and inter disciplinary
collaboration in a multivendor technology environment.
The Framework provides guidance and best practices to build common data architecture foundation for
integrated upstream workflows. The framework enables to efficiently govern the data across its lifecycle
and makes the relevant business team and data custodians to take responsibility to validate, store and deliver
static and real-time data in a timely manner for Integrated Upstream workflows. The framework components
form a key foundation element for efficient integrated workflows towards realizing integrate operations,
closed loop optimization, full benefit of Smart Field Investment.

Introduction
Integrated Upstream workflows in ADNOC includes workflows under integrated processes and initiatives
such as IAM, IRM, ICM, PEI, PAP, RPM and reservoir and production performance management. A
common data foundation delivering quality, complete and timely data from different data sources is essential
to execute these workflows efficiently in an integrated manner.
Typical data related challenges in the upstream industry are cases of data incompleteness, lack of clear
data ownership, multiple copies of data scattered in different source systems, data in numerous spreadsheet,
time consuming manual data loading and data validation process, data integrity issues between systems,
real time data integration and associated quality issues. Challenges include integration of disparate systems
2 SPE-188962-MS

and technologies that delivers integrated upstream workflows including smart field workflows, keeping the
models up to date continuously to match the actual and so on.
Streamlined access to all sub-surface data, production figures, well and facilities status, business
plans, economic data, regulatory and technical guidelines is essential for the upstream workflows and
keeping the technical models up to date. ADNOC is establishing an Integrated Data Architecture and
Governance framework that enables efficient execution of integrated upstream workflows and realize
through Upstream Data Hub initiative. The upstream data hub brings together various ADNOC upstream
information management capabilities and technologies together leveraging the integrated data architecture
and data governance framework to provide one integrated place for all upstream data for business to carry
out their processes efficiently.
As a use case, IRM is one of the process discussed below to present the importance to establish IRM
workflows on the common integrated data and governance framework.

Use Case - Integrated Reservoir Management (IRM)


Integrated Reservoir Management (IRM) is an upstream petroleum industry term to define the supervision
and controlling activities to manage subsurface hydrocarbon reservoirs (Al Marzouqi et al., 2016).
Integrated Reservoir Management frameworks have been deployed across assets around the world to
ensure the reservoir management business goals. The main motivation for embracing a new framework
for reservoir management was the need to promote innovative opportunities for reservoir management
improvement. Reservoir management is the process that involves data acquisition, validation, analysis,
integration opportunity generation and execution, can mitigate the outcome of such decisions in the presence
of uncertainties.
IRM brings in multi-disciplinary reservoir performance management processes, workflows, technology
and teams together and sets synergy through interactive workflows and collaboration for better management
of reservoir performance, achieve maximum production rate and enhanced recovery. The fundamental to
the IRM strategy is to keep the various reservoir and production processes and workflows integrated and
having an up to date models (well, static, dynamic models, network and facilities models) as close as to the
field reality. It is important for asset teams to have standard reservoir management processes, workflows
and data that are integrated to achieve integrated reservoir management to efficiently and optimally manage
reservoirs. The following sections of the paper discuss the various components of the integrated framework.

Integrated Data Architecture and Governance Framework


Integrated Upstream workflows require efficient sharing and integration of data across business disciplines
involving planning, studies, development and operations. The Integrated Data Architecture and Governance
Framework (Figure 1) provides guidelines and best practices to manage and orchestrate Integrated Upstream
workflows seamlessly in a multi-vendor technology environment driven by industry data standards and data
governance process.
SPE-188962-MS 3

Figure 1—Integrated Data Architecture and Governance Framework Components

The standards based framework enables seamless delivery of integrated of static, operational, static
and analytical data consistently for the Integrated Upstream workflows through common data model. The
framework comprises of data dictionary, data models, data standards, data workflow, data integration,
dataflow automation, data governance organization, data quality management and service level agreements
as the key components.

E&P Ontology and Upstream Data Dictionary


E&P Ontology is a controlled structured vocabulary (data taxonomy, data dictionary) and the inter-
relationship of oil field data objects that enables search, data integration and interoperability. Standard
naming standards for wells, reservoirs and facilities, units of measurements, activity and cost codes, failure
codes, asset specification, location details and so on enables tracking metadata across variety of systems that
captures data across E&P business processes. ADNOC is building its common upstream data dictionary. The
data dictionary provides detailed information of business data such as standard data definition and common
terminology for business data elements, their attributes, and business usage of data, frequency of data
required for business workflows, allowable values, and system of record/source database, data owner and
custodian. Common Upstream Data dictionary is a fundamental artifact developed and centrally managed
for Upstream projects to ensure all project use common set of data that shall ensure analysis, workflow and
reporting integrity. A data dictionary enables to communicate with business stakeholder on requirements
effectively to build Upstream data architecture.

Data Standards & Integration


Data standards and ontology provides robust and reliable data exchange mechanism that shall enable
data integration across multitude of databases and system and interoperability. Standards such as PPDM,
ProdML, WITSML, OPC, OPC-UA, PPDM, Modbus, HART, ISA-95, MIMOSA, ISO15925 etc. form
an essential component of the integrated data architecture. The integrated data architecture provides a
common data model and foundation to implement multi-vendor and/or in-house upstream workflows and
visualization that enables interoperability and integrity of the data analysis amongst them including existing
applications. The architecture shall make use of industry standard data model, data quality, metadata,
data lineage, master data management and data governance to facilitate upstream process and workflow
automation. Upstream data integration hub is being established to deliver data seamlessly from different
data sources leveraging the framework.
4 SPE-188962-MS

Figure – 2, presents the different integrated data architecture components orchestrated to systematically
harness the business value of data. The data governance model oversees the different data management
activities and work process ensuring delivering quality and complete data and usage of data effectively.

Figure 2—Integrated Data Architecture and Governance Framework

Data and Model Governance


Data governance across source data environment ensures the data inputs for Integrated Upstream process
and models governed to maintain data validity and completeness. This is enabled by having data
policies and procedures, defining the data definition, data sources, data governance committee, data
ownership, data custodianship, data flow, business rules, and data issue resolution process and service
level agreement to deliver data for Integrated Upstream workflows and automation. Engineering models
need to be continuously governed by defining clear roles, responsibilities, user privileges and role based
security for model creation, updating, tracking changes, validation, auditing, data source lineage and
version management. With the increase in automation and machine learning processes, clear governance
and guidelines to be delineate what processes goes through automation and what processes requires
human intervention. ADNOC is establishing group level Upstream Data Governance. Data governance
committee shall consist of steering committee comprising of leadership team, focus group comprising
of data champions, data stewards and custodians to set data standards, processes and workflows and
implementation.
Any technology providers participating in the Integrated Upstream Workflow in the company should
adhere to the framework to support interoperability and avoid creating silos of information. Data should
go in once in the smart field environment and be usable throughout the asset's lifecycle and by various
workflows and applications. Data governance initiatives improve data quality.

Data Quality Management


Analyzing trends and patterns, performing current business operations and making operational decisions,
and evaluating future business options and opportunities is possible only with a high-quality data. Data
quality management addresses defining, monitoring, maintaining data integrity, and improving data quality.
During planning and design phases it is important to define the data elements and the quality required
for each workflow's intended use. Specific things about the data that can either be measured or assessed
must be identified to understand the quality of data; DAMA-UK, (2013) for example, recommends:
completeness, uniqueness, timeliness, validity, accuracy, and consistency. The impacts of low quality should
be identified, and data profiling and correction processes must be defined. During the operational phase, it
SPE-188962-MS 5

is necessary to ensure compliance with data quality, including metrics and adherence to defined standards
(Data Management Association International, 2014).
Data quality need to be primarily addressed in the source systems on which the business operates.
Continuous availability of quality and complete data required for Integrated Upstream Workflows is
fundamental to Integrated Upstream processes. Data Quality Management process and procedures has to be
established to ensure the input data (real time/static/historical) is validated, extracted, filtered, conditioned,
reconciled and that it accurately represents the asset before using for any business workflows, calculation
or business logic. It is critical to ensure there is sufficient data and continuous update with new field data
or the field status to keep the models (physics based or analytics) up to date matching the actual. Invalid
sensor data, wrong assumptions, insufficient or inconsistent data leads to poor predictability and uncertainty
in the model. Exception based data validation enables real time data quality control for high frequency
real time data from the field. The data quality management processes should encompass reservoir data
acquisition and surveillance needs of Integrated Upstream Workflows and associated data management
activities. Dashboard showing the quality status of key data items and a quality management workflow is
essential to continuously monitor, report and resolve data quality issues.

Data Workflows and Automation


Streamlined data workflow directed by set data governance and data quality management practice is a
key building block of the framework. Data available in unstructured formats such as spreadsheets and
documents, manual loading of data, is a time-consuming process and error-prone processes which needs to
be eliminated for effective data management. Standardization and automation of the entire process of data
life cycle for key technical datasets (such as well tests, well logs, PVT, from data requisition, acquisition,
data delivery, quality control and loading into the relevant source repositories enables data assurance and
timely delivery of data for Integrated Upstream Workflows. Continuous effort to be made to bring such
technical data available in spreadsheets/documents into relevant source database.

Upstream Data Visualization and Analytics


Upstream workflows require integrated visualization of multiple information such as reservoir and
production facts and figures both planned and actual, trends, well and equipment status, performance
information, economics and KPIs that should be consistent across workflows and reporting levels without
any integrity issues and inconsistencies. Establishing common upstream visualization library enables to
share common visualization across upstream workflows leveraging common data foundation (upstream data
hub). The framework recommends building common KPI definition, common data sources, business logic
and calculations for upstream dashboards, data analytics and other smart field workflows.

Conclusion and Way Forward


Common upstream data foundation along with effective data governance protocol is essential for ADNOC
to implement and execute various upstream integrated workflows and initiatives efficiently. A structured
approach and systematic management of the data architecture foundation enables to move towards higher
level of integrated workflows, visualization, smart field automation and advanced analytics. ADNOC looks
forward to progressively implement the framework and continuously improve the upstream data foundation
to facilitate current and future upstream workflows and initiatives.

Nomenclature
ADNOC = Abu Dhabi National Oil Company
IRM = Integrated Reservoir Management
IAM = Integrated Asset Management
ICM = Integrated Capacity Management
6 SPE-188962-MS

PEI = Production Efficiency Improvement


PAP = Production Assurance Improvement
KPI = Key Performance Indicator

Reference
Al Marzouqi M. A, Bahamaish J., Al Jenaibi H., Al Hammadi H., and Saputelli L., (2016) "Building ADNOC's Pillars
for Process Standardization and Best Practices through an Integrated Reservoir Management Framework", Paper
SPE-183421-MS, presented at the Abu Dhabi International Petroleum Exhibition and Conference held in Abu Dhabi,
UAE, 7–10 November 2016
Al Marzouqi M. A, Bahamaish J., and Saputelli L., (2017) "Benefits of Implementing Integrated Reservoir Management
(IRM) framework by closing gaps in People, Process and Technology - Second Act", Paper SPE-188348-MS, presented
at the Abu Dhabi International Petroleum Exhibition and Conference held in Abu Dhabi, UAE, 13-16 November 2017
David, R.M., et al., Managing and Orchestrating Multi-Vendor Intelligent Oil Field Technology Environment to Enable
Efficient Next Generation Production and Reservoir Management Workflows, 185440, presented at the SPE Middle
East Intelligent Oil & Gas Conference & Exhibition held in Abu Dhabi, UAE, 15–16 September 2015.
David, R.M., Approach towards Establishing Unified Petroleum Data Analytics Environment to Enable Data Driven
Operations Decisions, 185440, presented at the SPE Annual Technical Conference and Exhibition held in Dubai, UAE,
26–28 September 2016.
Kalaydjian, F., Bourbiaux, B., Integrated Reservoir Management: A Powerful Method to Add Value to Companies' Asset,
Oil & Gas Science and Technology – Rev. IFP, Vol. 57 (2002), No. 3
Satter, A., et al., Integrated Reservoir Management, Journal of Petroleum Technology, December 1994.
Data Management Association International – DAMA: DAMA DMBOK-2, www.dama.org
DAMA-UK: The sis primary dimensions for data quality, 2013 http://damauk.org/

You might also like