Group 2ArkivNSCC_LogoLinkedIn Companyarrow forwardGroup 68 Copy 13Group 68 Copy 13ListFacebook-04fbPage 1Group 125Group 68 Copy 6Group 68 Copy 12Group 126Loupe CopyPage 1Group 68 Copy 6Group 68 Copy 6twitterList

Presentation abstracts

------------------------ Monday November 22nd. (UK time) ------------------------

11:00 Becoming a New Energy Data Professional – What are your transferable skills? Jess Kozman, Senior Principal Consultant, Katalyst DM

The transition of resource industries to a lower carbon future has already had a profound impact on data management professionals, as many data subject matter experts with years and decades of experience in exploration, global new ventures, or regional basin and play analysis have been transferred into roles supporting carbon sequestration, hydrogen storage, geothermal, or renewables. In this presentation we will look at new data types associated with low carbon energy domains, efforts to leverage open-source cloud data platforms for the standardization of greenhouse gas measurement, monitoring, and compliance, and lessons learned and industry accepted optimum practices from exploration and production that can be applied to new energy data platforms. We will conclude with some upskilling pathways for those with expertise and business acumen in petro-technical data management to position themselves as champions for the Data Fit Organization of the future.

15:00 The Impact of Digitalisation on Data Professionals Fionnuala Cousins & Prof. Simon Burnett, Lecturer & Professor, Robert Gordons University

Digitalisation is impacting all industry sectors. Advances in information technologies are enabling industries, including the upstream oil and gas industry to leverage data and information more effectively. The effective use of these technologies is reliant on the skills and knowledge of the data and information professionals. At present however, it is unclear if the training and education requirements of data professionals needed to complement these technological advances are also keeping pace.

Commissioned by OGUK and gathering data from practitioners, this research developed a model of the current and possible future digital ecosystems and a digital skills roadmap to address issues identified.

SPDM Online will be the first international presentation of this project.

------------------------ Tuesday November 23rd. (UK time) ------------------------

11:00 Rethinking Data & Digital Platforms in Oil & Gas Gunnar Staff, Production Optimization, Industry Solutions, Cognite.

Unified access to data across data silos is not delivering digitalization success despite cloud data lakes being commonplace in Oil & Gas. Traditional data integration is failing to meet new business requirements that demand a combination of real-time connected data, self-service, and a high degree of automation, speed, and intelligence. While collecting data from various sources is often straightforward, E&P organizations often struggle to fuse individual data sets together with other data sources to deliver a comprehensive view of their real-world assets and processes - one that is not only accessible but truly understandable to all data consumers.

The best E&P data and digital platforms focus on data contextualization and democratization by allowing business users to support easy discovery and navigation of data assets in live operational context to support various workloads and use cases.

The secret to successful E&P data and digital platforms is to place data operations, not cloud data storage, and AI based data augmentation at the center of their data and digital platforms. The goal of all data and digital platforms needs to be creation and provisioning of highly contextualized data that engineers, data scientists and others can intuitively understand and apply to their use case solving.

Focus needs to be on automating the process integration, transformation, preparation, curation, security, governance, and orchestration of a unified contextualized data representation of the industrial reality to enable analytics and insights quickly for business success. Those who remain focused on 'lifting and shifting' data from on-premise source systems siloes to the cloud without addressing the critical need for data augmentation by means of AI-generated active metadata risk ending up with a cloud data swamp instead of the desired digital transformation engine.

11:30 Offshore Daily Production Data Dashboard Robert Swiergon and Bee Smith, The Oil and Gas Authority

Since 2018 field operators have had an obligation to report daily production data per wellbore, for the entire life of the field, following cessation of production of the field. The OGA will present how this process has been implemented, from defining reporting requirements to initial reporting, acceptance, and subsequent disclosure. The challenges of processing large data sets covering a substantial period of time, for fields with individual characteristics, along with changes in field operator will be discussed. These are some of the factors that can determine the effective and timely reporting of this data.

15:00 The Importance of Seismic Data Quality and Standards Jill Lewis, Managing Director, Troika

I have volunteered at the Technical Standards Committee (TSC) of the Society of Exploration Geophysists (SEG) for over 22 years. My work there was to try and improve data standards and enable better data quality, especially to help in the ingestion of historic data. Unfortunately quality has not been improved regarding historic data so what, in the modern computing world, we can do about this.  Without addressing the quality issue we are not able to look to efficiency in the Cloud, utilise AI or ML efficiently or adopt solutions such as OSDU and this is a matter that must be addressed.  The SEG is currently updating SEG-Y_r2 to enable the auto-read of historic information that has gone through a QC process and the SEG have signed a MOU with the OSDU group and are working on the ability for SEG formatted data to be used within the OSDU system as well as VDS formatted data.  This will ensure that the system will be able to host international data which is regulated by NDR's, most of whom require data to be in SEG standard formats.

One of my main concerns is that data may be lost as the focus of energy changes unless it is cleaned up, with all the debate about the future of exploration why would the data be worth keeping and yet I feel as scientific data and as a resource we do not know how it may yet be used.  The examination of the near earth's crust will probably never be repeated in the same way, we may never drill and take core samples in the same numbers.  As yet no-one can predict how this information may be used but already we see moves for carbon capture and other innovations.  This data is almost certainly extremely precious in ways that we have not even thought about.

Let us look at how we can improve historic data quality and improve access to this precious information.

15:30 Can AI accelerate adoption of the OSDU Data Platform? Jamie Cruise, Head of Products, Data, Schlumberger

Come and discover how our ready to run service enables seamless connection to AI and domain applications for faster and improved decisions.

------------------------ Wednesday November 24th. (UK time) ------------------------

11:00 Data Mesh Explained – how we utilize data mesh in Equinor Sun Maria Lehmann & Jorn Olmheim, Enterprise Data Architect, Equinor

Presenting some thoughts on what Data Products are and what defines them in Equinor. The importance of ownership, lifecycle management and governance as a fundament in the organization. How we add value through design thinking methodology and techniques. The Data Mesh explained with an Enterprise Data Architecture perspective and how we utilize this in Equinor through building our Data Products.

11:30 But I don’t want to migrate my data (again)! Michael van der Haven, Director Consulting Expert, CGI

Utilizing OSDU as a hub in a data mesh is a low risk approach to opening data and have it available at your fingertips. Especially if your organization is still in the early stages of digitalization. In fact: within our organization we see quite a few customers that have made (commercial) choices in the past for moving data to a new platform and hare heavily invested in that both from a monetary and organizational perspective. These investments are often an important hurdle to reinvest in a platform like OSDU. Still, those organizations are very much interested in how an open platform like OSDU progresses and truly offers 'data at your fingertips'. The same goes for other organizations that do want to liberate their on-premise data from their traditional silos, but also realize that a cloud journey can be long, risky and expensive and even comprise security. This leads to an interesting dilemma: take a gamble and invest heavily in these challenging times, or not investing and missing out on the competition

In this talk we'll focus on using a light-weight approach to OSDU, while retaining your original investments but opening up new possibilities with OSDU.

15:00 Technical Information and Data Management – a Critical Enabler for Effective End-to-End Delivery of Subsurface Excellence Joshua Ukoha, Technical Data Management Lead, The Shell Petroleum Development Company Of Nigeria

Proper management of subsurface technical information and data continues to play a crucial role in driving business performance and employee effectiveness leading to sustained wins in process excellence, quality assurance and cost improvement across business functions. Despite the best tools and equipment available in the industry, subpar data management can render the best technologies useless and can lead to undesirable safety and reputational consequences. As most oil and gas companies continue to aspire for a leading role in the emerging energy transition journey through an agile and nimble workforce, upstream data management and insightful visualizations have become a very important enabler for maximum value realization. However, the vast volume of data generated in the hydrocarbon maturation process makes it very difficult to achieve precise decision-making in an accurate and timely manner. Additionally, data assimilation can be made more difficult when data is not digitized, reside with different owners (or does not have clear ownership), is of poor quality, is accessible by unauthorized persons or is not governed by a common data model. Although these challenges above are frequently encountered in typical subsurface operational environment, there are opportunities for removing data silos and effectively integrating, standardizing, and structuring our data to provide a unified platform for subsurface excellence. This paper outlines details of the effective and integrated approaches that have been deployed in subsurface data management and visualization efforts via several data management initiatives. These initiatives have in turn improved collaborative efficiency, validity and reliability of subsurface realizations and business decision-making. Leveraging the Do-It-Yourself (DIY) framework, in-house technologies and digital solutions have also been developed and deployed across the business to enhance productivity and drive automation of routine business processes, thereby creating significant value for the organization (~USD4mln in savings) in areas like field management optimization, hydrocarbon discovery, maturation, and recovery. Furthermore, to establish the desired technology-enabled workforce of the future and preserve the gains, dedicated data engineering and visualization capabilities have been identified and integrated in the development plans for subsurface staff to effectively embed a data-centric culture.

15:30 A Start-to-END Data Quality Evaluation Dr. Leila Belabbassi, Lecturer, Texas A&M University

If data quality is measured with accuracy, reliability, usability, and suitability, then the data process validating the quality of the data should start at the early stage of data collection. When evaluating data quality the starting point differs, frequently referring to data at delivery or end of the data pipeline. Which means the time after the data has undergone a series of transformations. To ensure data quality is not lost in the transformations along the pipeline, it is important to design the evaluation process to include the entire data pipeline – from collection to delivery. The development of a data quality process starts with defining the data evaluation process. This presentation steps through the main elements that go into building this process: (1) The use of standards and constituents of the data management system to unify the parameters used in the evaluation process, (2) The adoption of automated testing to realize the evaluation of large volumes of data, and (3) The practice of a two-way communication process to solve data issues discovered during the data evaluation process. This last element ties directly to the importance of having the teams comprising the data management infrastructure commit to execute the data evaluation plan efficiently. The start-to-end data quality evaluation depends on the commitment of two types of teams: (1) The teams that collected and managed the data are usually responsible for reviewing issues and creating solutions and (2) The teams that handle the data are responsible for fixing issues and annotating data. The end goal of the start-to-end data quality evaluation is to transform the data into an asset by easing access and increasing usability.

------------------------ Thursday November 25th. (UK time) ------------------------

11:00 The Superpowers to thrive in a world of constant change Siti Zubaidah Abu Bakar, Technical Data Analyst, Repsol

Being in a world of rapid pace of change in the society and industry requires an ultimate adaptability and flexibility. How do we do that with calm and full of clarity? We hope to discuss ways we may develop to endure with these uncertainties and chaos. To see with the new perspective and to shift the mindset in order to embrace the chaos despite of the overwhelming of more and more data.

11:30 Enabling Spatial Data through SLIMM (Spatial Location Information Model and Media files) Håvard Gustad, Specialist IM, Equinor

How spatial information model can be a key enabler for unlocking new value in media files. Making the data available through data services and data products. SLIMM is an agile project set up as a scrum team and works in sprints. To be successful we start small after thinking big. Rather than perusing one potentially big product, we’ve been breaking down the ideas into smaller pieces for quick testing and iterations.

15:00 Capturing assay table in mining documents. Henri Blondelle,CEO, AgileDD

Mining companies store millions of documents on file systems. Unfortunately, the information contained in the documents is frequently locked in unstructured formats as PDF. One of our customers, Barrick Gold, a major gold mining company, has trained our AI solution to capture assay tables and associated metadata from subsurface documents. The presentation explains the open-source technology we have developed to train an hybrid model able to detect and segment assay tables at scale. It also detail the implementation of the solution, from OCR of handwritten documents to the astonishing obtained results. According to our customer, accessing in confidence massive amount of legacy geochemical data reduces the number of necessary drill holes and trenches, hence impacts the cost and environmental aspects of exploration.

------------------------ Friday November 26th. (UK time) ------------------------

11:00 How Intelligent Search can support Digital Transformation Lee Hatfield, Senior IM Consultant, Flare Solutions

Digital Transformations are now considered by most organisations to be essential to creating value, improving efficiencies and building an innovative environment. The vast majority of organisations have now embarked in the Digital Transformation journey and this presentation details how that journey can be eased and supported using Intelligent Search as a driver for change.

11:30 A Response to the UKCS Data & Digital Maturity Survey Dave Mackinnon, Technology & Innovation Manager, Technology Leadership Board

Data & Digital are the next step change in performance for our industry and one that we cannot ignore. When we think about Data & Digital we often think of specific data solutions and analytic tools; an app, a wearable device, a remote visual inspection of an offshore operation. Instead we should be thinking about the entire ecosystem that brings all of these solutions together in pursuit of the real goals: increased value, efficiency, reliability, and of course the reduced emissions and transformational technologies that will enable net zero.

15:00 The Future of Industry Standards Ross Philo, Chief Executive Officer, Energistics

… followed by a brief panel session with guests to be announced.

Subscribe to our Newsletter