Group 2ArkivNSCC_LogoLinkedIn Companyarrow forwardGroup 68 Copy 13Group 68 Copy 13ListFacebook-04fbPage 1Group 125Group 68 Copy 6Group 68 Copy 12Group 126Loupe CopyPage 1Group 68 Copy 6Group 68 Copy 6twitterList

Presentation abstracts

------------------------ Monday November 28th. (UK time) ------------------------

11:00 Dynamic Digital Twin – Creating real value from structured data – What are your transferable skills? - Endre Nisja, IM Advisor, Shell. Shell has over the last 5 years implemented a Dynamic Digital Twin for the Ormen Lange gas field and the Nyhamna gas processing plant, that spans the whole process from reservoir to market.

11:30 The NDA’s Digital Transformation Journey - Dave Clark, AM&CI Manager, NDA. Transforming the knowledge of the past into insight for the future.

15:00 Sustainable utilization of subsurface on Norwegian Continental Shelf, Ying Guo, Senior Advisor / Business developer, CCUS, IOR, Norce.

This presentation gives the background of the need for transition to sustainable oi and gas exploration and production on NCS. The newly established Petrosenter NCS2030, hosted by the University of Stavanger, and with its research partners Norwegian Research Centre – NORCE and Institute of Energy Technology – IFE and University of Bergen – UiB, will assist the transition with focus on the matured areas on the Norwegian Continental Shelf with research, education an innovation.

Sustainable utilization of subsurface means to utilize the oil and gas reservoirs beyond what we currently do, for example storing CO2 for reducing Co2 emission, H2 as battery of energy storage, harvest geothermal heat resources, etc. Understanding the subsurface properties will be important, and digital information. Our need for accessing quality data for subsurface evaluations with advanced tools such as models, AI and ML, etc are critical for this transition.

15:30 15:30 Making complex problems simple with design thinking and creating great user experiences - Lene Regine Sorskar, UX designer, Okse; and Trond Roar Eide, Senior User Experience Designer, These Ways AS.

The purpose with this presentation is to teach people about the usage of an UX designer and design thinking. We focus on giving the user a voice so we can create a valuable solution they will use. Secondly, if they don’t have a designer on their team, this presentation will explain and give them a hands-on approach that they can use in solving complex problems.  

------------------------ Tuesday November 29th. (UK time) ------------------------

11:00 AVATARA-p: Advanced Augmented Analysis Robot for Palynology - Robert Williams, Paleontologist: marine microplankton, NPD.

Operators and consulting laboratories routinely produce palynological slides from drill cores and cuttings. Analysis of these slides are used to establish three geological factors: 1) age dating of strata, 2) environment of deposition, and 3) fossil biopolymer sources.

The Norwegian Petroleum Directorate (NPD) currently houses approximately 150,000 palynological slides in the Geobank. This archive of biopolymer fossils sheds light on important events in Norway's geological history in the form of fossil pollen, spores and microplankton that lived during the last four hundred million years.

To expand the industry's understanding of the subsurface, the NPD has shared its archive of palynological slides from released wells with companies and academia for over forty years. Increased geological understanding reduces margins of error inherent in geological mapping and thus reduces the economic risk of exploration and production. Therefore, both industry and the nation benefit from electronic access to an abundance of palynological raw data.

During the 14th European Congress on Digital Pathology held in Helsinki in 2018, the NPD gained insight into technological advances in slide scanning and artificial intelligent medical diagnosis. Palynologists can now reap the benefits of twenty years of research and development driven by medical research. This has resulted in optical resolution less than 0.29 micrometres, digital resolution of more than twenty gigapixels per slide, seamless image mosaics and focal plane stacking.

Through the AVATARA project, the NPD applies technological advances in digital pathology to geological disciplines based on traditional microscopy. Microfossils are scanned at an optical resolution typical of high-end light microscopes. Large 4K or 8K screens give the user an immersive visual experience that provides more information than narrow fields of view characteristic of traditional optical microscopy. The user can tag microfossils to improve taxonomic consensus and discussion between working groups. The user interface exports images of annotated specimens as training data for machine learning. The medical pathologist’s tools for improving diagnoses through interprofessional communication and decision-making are now available to biostratigraphers through the Avatara project.

11:30 Subsurface samples: To retain or not to retain, that is the question - Kerry Blinston, Head of Energy Solutions Group, Oasis.

How long should we retain subsurface core and cuttings samples? As areas mature and fields are decommissioned the ongoing cost of storage is frequently questioned and often drives a review of the retention requirement. However, new techniques and energy transition activities can spark renewed interest in subsurface samples. Additionally, regulators frequently require the submission of a subset of each sample and retain this set but must be notified and permit the disposal of the operators set. The arguments for retention include the uniqueness of the data, the cost of acquisition and the probability of new technologies and use cases emerging. The arguments against relate to cost and preservation issues. At a point in time, the argument for disposal is compelling. In this presentation, using our recent experience providing these services to clients, we put forward what we consider best practice for core and cuttings disposal.

15:00 Equinor OSDU Adaptation Experience - Einar Landre, Lead Analyst IT, Equinor.

15:30 OSDU challenges and the inter-relation to pre-existing standards such as PPDM and Energistics - Simon Kendall, Director of Data Management Strategy, Petrosys Interica.

As the pace of OSDU adoption increases there are several emerging trends, or needs, that are being described within the industry. These include, but are not limited to, the following prerequisites as companies stage their adoption of Cloud and OSDU working.

  • The requirement to move not only the data, but also the audit trail, to ensure data is trusted data
  • Ensure that in documenting the data lineage that data is both freed from the application and that an audit trail exists for “trusted” data

There are several challenges that an OSDU data migration will have. For example, the need to avoid duplicating data; protecting data in legacy systems that are not migrated; and moving trusted data only. In addition, use cases demonstrate that to date not all data types are supported. Therefore, the need to hold some trusted data outside OSDU will be pervasive for perhaps another 3-5 years.

This talk will look historically at the development of standards, their current application, and how they will support OSDU in the future.

------------------------ Wednesday November 30th. (UK time) ------------------------

11:00 Data to the Many – Implementing data mesh in Equinor - Nina Reiersgaard, Advisor Enterprise Data Management, Equinor. How we approach data products, data platform, and a federated model for data governance.

11:30 SecondLife for Distributed Data - Jess Kozman, Senior Principal Consultant, Katalyst Data Management.

Just when we thought we were seeing a decline in seismic exploration and therefore lowered interest in storing and managing large volumes of data, the business environment has shifted again. This shouldn’t be a surprise to anyone who has had experience with the cyclical nature of the resource industry, and if anything, it reinforces how our data management strategies must be deliberately resilient to factors that make situational awareness for technical data users Volatile, Uncertain, Complex, and Ambiguous (VUCA). Recent action on permitting practices, and the evolution of requirements for measurement, monitoring and verification (MMV) on carbon capture and storage sites and projects, for example, have driven a resurgence of interest in the data management community in passive acoustic seismic data acquisition and delivery technologies. At the same time, increased investment in unconventional drilling technology has spurred the use of distributed sensing for well integrity and sand control. Mining operations are investigating the use of fiber optic technology for routine seismic monitoring, as well as vibration and blast monitoring, rock mass characterization, and remote equipment and containment site analytics. And every publically held resource sector company now has to address a strategy for managing data that supports their Environmental, Sustainability and Governance (ESG) goals and investments in low-carbon projects and plans. More exotic techniques such as muon tomography also demonstrate how data management platforms will have to be future-proofed to accommodate new data types as these technologies gain acceptance. In this talk we will discuss some of our real-world experience with preparing data management strategies that can deliver Findable, Accessible, Interoperable and Reusable (FAIR) data to support emerging data analytic and interpretation needs.

15:00 OSDU as an intermediary for digitally driven collaboration and innovation in the oil and gas industry - Mahdis Moradi, PhD Candidate, Norwegian University of Science and Technology (NTNU).

Collaboration and crossing both intra and inter-organizational boundaries, could be a key to tackle challenges such as breaking organizational and data silos, and pursuing digitally enabled process innovation initiatives in the oil and gas industry. However, lack of effective collaboration and governance mechanisms might pose challenges on effective sharing, accumulation, and creation of knowledge, and the desired outcome. Therefore, it is important to have a connectivity-based business model to be able to manage cross boundary collaborations. The development of Open Subsurface Data Universe (OSDU) is a great example of inter-organizational and industrial collaboration. The diversity of players and complex dynamic of interactions in the OSDU community makes it a good case for studying risks of coopetition in complex innovation projects.

In depth study of the inter-organizational collaboration on the development of OSDU data platform help the industry’s stakeholders to make better understanding of how the shift from knowledge monopolies to knowledge sharing results in co-creation of value and building industrial competitive advantage. This study is based on longitudinal case study of the evolution of the OSDU data platform, and focuses on social and relational side (e.g., knowledge sharing, collaborative knowledge formation, trust and distrust, cultural differences, power dynamics, and governance mechanisms) of cross boundary collaboration through digital platforms. 40 interviews with different experts from the oil and gas industry helped us to investigate the way formal and informal coordination mechanisms are combined through online platforms to solve a complex task. We find out to tackle complicated industrial challenges it is better to put the competition mindset aside and make a balance between collaboration and competition to develop ecosystems support innovation. The results will be useful for facilitation of further collaborations and co-creations in the oil and gas industry to foster digital transformation.

To explore more on the innovation potential could be driven from the OSDU, the case study of an oil operator on driving performance in drilling through OSDU will be presented. The results from this study, emphasize on the important role of people, and the significance of inter and intra-organizational collaboration in the successful adoption of new digital solution in organizations.

15:30 Made Smarter Innovation: Centre for People-Led Digitalisation - Susan Lattanzio, Research and Industry Engagement Manager, University of Bath.

According to a number of influential reports the early adoption of digital technologies in manufacturing could realise significant economic, environmental, and societal value. The challenge is the UK is not adopting these technologies as quickly as our competitors. Studies have shown that although there are many obstacles to adoption consistently people and culture are identified as the greatest barrier. The £5 million Made Smarter Innovation: Centre for People-Led Digitalisation aims to improve the outcome of the adoption of digital technologies by understanding the human aspect of digitalisation. Within this presentation we will give a quick background to the Centre, explain how we are collaborating with industry, and describe our current research streams. For those who are interested we will describe how you might get involved in the work we are doing.

------------------------ Thursday December 1st. (UK time) ------------------------

11:00 Carbon Accounting, Monitoring, and Quantification to Foster Corporate Social Responsibility - Prashant Badaltjawdharie, Prashant Badaltjawdharie, CGI.

Organizations experience increasing pressure from governments, investors, lenders, and consumers to be able to prove greenhouse gas mitigation efforts. Being able to do so requires a sound ‘carbon accounting’ reporting strategy, allowing communication of reliable carbon accounting reports to internal and external stakeholders. With more stringent ESG policies, the pressure on organizations increases to have a reliable reporting strategy. This presentation will cover:

  • Carbon accounting concepts
  • Emission scopes and relevant data
  • Basic carbon quantification methods
  • ESG Policies and how to act upon them

11:30 Fission for compliments - Richard Thompson, Enterprise Data Manager, Sellafield Ltd. The grass roots of the ‘Sellafield Data Journey’.

15:00 Digital Skills and Transformation at the North Sea Transition Authority - Patrick Rickles, Head of Digital Skills & Innovation NSTA.

As set out by the Digital Strategy 2020-2025 for the North Sea Transition Authority (NSTA), it is the organisation’s ambition to enable digital services that ensure digital, data and technology work for everyone. Colleagues will have varying levels of digital capabilities as well as confidence, which makes delivery against this challenging. Patrick Rickles (Head of Digital Skills & Innovation) has worked with researchers from Robert Gordan University to assess existing digital skills and initiatives within the NSTA to understand what is/isn’t working and next steps to advance the organisation’s digital maturity. In this presentation, Patrick will discuss that work as well as new programmes coming out of it to empower individuals in their digital learning journey and establish a culture of change mindset.

15:30 Panel session on Data Architecture - Back to the Future? - Led by members of the Board of the SPDM Are we heading down architectural blind alleys? Is the Data Mesh concept only something of use for the largest organisations? Will APIs save us? How about OSDU? What is going on? And what can we do about?

------------------------ Friday December 2nd. (UK time) ------------------------

11:00 Automatic detection of information in unstructured documents to benefit the energy transition - Peter Faasse & Jurrien Dijk, Data Scientist & Data Manager, Geological Survey of the Netherlands, TNO.

The Geological Survey of the Netherlands currently investigates the application of AI in database processes. As NDR, we receive large, unstructured datasets, containing a variety of mostly old, scanned documents such as well reports, which require lengthy manual processing. As companies downscale their operations in the Netherlands, the number of these unstructured data sets increases. Furthermore, relevant aquifer property data can be found within these documents, but is not readily available for (potential) stakeholders.

By using multiple supervised machine learning techniques (text -and image classification), we aim to construct a data processing pipeline, that allows for automated processing and storage of documents and information within.

Preliminary results show good predictive capabilities of the object detection algorithm with 91% precision, as optimization continues. OCR allowed for extremely reliable results in well detection (98% precision) and reliable results in document type classification (89% precision). There are no results yet for detecting aquafer properties. Storing OCR-results in Elastic Search Database greatly improved accessibility and searchability of the documents.  

Current questions revolve around the following topics;

  • What is an acceptable level of accuracy and precision for the results?
  • Which data points are (most) relevant for detection?
  • To what extent do you include human controls in the process?

Scalable end-user access to the subsurface data with domain Ontologies - Adnan Latif, Center Executive Manager, SIRIUS SFI, UiO.

Subsurface digital transformation is about overcoming the bottleneck of data access and increasing the quality of interpretations by means of the better use of data. The data access bottleneck is substantial as up to 70% of subsurface experts' time is spent finding, accessing, integrating, and cleaning data before analysis can even start. (Putting the FOCUS on Data, W3C Workshop on Semantic Web in Oil & Gas Industry, Jim Crompton)

Viewed from the domain experts, it is hard to get an overview of all available data related to an area of interest, as this data is spread over different applications and many internal and external data sources. No unified view is, as a rule, available up front, though Project Data Managers (PDMs) assist. It is difficult to extract data from databases; should complex queries have to be written, Central Data Managers (CDMs) typically assist. It is challenging to extract data and information based on geological and petrophysical attributes as it is not possible to execute these types of queries simultaneously on multiple data sources. It is challenging to integrate datasets before analysis can start: this is often tedious manual work that the domain experts must do themselves. It is incredibly difficult to extract data and knowledge from the text documents as there are very few tools that can deal with the contents of unstructured documents and reports. Domain experts are well aware of the limitations of the workflow. As a result, valuable analyses on data are too often not performed, and possibilities in data are too often not detected.

 One solution to this challenge is to build competence and tools for the subsurface data wrangling. These tools can make domain experts less dependent on the CDMs and PDMs than today. This can be achieved by capturing the special knowledge of the CDMs and PDMs and building this into data wrangling tools. A successful attempt in this direction was Optique, a 14M Euro EU project finished in 2016. Optique showed that geoscience knowledge could be reliably captured in a knowledge graph (or an ontology), and reusable mappings from CDMs could efficiently connect this knowledge graph to data in databases. Optique then demonstrated that complex queries over several federated data sources (including EPDS, NPD FactPages, OpenWorks installations, GeoChemDB, CoreDB and DDR) could be easily written and efficiently executed. Since the process was fully automated, tasks that normally would take several days could, with the Optique platform, be performed in minutes.

Optique showed the potential to transform the way data is gathered and analyzed by streamlining the workflow and making it more user-friendly. However, Optique has also revealed shortcomings that impede the realization of its full potential: (i) limitation to relational databases, (ii) lack of built-in support for quantitative analytics, (iii) lack of access to unstructured data, and (iv) limited tool support for constructing and maintaining the necessary ontology and mappings.

Researchers in SIRIUS are actively working on addressing these shortcomings and aim to significantly broaden the applicability of the approach for use in subsurface projects.

In this session, we will demonstrate how OBDA technology can enable end-users to formulate and run complex queries without worrying about the underlying database(s) architecture, and queries are formulated in the G&G language instead of SQL. It provides extreme flexibility in query formulation by enabling end-users to combine hundreds of combinations in their queries based on G&G concepts and vocabulary encoded into the Geoscience

15:00 Defining the basics to be prepared for change - Liv Stordahl Borud, Leading Advisor Data Management, Equinor.

Access to data with known quality is a key enabler for digitalization and the current transition of the energy industry. In the digital age, data has become a critical input factor for value creation and must be managed throughout the life cycle to generate value based on business needs. To do this, we need the resources, competencies and skills, and we also need to have a common understanding of the roles, responsibilities and requirements for managing data and information across our organisation. Having a fit for purpose governance model is an essential part. We also need a data profession supporting individual development of competence and skills, and to ensure a coordinated approach when the organisation is growing. The aim is to present the governance model for data in Equinor, and the development of a professional pipeline for data professionals.

15:30 Cloud Archiving & AI-based image classification - Jesse Lord, Lead - Product Strategy, Kadme.

The presentation will focus on two recent projects: cloud archiving and the application of world class artificial intelligence technologies to reveal and extract knowledge from E&P datasets. Whereoil is a data integration platform for the oil and gas sector that acts as a single point of search for filesystems, Petrel projects, SharePoint and more. While Whereoil can connect to many different data sources, the one most pertinent to this project is the unstructured filesystem. Whereoil automatically crawls filesystems, ingests, geo-references and categorises the content and keeps it up to date.

This source data provides an interesting use case for archiving this data to the cloud.

  • Files and folders are moved over the internet from on-premise to cloud storage
  • Their content, geometries and metadata remain in the Whereoil index, so people can still find the files as before
  • Data Managers can approve or reject requests for archive or restore
  • Target data size for this project: big enough to be a cost impact, but not big enough to wind up on tapes in a box

These files are also the source data needed for the AI algorithms we applied for image classification.

The objective of the project is to gain a better understanding of images contained within the unstructured data realm and being able to utilise the enriched data for a quicker understanding of their context. In fact, not all knowledge can be located using search terms and metadata. A domain expert can learn a lot about a document just by looking at the figures contained within it.

The image extraction and classification workflow happens in the following way:

  • Images of document pages are sent to Whereoil
  • Labels are applied first to the pages (Document Layout Analysis) and then to the extracted images to classify them by type
  • Extracted, classified images are sent back to Whereoil for visualisation to the end user

To close the loop on this pipeline, users provide feedback to the model via the Whereoil interface. This way the model is kept up to date and grows over time to suit the unique dataset found in each environment.

Subscribe to our Newsletter