PNEC 2019

Pre-Conference Workshop: Large-Volume Cloud Data Storage Strategies - ADDITIONAL REGISTRATION REQUIRED (US$295) (Room Salon A-D)

THIS WORKSHOP IS RUN UNDER THE CHATHAM HOUSE RULE:When a meeting, or part thereof, is held under the Chatham House Rule, participants are free to use the information received, but neither the identity nor the affiliation of the speaker(s), nor that of any participant, may be revealed.

A half day participatory and facilitated workshop designed to capture best practices and lessons learned from different industries in managing large data volumes on enterprise level cloud storage.

Historically, professional petroleum data management practitioners have not fully leveraged their geographic proximity and innovative synergies with other data-, asset-, and capital-intensive industries. This workshop will capture and deliver practical methodologies that all participants should be able to take back to their working environments and apply to business-critical problems.

Agenda:

15 Min: Introduction and recap of participant’s key industry challenges

15 Min: Report on Asia-Pacific events and activities

15 Min: Issue priority exercise

45 Min Deliverables Workshop

Quantifying value (participants to choose 3 use cases)

                Business priorities – Internal

·         Delivering data directly from instrumentation to cloud storage

·         Reduction in risk of storing high value data on magnetic tape media

·         Reducing access time to business critical data (weeks to days)

·         Scale and cost reductions for processing of legacy datasets

·         Increased resilience & security

·         Protecting historical data investment

·         Reduction of on-premises footprint for specialized processing with cloud native tools

·         Energy benefits of reduced onsite storage

·         Accelerated speed of data acquisition

·         Rapid e-discovery and analysis from stored data inventories

Technology Priorities - Getting Data Volumes to the Cloud

·         Optimizing processes and workflows for available bandwidth

·         Increasing and supporting ingestion rates

·         Improving data retrieval performance

·         Maximizing compute capability (CPU/GPU)

·         Using increased bandwidth for processed data products

Highlight reporting: 15 min

Break and facilitated networking: 15 min

Networking report: 15 min

15 min mini-tutorial: Psychology of accepting lossy compression for oversampled datasets

45 Minute Deliverables Workshop (Participants to build a business case)

Business Priorities – External

·         Sharing datasets with multiple partners/stakeholders

·         Proprietary vs. public data entitlements

·         Facilitated sharing and collaboration with 3rd parties

·         Cloud data workflows associated with asset transfers

·         Automation of workflows applied across third party data and toolsets

·         Sharing enterprise level reference information models

Technology Prirorities – Using data in the cloud

·         Visualization of data cubes with disparate dimensions and scales

·         Compression and filtering technologies

·         Application of machine learning and artificial intelligence for pattern recognition

·         Extracting and QC'ing metadata at scale

·         Enabling new insights from legacy data with analytics

·         Automated tiering of project data using business rules

·         Accelerated correlation of structured and unstructured data

·         Comparison of standardized data across multiple assets and vintages

·         Creation of testbeds for artificial intelligence and machine learning algorithms

·         Interoperability with spatial and temporal datasets

·         Use of immersive displays and voice activated user interfaces

·         Tagging metadata in the cloud

Highlight Reporting: 15 min.

Final exercise: 20 min

Controlling the scope of a pilot cloud project:

·         Data Type

·         Spatial or Temporal Scope

·         Maturity Modelling