Seismic Workflow Optimization
The Seismic Workflow Optimization solution standardizes how seismic data is stored in the OSDU OpenVDS format, allowing for the removal of all physical storage duplication; OpenVDS makes seismic data streamable removing the data copying down to the individual workstations. The solution enables new, innovative ways to optimize the seismic data workflow:
- Consolidation of all seismic data into an Amazon S3 tiered and redundant storage architecture, addressing the needs of archiving as well as high-performance, and can also be integrated with an OSDU Data Platform.
- Improved ROI through optimized utilization of existing applications and removal of data copy, conversion and duplication, leveraging in-flight format transcoding.
Additionally, the Seismic Workflow Optimization solution delivers the following capabilities:
- OpenVDS can be directly consumed by AI/ML engines like Tensorflow, removing the costly data preparation.
- AI/ML can be applied interactively, enabling the powerful synergy between expert geoscientist and computational AI, with drastic acceleration of the workflow while maintaining QC and validation requirements.
- Seismic data can be conditioned in flight, with mathematical algorithms applied to seismic data instantly, leveraging the compute elasticity of the cloud.
Drilling & Completions
Achieve faster and more accurate drilling operations and more cost-optimized completions.
Geology & Geophysics
Reduce exploration risk and reduce time to decision.
Increase production and safety, reduce lifting costs and GHG emissions.
Improve end-to-end refinery optimization around throughput, reliability, costs and emissions.
Learn from and create a better experience for retail customers at gas stations.
Improve storage and delivery of products with streamlined terminal operations and gathering line systems.
Pipelines & Logistics
Increased visibility into condition monitoring and transport cost optimization capabilities across network of pipelines.
Trading & Risk
Optimize commodity trades by delivering various computing capabilities to trading platforms and develop new ones.
How to get started
Step 1: Deployment Readiness Assessment (DRA)
- Data Consolidation Strategy
- Data Tiering Strategy
- Data Streaming Strategy
- Readiness and Maturity Assessment
- Infrastructure inputs into Planning
- Engagement Scope
Step 2: Deployment planning
- Tag Discovery Workshop
- IT and Apps workflow review
- Architecture design
- Reference Architecture
- End to End engagement Plan
- Prioritized Development backlog
- Prioritized Use case backlog
Step 3: Deployment execution
- Setup Data Ingestion zones
- Setup Apps testing
- Implement Delivery pipelines
- End to End testing of delivery pipelines
- Implemented Data Lake
- Implemented Streaming and Compute
- Deployed Use Cases