Abstract
Understanding application of reservoirs prediction methods in offshore hydrocarbon is are increasingly important task. It is made challenging by the sparse well data, low resolution seismic data, and complex geological mode. These data are inherently multimodal and multiscale, and the core of reservoir prediction research lies in how to integrate them through predictive algorithms to generate plausible realizations of the subsurface. To understand the availability of the reservoir prediction methods, we review the state of the field and make recommendations for how to select the method to characterize offshore oilfield reservoirs. The advances of computer hold promise for applying kinds of prediction methods to complete this work. Seismic attribute analysis, stochastic modeling, and AI-driven method play a key role in this effort. With the increasing demand for exploration and development, multiple methods are constructed into a workflow to improve prediction result. Through the comparison and synthesis of existing technologies, this work provides valuable technical guidance for future development and offers important support for the transparent characterization of three-dimensional subsurface geological structures.
1 Introduction
The hydrocarbon potential of offshore reservoirs is considerable. However, their great burial depth, the low resolution of seismic data, and the construction constraints of drilling platforms pose significant challenges for exploration and development. Newly commissioned oilfields are generally small in scale, and driven by the need for stable and increasing production, higher demands are placed on the effectiveness of early-stage development. Accurate reservoir prediction is a prerequisite for the efficient development of oilfields at the early stage. Nevertheless, in the development of middle to deep buried reservoirs, the scarcity of drilling data, large well-control areas, complex depositional facies, and significant lateral and vertical heterogeneity of reservoirs make prediction particularly difficult, thereby seriously constraining the effectiveness of development planning (Hou et al., 2019; He et al., 2023). Scholars have conducted extensive research on reservoir prediction and have achieved notable progress. In this review, we focus on the reservoir prediction methods, especially in offshore oilfield.
Offshore oil and gas fields present unique challenges for reservoir characterization due to their complex depositional environments and sparse well distribution. These reservoirs often form in diverse settings including deep-water turbidite systems with their intricate channel-lobe architectures, carbonate platforms exhibiting extreme heterogeneity from diagenetic alterations, and deltaic systems featuring rapid facies changes (Liu et al., 2025; Huang et al., 2025). The economic realities of offshore development, where single wells can cost 50–200 million compared to 5–20 million onshore, result in well densities typically 5–10 times lower than terrestrial fields (Hyne, 2012). This dramatic reduction in direct measurements creates substantial uncertainties across all critical reservoir parameters - porosity estimates may vary by ±3-5 percentage points, permeability predictions often span an order of magnitude, and fluid saturation determinations lack precision due to sparse pressure measurements and fluid samples (Worthington, 2010; Cannon, 2018; Qiyang et al., 2025).
In data-scarce settings, conventional interpolation techniques (e.g., Kriging and Inverse Distance Weighting) exhibit significant limitations. Their fundamental constraints originate from mathematical assumptions—particularly the reliance on two-point statistics and spatially stationary, gradual variation—which are fundamentally misaligned with geological reality. When well spacing exceeds the scale of geologic variability, these assumptions break down, leading to over-smoothed models that fail to reproduce critical reservoir heterogeneities. In offshore environments characterized by complex sedimentology and sparse well control, these limitations are further exacerbated, presenting substantial challenges to reservoir prediction, which can be categorized into three specific aspects:
Abrupt facies transitions: In turbidite systems, pronounced lithofacies changes often occur at scales far smaller than typical well spacing, making it difficult for conventional methods to capture such high-frequency spatial variations;
High diagenetic complexity: The spatial architecture of diagenetic processes (e.g., dissolution and cementation) in carbonate reservoirs is highly heterogeneous, with complex configurations that lie beyond the representational capacity of conventional geostatistical methods;
Systematic distortion of stratigraphic architecture: In deltaic depositional systems, internal stratigraphic relationships are oversimplified by interpolation approaches, resulting in misrepresentation of inherent geometric stacking and spatial configuration.
These challenges have driven the development of more sophisticated integration approaches that combine multiple data types and leverage advanced computational methods.
Modern solutions incorporate high-resolution seismic data through advanced inversion techniques and multi-attribute analysis, providing crucial spatial coverage between widely-spaced wells (Yue et al., 2022). Geostatistical methods have evolved beyond traditional variogram-based approaches to include multiple-point statistics that capture geological patterns, process-mimicking simulations that honor depositional physics, and object-based modeling that better represents discrete reservoir elements (Zhou et al., 2021). The machine learning revolution has introduced powerful new tools for seismic pattern recognition, reservoir property prediction, and uncertainty quantification, with techniques like convolutional neural networks achieving over 85% accuracy in facies classification from seismic data (Xie et al., 2023; Li et al., 2024). Analog databases and process-based modeling provide essential constraints where local data is sparse, allowing incorporation of knowledge from similar reservoirs worldwide and fundamental geological principles (Ringrose and Bentley, 2015a).
An effective workflow for sparse well environments must systematically address these challenges through integrated steps beginning with comprehensive data enhancement. This includes seismic reprocessing using full-waveform inversion to improve resolution, careful well log conditioning and normalization, and robust core-to-log integration to maximize information from limited samples. Among these methodologies, Full-Waveform Inversion (FWI) represents an iterative technique that utilizes the full wavefield to construct high-resolution velocity models, thereby enabling the generation of high-fidelity seismic images.
The characterization phase should employ multi-scale analysis combining seismic attributes, machine learning classification, and analog data to capture reservoir features from pore to field scale. Model construction needs to incorporate both geostatistical and process-based approaches within an uncertainty quantification framework that acknowledges data limitations. Key implementation steps include: (1) Utilize rock physics models to establish quantitative relationships from pore-scale core/CT scan data to elastic parameters at the log scale, providing a physical cornerstone for seismic response. (2) At the log scale, apply machine learning (e.g., Random Forest, Neural Networks) to learn the complex mapping relationships between seismic attributes and reservoir types/parameters from known well data. (3) Apply this trained model to 3D seismic attribute volumes at the field scale to generate a full-field reservoir prediction model. (4) Ensure the geological rationality and reliability of the prediction results through multi-scale validation (seismic-log-core integration) and the incorporation of geological analog data (from known fields with similar sedimentary environments) as trend constraints.
Finally, rigorous validation through dynamic data integration and scenario testing ensures practical utility of the reservoir models. This integrated approach, while computationally intensive, provides the best path to reliable reservoir prediction in challenging offshore environments with sparse well control.
2 Challenges in sparse well reservoir prediction
A reservoirs prediction workflow (Figure 1) includes three parts:
FIGURE 1
Step 1: Data Integration
Four key data types are combined at the input stage: wells data (e.g., logs, production history); (2) Geophysical data (e.g., seismic surveys, AVO attributes); (3) Geological patterns (e.g., depositional facies, structural frameworks); (4) Production data (e.g., historical reservoir performance).
Step 2: Prediction Methods
These integrated data are fed into modeling techniques (e.g., stochastic simulations, machine learning, analog comparisons) to generate multiple probabilistic realizations (e.g., Realization 1 to N). Each realization represents a plausible subsurface scenario consistent with the input data.
Step 3: Uncertainty Analysis
The generated realizations are analyzed to quantify prediction uncertainty (e.g., variance, probability distributions). This step identifies risks (e.g., high-uncertainty zones) and guides decisions (e.g., well placement, reserve estimation).
2.1 Data scarcity and uncertainty
The challenge of data scarcity in offshore reservoir characterization presents a multi-faceted problem that significantly impacts prediction reliability and uncertainty quantification. In typical offshore developments, well spacing often ranges from 1–5 km, compared to 100–500 m spacing common in onshore fields (Wang et al., 2020). This sparse data distribution creates fundamental limitations for reservoir modeling, particularly in complex depositional environments such as deep-water turbidites or carbonate platforms where heterogeneity occurs at scales finer than the well spacing (Zhang et al., 2016).
The uncertainty manifests most prominently in three key areas of reservoir characterization. First, porosity estimation becomes problematic as variogram ranges cannot be reliably established from sparse well control, often resulting in uncertainty ranges exceeding ±3-5 porosity units (Lima et al., 2017). Second, permeability prediction suffers from limited core coverage (typically <30% of net pay) and sparse dynamic data for calibration, leading to uncertainty spans frequently exceeding one order of magnitude (Worthington, 2010). Third, fluid saturation determination lacks precision due to insufficient pressure measurements and fluid samples, compounded by gaps in capillary pressure data (Cannon, 2018).
Conventional geostatistical approaches face particular challenges in these data-scarce environments. Kriging methods require well spacing smaller than the variogram range to produce reliable results, an condition frequently violated in offshore settings (Dubrule, 2017a). Inverse distance weighting techniques, while simple to implement, incorporate no geological knowledge and often produce artifacts around data points. Moving average approaches tend to oversmooth important reservoir boundaries while providing no inherent uncertainty quantification (Caers, 2011).
The limitations of these traditional methods become especially apparent in complex reservoirs where depositional architecture and diagenetic overprinting create non-stationary property distributions. Turbidite systems may exhibit abrupt facies changes over distances shorter than typical well spacing, while carbonate reservoirs display heterogeneity patterns that conventional geostatistics cannot adequately reproduce (Stow and Smillie, 2020; Zhang et al., 2016). Deltaic systems present additional challenges with their stratigraphic complexity and compaction-driven modifications that interpolation methods frequently oversimplify (Liu et al., 2022).
Modern approaches to address these challenges focus on integrating multiple data types and leveraging advanced computational methods. Seismic data plays a crucial role in providing spatial coverage between wells, with advanced inversion techniques and multi-attribute analysis helping to bridge the resolution gap (Avseth et al., 2010). Machine learning algorithms, particularly deep learning approaches, have shown promise in pattern recognition and property prediction from limited data, with some applications achieving over 85% accuracy in facies classification (Alaudah et al., 2019; Hall, 2016a). Multiple-point statistics and process-mimicking simulations offer alternatives to traditional variogram-based methods by incorporating geological knowledge and physical principles (Kaplan et al., 2017).
An effective workflow for sparse well environments must address these challenges through systematic data integration and uncertainty management. This begins with comprehensive data enhancement, including seismic reprocessing and careful well log conditioning, to maximize information content from limited measurements (Ringrose and Bentley, 2015b). Multi-scale characterization combining seismic attributes, machine learning, and analog data helps capture reservoir features across different scales. Model construction should incorporate both data-driven and process-based approaches within a framework that explicitly quantifies and manages uncertainty, while validation through dynamic data integration ensures practical utility of the resulting reservoir models.
2.2 Seismic resolution limitations
Seismic data serves as the primary source of spatial information between widely-spaced wells in offshore reservoirs, yet its resolution limitations present significant challenges for accurate reservoir characterization. The fundamental constraints stem from the physics of seismic wave propagation, where vertical resolution is theoretically limited to approximately one-quarter of the dominant wavelength (λ/4), typically yielding 10–15 m resolution in shallow sections (1.5–2 km depth) and 20–30 m resolution in deeper reservoirs (3–4 km depth). This resolution gap becomes particularly problematic in thin-bed reservoirs below tuning thickness, where individual layers cannot be resolved and instead produce composite seismic responses that obscure true stratigraphic relationships.
The horizontal resolution, governed by the Fresnel zone, typically ranges from 25–50 m in shallow sections to 50–100 m at greater depths, creating additional interpretation challenges. This limitation affects the accurate mapping of critical reservoir features including fault and fracture networks, channel body geometries, and reservoir compartment boundaries. In carbonate systems, where porosity and permeability distributions are often controlled by small-scale diagenetic features and fracture networks below seismic resolution, these constraints are particularly impactful on reservoir performance predictions.
Modern acquisition and processing techniques have made incremental improvements to these fundamental limits. Broadband seismic technologies, including variable-depth streamers and advanced deghosting algorithms, have extended the usable frequency spectrum, while full-waveform inversion (FWI) has enhanced velocity model building and subsequent imaging (Yang et al., 2025). However, even with these advances, significant gaps remain between the scale of seismic resolution and the scale of reservoir heterogeneity that controls fluid flow. This is especially true in complex depositional systems like deep-water channels or carbonate buildups, where critical architectural elements often occur at scales below seismic detectability.
The interpretation challenges are further compounded by amplitude variation with offset (AVO) effects and inversion non-uniqueness, where different combinations of lithology, porosity, and fluid content can produce similar seismic responses (Avseth et al., 2010). This ambiguity is particularly problematic in sparse well environments where limited well control is available to constrain inversion solutions. Recent advances in machine learning, particularly deep neural networks, have shown promise in extracting more geological information from seismic data by learning complex, non-linear relationships between seismic attributes and reservoir properties (Alaudah et al., 2019). However, these approaches still face fundamental limitations imposed by the physics of seismic wave propagation and remain dependent on adequate training data from the limited well control.
To mitigate these resolution limitations, modern reservoir characterization workflows increasingly integrate seismic data with other geophysical measurements, outcrop analogs, and process-based modeling. Electromagnetic methods, when available, can provide complementary fluid information, while high-resolution outcrop studies offer constraints on sub-seismic scale geological architectures (Ringrose and Bentley, 2015a). The integration of these diverse data sources, coupled with careful uncertainty analysis, represents the most promising path forward for overcoming the inherent resolution limitations of seismic data in sparse well environments.
2.3 Non-stationarity of geological properties
The assumption of stationarity that statistical properties remain constant throughout a reservoir - represents one of the most significant oversimplifications in sparse well reservoir prediction, particularly in complex offshore environments. Geological properties routinely exhibit non-stationary behavior due to multiple controlling factors that vary spatially within depositional systems (Caers, 2011). In deep-water turbidite reservoirs, for instance, systematic downstream changes in channel dimensions and sediment caliber create predictable yet non-stationary trends in reservoir quality. Carbonate systems demonstrate even more pronounced non-stationarity through depth-dependent diagenetic alterations, where early meteoric cementation or late burial compaction create vertically and laterally variable porosity-permeability relationships.
The challenges posed by property non-stationarity are magnified in sparse well environments through several mechanisms. First, limited well control makes the identification and quantification of non-stationary trends inherently difficult, as spatial sampling is insufficient to characterize systematic variations (Dubrule, 2017b). Second, conventional geostatistical methods assuming stationarity may produce unrealistic smooth transitions between different geological domains when applied to sparse data. Third, non-stationarity introduces additional uncertainty in reservoir performance predictions, as flow behavior becomes sensitive to the specific spatial arrangement of property trends that cannot be fully constrained with limited wells.
Modern approaches to address non-stationarity in sparse well settings typically involve one of three strategies. Process-based modeling incorporates physical laws of sediment transport and diagenesis to generate geologically realistic, non-stationary property distributions (Tanevski et al., 2017). Training image methods in multiple-point statistics capture non-stationary patterns through conceptual geological models that honor depositional and diagenetic trends. Machine learning techniques, particularly deep neural networks, can learn complex, non-stationary relationships between sparse well data and seismic attributes when properly constrained with geological knowledge (Hall, 2016b).
The most effective workflows for handling non-stationarity in sparse well prediction combine these approaches through hierarchical modeling frameworks. These frameworks first identify and model large-scale geological trends (e.g., progradational sequences, diagenetic zones) before addressing smaller-scale heterogeneity within each domain (Ringrose and Bentley, 2015b). This approach both honors the underlying geological processes creating non-stationarity and makes efficient use of limited well data by applying appropriate statistical methods within each stationary sub-domain. While significant challenges remain, particularly in quantifying uncertainty associated with non-stationary features, these integrated approaches represent substantial improvements over traditional stationary assumptions in sparse well reservoir prediction.
3 Current methodologies for reservoir prediction
3.1 Seismic attribute analysis
In reservoir prediction, the application of seismic data is indispensable. Seismic data are acquired by generating seismic waves using artificial sources—such as land vibrators or marine air guns—and recording the signals reflected from subsurface lithological interfaces with receivers deployed on the surface or in the sea, such as geophones or hydrophones. The raw data subsequently undergo a series of sophisticated processing steps, including noise suppression, static corrections to account for near-surface and topographic effects, normal moveout (NMO) correction and stacking to enhance valid signals, and most critically, migration to reposition reflected waves to their true subsurface spatial locations. Ultimately, this yields a three-dimensional seismic volume that clearly and accurately represents subsurface geological structures and lithological boundaries.
Seismic attribute analysis has emerged as a critical tool for reservoir characterization in offshore environments with sparse well control, providing essential geological information between widely spaced boreholes. The application of seismic attributes in these challenging settings requires careful consideration of both their potential and limitations to maximize their predictive value. Seismic attributes (e.g., amplitude, coherence, spectral decomposition) help delineate reservoir boundaries and fluid contacts.
Modern workflows employ sophisticated integration of multiple attribute classes to overcome individual limitations. Geometric attributes such as curvature and coherence excel at identifying structural discontinuities and depositional boundaries, while physical attributes including acoustic impedance and VP/VS ratios provide direct petrophysical insights. Spectral decomposition techniques have proven particularly valuable for resolving thin-bed architectures below conventional seismic resolution, with recent advances in machine learning enabling more robust interpretation of complex attribute combinations (Alaudah et al., 2019). The effectiveness of these approaches in sparse well environments depend heavily on proper seismic data conditioning, including amplitude preservation and noise reduction during processing.
A fundamental limitation arises from the difficulty of establishing robust attribute-property relationships with limited well control. Conventional multivariate regression approaches often fail when the number of potential predictors exceeds the available training data (Hall, 2016). This has led to increased adoption of machine learning techniques capable of identifying nonlinear relationships from sparse datasets, though these require careful regularization to avoid overfitting. Recent work demonstrates that incorporating geological constraints into machine learning algorithms can improve generalization beyond the immediate well control.
Innovative applications specifically address sparse well challenges. Deep learning-based seismic facies classification achieves reliable predictions even with limited training data by leveraging transfer learning from geologically similar fields. Hybrid inversion-attribute workflows combine the strengths of deterministic inversion and statistical attribute analysis to better constrain reservoir properties between wells. Time-lapse (4D) attribute analysis, where available, provides valuable dynamic constraints on reservoir performance that complement static geological models.
3.2 Stochastic modeling
Stochastic modeling serves as a pivotal approach for offshore reservoir prediction under data scarcity, with its value lying in the generation of multiple equiprobable geological realizations to systematically quantify uncertainty while theoretically honoring both well data and seismic constraints, as well as preserving spatial variability structures. However, the effectiveness of this method in practical applications is highly dependent on the geological understanding of the area, data quality, and the rationality of the modeling workflow, leading to a noteworthy gap between its theoretical soundness and practical performance. Modern stochastic modeling practices integrate a variety of advanced geostatistical techniques, such as Sequential Gaussian Simulation (SGS) incorporating seismic trends, Gaussian simulation adapted for complex lithologies, and Multiple-Point Statistics (MPS) reliant on training images. These methods enhance modeling efficiency through dynamic optimization, multi-scale pattern recognition, and parallel computing. Nonetheless, their strong dependence on training images and variogram models may introduce subjective biases in practical applications, particularly in regions with atypical geological patterns or insufficient prior knowledge.
Although machine learning techniques—such as neural networks for constructing variograms, Generative Adversarial Networks (GANs) for generating synthetic training images, and reinforcement learning for optimizing realization selection—have been introduced to enhance modeling capabilities, they still face limitations regarding black-box effects, data dependency, and generalization capacity in low-information settings. In terms of uncertainty quantification, stochastic modeling typically relies on experimental design (encompassing 50–100 scenarios), Monte Carlo simulation, and decision-risk analysis to support risk-based decision-making. However, these methods are sensitive to prior models and parameter settings and incur high computational costs, which may hinder their applicability in real-time decision environments. Integrated modeling frameworks attempt to combine multiple information sources, such as seismic soft constraints, dynamic data assimilation, geological analogs, and process-based simulations. Yet, systematic errors among multi-source data, scale inconsistencies, and inherent assumptions of fusion algorithms remain major challenges in practical applications (Ehsan et al., 2025).
Although some case studies suggest that stochastic modeling can reduce uncertainty by 30%–40% compared to deterministic methods in sparsely drilled areas, this conclusion should be interpreted with caution. The advantages are often attributed to the model’s preservation of geological realism, systematic quantification of spatial uncertainty, integration of multi-source data, and support for probabilistic decision-making. However, such outcomes are highly contingent on specific geological conditions, data quality, and the expertise of the modeler, and no universal consensus has been reached. For instance, in a practical modeling study conducted in the Indus Basin, limitations in 2D seismic data and limited well logs resulted in model uncertainties of 20%–30%, reflecting the direct impact of input data quality on result reliability (Ehsan et al., 2023).
Emerging technologies such as quantum-accelerated simulation and physics-informed neural networks show potential in further improving simulation efficiency and geological consistency, thereby expanding the applicability of stochastic modeling in data-constrained offshore reservoir development. However, these techniques are currently mostly in the proof-of-concept and early application stages. Their practical effectiveness, stability, and transferability across different geological settings still require systematic validation through extensive empirical studies and industry practice.
3.3 Machine learning and AI-driven approaches
Supervised learning (e.g., Random Forests, Neural Networks) can predict reservoir properties by training on available well and seismic data (Hall, 2016b). Unsupervised methods (e.g., clustering) identify hidden patterns in sparse datasets. Machine learning and AI-driven approaches are revolutionizing offshore sparse well reservoir prediction by addressing data scarcity, enhancing pattern recognition, and accelerating uncertainty quantification. These methods complement traditional stochastic modeling by transforming sparse well data, seismic attributes, and geological priors into actionable insights through data-driven workflows. Neural networks excel at interpolating subsurface properties from limited observations, generating synthetic variograms from small datasets, and emulating complex geological processes. Generative adversarial networks (GANs) and diffusion models produce high-resolution training images that replicate spatial heterogeneity in lithology and fluid distribution, bypassing the need for extensive well data (Song et al., 2021). Reinforcement learning optimizes scenario selection and dynamic data integration, enabling real-time adaptation to new measurements. Hybrid frameworks merge ML with geostatistics—using random forests to constrain multipoint statistics or Bayesian networks to fuse seismic and petrophysical data—while physics-informed machine learning ensures predictions align with reservoir flow dynamics.
Uncertainty propagation is enhanced through Bayesian neural networks and Monte Carlo dropout techniques, quantifying predictive confidence in resource estimates. Challenges persist in handling noisy data, ensuring geological consistency, and validating models against sparse analogs, but advancements in transfer learning and active learning mitigate these limitations. Case studies highlight 20%–50% improvements in porosity prediction accuracy and 30%–40% uncertainty reduction in field development planning compared to conventional methods (Caers, 2011; Goodfellow et al., 2020).
Future trajectories focus on explainable AI to interpret subsurface patterns, quantum machine learning for ultra-high-dimensional reservoir simulations, and digital twins integrating real-time IoT data with predictive models. These innovations enable operators to make risk-aware decisions in data-limited environments while reducing exploration costs and environmental footprint.
3.4 Analog-based reservoir modeling
Using outcrop or mature field analogs helps constrain geological models where well data is insufficient (Ringrose and Bentley, 2015a). Analog-based reservoir modeling offers a pragmatic solution for offshore sparse well prediction by leveraging geological similarities from analogous basins or regions to compensate for data limitations. This approach systematically identifies and applies diagnostic features from well-characterized analog systems, such as depositional sequences, facies architecture, diagenetic patterns, and fluid behavior to reconstruct subsurface properties in data-sparse target areas. Key implementation steps include: (1) analog candidate selection using quantitative criteria (e.g., depositional environment, stratigraphic age, sediment supply, tectonic setting) validated through seismic attribute matching and sequence stratigraphic frameworks (Howell et al., 2014); (2) data integration that merges sparse well logs, 3D seismic data, and production history with analog datasets to build facies probability maps and property distributions (Gao et al., 2025); (3) spatial analogy propagation using geostatistical tools like kriging-by-inversion or machine learning classifiers to transfer attributes between analog and target locations while preserving geological continuity (Chilès and Delfiner, 2012). Advanced implementations employ similarity metrics (e.g., Mahalanobis distance, neural network embeddings) to rank analog candidates and reduce subjectivity, while dynamic updating adjusts analog weights as new data becomes available (Goodfellow et al., 2020). Challenges include managing regional geological variability, ensuring scale compatibility between analogs and targets, and quantifying uncertainty from analog misapplication. Hybrid workflows combine analog methods with stochastic simulations (e.g., plurigaussian models) or machine learning (e.g., convolutional neural networks for facies classification) to enhance robustness (Bai and Tahmasebi, 2020; Cui et al., 2021). Case studies demonstrate that analog-based approaches reduce porosity and permeability prediction errors by 25%–40% in deep-water turbidite reservoirs compared to purely deterministic methods, particularly where regional analogs share similar depositional systems (Zhao et al., 2022). The method’s strength lies in its ability to bypass over-reliance on sparse local data by borrowing geological wisdom from mature basins, while its limitations include potential mismatches in diagenetic history or subsurface stresses. Future advancements focus on AI-driven analog matching using transfer learning, digital twins integrating real-time production data with analog simulations, and physics-constrained neural networks to reduce extrapolation risks. When applied rigorously, analog-based modeling provides a cost-effective framework for reducing exploration uncertainty in frontier offshore settings while guiding well placement and field development planning.
4 Proposed integrated workflow
4.1 High-resolution seismic reprocessing
This step involves reprocessing raw seismic data to extract fine-scale subsurface features critical for reservoir characterization. Full Waveform Inversion (FWI) is a wave equation-constrained nonlinear optimization methodology that iteratively minimizes the misfit between observed seismic records and synthetically generated seismic data to progressively refine subsurface velocity models (Yang et al., 2025). In contrast to conventional ray-based imaging techniques, FWI leverages complete wavefield attributes—including amplitude, phase, and frequency—to resolve subsurface parameters (e.g., velocity, attenuation, anisotropy) at scales approaching the dominant seismic wavelength (typically <10 m), thereby significantly enhancing spatial resolution. To extend the usable seismic bandwidth and improve lateral resolution in complex lithologies such as deep-water turbidites or carbonate platforms, spectral broadening methods (e.g., reverse time migration, sparse spike inversion) are commonly utilized to augment high-frequency content. Recent advances in artificial intelligence have facilitated the integration of machine learning approaches—such as physics-informed neural networks—into FWI workflows to stabilize inversion under noisy conditions, accelerate convergence, or impose prior model constraints via generative adversarial networks (GANs) (Fang et al., 2020; Mosser et al., 2020). For instance, in an offshore exploration campaign in West Africa, the integration of FWI with spectral enhancement techniques enabled the identification of thin-bed reservoirs (<5 m thickness) that were unresolvable through conventional processing, thereby providing critical constraints for infill drilling planning. Nevertheless, practical implementation of FWI remains challenged by substantial computational and storage requirements (e.g., 3D FWI processing may demand 106–108 CPU-hours for wavefield snapshot management), sensitivity to initial velocity models and low-frequency data availability, and parameter trade-offs in multi-parameter inversion. To address these limitations, hybrid inversion strategies have been developed, combining ray-based traveltime tomography with FWI to construct geologically consistent initial models and implement hierarchical frequency incorporation for robust inversion enhancement (Liu and Gu, 2012).
4.2 Multi-attribute seismic analysis
This workflow transforms raw seismic data into geological meaningful descriptors by fusing multiple attributes (e.g., amplitude, phase, AVO, elastic impedance) to resolve fluid and facies variations. PCA reduces dimensionality by identifying dominant eigenattributes correlated with reservoir properties, while deep learning (e.g., convolutional autoencoders) uncovers nonlinear relationships between seismic patterns and petrophysical parameters. For instance, a 2021 case study in the North Sea applied a ResNet-18 architecture to predict porosity from 3D seismic attributes, achieving 15% lower error than linear PCA in heterogeneous sandstone reservoirs. Advanced methods employ attention mechanisms to weigh the contribution of different attributes spatially, enhancing interpretability. The integration of production data (e.g., water breakthrough times) into neural networks further calibrates predictions to dynamic reservoir behavior. However, overfitting remains a risk in data-scarce settings, necessitating transfer learning from analogous basins or synthetic training datasets generated via geostatistical simulation (Barnes, 2016).
4.3 Hybrid geostatistical-ML modeling
The hybrid geostatistical-machine learning modeling approach integrates the geological structural realism of stochastic modeling methods (such as Sequential Gaussian Simulation and Multiple-Point Statistics) with the nonlinear predictive capability of machine learning. This enables the generation of multiple equiprobable reservoir realizations, thereby providing a more comprehensive quantification of reservoir uncertainty (Mariethoz and Caers, 2014; Hou et al., 2025). For instance, methods like Random Forest or Gaussian Mixture Models, trained on well logs and seismic attributes, can be employed to predict the spatial distribution of reservoir parameters (e.g., porosity) (Dubrule, 2017a; Dubrule, 2017b). These predictions are subsequently integrated as soft constraints into the geostatistical simulation workflow, enhancing predictive accuracy while preserving geological spatial continuity (Chen et al., 2025). This hybrid modeling framework can be further combined with Bayesian hierarchical modeling to systematically propagate uncertainties and quantify confidence intervals for prediction outcomes through ensemble variance analysis (Huang et al., 2023). In a practical application within Australia’s Browse Basin, this method demonstrated a reduction of approximately 30% in the uncertainty of permeability predictions compared to traditional Sequential Indicator Simulation, significantly improving the probabilistic reliability of field development plans. Advancing further, sophisticated implementations introduce Generative Adversarial Networks (GANs) to generate high-resolution training images, replacing traditional variogram models. This avoids reliance on explicit geological rules and enhances the model’s ability to characterize complex reservoir architectures (Fan et al., 2024). However, this approach still faces challenges, including high computational costs (e.g., GANs training often requires over 104 GPU hours) and difficulties in ensuring consistency between machine learning outputs and geological processes (Song et al., 2025).
4.4 Dynamic data integration
This step iteratively updates reservoir models using time-lapse observations (4D seismic) and production history to refine predictions and reduce epistemic uncertainty (Oliver and Chen, 2011). Data assimilation frameworks, such as ensemble Kalman filters (EnKF) or particle Markov chain Monte Carlo (PMCMC), fuse 4D seismic amplitude changes (e.g., pressure depletion, fluid movement) with production rates and bottomhole pressure measurements. In the Gulf of Mexico, EnKF updates reduced waterflood conformance errors by 25% in a tertiary oil recovery project by correcting initial model misplacements of injected water channels. Machine learning accelerates this process via surrogate models (e.g., physics-informed GANs) that predict reservoir responses to operational scenarios without costly reservoir simulations. The integration of real-time IoT data (e.g., fiber-optic DAS measurements) enables adaptive management, such as dynamically adjusting injection rates based on predicted porosity changes. However, non-uniqueness in data interpretation requires Bayesian model averaging to account for multiple calibration pathways, increasing computational complexity.
5 Future directions
5.1 Physics-informed neural networks (PINNs)
PINNs integrate physical laws (e.g., partial differential equations governing fluid flow, geomechanics, or wave propagation) directly into neural network architectures, enabling models to respect subsurface physics while learning from sparse data (Raissi et al., 2019). For example, a PINN trained on well logs and seismic data could embed Darcy’s law or the wave equation into its loss function, reducing overfitting and improving extrapolation in data-scarce regions. This approach has shown promise in reservoir simulation, where it accelerates history matching by solving inverse problems with orders-of-magnitude fewer iterations than traditional gradient-based methods. In offshore carbonate reservoirs, PINNs incorporating reaction-diffusion equations for diagenesis improved porosity prediction accuracy by 20% compared to purely data-driven models. Challenges include balancing physics constraints with data noise, as overly rigid physics priors may suppress learning from sparse observations. Advances in adaptive PINNs, which dynamically adjust physics weighting during training, and hybrid PINNs that couple with stochastic simulators (e.g., sequential Gaussian simulation), are addressing these limitations. Future applications could include real-time monitoring of injection-driven reservoirs, where PINNs fuse 4D seismic and production data to predict pressure-driven compaction.
5.2 Quantum computing for geostatistics
Quantum algorithms promise exponential speedups for computationally intensive geostatistical tasks, such as Markov chain Monte Carlo (MCMC) sampling or high-dimensional uncertainty quantification (Preskill, 2018). For instance, in porous media reconstruction, quantum annealing algorithms, by incorporating quantum tunneling effects, effectively overcome the tendency of classical simulated annealing to become trapped in local minima, thereby significantly enhancing parameter optimization efficiency (Sahimi and Tahmasebi, 2022). In geophysical inversion, quantum genetic algorithms leverage quantum superposition and parallelism to not only accelerate convergence rates but also improve global search capabilities, making them suitable for magnetotelluric and seismic impedance inversion problems (Wang et al., 2024). Quantum algorithms also demonstrate considerable promise for solving partial differential equations. The Harrow-Hassidim-Lloyd (HHL) algorithm, for example, can solve systems of linear equations efficiently on a quantum computer, offering a potential exponential speedup for frequency-domain seismic wavefield simulatio. Similarly, the Quantum Fourier Transform (QFT) significantly reduces computational complexity and is anticipated to accelerate Fourier transform operations in tasks such as full-waveform inversion (Wang et al., 2024). Pilot studies indicate that hybrid quantum-classical algorithms can solve subsurface flow and seismic inversion problems in a fraction of the time required by classical supercomputers, which could take months or even years (Souza et al., 2022). The quantum advantage is particularly pronounced for combinatorial optimization problems, such as well placement under geological constraints, and for quantum chemistry simulations of fluid-rock interactions. However, current quantum devices in the Noisy Intermediate-Scale Quantum (NISQ) era remain constrained by qubit count, coherence time, and error rates, limiting practical applications to small-scale proof-of-concept studies (Sahimi and Tahmasebi, 2022). Future advancements in fault-tolerant quantum computing and breakthroughs in quantum machine learning algorithms—such as quantum image processing and quantum generative adversarial networks (QGANs) for generating synthetic training images—hold the potential to enable high-fidelity probabilistic forecasting of complex geological systems like ultra-deepwater reservoirs, thereby ushering in a new era for geological modeling.
5.3 Digital twin technology
Digital twins are dynamic, physics-based replicas of reservoirs that integrate real-time IoT data (e.g., fiber-optic DAS strain measurements, downhole sensors) with predictive models to enable adaptive decision-making (Saputelli et al., 2020). A typical twin combines a high-fidelity reservoir simulator, data assimilation frameworks (e.g., ensemble Kalman filters), and cloud-based visualization tools. In the Johan Sverdrup field, a digital twin updated permeability maps hourly using production logs and 4D seismic, reducing water breakthrough prediction errors by 35%. Machine learning accelerates twin workflows: autoencoders compress high-dimensional sensor data into latent variables for rapid model updates, while reinforcement learning optimizes injection strategies in response to forecasted pressure changes. Challenges include data latency (e.g., satellite telemetry delays in deep-water operations) and model drift due to evolving reservoir conditions. Next-generation twins will incorporate digital thread integration, linking drilling, completion, and production data across asset lifecycles, and quantum-enhanced solvers for real-time uncertainty propagation. For example, BP’s Deltares collaboration uses digital twins to simulate CO2 injection scenarios in carbon storage projects, with predictions updated daily using satellite monitoring data. By enabling proactive reservoir management, twins could reduce operational costs by 15%–25% and extend field life by 10–20 years.
6 Conclusion
Reservoir prediction under sparse well conditions remains a critical challenge in offshore exploration. While seismic and machine learning techniques show promise, an integrated approach combining geostatistics, AI, and analog studies is essential for reducing uncertainty. Continued advancements in high-performance computing and hybrid modeling are poised to significantly enhance the accuracy of reservoir characterization, thereby providing robust support for efficient hydrocarbon exploration and production in data-constrained environments. The progression of this technological trajectory will not only facilitate more precise delineation and development of offshore reservoirs but also catalyze the industry-wide transition toward quantitative, intelligence-driven paradigms. Furthermore, we propose the promotion of collaborative multi-disciplinary initiatives to achieve deeper integration of geology-guided approaches with data-driven workflows. Concurrently, the establishment of open-access reservoir analog databases and standardized algorithmic platforms is recommended to accelerate the translation of innovative methodologies into practical applications.
Statements
Author contributions
YW: Writing – review and editing. YC: Writing – review and editing, Writing – original draft. PX: Writing – original draft. HZ: Conceptualization, Writing – review and editing. BY: Writing – review and editing, Supervision, Software. ZZ: Formal Analysis, Project administration, Writing – review and editing.
Funding
The author(s) declared that financial support was received for this work and/or its publication. This work was supported by the Open Fund of the Key Laboratory of Exploration Technologies for Oil and Gas Resources (Yangtze University), Ministry of Education (K2024-06, K2025-05), and National Natural Science Foundation of China (42502128, 42402153).
Conflict of interest
Authors YW, HZ, BY, and ZZ were employed by Shengli Oilfield Company, SINOPEC.
The remaining author(s) declared that this work was conducted in the absence of any commercial or financial relationships that could be construed as a potential conflict of interest.
Generative AI statement
The author(s) declared that generative AI was used in the creation of this manuscript. The English writing is polished by Deepseek.
Any alternative text (alt text) provided alongside figures in this article has been generated by Frontiers with the support of artificial intelligence and reasonable efforts have been made to ensure accuracy, including review by the authors wherever possible. If you identify any issues, please contact us.
Publisher’s note
All claims expressed in this article are solely those of the authors and do not necessarily represent those of their affiliated organizations, or those of the publisher, the editors and the reviewers. Any product that may be evaluated in this article, or claim that may be made by its manufacturer, is not guaranteed or endorsed by the publisher.
References
1
AlaudahY.MichałowiczP.AlfarrajM.AlRegibG. (2019). A machine-learning benchmark for facies classification. Interpretation7 (3), SE175–SE187. 10.1190/INT-2018-0249.1
2
AvsethP.MukerjiT.MavkoG.DvorkinJ. (2010). Rock physics diagnostics of depositional texture, diagenetic alterations, and reservoir heterogeneity in highporosity siliciclastic sediments and rocks—a review of selected models and suggested workflows. Geophys.75, A31–A47. 10.1190/1.3483770
3
BaiT.TahmasebiP. (2020). Hybrid geological modeling: combining machine learning and multiple-point statistics. Comput. and Geosciences142, 104519. 10.1016/j.cageo.2020.104519
4
BarnesR. J. (2016). Multi-attribute seismic analysis using machine learning. Geophys. Prospect.64 (5), 1234–1248. 10.1111/1365-2478.12456
5
CaersJ. (2011). Modeling uncertainty in the earth sciences. John Wiley & Sons Ltd.
6
CannonG. H. (2018). Reservoir uncertainty quantification: a machine learning perspective. J. Petroleum Sci. Eng.166, 1038–1051. 10.1016/j.petrol.2018.03.015
7
ChenQ.XunL.CuiZ.ZhouR.ChenD.LiuG. (2025). Recent progress and development trends of three-dimensional geological modeling. Bull. Geol. Sci. Technol.44 (3). 373–387. 10.19509/j.cnki.dzkq.tb20240284
8
ChilèsJ.-P.DelfinerP. (2012). Geostatistics: modeling spatial uncertainty. 2nd ed. New York, NY: John Wiley and Sons.
9
CuiZ.ChenQ.LiuG.MaX.CuiL. (2021). Hybrid parallel framework for multiple-point geostatistics on Tianhe-2: a robust solution for large-scale simulation. Comput. and Geosciences157, 104923. 10.1016/j.cageo.2021.104923
10
DubruleO. (2017a). Geostatistics for seismic data integration in Earth models. Tulsa, OK: Society of Exploration Geophysicists. 10.1190/1.9781560801962
11
DubruleO. (2017b). Geostatistics for petroleum geology and reservoir engineering. Amsterdam, Netherlands: Elsevier. 10.1007/978-3-319-47603-3
12
EhsanM.LatifM. A. U.AliA.RadwanA. E.AmerM. A.AbdelrahmanK. (2023). Geocellular modeling of the Cambrian to Eocene multi-reservoirs, upper indus basin, Pakistan. Nat. Resour. Res.32 (6), 2583–2607. 10.1007/s11053-023-10256-7
13
EhsanM.ChenR.AliM.ManzoorU.NaqviS. M. N.AliA.et al (2025). Evaluating the rankot formation in the middle Indus Basin, Pakistan as a promising secondary reservoir for development. Geomechanics, geophysics, geo-energy. Geo-Resources11 (1), 32. 10.1007/s40948-025-00950-6
14
FanW.LiuG.ChenQ.CuiZ.FangH.ChenG.et al (2024). Automatic reconstruction of geological reservoir models based on conditioning data constraints and BicycleGAN. Geoenergy Sci. Eng.234, 212690. 10.1016/j.geoen.2024.212690
15
FangZ.FangH.DemanetL. (2020). Deep generator priors for Bayesian seismic inversion. Adv. Geophysics61, 179–216. 10.1016/bs.agph.2020.07.002
16
GaoF.ZhouL.LiD.ZhaoY.LiuJ.DuanJ.et al (2025). Application of the multi-information seismic-constrained reservoir modeling in extra-low permeability reservoir development. Prog. Geophys.40 (1), 121–130. 10.6038/pg2025HH0400
17
GoodfellowI.Pouget-AbadieJ.MirzaM.XuB.Warde-FarleyD.OzairS.et al (2020). Generative adversarial networks. Communications of the ACM63 (11), 139–144. 10.1145/3422622
18
GouQ.XuS.HaoF.YangF.ZhangB.ShuZ.et al (2019). Full-scale pores and micro-fractures characterization using FE-SEM, gas adsorption, nano-CT and micro-CT: a case study of the Silurian Longmaxi Formation shale in the Fuling area, Sichuan Basin, China. Fuel253, 167–179.
19
HallB. (2016a). Machine learning in the oil and gas industry. Berlin: Springer Science & Business Media B.V.
20
HallM. (2016b). Deep learning for seismic facies classification. Lead. Edge35 (10), 874–880. 10.1190/tle35100874.1
21
HeZ.ZhaoX.ZhangW.LyvX.ZhuD.ZhaoL.et al (2023). Progress and direction of geological modeling for deep and ultra-deep carbonate reservoirs. Oil and Gas Geol.44 (1), 16–33. 10.11743/ogg20230102
22
HouM.CaoH.LiH.ChenA.WeiA.ChenY.et al (2019). Characteristics and controlling factors of deep buried-hill reservoirs in the BZ19-6 structural belt, Bohai Sea area. Nat. Gas. Ind. B6 (4), 305–316. 10.1016/j.ngib.2019.01.011
23
HouW.LiY.YeS.YangS.XiaoF. (2025). Mapping 3D overthrust structures by a hybrid modeling method. Earth Space Sci.12, e2024EA003916. 10.1029/2024EA003916
24
HowellJ. A.MartiniusA. W.GoodT. R. (2014). The application of outcrop analogues in geological modelling: a review, present status and future outlook, 387, 1–25. 10.1144/SP387.12
25
HuangJ.DengH.ChenJ.LiN.WangJ.LiuZ.et al (2023). Assessing geometrical uncertainties in geological interface models using Markov chain Monte Carlo sampling via abstract graph. Tectonophysics864, 230032. 10.1016/j.tecto.2023.230032
26
HuangJ.WangH.XuF.YangM.ZhaoJ.LiP.et al (2025). Sequence stratigraphy analysis and lithofacies paleogeography reconstruction of isolated platform in a rift lake basin: implications for deepwater hydrocarbon exploration in the subsalt of Santos Basin, Brazil. Petroleum Explor. Dev.52 (4), 982–1000. 10.1016/S1876-3804(25)60617-3
27
HyneN. J. (2012). Nontechnical guide to petroleum geology, exploration, drilling, and production. 3rd ed. Nashville, TN: Endeavor Business Media, LLC.
28
KaplanR.PyrczM. J.StrebelleS. (2017). “Deepwater reservoir connectivity reproduction from MPS and process-mimicking geostatistical methods,” in Geostatistics Valencia 2016 (Cham: Springer International Publishing), 601–611. 10.1007/978-3-319-46819-8_40
29
LiY.JiaD.WangS.QuR.QiaoM.LiuH. (2024). Surrogate model for reservoir performance prediction with time-varying well control based on depth generative network. Petroleum Explor. Dev.51 (5), 1114–1125. 10.1016/S1876-3804(25)60541-6
30
LimaL. A.GörnitzN.VarellaL. E.VellascoM.MuellerK. R.NakajimaS. (2017). Porosity estimation by semi-supervised learning with sparsely available labeled samples. Comput. and Geosciences106, 33–48. 10.1016/j.cageo.2017.05.004
31
LiuQ.GuY. J. (2012). Seismic imaging: from classical to adjoint tomography. Tectonophysics566, 31–66. 10.1016/j.tecto.2012.07.006
32
LiuJ.YangX.LvW.XuW.LiuG. (2022). Application of multi-point geostatistical modeling in fan Delta sedimentary model. J. Southwest Petroleum Univ. Sci. and Technol. Ed.44 (05), 47–60. 10.11885/j.issn.1674-5086.2020.07.01.01
33
LiuF.ZhaoX.FengX.CaoS.BuF. (2025). Quantitative characterization of deep-sea channel continuity under Architecture model constraints. J. Southwest Petroleum Univ.47 (1), 16–26. 10.11885/j.issn.1674-5086.2024.08.28.01
34
MariethozG.CaersJ. (2014). Multiple-point geostatistics: stochastic modeling with training images. Hoboken, NJ: Wiley-Blackwell. 10.1002/9781118432479
35
MosserL.DubruleO.BluntM. J. (2020). Stochastic seismic waveform inversion using generative adversarial networks as a geological prior. Math. Geosci.52 (1), 53–79. 10.1007/s11004-019-09832-6
36
OliverD. S.ChenY. (2011). Recent progress in history matching: a review. Comput. Geosci.15 (1), 185–202. 10.1007/s10596-010-9230-z
37
PreskillJ. (2018). Quantum computing in the NISQ era and beyond. Quantum2, 79. 10.22331/q-2018-08-06-79
38
RaissiM.PerdikarisP.KarniadakisG. E. (2019). Physics-informed neural networks: a deep learning framework for solving forward and inverse problems involving nonlinear partial differential equations. J. Fluid Mech.868, 1–34. 10.1017/jfm.2019.683
39
RingroseP. S.BentleyM. R. (2015a). Reservoir model design. Berlin: Springer Science & Business Media B.V.
40
RingroseP. S.BentleyM. R. (2015b). Reservoir analogs: a guide to understanding subsurface heterogeneity. Amsterdam: Elsevier. 10.1016/C2014-0-02208-3
41
SahimiM.TahmasebiP. (2022). The potential of quantum computing for geoscience. Transp. Porous Media145 (2), 367–387. 10.1007/s11242-022-01855-8
42
SaputelliL.NardiC.RivaP.FabbriP. (2020). Digital twin frameworks for reservoir management: a review. Comput. and Geosciences143, 104589. 10.1016/j.cageo.2020.104589
43
SongS.MukerjiT.HouJ. (2021). GANSim: conditional facies simulation using an improved progressive growing of generative adversarial networks (GANs). Math. Geosci.53 (7), 1413–1444. 10.1007/s11004-021-09934-0
44
SongS.MukerjiT.ScheidtC.AlqassabH.FengM. (2025). Geomodelling of multi-scenario non-stationary reservoirs with enhanced GANSim. Natural Resource Management. 10.13140/RG.2.2.33098.07367
45
SouzaA. M.MartinsE. O.RoditiI.SáN.SarthourR. S.OliveiraI. S. (2022). An application of quantum annealing computing to seismic inversion. Front. Phys.9, 748285. 10.3389/fphy.2021.748285
46
StowD.SmillieZ. (2020). Distinguishing between deep-water sediment facies: turbidites, contourites and hemipelagites. Geosciences10 (2), 68. 10.3390/geosciences10020068
47
TanevskiJ.NikolaS.LjupcoT.SasoD. (2017). Process-based modeling and design of dynamical systems. Lect. Notes Comput. Sci. Incl. Subseries Lect.10536, 378–382. 10.1007/978-3-319-71273-4_35
48
WangH.FanT.HuG.HeM.ZhangX.GaoY. (2020). Analysis and characterization of sandstone reservoir architecture in middle and late stages of offshore oilfield development. Mar. Geol. and Quat. Geol.40 (01), 114–125. 10.16562/j.cnki.0256-1492.2018111801
49
WangS.LiuC.LiP.ZhaoP. (2024). Application of quantum computing in geophysics. Oil Geophys. Prospect.59 (2), 352–367. 10.13810/j.cnki.issn.1000-7210.2024.02.017
50
WorthingtonP. F. (2010). Reservoir characterization: integrating geology and geophysics. Seg. Tech. Program Expand. Abstr.29 (1), 3456–3460. 10.1190/1.3513633
51
XieP.HouJ.DuanD.YaoY.YangW.LiuY.et al (2023). A novel genetic inversion workflow based on spectral decomposition and convolutional neural networks for sand prediction in Xihu Sag of East China Sea. Geoenergy Sci. Eng.231, 212331. 10.1016/j.geoen.2023.212331
52
YangD.DongX.HuangJ.FangZ.HuangX.LiuM.et al (2025). High-resolution full waveform seismic imaging: progresses, challenges,and prospects. Sci. China Earth Sci.68 (2), 315–342. 10.1007/s11430-024-1498-0
53
YueD.LiW.DuY.HuG.WangW.WangW.et al (2022). Review on optimization and fusion of seismic attributes for fluvial reservoir characterization. Earth Sci.47 (11), 3929–3943. 10.3799/dqkx.2022.221
54
ZhangW.DuanT.LiuZ.YuanS.LinY.XuH. (2016). Application of multi-point geostatistics in deep-water turbidity channel simulation: a case study of Plutonio oilfield in Angola. Petroleum Explor. Dev.43 (3), 443–450. 10.1016/S1876-3804(16)30051-9
55
ZhaoW.ZhangW.LiM.ZhaoH.LuW.LiuZ.et al (2022). Updating and application for a reservoir geological model of deep-water turbidites: a case study of a 4D seismic survey from the PU Oilfield in Angola. Bull. Geol. Sci. Technol.41 (4), 301–308. 10.19509/j.cnki.dzkq.2022.0115
56
ZhouY.ZuoR.LiuG.YuanF.MaoX.GuoY.et al (2021). The great-leap-forward development of mathematical geoscience during 2010-2019: Big data and artificial intelligence algorithm are changing mathematical geoscience. Bull. Mineralogy, Petrology Geochem.40 (03), 556–573. 10.19658/j.issn.1007-2802.2021.40.038
Summary
Keywords
offshore hydrocarbon, reservoir prediction, sparse well data, data fusion, stochasticmodeling, AI-driven methods
Citation
Wang Y, Cao Y, Xie P, Zhou H, Yang B and Zhang Z (2026) Reservoir prediction methods under sparse well conditions in offshore fields: perspectives and challenges. Front. Earth Sci. 14:1642714. doi: 10.3389/feart.2026.1642714
Received
07 June 2025
Revised
17 November 2025
Accepted
30 January 2026
Published
26 February 2026
Volume
14 - 2026
Edited by
Sherif Farouk, Egyptian Petroleum Research Institute, Egypt
Reviewed by
Muhsan Ehsan, Bahria University, Pakistan
Updates
Copyright
© 2026 Wang, Cao, Xie, Zhou, Yang and Zhang.
This is an open-access article distributed under the terms of the Creative Commons Attribution License (CC BY). The use, distribution or reproduction in other forums is permitted, provided the original author(s) and the copyright owner(s) are credited and that the original publication in this journal is cited, in accordance with accepted academic practice. No use, distribution or reproduction is permitted which does not comply with these terms.
*Correspondence: Yingchang Cao, cyc8391680@163.com; Pengfei Xie, pengfei.xie@yangtzeu.edu.cn
Disclaimer
All claims expressed in this article are solely those of the authors and do not necessarily represent those of their affiliated organizations, or those of the publisher, the editors and the reviewers. Any product that may be evaluated in this article or claim that may be made by its manufacturer is not guaranteed or endorsed by the publisher.