10 resultados para Timed and Probabilistic Automata
em Publishing Network for Geoscientific
Resumo:
Coastal managers require reliable spatial data on the extent and timing of potential coastal inundation, particularly in a changing climate. Most sea level rise (SLR) vulnerability assessments are undertaken using the easily implemented bathtub approach, where areas adjacent to the sea and below a given elevation are mapped using a deterministic line dividing potentially inundated from dry areas. This method only requires elevation data usually in the form of a digital elevation model (DEM). However, inherent errors in the DEM and spatial analysis of the bathtub model propagate into the inundation mapping. The aim of this study was to assess the impacts of spatially variable and spatially correlated elevation errors in high-spatial resolution DEMs for mapping coastal inundation. Elevation errors were best modelled using regression-kriging. This geostatistical model takes the spatial correlation in elevation errors into account, which has a significant impact on analyses that include spatial interactions, such as inundation modelling. The spatial variability of elevation errors was partially explained by land cover and terrain variables. Elevation errors were simulated using sequential Gaussian simulation, a Monte Carlo probabilistic approach. 1,000 error simulations were added to the original DEM and reclassified using a hydrologically correct bathtub method. The probability of inundation to a scenario combining a 1 in 100 year storm event over a 1 m SLR was calculated by counting the proportion of times from the 1,000 simulations that a location was inundated. This probabilistic approach can be used in a risk-aversive decision making process by planning for scenarios with different probabilities of occurrence. For example, results showed that when considering a 1% probability exceedance, the inundated area was approximately 11% larger than mapped using the deterministic bathtub approach. The probabilistic approach provides visually intuitive maps that convey uncertainties inherent to spatial data and analysis.
Resumo:
Particulate matter concentration and water temperature at 5 m depth level are compared in the Canary upwelling region to the east of the Cape Blanc. It was found that accumulation of particulate matter was timed to hydrofrontal zones. Particle size distributions for particulate matter obtained using the Coulter counter agree with the hyperbolic law (of the Junge type) with double values for the size parameter, which changes for particle diameters of 5-6 microns. Average values for the size parameter in the region of the upwelling are significantly lower than in the open ocean. Specific surface of particulate matter associated with reactivity differs significantly on different sides of the upwelling front and increases beyond the upwelling.
Resumo:
The large discrepancy between field and laboratory measurements of mineral reaction rates is a long-standing problem in earth sciences, often attributed to factors extrinsic to the mineral itself. Nevertheless, differences in reaction rate are also observed within laboratory measurements, raising the possibility of intrinsic variations as well. Critical insight is available from analysis of the relationship between the reaction rate and its distribution over the mineral surface. This analysis recognizes the fundamental variance of the rate. The resulting anisotropic rate distributions are completely obscured by the common practice of surface area normalization. In a simple experiment using a single crystal and its polycrystalline counterpart, we demonstrate the sensitivity of dissolution rate to grain size, results that undermine the use of "classical" rate constants. Comparison of selected published crystal surface step retreat velocities (Jordan and Rammensee, 1998) as well as large single crystal dissolution data (Busenberg and Plummer, 1986) provide further evidence of this fundamental variability. Our key finding highlights the unsubstantiated use of a single-valued "mean" rate or rate constant as a function of environmental conditions. Reactivity predictions and long-term reservoir stability calculations based on laboratory measurements are thus not directly applicable to natural settings without a probabilistic approach. Such a probabilistic approach must incorporate both the variation of surface energy as a general range (intrinsic variation) as well as constraints to this variation owing to the heterogeneity of complex material (e.g., density of domain borders). We suggest the introduction of surface energy spectra (or the resulting rate spectra) containing information about the probability of existing rate ranges and the critical modes of surface energy.
Resumo:
We introduce two probabilistic, data-driven models that predict a ship's speed and the situations where a ship is probable to get stuck in ice based on the joint effect of ice features such as the thickness and concentration of level ice, ice ridges, rafted ice, moreover ice compression is considered. To develop the models to datasets were utilized. First, the data from the Automatic Identification System about the performance of a selected ship was used. Second, a numerical ice model HELMI, developed in the Finnish Meteorological Institute, provided information about the ice field. The relations between the ice conditions and ship movements were established using Bayesian learning algorithms. The case study presented in this paper considers a single and unassisted trip of an ice-strengthened bulk carrier between two Finnish ports in the presence of challenging ice conditions, which varied in time and space. The obtained results show good prediction power of the models. This means, on average 80% for predicting the ship's speed within specified bins, and above 90% for predicting cases where a ship may get stuck in ice. We expect this new approach to facilitate the safe and effective route selection problem for ice-covered waters where the ship performance is reflected in the objective function.
Resumo:
Existing models estimating oil spill costs at sea are based on data from the past, and they usually lack a systematic approach. This make them passive, and limits their ability to forecast the effect of the changes in the oil combating fleet or location of a spill on the oil spill costs. In this paper we make an attempt towards the development of a probabilistic and systematic model estimating the costs of clean-up operations for the Gulf of Finland. For this purpose we utilize expert knowledge along with the available data and information from literature. Then, the obtained information is combined into a framework with the use of a Bayesian Belief Networks. Due to lack of data, we validate the model by comparing its results with existing models, with which we found good agreement. We anticipate that the presented model can contribute to the cost-effective oil-combating fleet optimization for the Gulf of Finland. It can also facilitate the accident consequences estimation in the framework of formal safety assessment (FSA).
Resumo:
The selection of metrics for ecosystem restoration programs is critical for improving the quality of monitoring programs and characterizing project success. Moreover it is oftentimes very difficult to balance the importance of multiple ecological, social, and economical metrics. Metric selection process is a complex and must simultaneously take into account monitoring data, environmental models, socio-economic considerations, and stakeholder interests. We propose multicriteria decision analysis (MCDA) methods, broadly defined, for the selection of optimal sets of metrics to enhance evaluation of ecosystem restoration alternatives. Two MCDA methods, a multiattribute utility analysis (MAUT), and a probabilistic multicriteria acceptability analysis (ProMAA), are applied and compared for a hypothetical case study of a river restoration involving multiple stakeholders. Overall, the MCDA results in a systematic, unbiased, and transparent solution, informing restoration alternatives evaluation. The two methods provide comparable results in terms of selected metrics. However, because ProMAA can consider probability distributions for weights and utility values of metrics for each criteria, it is suggested as the best option if data uncertainty is high. Despite the increase in complexity in the metric selection process, MCDA improves upon the current ad-hoc decision practice based on the consultations with stakeholders and experts, and encourages transparent and quantitative aggregation of data and judgement, increasing the transparency of decision making in restoration projects. We believe that MCDA can enhance the overall sustainability of ecosystem by enhancing both ecological and societal needs.
Resumo:
Probabilistic climate data have become available for the first time through the UK Climate Projections 2009, so that the risk of tree growth change can be quantified. We assess the drought risk spatially and temporally using drought probabilities and tree species vulnerabilities across Britain. We assessed the drought impact on the potential yield class of three major tree species (Picea sitchensis, Pinus sylvestris, and Quercus robur) which presently cover around 59% (400,700 ha) of state-managed forests, across lowland and upland sites. Here we show that drought impacts result mostly in reduced tree growth over the next 80 years when using b1, a1b and a1fi IPCC emissions scenarios. We found a maximum reduction of 94% but also a maximum increase of 56% in potential stand yield class in the 2080s from the baseline climate (1961-1990). Furthermore, potential production over the national forest estate for all three species in the 2080s may decrease due to drought by 42% in the lowlands and 32% in the uplands in comparison to the baseline climate. Our results reveal that potential tree growth and forest production on the national forest estate in Britain is likely to reduce, and indicate where and when adaptation measures are required. Moreover, this paper demonstrates the value of probabilistic climate projections for an important economic and environmental sector.
Resumo:
A probabilistic function (integrated source contribution function, ISCF) based on backward air mass trajectory calculation was developed to track sources and atmospheric pathways of polycyclic aromatic hydrocarbons (PAHs) to the Canadian High Arctic station of Alert. In addition to the movement of air masses, the emission intensities at the sources and the major processes of partition, indirect photolysis, and deposition occurring on the way to the Arctic were incorporated into the ISCF. The predicted temporal trend of PAHs at Alert was validated by measured PAH concentrations throughout 2004. The PAH levels in the summer are orders of magnitude lower than those in the winter and spring when long-range atmospheric transport events occur more frequently. PAHs observed at Alert are mostly from East Asia (including Russia Far East), North Europe (including European Russia), and North America. These sources account for 25, 45, and 27% of PAHs atmospheric level at Alert, respectively. Source regions and transport pathways contributing to the PAHs contamination in the Canadian High Arctic vary seasonally. In the winter, Russia and Europe are the major sources. PAHs from these sources travel eastward and turn to the north at approximately 120°E before reaching Alert, in conjunction with the well- known Arctic haze events. In the spring, PAHs from Russia and Europe first migrate to the west and then turn to the north at 60°W toward Alert. The majority of PAHs in the summer are from northern Canada where they are carried to Alert via low- level transport pathways. In the fall, 70% of PAHs arriving at Alert are delivered from North American sources.
Resumo:
Assessing frequency and extent of mass movement at continental margins is crucial to evaluate risks for offshore constructions and coastal areas. A multidisciplinary approach including geophysical, sedimentological, geotechnical, and geochemical methods was applied to investigate multistage mass transport deposits (MTDs) off Uruguay, on top of which no surficial hemipelagic drape was detected based on echosounder data. Nonsteady state pore water conditions are evidenced by a distinct gradient change in the sulfate (SO4**2-) profile at 2.8 m depth. A sharp sedimentological contact at 2.43 m coincides with an abrupt downward increase in shear strength from approx. 10 to >20 kPa. This boundary is interpreted as a paleosurface (and top of an older MTD) that has recently been covered by a sediment package during a younger landslide event. This youngest MTD supposedly originated from an upslope position and carried its initial pore water signature downward. The kink in the SO4**2- profile approx. 35 cm below the sedimentological and geotechnical contact indicates that bioirrigation affected the paleosurface before deposition of the youngest MTD. Based on modeling of the diffusive re-equilibration of SO4**2- the age of the most recent MTD is estimated to be <30 years. The mass movement was possibly related to an earthquake in 1988 (approx. 70 km southwest of the core location). Probabilistic slope stability back analysis of general landslide structures in the study area reveals that slope failure initiation requires additional ground accelerations. Therefore, we consider the earthquake as a reasonable trigger if additional weakening processes (e.g., erosion by previous retrogressive failure events or excess pore pressures) preconditioned the slope for failure. Our study reveals the necessity of multidisciplinary approaches to accurately recognize and date recent slope failures in complex settings such as the investigated area.
Resumo:
The scatterometer SeaWinds on QuikSCAT provided regular measurements at Ku-band from 1999 to 2009. Although it was designed for ocean applications, it has been frequently used for the assessment of seasonal snowmelt patterns aside from other terrestrial applications such as ice cap monitoring, phenology and urban mapping. This paper discusses general data characteristics of SeaWinds and reviews relevant change detection algorithms. Depending on the complexity of the method, parameters such as long-term noise and multiple event analyses were incorporated. Temporal averaging is a commonly accepted preprocessing step with consideration of diurnal, multi-day or seasonal averages.