30 resultados para Geospatio-temporal Conceptual Models
em BORIS: Bern Open Repository and Information System - Berna - Suiça
Resumo:
In studies related to deep geological disposal of radioactive waste, it is current practice to transfer external information (e.g. from other sites, from underground rock laboratories or from natural analogues) to safety cases for specific projects. Transferable information most commonly includes parameters, investigation techniques, process understanding, conceptual models and high-level conclusions on system behaviour. Prior to transfer, the basis of transferability needs to be established. In argillaceous rocks, the most relevant common feature is the microstructure of the rocks, essentially determined by the properties of clay–minerals. Examples are shown from the Swiss and French programmes how transfer of information was handled and justified. These examples illustrate how transferability depends on the stage of development of a repository safety case and highlight the need for adequate system understanding at all sites involved to support the transfer.
Resumo:
Water-conducting faults and fractures were studied in the granite-hosted A¨ spo¨ Hard Rock Laboratory (SE Sweden). On a scale of decametres and larger, steeply dipping faults dominate and contain a variety of different fault rocks (mylonites, cataclasites, fault gouges). On a smaller scale, somewhat less regular fracture patterns were found. Conceptual models of the fault and fracture geometries and of the properties of rock types adjacent to fractures were derived and used as input for the modelling of in situ dipole tracer tests that were conducted in the framework of the Tracer Retention Understanding Experiment (TRUE-1) on a scale of metres. After the identification of all relevant transport and retardation processes, blind predictions of the breakthroughs of conservative to moderately sorbing tracers were calculated and then compared with the experimental data. This paper provides the geological basis and model calibration, while the predictive and inverse modelling work is the topic of the companion paper [J. Contam. Hydrol. 61 (2003) 175]. The TRUE-1 experimental volume is highly fractured and contains the same types of fault rocks and alterations as on the decametric scale. The experimental flow field was modelled on the basis of a 2D-streamtube formalism with an underlying homogeneous and isotropic transmissivity field. Tracer transport was modelled using the dual porosity medium approach, which is linked to the flow model by the flow porosity. Given the substantial pumping rates in the extraction borehole, the transport domain has a maximum width of a few centimetres only. It is concluded that both the uncertainty with regard to the length of individual fractures and the detailed geometry of the network along the flowpath between injection and extraction boreholes are not critical because flow is largely one-dimensional, whether through a single fracture or a network. Process identification and model calibration were based on a single uranine breakthrough (test PDT3), which clearly showed that matrix diffusion had to be included in the model even over the short experimental time scales, evidenced by a characteristic shape of the trailing edge of the breakthrough curve. Using the geological information and therefore considering limited matrix diffusion into a thin fault gouge horizon resulted in a good fit to the experiment. On the other hand, fresh granite was found not to interact noticeably with the tracers over the time scales of the experiments. While fracture-filling gouge materials are very efficient in retarding tracers over short periods of time (hours–days), their volume is very small and, with time progressing, retardation will be dominated by altered wall rock and, finally, by fresh granite. In such rocks, both porosity (and therefore the effective diffusion coefficient) and sorption Kds are more than one order of magnitude smaller compared to fault gouge, thus indicating that long-term retardation is expected to occur but to be less pronounced.
Resumo:
Since European settlement, there has been a dramatic increase in the density, cover and distribution of woody plants in former grassland and open woodland. There is a widespread belief that shrub encroachment is synonymous with declines in ecosystem functions, and often it is associated with landscape degradation or desertification. Indeed, this decline in ecosystem functioning is considered to be driven largely by the presence of the shrubs themselves. This prevailing paradigm has been the basis for an extensive program of shrub removal, based on the view that it is necessary to reinstate the original open woodland or grassland structure from which shrublands are thought to have been derived. We review existing scientific evidence, particularly focussed on eastern Australia, to question the notion that shrub encroachment leads to declines in ecosystem functions. We then summarise this scientific evidence into two conceptual models aimed at optimising landscape management to maximise the services provided by shrub-encroached areas. The first model seeks to reconcile the apparent conflicts between the patch- and landscape-level effects of shrubs. The second model identifies the ecosystem services derived from different stages of shrub encroachment. We also examined six ecosystem services provided by shrublands (biodiversity, soil C, hydrology, nutrient provision, grass growth and soil fertility) by using published and unpublished data. We demonstrated the following: (1) shrub effects on ecosystems are strongly scale-, species- and environment-dependent and, therefore, no standardised management should be applied to every case; (2) overgrazing dampens the generally positive effect of shrubs, leading to the misleading relationship between encroachment and degradation; (3) woody encroachment per se does not hinder any of the functions or services described above, rather it enhances many of them; (4) no single shrub-encroachment state (including grasslands without shrubs) will maximise all services; rather, the provision of ecosystem goods and services by shrublands requires a mixture of different states; and (5) there has been little rigorous assessment of the long-term effectiveness of removal and no evidence that this improves land condition in most cases. Our review provides the basis for an improved, scientifically based understanding and management of shrublands, so as to balance the competing goals of providing functional habitats, maintaining soil processes and sustaining pastoral livelihoods.
Resumo:
The fatality risk caused by avalanches on road networks can be analysed using a long-term approach, resulting in a mean value of risk, and with emphasis on short-term fluctuations due to the temporal variability of both, the hazard potential and the damage potential. In this study, the approach for analysing the long-term fatality risk has been adapted by modelling the highly variable short-term risk. The emphasis was on the temporal variability of the damage potential and the related risk peaks. For defined hazard scenarios resulting from classified amounts of snow accumulation, the fatality risk was calculated by modelling the hazard potential and observing the traffic volume. The avalanche occurrence probability was calculated using a statistical relationship between new snow height and observed avalanche releases. The number of persons at risk was determined from the recorded traffic density. The method resulted in a value for the fatality risk within the observed time frame for the studied road segment. The long-term fatality risk due to snow avalanches as well as the short-term fatality risk was compared to the average fatality risk due to traffic accidents. The application of the method had shown that the long-term avalanche risk is lower than the fatality risk due to traffic accidents. The analyses of short-term avalanche-induced fatality risk provided risk peaks that were 50 times higher than the statistical accident risk. Apart from situations with high hazard level and high traffic density, risk peaks result from both, a high hazard level combined with a low traffic density and a high traffic density combined with a low hazard level. This provided evidence for the importance of the temporal variability of the damage potential for risk simulations on road networks. The assumed dependence of the risk calculation on the sum of precipitation within three days is a simplified model. Thus, further research is needed for an improved determination of the diurnal avalanche probability. Nevertheless, the presented approach may contribute as a conceptual step towards a risk-based decision-making in risk management.
Resumo:
Vestibular cognition has recently gained attention. Despite numerous experimental and clinical demonstrations, it is not yet clear what vestibular cognition really is. For future research in vestibular cognition, adopting a computational approach will make it easier to explore the underlying mech- anisms. Indeed, most modeling approaches in vestibular science include a top-down or a priori component. We review recent Bayesian optimal observer models, and discuss in detail the conceptual value of prior assumptions, likelihood and posterior estimates for research in vestibular cognition. We then consider forward models in vestibular processing, which are required in order to distinguish between sensory input that is induced by active self-motion, and sensory input that is due to passive self-motion. We suggest that forward models are used not only in the service of estimating sensory states but they can also be drawn upon in an offline mode (e.g., spatial perspective transformations), in which interaction with sensory input is not desired. A computational approach to vestibular cogni- tion will help to discover connections across studies, and it will provide a more coherent framework for investigating vestibular cognition.
Resumo:
Rho-family GTPases are molecular switches that transmit extracellular cues to intracellular signaling pathways. Their regulation is likely to be highly regulated in space and in time, but most of what is known about Rho-family GTPase signaling has been derived from techniques that do not resolve these dimensions. New imaging technologies now allow the visualization of Rho GTPase signaling with high spatio-temporal resolution. This has led to insights that significantly extend classic models and call for a novel conceptual framework. These approaches clearly show three things. First, Rho GTPase signaling dynamics occur on micrometer length scales and subminute timescales. Second, multiple subcellular pools of one given Rho GTPase can operate simultaneously in time and space to regulate a wide variety of morphogenetic events (e.g. leading-edge membrane protrusion, tail retraction, membrane ruffling). These different Rho GTPase subcellular pools might be described as 'spatio-temporal signaling modules' and might involve the specific interaction of one GTPase with different guanine nucleotide exchange factors (GEFs), GTPase-activating proteins (GAPs) and effectors. Third, complex spatio-temporal signaling programs that involve precise crosstalk between multiple Rho GTPase signaling modules regulate specific morphogenetic events. The next challenge is to decipher the molecular circuitry underlying this complex spatio-temporal modularity to produce integrated models of Rho GTPase signaling.
Resumo:
Background External validity of study results is an important issue from a clinical point of view. From a methodological point of view, however, the concept of external validity is more complex than it seems to be at first glance. Methods Methodological review to address the concept of external validity. Results External validity refers to the question whether results are generalizable to persons other than the population in the original study. The only formal way to establish the external validity would be to repeat the study for that specific target population. We propose a three-way approach for assessing the external validity for specified target populations. (i) The study population might not be representative for the eligibility criteria that were intended. It should be addressed whether the study population differs from the intended source population with respect to characteristics that influence outcome. (ii) The target population will, by definition, differ from the study population with respect to geographical, temporal and ethnical conditions. Pondering external validity means asking the question whether these differences may influence study results. (iii) It should be assessed whether the study's conclusions can be generalized to target populations that do not meet all the eligibility criteria. Conclusion Judging the external validity of study results cannot be done by applying given eligibility criteria to a single target population. Rather, it is a complex reflection in which prior knowledge, statistical considerations, biological plausibility and eligibility criteria all have place.
Resumo:
n learning from trial and error, animals need to relate behavioral decisions to environmental reinforcement even though it may be difficult to assign credit to a particular decision when outcomes are uncertain or subject to delays. When considering the biophysical basis of learning, the credit-assignment problem is compounded because the behavioral decisions themselves result from the spatio-temporal aggregation of many synaptic releases. We present a model of plasticity induction for reinforcement learning in a population of leaky integrate and fire neurons which is based on a cascade of synaptic memory traces. Each synaptic cascade correlates presynaptic input first with postsynaptic events, next with the behavioral decisions and finally with external reinforcement. For operant conditioning, learning succeeds even when reinforcement is delivered with a delay so large that temporal contiguity between decision and pertinent reward is lost due to intervening decisions which are themselves subject to delayed reinforcement. This shows that the model provides a viable mechanism for temporal credit assignment. Further, learning speeds up with increasing population size, so the plasticity cascade simultaneously addresses the spatial problem of assigning credit to synapses in different population neurons. Simulations on other tasks, such as sequential decision making, serve to contrast the performance of the proposed scheme to that of temporal difference-based learning. We argue that, due to their comparative robustness, synaptic plasticity cascades are attractive basic models of reinforcement learning in the brain.
Resumo:
Fully coupled climate carbon cycle models are sophisticated tools that are used to predict future climate change and its impact on the land and ocean carbon cycles. These models should be able to adequately represent natural variability, requiring model validation by observations. The present study focuses on the ocean carbon cycle component, in particular the spatial and temporal variability in net primary productivity (PP) and export production (EP) of particulate organic carbon (POC). Results from three coupled climate carbon cycle models (IPSL, MPIM, NCAR) are compared with observation-based estimates derived from satellite measurements of ocean colour and results from inverse modelling (data assimilation). Satellite observations of ocean colour have shown that temporal variability of PP on the global scale is largely dominated by the permanently stratified, low-latitude ocean (Behrenfeld et al., 2006) with stronger stratification (higher sea surface temperature; SST) being associated with negative PP anomalies. Results from all three coupled models confirm the role of the low-latitude, permanently stratified ocean for anomalies in globally integrated PP, but only one model (IPSL) also reproduces the inverse relationship between stratification (SST) and PP. An adequate representation of iron and macronutrient co-limitation of phytoplankton growth in the tropical ocean has shown to be the crucial mechanism determining the capability of the models to reproduce observed interactions between climate and PP.
Resumo:
The Wetland and Wetland CH4 Intercomparison of Models Project (WETCHIMP) was created to evaluate our present ability to simulate large-scale wetland characteristics and corresponding methane (CH4) emissions. A multi-model comparison is essential to evaluate the key uncertainties in the mechanisms and parameters leading to methane emissions. Ten modelling groups joined WETCHIMP to run eight global and two regional models with a common experimental protocol using the same climate and atmospheric carbon dioxide (CO2) forcing datasets. We reported the main conclusions from the intercomparison effort in a companion paper (Melton et al., 2013). Here we provide technical details for the six experiments, which included an equilibrium, a transient, and an optimized run plus three sensitivity experiments (temperature, precipitation, and atmospheric CO2 concentration). The diversity of approaches used by the models is summarized through a series of conceptual figures, and is used to evaluate the wide range of wetland extent and CH4 fluxes predicted by the models in the equilibrium run. We discuss relationships among the various approaches and patterns in consistencies of these model predictions. Within this group of models, there are three broad classes of methods used to estimate wetland extent: prescribed based on wetland distribution maps, prognostic relationships between hydrological states based on satellite observations, and explicit hydrological mass balances. A larger variety of approaches was used to estimate the net CH4 fluxes from wetland systems. Even though modelling of wetland extent and CH4 emissions has progressed significantly over recent decades, large uncertainties still exist when estimating CH4 emissions: there is little consensus on model structure or complexity due to knowledge gaps, different aims of the models, and the range of temporal and spatial resolutions of the models.
Resumo:
BACKGROUND Rising levels of overweight and obesity are important public-health concerns worldwide. The purpose of this study is to elucidate their prevalence and trends in Switzerland by analyzing variations in Body Mass Index (BMI) of Swiss conscripts. METHODS The conscription records were provided by the Swiss Army. This study focussed on conscripts 18.5-20.5 years of age from the seven one-year birth cohorts spanning the period 1986-1992. BMI across professional status, area-based socioeconomic position (abSEP), urbanicity and regions was analyzed. Two piecewise quantile regression models with linear splines for three birth-cohort groups were used to examine the association of median BMI with explanatory variables and to determine the extent to which BMI has varied over time. RESULTS The study population consisted of 188,537 individuals. Median BMI was 22.51 kg/m2 (22.45-22.57 95% confidence interval (CI)). BMI was lower among conscripts of high professional status (-0.46 kg/m2; 95% CI: -0.50, -0.42, compared with low), living in areas of high abSEP (-0.11 kg/m2; 95% CI: -0.16, -0.07 compared to medium) and from urban communities (-0.07 kg/m2; 95% CI: -0.11, -0.03, compared with peri-urban). Comparing with Midland, median BMI was highest in the North-West (0.25 kg/m2; 95% CI: 0.19-0.30) and Central regions (0.11 kg/m2; 95% CI: 0.05-0.16) and lowest in the East (-0.19 kg/m2; 95% CI: -0.24, -0.14) and Lake Geneva regions (-0.15 kg/m2; 95% CI: -0.20, -0.09). Trajectories of regional BMI growth varied across birth cohorts, with median BMI remaining high in the Central and North-West regions, whereas stabilization and in some cases a decline were observed elsewhere. CONCLUSIONS BMI of Swiss conscripts is associated with individual and abSEP and urbanicity. Results show regional variation in the levels and temporal trajectories of BMI growth and signal their possible slowdown among recent birth cohorts.
Resumo:
The present study investigated the relationship between psychometric intelligence and temporal resolution power (TRP) as simultaneously assessed by auditory and visual psychophysical timing tasks. In addition, three different theoretical models of the functional relationship between TRP and psychometric intelligence as assessed by means of the Adaptive Matrices Test (AMT) were developed. To test the validity of these models, structural equation modeling was applied. Empirical data supported a hierarchical model that assumed auditory and visual modality-specific temporal processing at a first level and amodal temporal processing at a second level. This second-order latent variable was substantially correlated with psychometric intelligence. Therefore, the relationship between psychometric intelligence and psychophysical timing performance can be explained best by a hierarchical model of temporal information processing.