31 resultados para Framework Model
em BORIS: Bern Open Repository and Information System - Berna - Suiça
Resumo:
Initialising the ocean internal variability for decadal predictability studies is a new area of research and a variety of ad hoc methods are currently proposed. In this study, we explore how nudging with sea surface temperature (SST) and salinity (SSS) can reconstruct the threedimensional variability of the ocean in a perfect model framework. This approach builds on the hypothesis that oceanic processes themselves will transport the surface information into the ocean interior as seen in ocean-only simulations. Five nudged simulations are designed to reconstruct a 150 years ‘‘target’’ simulation, defined as a portion of a long control simulation. The nudged simulations differ by the variables restored to, SST or SST + SSS, and by the area where the nudging is applied. The strength of the heat flux feedback is diagnosed from observations and the restoring coefficients for SSS use the same time-scale. We observed that this choice prevents spurious convection at high latitudes and near sea-ice border when nudging both SST and SSS. In the tropics, nudging the SST is enough to reconstruct the tropical atmosphere circulation and the associated dynamical and thermodynamical impacts on the underlying ocean. In the tropical Pacific Ocean, the profiles for temperature show a significant correlation from the surface down to 2,000 m, due to dynamical adjustment of the isopycnals. At mid-tohigh latitudes, SSS nudging is required to reconstruct both the temperature and the salinity below the seasonal thermocline. This is particularly true in the North Atlantic where adding SSS nudging enables to reconstruct the deep convection regions of the target. By initiating a previously documented 20-year cycle of the model, the SST + SSS nudging is also able to reproduce most of the AMOC variations, a key source of decadal predictability. Reconstruction at depth does not significantly improve with amount of time spent nudging and the efficiency of the surface nudging rather depends on the period/events considered. The joint SST + SSS nudging applied verywhere is the most efficient approach. It ensures that the right water masses are formed at the right surface density, the subsequent circulation, subduction and deep convection further transporting them at depth. The results of this study underline the potential key role of SSS for decadal predictability and further make the case for sustained largescale observations of this field.
Resumo:
Changes in the nature of work and organizations have led to an increased need for self-directed career management (SDCM). However, there is no consensus in the literature of what constitutes SDCM and many related concepts have been proposed. Integrating previous research across different conceptualizations of SDCM, the paper proposes four critical career resources which are essential for career development in the modern context: human capital resources, social resources, psychological resources, and identity resources. Implications of this framework for counselling practice are presented.
Resumo:
We investigate the transition from unitary to dissipative dynamics in the relativistic O(N) vector model with the λ(φ2)2 interaction using the nonperturbative functional renormalization group in the real-time formalism. In thermal equilibrium, the theory is characterized by two scales, the interaction range for coherent scattering of particles and the mean free path determined by the rate of incoherent collisions with excitations in the thermal medium. Their competition determines the renormalization group flow and the effective dynamics of the model. Here we quantify the dynamic properties of the model in terms of the scale-dependent dynamic critical exponent z in the limit of large temperatures and in 2≤d≤4 spatial dimensions. We contrast our results to the behavior expected at vanishing temperature and address the question of the appropriate dynamic universality class for the given microscopic theory.
Resumo:
This paper presents a kernel density correlation based nonrigid point set matching method and shows its application in statistical model based 2D/3D reconstruction of a scaled, patient-specific model from an un-calibrated x-ray radiograph. In this method, both the reference point set and the floating point set are first represented using kernel density estimates. A correlation measure between these two kernel density estimates is then optimized to find a displacement field such that the floating point set is moved to the reference point set. Regularizations based on the overall deformation energy and the motion smoothness energy are used to constraint the displacement field for a robust point set matching. Incorporating this non-rigid point set matching method into a statistical model based 2D/3D reconstruction framework, we can reconstruct a scaled, patient-specific model from noisy edge points that are extracted directly from the x-ray radiograph by an edge detector. Our experiment conducted on datasets of two patients and six cadavers demonstrates a mean reconstruction error of 1.9 mm
Resumo:
Synaptic strength depresses for low and potentiates for high activation of the postsynaptic neuron. This feature is a key property of the Bienenstock–Cooper–Munro (BCM) synaptic learning rule, which has been shown to maximize the selectivity of the postsynaptic neuron, and thereby offers a possible explanation for experience-dependent cortical plasticity such as orientation selectivity. However, the BCM framework is rate-based and a significant amount of recent work has shown that synaptic plasticity also depends on the precise timing of presynaptic and postsynaptic spikes. Here we consider a triplet model of spike-timing–dependent plasticity (STDP) that depends on the interactions of three precisely timed spikes. Triplet STDP has been shown to describe plasticity experiments that the classical STDP rule, based on pairs of spikes, has failed to capture. In the case of rate-based patterns, we show a tight correspondence between the triplet STDP rule and the BCM rule. We analytically demonstrate the selectivity property of the triplet STDP rule for orthogonal inputs and perform numerical simulations for nonorthogonal inputs. Moreover, in contrast to BCM, we show that triplet STDP can also induce selectivity for input patterns consisting of higher-order spatiotemporal correlations, which exist in natural stimuli and have been measured in the brain. We show that this sensitivity to higher-order correlations can be used to develop direction and speed selectivity.
Resumo:
Stimulation of human epileptic tissue can induce rhythmic, self-terminating responses on the EEG or ECoG. These responses play a potentially important role in localising tissue involved in the generation of seizure activity, yet the underlying mechanisms are unknown. However, in vitro evidence suggests that self-terminating oscillations in nervous tissue are underpinned by non-trivial spatio-temporal dynamics in an excitable medium. In this study, we investigate this hypothesis in spatial extensions to a neural mass model for epileptiform dynamics. We demonstrate that spatial extensions to this model in one and two dimensions display propagating travelling waves but also more complex transient dynamics in response to local perturbations. The neural mass formulation with local excitatory and inhibitory circuits, allows the direct incorporation of spatially distributed, functional heterogeneities into the model. We show that such heterogeneities can lead to prolonged reverberating responses to a single pulse perturbation, depending upon the location at which the stimulus is delivered. This leads to the hypothesis that prolonged rhythmic responses to local stimulation in epileptogenic tissue result from repeated self-excitation of regions of tissue with diminished inhibitory capabilities. Combined with previous models of the dynamics of focal seizures this macroscopic framework is a first step towards an explicit spatial formulation of the concept of the epileptogenic zone. Ultimately, an improved understanding of the pathophysiologic mechanisms of the epileptogenic zone will help to improve diagnostic and therapeutic measures for treating epilepsy.
Resumo:
BACKGROUND: Drugs are routinely combined in anesthesia and pain management to obtain an enhancement of the desired effects. However, a parallel enhancement of the undesired effects might take place as well, resulting in a limited therapeutic usefulness. Therefore, when addressing the question of optimal drug combinations, side effects must be taken into account. METHODS: By extension of a previously published interaction model, the authors propose a method to study drug interactions considering also their side effects. A general outcome parameter identified as patient's well-being is defined by superposition of positive and negative effects. Well-being response surfaces are computed and analyzed for varying drugs pharmacodynamics and interaction types. In particular, the existence of multiple maxima and of optimal drug combinations is investigated for the combination of two drugs. RESULTS: Both drug pharmacodynamics and interaction type affect the well-being surface and the deriving optimal combinations. The effect of the interaction parameters can be explained in terms of synergy and antagonism and remains unchanged for varying pharmacodynamics. For all simulations performed for the combination of two drugs, the presence of more than one maximum was never observed. CONCLUSIONS: The model is consistent with clinical knowledge and supports previously published experimental results on optimal drug combinations. This new framework improves understanding of the characteristics of drug combinations used in clinical practice and can be used in clinical research to identify optimal drug dosing.
Resumo:
Data visualization is the process of representing data as pictures to support reasoning about the underlying data. For the interpretation to be as easy as possible, we need to be as close as possible to the original data. As most visualization tools have an internal meta-model, which is different from the one for the presented data, they usually need to duplicate the original data to conform to their meta-model. This leads to an increase in the resources needed, increase which is not always justified. In this work we argue for the need of having an engine that is as close as possible to the data and we present our solution of moving the visualization tool to the data, instead of moving the data to the visualization tool. Our solution also emphasizes the necessity of reusing basic blocks to express complex visualizations and allowing the programmer to script the visualization using his preferred tools, rather than a third party format. As a validation of the expressiveness of our framework, we show how we express several already published visualizations and describe the pros and cons of the approach.
Resumo:
We present a framework for statistical finite element analysis combining shape and material properties, and allowing performing statistical statements of biomechanical performance across a given population. In this paper, we focus on the design of orthopaedic implants that fit a maximum percentage of the target population, both in terms of geometry and biomechanical stability. CT scans of the bone under consideration are registered non-rigidly to obtain correspondences in position and intensity between them. A statistical model of shape and intensity (bone density) is computed by means of principal component analysis. Afterwards, finite element analysis (FEA) is performed to analyse the biomechanical performance of the bones. Realistic forces are applied on the bones and the resulting displacement and bone stress distribution are calculated. The mechanical behaviour of different PCA bone instances is compared.
Resumo:
Drug-induced respiratory depression is a common side effect of the agents used in anesthesia practice to provide analgesia and sedation. Depression of the ventilatory drive in the spontaneously breathing patient can lead to severe cardiorespiratory events and it is considered a primary cause of morbidity. Reliable predictions of respiratory inhibition in the clinical setting would therefore provide a valuable means to improve the safety of drug delivery. Although multiple studies investigated the regulation of breathing in man both in the presence and absence of ventilatory depressant drugs, a unified description of respiratory pharmacodynamics is not available. This study proposes a mathematical model of human metabolism and cardiorespiratory regulation integrating several isolated physiological and pharmacological aspects of acute drug-induced ventilatory depression into a single theoretical framework. The description of respiratory regulation has a parsimonious yet comprehensive structure with substantial predictive capability. Simulations relative to the synergistic interaction of the hypercarbic and hypoxic respiratory drive and the global effect of drugs on the control of breathing are in good agreement with published experimental data. Besides providing clinically relevant predictions of respiratory depression, the model can also serve as a test bed to investigate issues of drug tolerability and dose finding/control under non-steady-state conditions.
Resumo:
Vitamin C (L-ascorbic acid) is an essential micronutrient that serves as an antioxidant and as a cofactor in many enzymatic reactions. Intestinal absorption and renal reabsorption of the vitamin is mediated by the epithelial apical L-ascorbic acid cotransporter SVCT1 (SLC23A1). We explored the molecular mechanisms of SVCT1-mediated L-ascorbic acid transport using radiotracer and voltage-clamp techniques in RNA-injected Xenopus oocytes. L-ascorbic acid transport was saturable (K(0.5) approximately 70 microM), temperature dependent (Q(10) approximately 5), and energized by the Na(+) electrochemical potential gradient. We obtained a Na(+)-L-ascorbic acid coupling ratio of 2:1 from simultaneous measurement of currents and fluxes. L-ascorbic acid and Na(+) saturation kinetics as a function of cosubstrate concentrations revealed a simultaneous transport mechanism in which binding is ordered Na(+), L-ascorbic acid, Na(+). In the absence of L-ascorbic acid, SVCT1 mediated pre-steady-state currents that decayed with time constants 3-15 ms. Transients were described by single Boltzmann distributions. At 100 mM Na(+), maximal charge translocation (Q(max)) was approximately 25 nC, around a midpoint (V(0.5)) at -9 mV, and with apparent valence approximately -1. Q(max) was conserved upon progressive removal of Na(+), whereas V(0.5) shifted to more hyperpolarized potentials. Model simulation predicted that the pre-steady-state current predominantly results from an ion-well effect on binding of the first Na(+) partway within the membrane electric field. We present a transport model for SVCT1 that will provide a framework for investigating the impact of specific mutations and polymorphisms in SLC23A1 and help us better understand the contribution of SVCT1 to vitamin C metabolism in health and disease.
Resumo:
The present distribution of freshwater fish in the Alpine region has been strongly affected by colonization events occurring after the last glacial maximum (LGM), some 20,000 years ago. We use here a spatially explicit simulation framework to model and better understand their colonization dynamics in the Swiss Rhine basin. This approach is applied to the European bullhead (Cottus gobio), which is an ideal model organism to study fish past demographic processes since it has not been managed by humans. The molecular diversity of eight sampled populations is simulated and compared to observed data at six microsatellite loci under an approximate Bayesian computation framework to estimate the parameters of the colonization process. Our demographic estimates fit well with current knowledge about the biology of this species, but they suggest that the Swiss Rhine basin was colonized very recently, after the Younger Dryas some 6600 years ago. We discuss the implication of this result, as well as the strengths and limits of the spatially explicit approach coupled to the approximate Bayesian computation framework.
Evolutionary demography of long-lived monocarpic perennials: a time-lagged integral projection model
Resumo:
1. The evolution of flowering strategies (when and at what size to flower) in monocarpic perennials is determined by balancing current reproduction with expected future reproduction, and these are largely determined by size-specific patterns of growth and survival. However, because of the difficulty in following long-lived individuals throughout their lives, this theory has largely been tested using short-lived species (< 5 years). 2. Here, we tested this theory using the long-lived monocarpic perennial Campanula thyrsoides which can live up to 16 years. We used a novel approach that combined permanent plot and herb chronology data from a 3-year field study to parameterize and validate integral projection models (IPMs). 3. Similar to other monocarpic species, the rosette leaves of C. thyrsoides wither over winter and so size cannot be measured in the year of flowering. We therefore extended the existing IPM framework to incorporate an additional time delay that arises because flowering demography must be predicted from rosette size in the year before flowering. 4. We found that all main demographic functions (growth, survival probability, flowering probability and fecundity) were strongly size-dependent and there was a pronounced threshold size of flowering. There was good agreement between the predicted distribution of flowering ages obtained from the IPMs and that estimated in the field. Mostly, there was good agreement between the IPM predictions and the direct quantitative field measurements regarding the demographic parameters lambda, R-0 and T. We therefore conclude that the model captures the main demographic features of the field populations. 5. Elasticity analysis indicated that changes in the survival and growth function had the largest effect (c. 80%) on lambda and this was considerably larger than in short-lived monocarps. We found only weak selection pressure operating on the observed flowering strategy which was close to the predicted evolutionary stable strategy. 6. Synthesis. The extended IPM accurately described the demography of a long-lived monocarpic perennial using data collected over a relatively short period. We could show that the evolution of flowering strategies in short- and long-lived monocarps seem to follow the same general rules but with a longevity-related emphasis on survival over fecundity.