24 resultados para Process models

em BORIS: Bern Open Repository and Information System - Berna - Suiça


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Several methods based on Kriging have recently been proposed for calculating a probability of failure involving costly-to-evaluate functions. A closely related problem is to estimate the set of inputs leading to a response exceeding a given threshold. Now, estimating such a level set—and not solely its volume—and quantifying uncertainties on it are not straightforward. Here we use notions from random set theory to obtain an estimate of the level set, together with a quantification of estimation uncertainty. We give explicit formulae in the Gaussian process set-up and provide a consistency result. We then illustrate how space-filling versus adaptive design strategies may sequentially reduce level set estimation uncertainty.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Sound knowledge of the spatial and temporal patterns of rockfalls is fundamental for the management of this very common hazard in mountain environments. Process-based, three-dimensional simulation models are nowadays capable of reproducing the spatial distribution of rockfall occurrences with reasonable accuracy through the simulation of numerous individual trajectories on highly-resolved digital terrain models. At the same time, however, simulation models typically fail to quantify the ‘real’ frequency of rockfalls (in terms of return intervals). The analysis of impact scars on trees, in contrast, yields real rockfall frequencies, but trees may not be present at the location of interest and rare trajectories may not necessarily be captured due to the limited age of forest stands. In this article, we demonstrate that the coupling of modeling with tree-ring techniques may overcome the limitations inherent to both approaches. Based on the analysis of 64 cells (40 m × 40 m) of a rockfall slope located above a 1631-m long road section in the Swiss Alps, we illustrate results from 488 rockfalls detected in 1260 trees. We illustrate that tree impact data cannot only be used (i) to reconstruct the real frequency of rockfalls for individual cells, but that they also serve (ii) the calibration of the rockfall model Rockyfor3D, as well as (iii) the transformation of simulated trajectories into real frequencies. Calibrated simulation results are in good agreement with real rockfall frequencies and exhibit significant differences in rockfall activity between the cells (zones) along the road section. Real frequencies, expressed as rock passages per meter road section, also enable quantification and direct comparison of the hazard potential between the zones. The contribution provides an approach for hazard zoning procedures that complements traditional methods with a quantification of rockfall frequencies in terms of return intervals through a systematic inclusion of impact records in trees.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We present a novel surrogate model-based global optimization framework allowing a large number of function evaluations. The method, called SpLEGO, is based on a multi-scale expected improvement (EI) framework relying on both sparse and local Gaussian process (GP) models. First, a bi-objective approach relying on a global sparse GP model is used to determine potential next sampling regions. Local GP models are then constructed within each selected region. The method subsequently employs the standard expected improvement criterion to deal with the exploration-exploitation trade-off within selected local models, leading to a decision on where to perform the next function evaluation(s). The potential of our approach is demonstrated using the so-called Sparse Pseudo-input GP as a global model. The algorithm is tested on four benchmark problems, whose number of starting points ranges from 102 to 104. Our results show that SpLEGO is effective and capable of solving problems with large number of starting points, and it even provides significant advantages when compared with state-of-the-art EI algorithms.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Perennial snow and ice (PSI) extent is an important parameter of mountain environments with regard to its involvement in the hydrological cycle and the surface energy budget. We investigated interannual variations of PSI in nine mountain regions of interest (ROI) between 2000 and 2008. For that purpose, a novel MODIS data set processed at the Canada Centre for Remote Sensing at 250 m spatial resolution was utilized. The extent of PSI exhibited significant interannual variations, with coefficients of variation ranging from 5% to 81% depending on the ROI. A strong negative relationship was found between PSI and positive degree-days (threshold 0°C) during the summer months in most ROIs, with linear correlation coefficients (r) being as low as r = −0.90. In the European Alps and Scandinavia, PSI extent was significantly correlated with annual net glacier mass balances, with r = 0.91 and r = 0.85, respectively, suggesting that MODIS-derived PSI extent may be used as an indicator of net glacier mass balances. Validation of PSI extent in two land surface classifications for the years 2000 and 2005, GLC-2000 and Globcover, revealed significant discrepancies of up to 129% for both classifications. With regard to the importance of such classifications for land surface parameterizations in climate and land surface process models, this is a potential source of error to be investigated in future studies. The results presented here provide an interesting insight into variations of PSI in several ROIs and are instrumental for our understanding of sensitive mountain regions in the context of global climate change assessment.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Background This study addressed the temporal properties of personality disorders and their treatment by schema-centered group psychotherapy. It investigated the change mechanisms of psychotherapy using a novel method by which psychotherapy can be modeled explicitly in the temporal domain. Methodology and Findings 69 patients were assigned to a specific schema-centered behavioral group psychotherapy, 26 to social skills training as a control condition. The largest diagnostic subgroups were narcissistic and borderline personality disorder. Both treatments offered 30 group sessions of 100 min duration each, at a frequency of two sessions per week. Therapy process was described by components resulting from principal component analysis of patients' session-reports that were obtained after each session. These patient-assessed components were Clarification, Bond, Rejection, and Emotional Activation. The statistical approach focused on time-lagged associations of components using time-series panel analysis. This method provided a detailed quantitative representation of therapy process. It was found that Clarification played a core role in schema-centered psychotherapy, reducing rejection and regulating the emotion of patients. This was also a change mechanism linked to therapy outcome. Conclusions/Significance The introduced process-oriented methodology allowed to highlight the mechanisms by which psychotherapeutic treatment became effective. Additionally, process models depicted the actual patterns that differentiated specific diagnostic subgroups. Time-series analysis explores Granger causality, a non-experimental approximation of causality based on temporal sequences. This methodology, resting upon naturalistic data, can explicate mechanisms of action in psychotherapy research and illustrate the temporal patterns underlying personality disorders.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

OBJECTIVE Intense alcohol consumption is a risk factor for a number of health problems. Dual-process models assume that self-regulatory behavior such as drinking alcohol is guided by both reflective and impulsive processes. Evidence suggests that (a) impulsive processes such as implicit attitudes are more strongly associated with behavior when executive functioning abilities are low, and (b) higher neural baseline activation in the lateral prefrontal cortex (PFC) is associated with better inhibitory control. The present study integrates these 2 strands of research to investigate how individual differences in neural baseline activation in the lateral PFC moderate the association between implicit alcohol attitudes and drinking behavior. METHOD Baseline cortical activation was measured with resting electroencephalography (EEG) in 89 moderate drinkers. In a subsequent behavioral testing session they completed measures of implicit alcohol attitudes and self-reported drinking behavior. RESULTS Implicit alcohol attitudes were related to self-reported alcohol consumption. Most centrally, implicit alcohol attitudes were more strongly associated with drinking behavior in individuals with low as compared with high baseline activation in the right lateral PFC. CONCLUSIONS These findings are in line with predictions made on the basis of dual-process models. They provide further evidence that individual differences in neural baseline activation in the right lateral PFC may contribute to executive functioning abilities such as inhibitory control. Moreover, individuals with strongly positive implicit alcohol attitudes coupled with a low baseline activation in the right lateral PFC may be at greater risk of developing unhealthy drinking patterns than others.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Cross-cultural comparisons may increase our understanding of different models of substance use treatment and help identify consistent associations between patients' characteristics, treatment conditions, and outcomes.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Development of novel implants in orthopaedic trauma surgery is based on limited datasets of cadaver trials or artificial bone models. A method has been developed whereby implants can be constructed in an evidence based method founded on a large anatomic database consisting of more than 2.000 datasets of bones extracted from CT scans. The aim of this study was the development and clinical application of an anatomically pre-contoured plate for the treatment of distal fibular fractures based on the anatomical database. 48 Caucasian and Asian bone models (left and right) from the database were used for the preliminary optimization process and validation of the fibula plate. The implant was constructed to fit bilaterally in a lateral position of the fibula. Then a biomechanical comparison of the designed implant to the current gold standard in the treatment of distal fibular fractures (locking 1/3 tubular plate) was conducted. Finally, a clinical surveillance study to evaluate the grade of implant fit achieved was performed. The results showed that with a virtual anatomic database it was possible to design a fibula plate with an optimized fit for a large proportion of the population. Biomechanical testing showed the novel fibula plate to be superior to 1/3 tubular plates in 4-point bending tests. The clinical application showed a very high degree of primary implant fit. Only in a small minority of cases further intra-operative implant bending was necessary. Therefore, the goal to develop an implant for the treatment of distal fibular fractures based on the evidence of a large anatomical database could be attained. Biomechanical testing showed good results regarding the stability and the clinical application confirmed the high grade of anatomical fit.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Neurons generate spikes reliably with millisecond precision if driven by a fluctuating current--is it then possible to predict the spike timing knowing the input? We determined parameters of an adapting threshold model using data recorded in vitro from 24 layer 5 pyramidal neurons from rat somatosensory cortex, stimulated intracellularly by a fluctuating current simulating synaptic bombardment in vivo. The model generates output spikes whenever the membrane voltage (a filtered version of the input current) reaches a dynamic threshold. We find that for input currents with large fluctuation amplitude, up to 75% of the spike times can be predicted with a precision of +/-2 ms. Some of the intrinsic neuronal unreliability can be accounted for by a noisy threshold mechanism. Our results suggest that, under random current injection into the soma, (i) neuronal behavior in the subthreshold regime can be well approximated by a simple linear filter; and (ii) most of the nonlinearities are captured by a simple threshold process.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

BACKGROUND: Cyclic recruitment during mechanical ventilation contributes to ventilator associated lung injury. Two different pathomechanisms in acute respiratory distress syndrome (ARDS) are currently discussed: alveolar collapse vs persistent flooding of small airways and alveoli. We compare two different ARDS animal models by computed tomography (CT) to describe different recruitment and derecruitment mechanisms at different airway pressures: (i) lavage-ARDS, favouring alveolar collapse by surfactant depletion; and (ii) oleic acid ARDS, favouring alveolar flooding by capillary leakage. METHODS: In 12 pigs [25 (1) kg], ARDS was randomly induced, either by saline lung lavage or oleic acid (OA) injection, and 3 animals served as controls. A respiratory breathhold manoeuvre without spontaneous breathing at different continuous positive airway pressure (CPAP) was applied in random order (CPAP levels of 5, 10, 15, 30, 35 and 50 cm H(2)O) and spiral-CT scans of the total lung were acquired at each CPAP level (slice thickness=1 mm). In each spiral-CT the volume of total lung parenchyma, tissue, gas, non-aerated, well-aerated, poorly aerated, and over-aerated lung was calculated. RESULTS: In both ARDS models non-aerated lung volume decreased significantly from CPAP 5 to CPAP 50 [oleic acid lung injury (OAI): 346.9 (80.1) to 96.4 (48.8) ml, P<0.001; lavage-ARDS: 245 17.6) to 42.7 (4.8) ml, P<0.001]. In lavage-ARDS poorly aerated lung volume decreased at higher CPAP levels [232 (45.2) at CPAP 10 to 84 (19.4) ml at CPAP 50, P<0.001] whereas in OAI poorly aerated lung volume did not vary at different airway pressures. CONCLUSIONS: In both ARDS models well-aerated and non-aerated lung volume respond to different CPAP levels in a comparable fashion: Thus, a cyclical alveolar collapse seems to be part of the derecruitment process also in the OA-ARDS. In OA-ARDS, the increase in poorly aerated lung volume reflects the specific initial lesion, that is capillary leakage with interstitial and alveolar oedema.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Monte Carlo (code GEANT) produced 6 and 15 MV phase space (PS) data were used to define several simple photon beam models. For creating the PS data the energy of starting electrons hitting the target was tuned to get correct depth dose data compared to measurements. The modeling process used the full PS information within the geometrical boundaries of the beam including all scattered radiation of the accelerator head. Scattered radiation outside the boundaries was neglected. Photons and electrons were assumed to be radiated from point sources. Four different models were investigated which involved different ways to determine the energies and locations of beam particles in the output plane. Depth dose curves, profiles, and relative output factors were calculated with these models for six field sizes from 5x5 to 40x40cm2 and compared to measurements. Model 1 uses a photon energy spectrum independent of location in the PS plane and a constant photon fluence in this plane. Model 2 takes into account the spatial particle fluence distribution in the PS plane. A constant fluence is used again in model 3, but the photon energy spectrum depends upon the off axis position. Model 4, finally uses the spatial particle fluence distribution and off axis dependent photon energy spectra in the PS plane. Depth dose curves and profiles for field sizes up to 10x10cm2 were not model sensitive. Good agreement between measured and calculated depth dose curves and profiles for all field sizes was reached for model 4. However, increasing deviations were found for increasing field sizes for models 1-3. Large deviations resulted for the profiles of models 2 and 3. This is due to the fact that these models overestimate and underestimate the energy fluence at large off axis distances. Relative output factors consistent with measurements resulted only for model 4.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This review deals with an important aspect of organ transplantation, namely the process of psychic organ integration and organ-related fantasies. The body schema and body self are two important concepts in the integration of a transplanted organ. Different models and theories on organ integration are presented and will be discussed. There is evidence that beside the emotional impact and the influence on well-being, organ integration depends closely on psychic processes involving in the incorporation of the transplanted organ and the respective organ-related fantasies. Therefore, these organ fantasies - whether unconscious or conscious - may play an important role in the future development of the instinctive and highly individual relation the patients elaborate with the new organ. Beside the concern with the new organ, a bereavement to the lost old and sick organ may also influence the patients thoughts. Moreover, the good resolving of all these issues evokes the "good practice" patients develop towards the new situation. This will bring up issues as compliance, infections, rejection episodes and - most important - also organ survival.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Virtual machines emulating hardware devices are generally implemented in low-level languages and using a low-level style for performance reasons. This trend results in largely difficult to understand, difficult to extend and unmaintainable systems. As new general techniques for virtual machines arise, it gets harder to incorporate or test these techniques because of early design and optimization decisions. In this paper we show how such decisions can be postponed to later phases by separating virtual machine implementation issues from the high-level machine-specific model. We construct compact models of whole-system VMs in a high-level language, which exclude all low-level implementation details. We use the pluggable translation toolchain PyPy to translate those models to executables. During the translation process, the toolchain reintroduces the VM implementation and optimization details for specific target platforms. As a case study we implement an executable model of a hardware gaming device. We show that our approach to VM building increases understandability, maintainability and extendability while preserving performance.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Systems must co-evolve with their context. Reverse engineering tools are a great help in this process of required adaption. In order for these tools to be flexible, they work with models, abstract representations of the source code. The extraction of such information from source code can be done using a parser. However, it is fairly tedious to build new parsers. And this is made worse by the fact that it has to be done over and over again for every language we want to analyze. In this paper we propose a novel approach which minimizes the knowledge required of a certain language for the extraction of models implemented in that language by reflecting on the implementation of preparsed ASTs provided by an IDE. In a second phase we use a technique referred to as Model Mapping by Example to map platform dependent models onto domain specific model.