57 resultados para Probabilities.
em BORIS: Bern Open Repository and Information System - Berna - Suiça
Resumo:
Determination of future risk of exacerbations is a key issue in the management of asthma. We previously developed a method to calculate conditional probabilities (π) of future decreases in lung function by using the daily fluctuations in peak expiratory flow (PEF).
Resumo:
Background Although individuals vulnerable to psychosis show brain volumetric abnormalities, structural alterations underlying different probabilities for later transition are unknown. The present study addresses this issue by means of voxel-based morphometry (VBM). Method We investigated grey matter volume (GMV) abnormalities by comparing four neuroleptic-free groups: individuals with first episode of psychosis (FEP) and with at-risk mental state (ARMS), with either long-term (ARMS-LT) or short-term ARMS (ARMS-ST), compared to the healthy control (HC) group. Using three-dimensional (3D) magnetic resonance imaging (MRI), we examined 16 FEP, 31 ARMS, clinically followed up for on average 3 months (ARMS-ST, n=18) and 4.5 years (ARMS-LT, n=13), and 19 HC. Results The ARMS-ST group showed less GMV in the right and left insula compared to the ARMS-LT (Cohen's d 1.67) and FEP groups (Cohen's d 1.81) respectively. These GMV differences were correlated positively with global functioning in the whole ARMS group. Insular alterations were associated with negative symptomatology in the whole ARMS group, and also with hallucinations in the ARMS-ST and ARMS-LT subgroups. We found a significant effect of previous antipsychotic medication use on GMV abnormalities in the FEP group. Conclusions GMV abnormalities in subjects at high clinical risk for psychosis are associated with negative and positive psychotic symptoms, and global functioning. Alterations in the right insula are associated with a higher risk for transition to psychosis, and thus may be related to different transition probabilities.
Resumo:
SUMMARY: Remaining lifetime and absolute 10-year probabilities for osteoporotic fractures were determined by gender, age, and BMD values. Remaining lifetime probability at age 50 years was 20.2% in men and 51.3% in women and increased with advancing age and decreasing BMD. The study validates the elements required to populate a Swiss-specific FRAX model. INTRODUCTION: Switzerland belongs to high-risk countries for osteoporosis. Based on demographic projections, burden will still increase. We assessed remaining lifetime and absolute 10-year probabilities for osteoporotic fractures by gender, age and BMD in order to populate FRAX algorithm for Switzerland. METHODS: Osteoporotic fracture incidence was determined from national epidemiological data for hospitalised fractured patients from the Swiss Federal Office of Statistics in 2000 and results of a prospective Swiss cohort with almost 5,000 fractured patients in 2006. Validated BMD-associated fracture risk was used together with national death incidence and risk tables to determine remaining lifetime and absolute 10-year fracture probabilities for hip and major osteoporotic (hip, spine, distal radius, proximal humerus) fractures. RESULTS: Major osteoporotic fractures incidence was 773 and 2,078 per 100,000 men and women aged 50 and older. Corresponding remaining lifetime probabilities at age 50 were 20.2% and 51.3%. Hospitalisation for clinical spine, distal radius, and proximal humerus fractures reached 25%, 30% and 50%, respectively. Absolute 10-year probability of osteoporotic fracture increased with advancing age and decreasing BMD and was higher in women than in men. CONCLUSION: This study validates the elements required to populate a Swiss-specific FRAX model, a country at highest risk for osteoporotic fractures.
Resumo:
Statistical physicists assume a probability distribution over micro-states to explain thermodynamic behavior. The question of this paper is whether these probabilities are part of a best system and can thus be interpreted as Humean chances. I consider two strategies, viz. a globalist as suggested by Loewer, and a localist as advocated by Frigg and Hoefer. Both strategies fail because the system they are part of have rivals that are roughly equally good, while ontic probabilities should be part of a clearly winning system. I conclude with the diagnosis that well-defined micro-probabilities under-estimate the robust character of explanations in statistical physics.
Resumo:
The talk starts out with a short introduction to the philosophy of probability. I highlight the need to interpret probabilities in the sciences and motivate objectivist accounts of probabilities. Very roughly, according to such accounts, ascriptions of probabilities have truth-conditions that are independent of personal interests and needs. But objectivist accounts are pointless if they do not provide an objectivist epistemology, i.e., if they do not determine well-defined methods to support or falsify claims about probabilities. In the rest of the talk I examine recent philosophical proposals for an objectivist methodology. Most of them take up ideas well-known from statistics. I nevertheless find some proposals incompatible with objectivist aspirations.
Resumo:
How do probabilistic models represent their targets and how do they allow us to learn about them? The answer to this question depends on a number of details, in particular on the meaning of the probabilities involved. To classify the options, a minimalist conception of representation (Su\'arez 2004) is adopted: Modelers devise substitutes (``sources'') of their targets and investigate them to infer something about the target. Probabilistic models allow us to infer probabilities about the target from probabilities about the source. This leads to a framework in which we can systematically distinguish between different models of probabilistic modeling. I develop a fully Bayesian view of probabilistic modeling, but I argue that, as an alternative, Bayesian degrees of belief about the target may be derived from ontic probabilities about the source. Remarkably, some accounts of ontic probabilities can avoid problems if they are supposed to apply to sources only.
Resumo:
This article proposes computing sensitivities of upper tail probabilities of random sums by the saddlepoint approximation. The considered sensitivity is the derivative of the upper tail probability with respect to the parameter of the summation index distribution. Random sums with Poisson or Geometric distributed summation indices and Gamma or Weibull distributed summands are considered. The score method with importance sampling is considered as an alternative approximation. Numerical studies show that the saddlepoint approximation and the method of score with importance sampling are very accurate. But the saddlepoint approximation is substantially faster than the score method with importance sampling. Thus, the suggested saddlepoint approximation can be conveniently used in various scientific problems.
Resumo:
This article provides importance sampling algorithms for computing the probabilities of various types ruin of spectrally negative Lévy risk processes, which are ruin over the infinite time horizon, ruin within a finite time horizon and ruin past a finite time horizon. For the special case of the compound Poisson process perturbed by diffusion, algorithms for computing probabilities of ruins by creeping (i.e. induced by the diffusion term) and by jumping (i.e. by a claim amount) are provided. It is shown that these algorithms have either bounded relative error or logarithmic efficiency, as t,x→∞t,x→∞, where t>0t>0 is the time horizon and x>0x>0 is the starting point of the risk process, with y=t/xy=t/x held constant and assumed either below or above a certain constant.
Resumo:
We review various inequalities for Mills' ratio (1 - Φ)= Ø, where Ø and Φ denote the standard Gaussian density and distribution function, respectively. Elementary considerations involving finite continued fractions lead to a general approximation scheme which implies and refines several known bounds.
Resumo:
Information on the relationship between cumulative fossil CO2 emissions and multiple climate targets is essential to design emission mitigation and climate adaptation strategies. In this study, the transient response of a climate or environmental variable per trillion tonnes of CO2 emissions, termed TRE, is quantified for a set of impact-relevant climate variables and from a large set of multi-forcing scenarios extended to year 2300 towards stabilization. An ∼ 1000-member ensemble of the Bern3D-LPJ carbon–climate model is applied and model outcomes are constrained by 26 physical and biogeochemical observational data sets in a Bayesian, Monte Carlo-type framework. Uncertainties in TRE estimates include both scenario uncertainty and model response uncertainty. Cumulative fossil emissions of 1000 Gt C result in a global mean surface air temperature change of 1.9 °C (68 % confidence interval (c.i.): 1.3 to 2.7 °C), a decrease in surface ocean pH of 0.19 (0.18 to 0.22), and a steric sea level rise of 20 cm (13 to 27 cm until 2300). Linearity between cumulative emissions and transient response is high for pH and reasonably high for surface air and sea surface temperatures, but less pronounced for changes in Atlantic meridional overturning, Southern Ocean and tropical surface water saturation with respect to biogenic structures of calcium carbonate, and carbon stocks in soils. The constrained model ensemble is also applied to determine the response to a pulse-like emission and in idealized CO2-only simulations. The transient climate response is constrained, primarily by long-term ocean heat observations, to 1.7 °C (68 % c.i.: 1.3 to 2.2 °C) and the equilibrium climate sensitivity to 2.9 °C (2.0 to 4.2 °C). This is consistent with results by CMIP5 models but inconsistent with recent studies that relied on short-term air temperature data affected by natural climate variability.