951 resultados para exceedance probabilities
Resumo:
In occupational exposure assessment of airborne contaminants, exposure levels can either be estimated through repeated measurements of the pollutant concentration in air, expert judgment or through exposure models that use information on the conditions of exposure as input. In this report, we propose an empirical hierarchical Bayesian model to unify these approaches. Prior to any measurement, the hygienist conducts an assessment to generate prior distributions of exposure determinants. Monte-Carlo samples from these distributions feed two level-2 models: a physical, two-compartment model, and a non-parametric, neural network model trained with existing exposure data. The outputs of these two models are weighted according to the expert's assessment of their relevance to yield predictive distributions of the long-term geometric mean and geometric standard deviation of the worker's exposure profile (level-1 model). Bayesian inferences are then drawn iteratively from subsequent measurements of worker exposure. Any traditional decision strategy based on a comparison with occupational exposure limits (e.g. mean exposure, exceedance strategies) can then be applied. Data on 82 workers exposed to 18 contaminants in 14 companies were used to validate the model with cross-validation techniques. A user-friendly program running the model is available upon request.
Resumo:
OBJECTIVES: Skin notations are used as a hazard identification tool to flag chemicals associated with a potential risk related to transdermal penetration. The transparency and rigorousness of the skin notation assignment process have recently been questioned. We compared different approaches proposed as criteria for these notations as a starting point for improving and systematizing current practice. METHODS: In this study, skin notations, dermal acute lethal dose 50 in mammals (LD(50)s) and two dermal risk indices derived from previously published work were compared using the lists of Swiss maximum allowable concentrations (MACs) and threshold limit values (TLVs) from the American Conference of Governmental Industrial Hygienists (ACGIH). The indices were both based on quantitative structure-activity relationship (QSAR) estimation of transdermal fluxes. One index compared the cumulative dose received through skin given specific exposure surface and duration to that received through lungs following inhalation 8 h at the MAC or TLV. The other index estimated the blood level increase caused by adding skin exposure to the inhalation route at kinetic steady state. Dermal-to-other route ratios of LD(50) were calculated as secondary indices of dermal penetrability. RESULTS: The working data set included 364 substances. Depending on the subdataset, agreement between the Swiss and ACGIH skin notations varied between 82 and 87%. Chemicals with a skin notation were more likely to have higher dermal risk indices and lower dermal LD(50) than chemicals without a notation (probabilities between 60 and 70%). The risk indices, based on cumulative dose and kinetic steady state, respectively, appeared proportional up to a constant independent of chemical-specific properties. They agreed well with dermal LD(50)s (Spearman correlation coefficients -0.42 to -0.43). Dermal-to-other routes LD(50) ratios were moderately associated with QSAR-based transdermal fluxes (Spearman correlation coefficients -0.2 to -0.3). CONCLUSIONS: The plausible but variable relationship between current skin notations and the different approaches tested confirm the need to improve current skin notations. QSAR-based risk indices and dermal toxicity data might be successfully integrated in a systematic alternative to current skin notations for detecting chemicals associated with potential dermal risk in the workplace. [Authors]
Resumo:
BACKGROUND: Comparative effectiveness research in spine surgery is still a rarity. In this study, pain alleviation and quality of life (QoL) improvement after lumbar total disc arthroplasty (TDA) and anterior lumbar interbody fusion (ALIF) were anonymously compared by surgeon and implant. METHODS: A total of 534 monosegmental TDAs from the SWISSspine registry were analyzed. Mean age was 42 years (19-65 years), 59% were females. Fifty cases with ALIF were documented in the international Spine Tango registry and used as concurrent comparator group for the pain analysis. Mean age was 46 years (21-69 years), 78% were females. The average follow-up time in both samples was 1 year. Comparison of back/leg pain alleviation and QoL improvement was performed. Unadjusted and adjusted probabilities for achievement of minimum clinically relevant improvements of 18 VAS points or 0.25 EQ-5D points were calculated for each surgeon. RESULTS: Mean preoperative back pain decreased from 69 to 30 points at 1 year (ØΔ 39pts) after TDA, and from 66 to 27 points after ALIF (ØΔ 39pts). Mean preoperative QoL improved from 0.34 to 0.74 points at 1 year (ØΔ 0.40pts). There were surgeons with better patient selection, indicated by lower adjusted probabilities reflecting worsening of outcomes if they had treated an average patient sample. ALIF had similar pain alleviation than TDA. CONCLUSIONS: Pain alleviation after TDA and ALIF was similar. Differences in surgeon's patient selection based on pain and QoL were revealed. Some surgeons seem to miss the full therapeutic potential of TDA by selecting patients with lower symptom severity.
Resumo:
Debris flow hazard modelling at medium (regional) scale has been subject of various studies in recent years. In this study, hazard zonation was carried out, incorporating information about debris flow initiation probability (spatial and temporal), and the delimitation of the potential runout areas. Debris flow hazard zonation was carried out in the area of the Consortium of Mountain Municipalities of Valtellina di Tirano (Central Alps, Italy). The complexity of the phenomenon, the scale of the study, the variability of local conditioning factors, and the lacking data limited the use of process-based models for the runout zone delimitation. Firstly, a map of hazard initiation probabilities was prepared for the study area, based on the available susceptibility zoning information, and the analysis of two sets of aerial photographs for the temporal probability estimation. Afterwards, the hazard initiation map was used as one of the inputs for an empirical GIS-based model (Flow-R), developed at the University of Lausanne (Switzerland). An estimation of the debris flow magnitude was neglected as the main aim of the analysis was to prepare a debris flow hazard map at medium scale. A digital elevation model, with a 10 m resolution, was used together with landuse, geology and debris flow hazard initiation maps as inputs of the Flow-R model to restrict potential areas within each hazard initiation probability class to locations where debris flows are most likely to initiate. Afterwards, runout areas were calculated using multiple flow direction and energy based algorithms. Maximum probable runout zones were calibrated using documented past events and aerial photographs. Finally, two debris flow hazard maps were prepared. The first simply delimits five hazard zones, while the second incorporates the information about debris flow spreading direction probabilities, showing areas more likely to be affected by future debris flows. Limitations of the modelling arise mainly from the models applied and analysis scale, which are neglecting local controlling factors of debris flow hazard. The presented approach of debris flow hazard analysis, associating automatic detection of the source areas and a simple assessment of the debris flow spreading, provided results for consequent hazard and risk studies. However, for the validation and transferability of the parameters and results to other study areas, more testing is needed.
Resumo:
The article is intended to improve our understanding of the reasons underlying the intellectual migration of scientists from well-known cognitive domains to nascent scientific fields. To that purpose we present, first, a number of findings from the sociology of science that give different insights about this phenomenon. We then attempt to bring some of these insights together under the conceptual roof of an actor-based approach linking expected utility and diffusion theory. Intellectual migration is regarded as the rational choice of scientists who decide under uncertainty and on the base of a number of decision-making variables, which define probabilities, costs, and benefits of the migration.
Resumo:
AIMS: To investigate empirically the hypothesized relationship between counsellor motivational interviewing (MI) skills and patient change talk (CT) by analysing the articulation between counsellor behaviours and patient language during brief motivational interventions (BMI) addressing at-risk alcohol consumption. DESIGN: Sequential analysis of psycholinguistic codes obtained by two independent raters using the Motivational Interviewing Skill Code (MISC), version 2.0. SETTING: Secondary analysis of data from a randomized controlled trial evaluating the effectiveness of BMI in an emergency department. PARTICIPANTS: A total of 97 patients tape-recorded when receiving BMI. MEASUREMENTS: MISC variables were categorized into three counsellor behaviours (MI-consistent, MI-inconsistent and 'other') and three kinds of patient language (CT, counter-CT (CCT) and utterances not linked with the alcohol topic). Observed transition frequencies, conditional probabilities and significance levels based on odds ratios were computed using sequential analysis software. FINDINGS: MI-consistent behaviours were the only counsellor behaviours that were significantly more likely to be followed by patient CT. Those behaviours were significantly more likely to be followed by patient change exploration (CT and CCT) while MI-inconsistent behaviours and 'other' counsellor behaviours were significantly more likely to be followed by utterances not linked with the alcohol topic and significantly less likely to be followed by CT. MI-consistent behaviours were more likely after change exploration, whereas 'other' counsellor behaviours were more likely only after utterances not linked with the alcohol topic. CONCLUSIONS: Findings lend support to the hypothesized relationship between MI-consistent behaviours and CT, highlight the importance of patient influence on counsellor behaviour and emphasize the usefulness of MI techniques and spirit during brief interventions targeting change enhancement.
Resumo:
Compositional data analysis motivated the introduction of a complete Euclidean structure in the simplex of D parts. This was based on the early work of J. Aitchison (1986) and completed recently when Aitchinson distance in the simplex was associated with an inner product and orthonormal bases were identified (Aitchison and others, 2002; Egozcue and others, 2003). A partition of the support of a random variable generates a composition by assigning the probability of each interval to a part of the composition. One can imagine that the partition can be refined and the probability density would represent a kind of continuous composition of probabilities in a simplex of infinitely many parts. This intuitive idea would lead to a Hilbert-space of probability densitiesby generalizing the Aitchison geometry for compositions in the simplex into the set probability densities
Resumo:
Entrevista a Carles Cuadras Avellana
Resumo:
In this paper a novel methodology aimed at minimizing the probability of network failure and the failure impact (in terms of QoS degradation) while optimizing the resource consumption is introduced. A detailed study of MPLS recovery techniques and their GMPLS extensions are also presented. In this scenario, some features for reducing the failure impact and offering minimum failure probabilities at the same time are also analyzed. Novel two-step routing algorithms using this methodology are proposed. Results show that these methods offer high protection levels with optimal resource consumption
Resumo:
Aim This study compares the direct, macroecological approach (MEM) for modelling species richness (SR) with the more recent approach of stacking predictions from individual species distributions (S-SDM). We implemented both approaches on the same dataset and discuss their respective theoretical assumptions, strengths and drawbacks. We also tested how both approaches performed in reproducing observed patterns of SR along an elevational gradient.Location Two study areas in the Alps of Switzerland.Methods We implemented MEM by relating the species counts to environmental predictors with statistical models, assuming a Poisson distribution. S-SDM was implemented by modelling each species distribution individually and then stacking the obtained prediction maps in three different ways - summing binary predictions, summing random draws of binomial trials and summing predicted probabilities - to obtain a final species count.Results The direct MEM approach yields nearly unbiased predictions centred around the observed mean values, but with a lower correlation between predictions and observations, than that achieved by the S-SDM approaches. This method also cannot provide any information on species identity and, thus, community composition. It does, however, accurately reproduce the hump-shaped pattern of SR observed along the elevational gradient. The S-SDM approach summing binary maps can predict individual species and thus communities, but tends to overpredict SR. The two other S-SDM approaches the summed binomial trials based on predicted probabilities and summed predicted probabilities - do not overpredict richness, but they predict many competing end points of assembly or they lose the individual species predictions, respectively. Furthermore, all S-SDM approaches fail to appropriately reproduce the observed hump-shaped patterns of SR along the elevational gradient.Main conclusions Macroecological approach and S-SDM have complementary strengths. We suggest that both could be used in combination to obtain better SR predictions by following the suggestion of constraining S-SDM by MEM predictions.
Resumo:
Over the last decade, the development of statistical models in support of forensic fingerprint identification has been the subject of increasing research attention, spurned on recently by commentators who claim that the scientific basis for fingerprint identification has not been adequately demonstrated. Such models are increasingly seen as useful tools in support of the fingerprint identification process within or in addition to the ACE-V framework. This paper provides a critical review of recent statistical models from both a practical and theoretical perspective. This includes analysis of models of two different methodologies: Probability of Random Correspondence (PRC) models that focus on calculating probabilities of the occurrence of fingerprint configurations for a given population, and Likelihood Ratio (LR) models which use analysis of corresponding features of fingerprints to derive a likelihood value representing the evidential weighting for a potential source.
Resumo:
This paper studies the limits of discrete time repeated games with public monitoring. We solve and characterize the Abreu, Milgrom and Pearce (1991) problem. We found that for the "bad" ("good") news model the lower (higher) magnitude events suggest cooperation, i.e., zero punishment probability, while the highrt (lower) magnitude events suggest defection, i.e., punishment with probability one. Public correlation is used to connect these two sets of signals and to make the enforceability to bind. The dynamic and limit behavior of the punishment probabilities for variations in ... (the discount rate) and ... (the time interval) are characterized, as well as the limit payo¤s for all these scenarios (We also introduce uncertainty in the time domain). The obtained ... limits are to the best of my knowledge, new. The obtained ... limits coincide with Fudenberg and Levine (2007) and Fudenberg and Olszewski (2011), with the exception that we clearly state the precise informational conditions that cause the limit to converge from above, to converge from below or to degenerate. JEL: C73, D82, D86. KEYWORDS: Repeated Games, Frequent Monitoring, Random Pub- lic Monitoring, Moral Hazard, Stochastic Processes.
Resumo:
The log-ratio methodology makes available powerful tools for analyzing compositionaldata. Nevertheless, the use of this methodology is only possible for those data setswithout null values. Consequently, in those data sets where the zeros are present, aprevious treatment becomes necessary. Last advances in the treatment of compositionalzeros have been centered especially in the zeros of structural nature and in the roundedzeros. These tools do not contemplate the particular case of count compositional datasets with null values. In this work we deal with \count zeros" and we introduce atreatment based on a mixed Bayesian-multiplicative estimation. We use the Dirichletprobability distribution as a prior and we estimate the posterior probabilities. Then weapply a multiplicative modi¯cation for the non-zero values. We present a case studywhere this new methodology is applied.Key words: count data, multiplicative replacement, composition, log-ratio analysis
Resumo:
This paper focuses on one of the methods for bandwidth allocation in an ATM network: the convolution approach. The convolution approach permits an accurate study of the system load in statistical terms by accumulated calculations, since probabilistic results of the bandwidth allocation can be obtained. Nevertheless, the convolution approach has a high cost in terms of calculation and storage requirements. This aspect makes real-time calculations difficult, so many authors do not consider this approach. With the aim of reducing the cost we propose to use the multinomial distribution function: the enhanced convolution approach (ECA). This permits direct computation of the associated probabilities of the instantaneous bandwidth requirements and makes a simple deconvolution process possible. The ECA is used in connection acceptance control, and some results are presented
Resumo:
This paper presents and discusses further aspects of the subjectivist interpretation of probability (also known as the 'personalist' view of probabilities) as initiated in earlier forensic and legal literature. It shows that operational devices to elicit subjective probabilities - in particular the so-called scoring rules - provide additional arguments in support of the standpoint according to which categorical claims of forensic individualisation do not follow from a formal analysis under that view of probability theory.