952 resultados para SURE threshold
Resumo:
The exchange of gluons between heavy quarks produced in e+e- interactions results in an enhancement of their production near threshold. We study QCD threshold effects in collisions. The results are relevant to heavy quark production by beamstrahlung and laser backscattering in future linear collider experiments. Detailed predictions for top-, bottom-, and charm-quark production are presented.
Resumo:
PURPOSE: To assess how different diagnostic decision aids perform in terms of sensitivity, specificity, and harm. METHODS: Four diagnostic decision aids were compared, as applied to a simulated patient population: a findings-based algorithm following a linear or branched pathway, a serial threshold-based strategy, and a parallel threshold-based strategy. Headache in immune-compromised HIV patients in a developing country was used as an example. Diagnoses included cryptococcal meningitis, cerebral toxoplasmosis, tuberculous meningitis, bacterial meningitis, and malaria. Data were derived from literature and expert opinion. Diagnostic strategies' validity was assessed in terms of sensitivity, specificity, and harm related to mortality and morbidity. Sensitivity analyses and Monte Carlo simulation were performed. RESULTS: The parallel threshold-based approach led to a sensitivity of 92% and a specificity of 65%. Sensitivities of the serial threshold-based approach and the branched and linear algorithms were 47%, 47%, and 74%, respectively, and the specificities were 85%, 95%, and 96%. The parallel threshold-based approach resulted in the least harm, with the serial threshold-based approach, the branched algorithm, and the linear algorithm being associated with 1.56-, 1.44-, and 1.17-times higher harm, respectively. Findings were corroborated by sensitivity and Monte Carlo analyses. CONCLUSION: A threshold-based diagnostic approach is designed to find the optimal trade-off that minimizes expected harm, enhancing sensitivity and lowering specificity when appropriate, as in the given example of a symptom pointing to several life-threatening diseases. Findings-based algorithms, in contrast, solely consider clinical observations. A parallel workup, as opposed to a serial workup, additionally allows for all potential diseases to be reviewed, further reducing false negatives. The parallel threshold-based approach might, however, not be as good in other disease settings.
Resumo:
Random scale-free networks have the peculiar property of being prone to the spreading of infections. Here we provide for the susceptible-infected-susceptible model an exact result showing that a scale-free degree distribution with diverging second moment is a sufficient condition to have null epidemic threshold in unstructured networks with either assortative or disassortative mixing. Degree correlations result therefore irrelevant for the epidemic spreading picture in these scale-free networks. The present result is related to the divergence of the average nearest neighbors degree, enforced by the degree detailed balance condition.
Resumo:
PURPOSE: Neurophysiological monitoring aims to improve the safety of pedicle screw placement, but few quantitative studies assess specificity and sensitivity. In this study, screw placement within the pedicle is measured (post-op CT scan, horizontal and vertical distance from the screw edge to the surface of the pedicle) and correlated with intraoperative neurophysiological stimulation thresholds. METHODS: A single surgeon placed 68 thoracic and 136 lumbar screws in 30 consecutive patients during instrumented fusion under EMG control. The female to male ratio was 1.6 and the average age was 61.3 years (SD 17.7). Radiological measurements, blinded to stimulation threshold, were done on reformatted CT reconstructions using OsiriX software. A standard deviation of the screw position of 2.8 mm was determined from pilot measurements, and a 1 mm of screw-pedicle edge distance was considered as a difference of interest (standardised difference of 0.35) leading to a power of the study of 75 % (significance level 0.05). RESULTS: Correct placement and stimulation thresholds above 10 mA were found in 71 % of screws. Twenty-two percent of screws caused cortical breach, 80 % of these had stimulation thresholds above 10 mA (sensitivity 20 %, specificity 90 %). True prediction of correct position of the screw was more frequent for lumbar than for thoracic screws. CONCLUSION: A screw stimulation threshold of >10 mA does not indicate correct pedicle screw placement. A hypothesised gradual decrease of screw stimulation thresholds was not observed as screw placement approaches the nerve root. Aside from a robust threshold of 2 mA indicating direct contact with nervous tissue, a secondary threshold appears to depend on patients' pathology and surgical conditions.
Resumo:
[spa] En un modelo de Poisson compuesto, definimos una estrategia de reaseguro proporcional de umbral : se aplica un nivel de retención k1 siempre que las reservas sean inferiores a un determinado umbral b, y un nivel de retención k2 en caso contrario. Obtenemos la ecuación íntegro-diferencial para la función Gerber-Shiu, definida en Gerber-Shiu -1998- en este modelo, que nos permite obtener las expresiones de la probabilidad de ruina y de la transformada de Laplace del momento de ruina para distintas distribuciones de la cuantía individual de los siniestros. Finalmente presentamos algunos resultados numéricos.
Resumo:
Division of labor in social insects is determinant to their ecological success. Recent models emphasize that division of labor is an emergent property of the interactions among nestmates obeying to simple behavioral rules. However, the role of evolution in shaping these rules has been largely neglected. Here, we investigate a model that integrates the perspectives of self-organization and evolution. Our point of departure is the response threshold model, where we allow thresholds to evolve. We ask whether the thresholds will evolve to a state where division of labor emerges in a form that fits the needs of the colony. We find that division of labor can indeed evolve through the evolutionary branching of thresholds, leading to workers that differ in their tendency to take on a given task. However, the conditions under which division of labor evolves depend on the strength of selection on the two fitness components considered: amount of work performed and on worker distribution over tasks. When selection is strongest on the amount of work performed, division of labor evolves if switching tasks is costly. When selection is strongest on worker distribution, division of labor is less likely to evolve. Furthermore, we show that a biased distribution (like 3:1) of workers over tasks is not easily achievable by a threshold mechanism, even under strong selection. Contrary to expectation, multiple matings of colony foundresses impede the evolution of specialization. Overall, our model sheds light on the importance of considering the interaction between specific mechanisms and ecological requirements to better understand the evolutionary scenarios that lead to division of labor in complex systems. ELECTRONIC SUPPLEMENTARY MATERIAL: The online version of this article (doi:10.1007/s00265-012-1343-2) contains supplementary material, which is available to authorized users.
Resumo:
Aims :¦Several studies have questioned the validity of separating the diagnosis of alcohol abuse from that of alcohol dependence, and the DSM-5 task force has proposed combining the criteria from these two diagnoses to assess a single category of alcohol use disorders (AUD). Furthermore, the DSM-5 task force has proposed including a new 2-symptom threshold and a severity scale based on symptom counts for the AUD diagnosis. The current study aimed to examine these modifications in a large population-based sample.¦Method :¦Data stemmed from an adult sample (N=2588 ; mean age 51.3 years (s.d.: 0.2), 44.9% female) of current and lifetime drinkers from the PsyCoLaus study, conducted in the Lausanne area in Switzerland. AUDs and validating variables were assessed using a semi-structured diagnostic interview for the assessment of alcohol¦and other major psychiatric disorders. First, the adequacy of the proposed 2- symptom threshold was tested by comparing threshold models at each possible cutoff and a linear model, in relation to different validating variables. The model with the smallest Akaike Criterion Information (AIC) value was established as the best¦model for each validating variable. Second, models with varying subsets of individual AUD symptoms were created to assess the associations between each symptom and the validating variables. The subset of symptoms with the smallest AIC value was established as the best subset for each validator.¦Results :¦1) For the majority of validating variables, the linear model was found to be the best fitting model. 2) Among the various subsets of symptoms, the symptoms most frequently associated with the validating variables were : a) drinking despite having knowledge of a physical or psychological problem, b) having had a persistent desire or unsuccessful efforts to cut down or control drinking and c) craving. The¦least frequent symptoms were : d) drinking in larger amounts or over a longer period than was intended, e) spending a great deal of time in obtaining, using or recovering from alcohol use and f) failing to fulfill major role obligations.¦Conclusions :¦The proposed DSM-5 2-symptom threshold did not receive support in our data. Instead, a linear AUD diagnosis was supported with individuals receiving an increasingly severe AUD diagnosis. Moreover, certain symptoms were more frequently associated with the validating variables, which suggests that these¦symptoms should be considered as more severe.
Resumo:
A simple and most promising oxide-assisted catalyst-free method is used to prepare silicon nitride nanowires that give rise to high yield in a short time. After a brief analysis of the state of the art, we reveal the crucial role played by the oxygen partial pressure: when oxygen partial pressure is slightly below the threshold of passive oxidation, a high yield inhibiting the formation of any silica layer covering the nanowires occurs and thanks to the synthesis temperature one can control nanowire dimensions
Resumo:
INTRODUCTION: Perfusion-CT (PCT) processing involves deconvolution, a mathematical operation that computes the perfusion parameters from the PCT time density curves and an arterial curve. Delay-sensitive deconvolution does not correct for arrival delay of contrast, whereas delay-insensitive deconvolution does. The goal of this study was to compare delay-sensitive and delay-insensitive deconvolution PCT in terms of delineation of the ischemic core and penumbra. METHODS: We retrospectively identified 100 patients with acute ischemic stroke who underwent admission PCT and CT angiography (CTA), a follow-up vascular study to determine recanalization status, and a follow-up noncontrast head CT (NCT) or MRI to calculate final infarct volume. PCT datasets were processed twice, once using delay-sensitive deconvolution and once using delay-insensitive deconvolution. Regions of interest (ROIs) were drawn, and cerebral blood flow (CBF), cerebral blood volume (CBV), and mean transit time (MTT) in these ROIs were recorded and compared. Volume and geographic distribution of ischemic core and penumbra using both deconvolution methods were also recorded and compared. RESULTS: MTT and CBF values are affected by the deconvolution method used (p < 0.05), while CBV values remain unchanged. Optimal thresholds to delineate ischemic core and penumbra are different for delay-sensitive (145 % MTT, CBV 2 ml × 100 g(-1) × min(-1)) and delay-insensitive deconvolution (135 % MTT, CBV 2 ml × 100 g(-1) × min(-1) for delay-insensitive deconvolution). When applying these different thresholds, however, the predicted ischemic core (p = 0.366) and penumbra (p = 0.405) were similar with both methods. CONCLUSION: Both delay-sensitive and delay-insensitive deconvolution methods are appropriate for PCT processing in acute ischemic stroke patients. The predicted ischemic core and penumbra are similar with both methods when using different sets of thresholds, specific for each deconvolution method.
Resumo:
This study evaluated the effect of initial pH values of 4.5, 6.5 and 8.5 of the attractant (protein bait) Milhocina® and borax (sodium borate) in the field, on the capture of fruit flies in McPhail traps, using 1, 2, 4 and 8 traps per hectare, in order to estimate control thresholds in a Hamlin orange grove in the central region of the state of São Paulo. The most abundant fruit fly species was Ceratitis capitata, comprising almost 99% of the fruit flies captured, of which 80% were females. The largest captures of C. capitata were found in traps baited with Milhocina® and borax at pH 8.5. Captures per trap for the four densities were similar, indicating that the population can be estimated with one trap per hectare in areas with high populations. It was found positive relationships between captures of C. capitata and the number of Hamlin oranges damaged, 2 and 3 weeks after capture. It was obtained equations that correlate captures and damage levels which can be used to estimate control thresholds. The average loss caused in Hamlin orange fruits by C. capitata was 2.5 tons per hectare or 7.5% of production.
Resumo:
Gene filtering is a useful preprocessing technique often applied to microarray datasets. However, it is no common practice because clear guidelines are lacking and it bears the risk of excluding some potentially relevant genes. In this work, we propose to model microarray data as a mixture of two Gaussian distributions that will allow us to obtain an optimal filter threshold in terms of the gene expression level.
Resumo:
We develop an analytical approach to the susceptible-infected-susceptible epidemic model that allows us to unravel the true origin of the absence of an epidemic threshold in heterogeneous networks. We find that a delicate balance between the number of high degree nodes in the network and the topological distance between them dictates the existence or absence of such a threshold. In particular, small-world random networks with a degree distribution decaying slower than an exponential have a vanishing epidemic threshold in the thermodynamic limit.