906 resultados para automatically generated meta classifiers with large levels
Resumo:
A search for diphoton events with large missing transverse energy is presented. The data were collected with the ATLAS detector in proton-proton collisions at √s=7 TeV at the CERN Large Hadron Collider and correspond to an integrated luminosity of 3.1 pb⁻¹. No excess of such events is observed above the standard model background prediction. In the context of a specific model with one universal extra dimension with compactification radius R and gravity-induced decays, values of 1/R<729 GeV are excluded at 95% C. L., providing the most sensitive limit on this model to date.
Resumo:
In epidemiological work, outcomes are frequently non-normal, sample sizes may be large, and effects are often small. To relate health outcomes to geographic risk factors, fast and powerful methods for fitting spatial models, particularly for non-normal data, are required. We focus on binary outcomes, with the risk surface a smooth function of space. We compare penalized likelihood models, including the penalized quasi-likelihood (PQL) approach, and Bayesian models based on fit, speed, and ease of implementation. A Bayesian model using a spectral basis representation of the spatial surface provides the best tradeoff of sensitivity and specificity in simulations, detecting real spatial features while limiting overfitting and being more efficient computationally than other Bayesian approaches. One of the contributions of this work is further development of this underused representation. The spectral basis model outperforms the penalized likelihood methods, which are prone to overfitting, but is slower to fit and not as easily implemented. Conclusions based on a real dataset of cancer cases in Taiwan are similar albeit less conclusive with respect to comparing the approaches. The success of the spectral basis with binary data and similar results with count data suggest that it may be generally useful in spatial models and more complicated hierarchical models.
Resumo:
Use of microarray technology often leads to high-dimensional and low- sample size data settings. Over the past several years, a variety of novel approaches have been proposed for variable selection in this context. However, only a small number of these have been adapted for time-to-event data where censoring is present. Among standard variable selection methods shown both to have good predictive accuracy and to be computationally efficient is the elastic net penalization approach. In this paper, adaptation of the elastic net approach is presented for variable selection both under the Cox proportional hazards model and under an accelerated failure time (AFT) model. Assessment of the two methods is conducted through simulation studies and through analysis of microarray data obtained from a set of patients with diffuse large B-cell lymphoma where time to survival is of interest. The approaches are shown to match or exceed the predictive performance of a Cox-based and an AFT-based variable selection method. The methods are moreover shown to be much more computationally efficient than their respective Cox- and AFT- based counterparts.
Resumo:
We present the case of a patient who presented with acute inferior myocardial infarction and embolic occlusion of the distal left anterior descending and proximal right coronary artery. A large atrial septal defect (ASD) was seen on transesophageal echocardiography and the ASD was closed during the same session as coronary angiography and percutaneous coronary intervention. The presence of embolic or thrombotic occlusions of coronary arteries should prompt interventional cardiologists to look for a patent foramen ovale or ASD and perform percutaneous closure right away.