997 resultados para GLAUCOMA PROBABILITY SCORE


Relevância:

20.00% 20.00%

Publicador:

Resumo:

The analytical solution of a multidimensional Langevin equation at the overdamping limit is obtained and the probability of particles passing over a two-dimensional saddle point is discussed. These results may break a path for studying further the fusion in superheavy elements synthesis.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The probability distribution of the four-phase structure invariants (4PSIs) involving four pairs of structure factors is derived by integrating the direct methods with isomorphous replacement (IR). A simple expression of the reliability parameter for 16 types of invariant is given in the case of a native protein and a heavy-atom derivative. Test calculations on a protein and its heavy-atom derivative using experimental diffraction data show that the reliability for 4PSI estimates is comparable with that for the three-phase structure invariants (3PSIs), and that a large-modulus invariants method can be used to improve the accuracy.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Based on the second-order solutions obtained for the three-dimensional weakly nonlinear random waves propagating over a steady uniform current in finite water depth, the joint statistical distribution of the velocity and acceleration of the fluid particle in the current direction is derived using the characteristic function expansion method. From the joint distribution and the Morison equation, the theoretical distributions of drag forces, inertia forces and total random forces caused by waves propagating over a steady uniform current are determined. The distribution of inertia forces is Gaussian as that derived using the linear wave model, whereas the distributions of drag forces and total random forces deviate slightly from those derived utilizing the linear wave model. The distributions presented can be determined by the wave number spectrum of ocean waves, current speed and the second order wave-wave and wave-current interactions. As an illustrative example, for fully developed deep ocean waves, the parameters appeared in the distributions near still water level are calculated for various wind speeds and current speeds by using Donelan-Pierson-Banner spectrum and the effects of the current and the nonlinearity of ocean waves on the distribution are studied. (c) 2006 Elsevier Ltd. All rights reserved.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Aims Surgery for infective endocarditis (IE) is associated with high mortality. Our objectives were to describe the experience with surgical treatment for IE in Spain, and to identify predictors of in-hospital mortality. Methods Prospective cohort of 1000 consecutive patients with IE. Data were collected in 26 Spanish hospitals. Results Surgery was performed in 437 patients (43.7%). Patients treated with surgery were younger and predominantly male. They presented fewer comorbid conditions and more often had negative blood cultures and heart failure. In-hospital mortality after surgery was lower than in the medical therapy group (24.3 vs 30.7%, p = 0.02). In patients treated with surgery, endocarditis involved a native valve in 267 patients (61.1%), a prosthetic valve in 122 (27.9%), and a pacemaker lead with no clear further valve involvement in 48 (11.0%). The most common aetiologies were Staphylococcus (186, 42.6%), Streptococcus (97, 22.2%), and Enterococcus (49, 11.2%). The main indications for surgery were heart failure and severe valve regurgitation. A risk score for in-hospital mortality was developed using 7 prognostic variables with a similar predictive value (OR between 1.7 and 2.3): PALSUSE: prosthetic valve, age ≥ 70, large intracardiac destruction, Staphylococcus spp, urgent surgery, sex [female], EuroSCORE ≥ 10. In-hospital mortality ranged from 0% in patients with a PALSUSE score of 0 to 45.4% in patients with PALSUSE score > 3. Conclusions The prognosis of IE surgery is highly variable. The PALSUSE score could help to identify patients with higher in-hospital mortality.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A nonparametric probability estimation procedure using the fuzzy ARTMAP neural network is here described. Because the procedure does not make a priori assumptions about underlying probability distributions, it yields accurate estimates on a wide variety of prediction tasks. Fuzzy ARTMAP is used to perform probability estimation in two different modes. In a 'slow-learning' mode, input-output associations change slowly, with the strength of each association computing a conditional probability estimate. In 'max-nodes' mode, a fixed number of categories are coded during an initial fast learning interval, and weights are then tuned by slow learning. Simulations illustrate system performance on tasks in which various numbers of clusters in the set of input vectors mapped to a given class.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

An incremental, nonparametric probability estimation procedure using the fuzzy ARTMAP neural network is introduced. In slow-learning mode, fuzzy ARTMAP searches for patterns of data on which to build ever more accurate estimates. In max-nodes mode, the network initially learns a fixed number of categories, and weights are then adjusted gradually.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We present a neural network that adapts and integrates several preexisting or new modules to categorize events in short term memory (STM), encode temporal order in working memory, evaluate timing and probability context in medium and long term memory. The model shows how processed contextual information modulates event recognition and categorization, focal attention and incentive motivation. The model is based on a compendium of Event Related Potentials (ERPs) and behavioral results either collected by the authors or compiled from the classical ERP literature. Its hallmark is, at the functional level, the interplay of memory registers endowed with widely different dynamical ranges, and at the structural level, the attempt to relate the different modules to known anatomical structures.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The class of all Exponential-Polynomial-Trigonometric (EPT) functions is classical and equal to the Euler-d’Alembert class of solutions of linear differential equations with constant coefficients. The class of non-negative EPT functions defined on [0;1) was discussed in Hanzon and Holland (2010) of which EPT probability density functions are an important subclass. EPT functions can be represented as ceAxb, where A is a square matrix, b a column vector and c a row vector where the triple (A; b; c) is the minimal realization of the EPT function. The minimal triple is only unique up to a basis transformation. Here the class of 2-EPT probability density functions on R is defined and shown to be closed under a variety of operations. The class is also generalised to include mixtures with the pointmass at zero. This class coincides with the class of probability density functions with rational characteristic functions. It is illustrated that the Variance Gamma density is a 2-EPT density under a parameter restriction. A discrete 2-EPT process is a process which has stochastically independent 2-EPT random variables as increments. It is shown that the distribution of the minimum and maximum of such a process is an EPT density mixed with a pointmass at zero. The Laplace Transform of these distributions correspond to the discrete time Wiener-Hopf factors of the discrete time 2-EPT process. A distribution of daily log-returns, observed over the period 1931-2011 from a prominent US index, is approximated with a 2-EPT density function. Without the non-negativity condition, it is illustrated how this problem is transformed into a discrete time rational approximation problem. The rational approximation software RARL2 is used to carry out this approximation. The non-negativity constraint is then imposed via a convex optimisation procedure after the unconstrained approximation. Sufficient and necessary conditions are derived to characterise infinitely divisible EPT and 2-EPT functions. Infinitely divisible 2-EPT density functions generate 2-EPT Lévy processes. An assets log returns can be modelled as a 2-EPT Lévy process. Closed form pricing formulae are then derived for European Options with specific times to maturity. Formulae for discretely monitored Lookback Options and 2-Period Bermudan Options are also provided. Certain Greeks, including Delta and Gamma, of these options are also computed analytically. MATLAB scripts are provided for calculations involving 2-EPT functions. Numerical option pricing examples illustrate the effectiveness of the 2-EPT approach to financial modelling.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Ribosome profiling (ribo-seq) is a recently developed technique that provides genomewide information on protein synthesis (GWIPS) in vivo. The high resolution of ribo-seq is one of the exciting properties of this technique. In Chapter 2, I present a computational method that utilises the sub-codon precision and triplet periodicity of ribosome profiling data to detect transitions in the translated reading frame. Application of this method to ribosome profiling data generated for human HeLa cells allowed us to detect several human genes where the same genomic segment is translated in more than one reading frame. Since the initial publication of the ribosome profiling technique in 2009, there has been a proliferation of studies that have used the technique to explore various questions with respect to translation. A review of the many uses and adaptations of the technique is provided in Chapter 1. Indeed, owing to the increasing popularity of the technique and the growing number of published ribosome profiling datasets, we have developed GWIPS-viz (http://gwips.ucc.ie), a ribo-seq dedicated genome browser. Details on the development of the browser and its usage are provided in Chapter 3. One of the surprising findings of ribosome profiling of initiating ribosomes carried out in 3 independent studies, was the widespread use of non-AUG codons as translation initiation start sites in mammals. Although initiation at non-AUG codons in mammals has been documented for some time, the extent of non-AUG initiation reported by these ribo-seq studies was unexpected. In Chapter 4, I present an approach for estimating the strength of initiating codons based on the leaky scanning model of translation initiation. Application of this approach to ribo-seq data illustrates that initiation at non-AUG codons is inefficient compared to initiation at AUG codons. In addition, our approach provides a probability of initiation score for each start site that allows its strength of initiation to be evaluated.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

© 2010 by the American Geophysical Union.The cross-scale probabilistic structure of rainfall intensity records collected over time scales ranging from hours to decades at sites dominated by both convective and frontal systems is investigated. Across these sites, intermittency build-up from slow to fast time-scales is analyzed in terms of heavy tailed and asymmetric signatures in the scale-wise evolution of rainfall probability density functions (pdfs). The analysis demonstrates that rainfall records dominated by convective storms develop heavier-Tailed power law pdfs toward finer scales when compared with their frontal systems counterpart. Also, a concomitant marked asymmetry build-up emerges at such finer time scales. A scale-dependent probabilistic description of such fat tails and asymmetry appearance is proposed based on a modified q-Gaussian model, able to describe the cross-scale rainfall pdfs in terms of the nonextensivity parameter q, a lacunarity (intermittency) correction and a tail asymmetry coefficient, linked to the rainfall generation mechanism.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The objective of spatial downscaling strategies is to increase the information content of coarse datasets at smaller scales. In the case of quantitative precipitation estimation (QPE) for hydrological applications, the goal is to close the scale gap between the spatial resolution of coarse datasets (e.g., gridded satellite precipitation products at resolution L × L) and the high resolution (l × l; L»l) necessary to capture the spatial features that determine spatial variability of water flows and water stores in the landscape. In essence, the downscaling process consists of weaving subgrid-scale heterogeneity over a desired range of wavelengths in the original field. The defining question is, which properties, statistical and otherwise, of the target field (the known observable at the desired spatial resolution) should be matched, with the caveat that downscaling methods be as a general as possible and therefore ideally without case-specific constraints and/or calibration requirements? Here, the attention is focused on two simple fractal downscaling methods using iterated functions systems (IFS) and fractal Brownian surfaces (FBS) that meet this requirement. The two methods were applied to disaggregate spatially 27 summertime convective storms in the central United States during 2007 at three consecutive times (1800, 2100, and 0000 UTC, thus 81 fields overall) from the Tropical Rainfall Measuring Mission (TRMM) version 6 (V6) 3B42 precipitation product (~25-km grid spacing) to the same resolution as the NCEP stage IV products (~4-km grid spacing). Results from bilinear interpolation are used as the control. A fundamental distinction between IFS and FBS is that the latter implies a distribution of downscaled fields and thus an ensemble solution, whereas the former provides a single solution. The downscaling effectiveness is assessed using fractal measures (the spectral exponent β, fractal dimension D, Hurst coefficient H, and roughness amplitude R) and traditional operational scores statistics scores [false alarm rate (FR), probability of detection (PD), threat score (TS), and Heidke skill score (HSS)], as well as bias and the root-mean-square error (RMSE). The results show that both IFS and FBS fractal interpolation perform well with regard to operational skill scores, and they meet the additional requirement of generating structurally consistent fields. Furthermore, confidence intervals can be directly generated from the FBS ensemble. The results were used to diagnose errors relevant for hydrometeorological applications, in particular a spatial displacement with characteristic length of at least 50 km (2500 km2) in the location of peak rainfall intensities for the cases studied. © 2010 American Meteorological Society.