996 resultados para normal probability


Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this thesis, we consider Bayesian inference on the detection of variance change-point models with scale mixtures of normal (for short SMN) distributions. This class of distributions is symmetric and thick-tailed and includes as special cases: Gaussian, Student-t, contaminated normal, and slash distributions. The proposed models provide greater flexibility to analyze a lot of practical data, which often show heavy-tail and may not satisfy the normal assumption. As to the Bayesian analysis, we specify some prior distributions for the unknown parameters in the variance change-point models with the SMN distributions. Due to the complexity of the joint posterior distribution, we propose an efficient Gibbs-type with Metropolis- Hastings sampling algorithm for posterior Bayesian inference. Thereafter, following the idea of [1], we consider the problems of the single and multiple change-point detections. The performance of the proposed procedures is illustrated and analyzed by simulation studies. A real application to the closing price data of U.S. stock market has been analyzed for illustrative purposes.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

BACKGROUND Although well-established for suspected lower limb deep venous thrombosis, an algorithm combining a clinical decision score, d-dimer testing, and ultrasonography has not been evaluated for suspected upper extremity deep venous thrombosis (UEDVT). OBJECTIVE To assess the safety and feasibility of a new diagnostic algorithm in patients with clinically suspected UEDVT. DESIGN Diagnostic management study. (ClinicalTrials.gov: NCT01324037) SETTING: 16 hospitals in Europe and the United States. PATIENTS 406 inpatients and outpatients with suspected UEDVT. MEASUREMENTS The algorithm consisted of the sequential application of a clinical decision score, d-dimer testing, and ultrasonography. Patients were first categorized as likely or unlikely to have UEDVT; in those with an unlikely score and normal d-dimer levels, UEDVT was excluded. All other patients had (repeated) compression ultrasonography. The primary outcome was the 3-month incidence of symptomatic UEDVT and pulmonary embolism in patients with a normal diagnostic work-up. RESULTS The algorithm was feasible and completed in 390 of the 406 patients (96%). In 87 patients (21%), an unlikely score combined with normal d-dimer levels excluded UEDVT. Superficial venous thrombosis and UEDVT were diagnosed in 54 (13%) and 103 (25%) patients, respectively. All 249 patients with a normal diagnostic work-up, including those with protocol violations (n = 16), were followed for 3 months. One patient developed UEDVT during follow-up, for an overall failure rate of 0.4% (95% CI, 0.0% to 2.2%). LIMITATIONS This study was not powered to show the safety of the substrategies. d-Dimer testing was done locally. CONCLUSION The combination of a clinical decision score, d-dimer testing, and ultrasonography can safely and effectively exclude UEDVT. If confirmed by other studies, this algorithm has potential as a standard approach to suspected UEDVT. PRIMARY FUNDING SOURCE None.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

It has been observed in various practical applications that data do not conform to the normal distribution, which is symmetric with no skewness. The skew normal distribution proposed by Azzalini(1985) is appropriate for the analysis of data which is unimodal but exhibits some skewness. The skew normal distribution includes the normal distribution as a special case where the skewness parameter is zero. In this thesis, we study the structural properties of the skew normal distribution, with an emphasis on the reliability properties of the model. More specifically, we obtain the failure rate, the mean residual life function, and the reliability function of a skew normal random variable. We also compare it with the normal distribution with respect to certain stochastic orderings. Appropriate machinery is developed to obtain the reliability of a component when the strength and stress follow the skew normal distribution. Finally, IQ score data from Roberts (1988) is analyzed to illustrate the procedure.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The discovery that the epsilon 4 allele of the apolipoprotein E (apoE) gene is a putative risk factor for Alzheimer disease (AD) in the general population has highlighted the role of genetic influences in this extremely common and disabling illness. It has long been recognized that another genetic abnormality, trisomy 21 (Down syndrome), is associated with early and severe development of AD neuropathological lesions. It remains a challenge, however, to understand how these facts relate to the pathological changes in the brains of AD patients. We used computerized image analysis to examine the size distribution of one of the characteristic neuropathological lesions in AD, deposits of A beta peptide in senile plaques (SPs). Surprisingly, we find that a log-normal distribution fits the SP size distribution quite well, motivating a porous model of SP morphogenesis. We then analyzed SP size distribution curves in genotypically defined subgroups of AD patients. The data demonstrate that both apoE epsilon 4/AD and trisomy 21/AD lead to increased amyloid deposition, but by apparently different mechanisms. The size distribution curve is shifted toward larger plaques in trisomy 21/AD, probably reflecting increased A beta production. In apoE epsilon 4/AD, the size distribution is unchanged but the number of SP is increased compared to apoE epsilon 3, suggesting increased probability of SP initiation. These results demonstrate that subgroups of AD patients defined on the basis of molecular characteristics have quantitatively different neuropathological phenotypes.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Mode of access: Internet.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Motivation: An important problem in microarray experiments is the detection of genes that are differentially expressed in a given number of classes. We provide a straightforward and easily implemented method for estimating the posterior probability that an individual gene is null. The problem can be expressed in a two-component mixture framework, using an empirical Bayes approach. Current methods of implementing this approach either have some limitations due to the minimal assumptions made or with more specific assumptions are computationally intensive. Results: By converting to a z-score the value of the test statistic used to test the significance of each gene, we propose a simple two-component normal mixture that models adequately the distribution of this score. The usefulness of our approach is demonstrated on three real datasets.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This study examined the use of non-standard parameters to investigate the visual field, with particular reference to the detection of glaucomatous visual field loss. Evaluation of the new perimetric strategy for threshold estimation - FASTPAC, demonstrated a reduction in the examination time of normals compared to the standard strategy. Despite an increased within-test variability the FASTPAC strategy produced a similar mean sensitivity to the standard strategy, reducing the effects of patient fatigue. The new technique of Blue-Yellow perimetry was compared to White-White perimetry for the detection of glaucomatous field loss in OHT and POAG. Using a database of normal subjects, confidence limits for normality were constructed to account for the increased between-subject variability with increase in age and eccentricity and for the greater variability of the Blue-Yellow field compared to the White-White field. Effects of individual ocular media absorption had little effect on Blue-Yellow field variability. Total and pattern probability analysis revealed five of 27 OHTs to exhibit Blue-Yellow focal abnormalities; two of these patients subsequently developed White-White loss. Twelve of the 24 POAGs revealed wider and/or deeper Blue-Yellow loss compared with the White-White field. Blue-Yellow perimetry showed good sensitivity and specificity characteristics, however, lack of perimetric experience and the presence of cataract influenced the Blue-Yellow visual field and may confound the interpretation of Blue-Yellow visual field loss. Visual field indices demonstrated a moderate relationship to the structural parameters of the optic nerve head using scanning laser tomography. No abnormalities in Blue-Yellow or Red-Green colour CS was apparent for the OHT patients. A greater vulnerability of the SWS pathway in glaucoma was demonstrated using Blue-Yellow perimetry however predicting which patients may benefit from B-Y perimetric examination is difficult. Furthermore, cataract and the extent of the field loss may limit the extent to which the integrity of the SWS channels can be selectively examined.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Lognormal distribution has abundant applications in various fields. In literature, most inferences on the two parameters of the lognormal distribution are based on Type-I censored sample data. However, exact measurements are not always attainable especially when the observation is below or above the detection limits, and only the numbers of measurements falling into predetermined intervals can be recorded instead. This is the so-called grouped data. In this paper, we will show the existence and uniqueness of the maximum likelihood estimators of the two parameters of the underlying lognormal distribution with Type-I censored data and grouped data. The proof was first established under the case of normal distribution and extended to the lognormal distribution through invariance property. The results are applied to estimate the median and mean of the lognormal population.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The present work proposes a Hypothesis Test to detect a shift in the variance of a series of independent normal observations using a statistic based on the p-values of the F distribution. Since the probability distribution function of this statistic is intractable, critical values were we estimated numerically through extensive simulation. A regression approach was used to simplify the quantile evaluation and extrapolation. The power of the test was simulated using Monte Carlo simulation, and the results were compared with the Chen test (1997) to prove its efficiency. Time series analysts might find the test useful to address homoscedasticity studies were at most one change might be involved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In an influential paper Pesaran ('A simple panel unit root test in presence of cross-section dependence', Journal of Applied Econometrics, Vol. 22, pp. 265-312, 2007) proposes two unit root tests for panels with a common factor structure. These are the CADF and CIPS test statistics, which are amongst the most popular test statistics in the literature. One feature of these statistics is that their limiting distributions are highly non-standard, making for relatively complicated implementation. In this paper, we take this feature as our starting point to develop modified CADF and CIPS test statistics that support standard chi-squared and normal inference.