910 resultados para Chi-Square Distribution
Resumo:
"NAVWEPS report 7770. NOTS TP 2749."
Resumo:
Throughout this article, it is assumed that the no-central chi-square chart with two stage samplings (TSS Chisquare chart) is employed to monitor a process where the observations from the quality characteristic of interest X are independent and identically normally distributed with mean μ and variance σ2. The process is considered to start with the mean and the variance on target (μ = μ0; σ2 = σ0 2), but at some random time in the future an assignable cause shifts the mean from μ0 to μ1 = μ0 ± δσ0, δ >0 and/or increases the variance from σ0 2 to σ1 2 = γ2σ0 2, γ > 1. Before the assignable cause occurrence, the process is considered to be in a state of statistical control (defined by the in-control state). Similar to the Shewhart charts, samples of size n 0+ 1 are taken from the process at regular time intervals. The samplings are performed in two stages. At the first stage, the first item of the i-th sample is inspected. If its X value, say Xil, is close to the target value (|Xil-μ0|< w0σ 0, w0>0), then the sampling is interrupted. Otherwise, at the second stage, the remaining n0 items are inspected and the following statistic is computed. Wt = Σj=2n 0+1(Xij - μ0 + ξiσ 0)2 i = 1,2 Let d be a positive constant then ξ, =d if Xil > 0 ; otherwise ξi =-d. A signal is given at sample i if |Xil-μ0| > w0σ 0 and W1 > knia:tl, where kChi is the factor used in determining the upper control limit for the non-central chi-square chart. If devices such as go and no-go gauges can be considered, then measurements are not required except when the sampling goes to the second stage. Let P be the probability of deciding that the process is in control and P 1, i=1,2, be the probability of deciding that the process is in control at stage / of the sampling procedure. Thus P = P1 + P 2 - P1P2, P1 = Pr[μ0 - w0σ0 ≤ X ≤ μ0+ w 0σ0] P2=Pr[W ≤ kChi σ0 2], (3) During the in-control period, W / σ0 2 is distributed as a non-central chi-square distribution with n0 degrees of freedom and a non-centrality parameter λ0 = n0d2, i.e. W / σ0 2 - xn0 22 (λ0) During the out-of-control period, W / σ1 2 is distributed as a non-central chi-square distribution with n0 degrees of freedom and a non-centrality parameter λ1 = n0(δ + ξ)2 / γ2 The effectiveness of a control chart in detecting a process change can be measured by the average run length (ARL), which is the speed with which a control chart detects process shifts. The ARL for the proposed chart is easily determined because in this case, the number of samples before a signal is a geometrically distributed random variable with parameter 1-P, that is, ARL = I /(1-P). It is shown that the performance of the proposed chart is better than the joint X̄ and R charts, Furthermore, if the TSS Chi-square chart is used for monitoring diameters, volumes, weights, etc., then appropriate devices, such as go-no-go gauges can be used to decide if the sampling should go to the second stage or not. When the process is stable, and the joint X̄ and R charts are in use, the monitoring becomes monotonous because rarely an X̄ or R value fall outside the control limits. The natural consequence is the user to pay less and less attention to the steps required to obtain the X̄ and R value. In some cases, this lack of attention can result in serious mistakes. The TSS Chi-square chart has the advantage that most of the samplings are interrupted, consequently, most of the time the user will be working with attributes. Our experience shows that the inspection of one item by attribute is much less monotonous than measuring four or five items at each sampling.
Resumo:
When the data are counts or the frequencies of particular events and can be expressed as a contingency table, then they can be analysed using the chi-square distribution. When applied to a 2 x 2 table, the test is approximate and care needs to be taken in analysing tables when the expected frequencies are small either by applying Yate’s correction or by using Fisher’s exact test. Larger contingency tables can also be analysed using this method. Note that it is a serious statistical error to use any of these tests on measurement data!
Resumo:
The vibro-acoustic response of built-up structures, consisting of stiff components with low modal density and flexible components with high modal density, is sensitive to small imperfections in the flexible components. In this paper, the uncertainty of the response is considered by modeling the low modal density master system as deterministic and the high modal density subsystems in a nonparametric stochastic way, i.e., carrying a diffuse wave field, and by subsequently computing the response probability density function. The master system's mean squared response amplitude follows a singular noncentral complex Wishart distribution conditional on the subsystem energies. For a single degree of freedom, this is equivalent to a chi-square or an exponential distribution, depending on the loading conditions. The subsystem energies follow approximately a chi-square distribution when their relative variance is smaller than unity. The results are validated by application to plate structures, and good agreement with Monte Carlo simulations is found. © 2012 Acoustical Society of America.
Resumo:
Student’s t-distribution has found various applications in mathematical statistics. One of the main properties of the t-distribution is to converge to the normal distribution as the number of samples tends to infinity. In this paper, by using a Cauchy integral we introduce a generalization of the t-distribution function with four free parameters and show that it converges to the normal distribution again. We provide a comprehensive treatment of mathematical properties of this new distribution. Moreover, since the Fisher F-distribution has a close relationship with the t-distribution, we also introduce a generalization of the F-distribution and prove that it converges to the chi-square distribution as the number of samples tends to infinity. Finally some particular sub-cases of these distributions are considered.
Resumo:
In this paper, we consider the non-central chi-square chart with two stage samplings. During the first stage, one item of the sample is inspected and, depending on the result, the sampling is either interrupted, or it goes on to the second stage, where the remaining sample items are inspected and the non-central chi-square statistic is computed. The proposed chart is not only more sensitive than the joint (X) over bar and R charts, but operationally simpler too, particularly when appropriate devices, such as go-no-go gauges, can be used to decide if the sampling should go on to the second stage or not. (c) 2004 Elsevier B.V. All rights reserved.
Resumo:
Traditionally, an (X) over bar -chart is used to control the process mean and an R-chart to control the process variance. However, these charts are not sensitive to small changes in process parameters. A good alternative to these charts is the exponentially weighted moving average (EWMA) control chart for controlling the process mean and variability, which is very effective in detecting small process disturbances. In this paper, we propose a single chart that is based on the non-central chi-square statistic, which is more effective than the joint (X) over bar and R charts in detecting assignable cause(s) that change the process mean and/or increase variability. It is also shown that the EWMA control chart based on a non-central chi-square statistic is more effective in detecting both increases and decreases in mean and/or variability.
Resumo:
We introduce a diagnostic test for the mixing distribution in a generalised linear mixed model. The test is based on the difference between the marginal maximum likelihood and conditional maximum likelihood estimates of a subset of the fixed effects in the model. We derive the asymptotic variance of this difference, and propose a test statistic that has a limiting chi-square distribution under the null hypothesis that the mixing distribution is correctly specified. For the important special case of the logistic regression model with random intercepts, we evaluate via simulation the power of the test in finite samples under several alternative distributional forms for the mixing distribution. We illustrate the method by applying it to data from a clinical trial investigating the effects of hormonal contraceptives in women.
Resumo:
The purpose of this research is to develop a new statistical method to determine the minimum set of rows (R) in a R x C contingency table of discrete data that explains the dependence of observations. The statistical power of the method will be empirically determined by computer simulation to judge its efficiency over the presently existing methods. The method will be applied to data on DNA fragment length variation at six VNTR loci in over 72 populations from five major racial groups of human (total sample size is over 15,000 individuals; each sample having at least 50 individuals). DNA fragment lengths grouped in bins will form the basis of studying inter-population DNA variation within the racial groups are significant, will provide a rigorous re-binning procedure for forensic computation of DNA profile frequencies that takes into account intra-racial DNA variation among populations. ^
Resumo:
Previous literature has focused on the need for support of undergraduate nursing students during clinical placements. Little is known about the support provided by employers for registered nurses (RNs) who pursue further education. This study sought to identify and describe the types, levels and perceived need for support in the workplace for RNs as they undertake further postgraduate nursing study by distance education (DE).Using an exploratory descriptive design a self-report questionnaire was distributed to a convenient sample of 270 RNs working in one acute care public hospital in Tasmania, Australia.92 questionnaires (response rate 34%) were returned with 26 (28%) reporting being currently enrolled in further study by DE and a further 50 (54)% of RNs planning future study. Results revealed that 100% of participants with a Masters degree completed this by DE. There were differences between the support sought by RNs to that offered by employers, and 16 (34%) who had done or were currently doing DE study, received no support to undertake DE. There was an overwhelming desire by RNs for support; 87 (94%), with a majority believing some support should be mandatory 76 (83%).This study may encourage employers to introduce structured support systems that will actively assist nurses to pursue further study. © 2010.
Resumo:
Background: Although the potential to reduce hospitalisation and mortality in chronic heart failure (CHF) is well reported, the feasibility of receiving healthcare by structured telephone support or telemonitoring is not. Aims: To determine; adherence, adaptation and acceptability to a national nurse-coordinated telephone-monitoring CHF management strategy. The Chronic Heart Failure Assistance by Telephone Study (CHAT). Methods: Triangulation of descriptive statistics, feedback surveys and qualitative analysis of clinical notes. Cohort comprised of standard care plus intervention (SC + I) participants who completed the first year of the study. Results: 30 GPs (70% rural) randomised to SC + I recruited 79 eligible participants, of whom 60 (76%) completed the full 12 month follow-up period. During this time 3619 calls were made into the CHAT system (mean 45.81 SD ± 79.26, range 0-369), Overall there was an adherence to the study protocol of 65.8% (95% CI 0.54-0.75; p = 0.001) however, of the 60 participants who completed the 12 month follow-up period the adherence was significantly higher at 92.3% (95% CI 0.82-0.97, p ≤ 0.001). Only 3% of this elderly group (mean age 74.7 ±9.3 years) were unable to learn or competently use the technology. Participants rated CHAT with a total acceptability rate of 76.45%. Conclusion: This study shows that elderly CHF patients can adapt quickly, find telephone-monitoring an acceptable part of their healthcare routine, and are able to maintain good adherence for a least 12 months. © 2007.
Resumo:
1. The low density lipoprotein receptor is an important regulator of serum cholesterol which may have implications for the development of both hypertension and obesity. In this study, genotypes for a low density lipoprotein receptor gene (LDLR) dinucleotide polymorphism were determined in both lean and obese normotensive populations. 2. In previous cross-sectional association studies an ApaLI and a HincII polymorphism for LDLR were shown to be associated with obesity in essential hypertensives. However, these polymorphisms did not show an association with obesity in normotensives. 3. In contrast, this study reports that preliminary results for an LDLR microsatellite marker, located more towards the 3' end of the gene, show a significant association with obesity in the normotensive population studied. These results indicate that LDLR could play an important role in the development of obesity, which might be independent of hypertension.
Resumo:
Objective: Modern series from high-volume esophageal centers report an approximate 40% 5-year survival in patients treated with curative intent and postoperative mortality rates of less than 4%. An objective analysis of factors that underpin current benchmarks within high-volume centers has not been performed. Methods: Three time periods were studied, 1990 to 1998 (period 1), 1999 to 2003 (period 2), and 2004 to 2008 (period 3), in which 471, 254, and 342 patients, respectively, with esophageal cancer were treated with curative intent. All data were prospectively recorded, and staging, pathology, treatment, operative, and oncologic outcomes were compared. Results: Five-year disease-specific survival was 28%, 35%, and 44%, and in-hospital postoperative mortality was 6.7%, 4.4%, and 1.7% for periods 1 to 3, respectively (P < .001). Period 3, compared with periods 1 and 2, respectively, was associated with significantly (P < .001) more early tumors (17% vs 4% and 6%), higher nodal yields (median 22 vs 11 and 18), and a higher R0 rate in surgically treated patients (81% vs 73% and 75%). The use of multimodal therapy increased (P < .05) across time periods. By multivariate analysis, age, T stage, N stage, vascular invasion, R status, and time period were significantly (P < .0001) associated with outcome. Conclusions: Improved survival with localized esophageal cancer in the modern era may reflect an increase of early tumors and optimized staging. Important surgical and pathologic standards, including a higher R0 resection rate and nodal yields, and lower postoperative mortality, were also observed. Copyright © 2012 by The American Association for Thoracic Surgery.
Resumo:
Purpose The role played by the innate immune system in determining survival from non-small-cell lung cancer (NSCLC) is unclear. The aim of this study was to investigate the prognostic significance of macrophage and mast-cell infiltration in NSCLC. Methods We used immunohistochemistry to identify tryptase+ mast cells and CD68+ macrophages in the tumor stroma and tumor islets in 175 patients with surgically resected NSCLC. Results Macrophages were detected in both the tumor stroma and islets in all patients. Mast cells were detected in the stroma and islets in 99.4% and 68.5% of patients, respectively. Using multivariate Cox proportional hazards analysis, increasing tumor islet macrophage density (P < .001) and tumor islet/stromal macrophage ratio (P < .001) emerged as favorable independent prognostic indicators. In contrast, increasing stromal macrophage density was an independent predictor of reduced survival (P = .001). The presence of tumor islet mast cells (P = .018) and increasing islet/stromal mast-cell ratio (P = .032) were also favorable independent prognostic indicators. Macrophage islet density showed the strongest effect: 5-year survival was 52.9% in patients with an islet macrophage density greater than the median versus 7.7% when less than the median (P < .0001). In the same groups, respectively, median survival was 2,244 versus 334 days (P < .0001). Patients with a high islet macrophage density but incomplete resection survived markedly longer than patients with a low islet macrophage density but complete resection. Conclusion The tumor islet CD68+ macrophage density is a powerful independent predictor of survival from surgically resected NSCLC. The biologic explanation for this and its implications for the use of adjunctive treatment requires further study. © 2005 by American Society of Clinical Oncology.