12 resultados para Variable sample size X- control chart

em Aston University Research Archive


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Statistical software is now commonly available to calculate Power (P') and sample size (N) for most experimental designs. In many circumstances, however, sample size is constrained by lack of time, cost, and in research involving human subjects, the problems of recruiting suitable individuals. In addition, the calculation of N is often based on erroneous assumptions about variability and therefore such estimates are often inaccurate. At best, we would suggest that such calculations provide only a very rough guide of how to proceed in an experiment. Nevertheless, calculation of P' is very useful especially in experiments that have failed to detect a difference which the experimenter thought was present. We would recommend that P' should always be calculated in these circumstances to determine whether the experiment was actually too small to test null hypotheses adequately.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The concept of sample size and statistical power estimation is now something that Optometrists that want to perform research, whether it be in practice or in an academic institution, cannot simply hide away from. Ethics committees, journal editors and grant awarding bodies are now increasingly requesting that all research be backed up with sample size and statistical power estimation in order to justify any study and its findings. This article presents a step-by-step guide of the process for determining sample sizeand statistical power. It builds on statistical concepts presented in earlier articles in Optometry Today by Richard Armstrong and Frank Eperjesi.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The performance of seven minimization algorithms are compared on five neural network problems. These include a variable-step-size algorithm, conjugate gradient, and several methods with explicit analytic or numerical approximations to the Hessian.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The Vapnik-Chervonenkis (VC) dimension is a combinatorial measure of a certain class of machine learning problems, which may be used to obtain upper and lower bounds on the number of training examples needed to learn to prescribed levels of accuracy. Most of the known bounds apply to the Probably Approximately Correct (PAC) framework, which is the framework within which we work in this paper. For a learning problem with some known VC dimension, much is known about the order of growth of the sample-size requirement of the problem, as a function of the PAC parameters. The exact value of sample-size requirement is however less well-known, and depends heavily on the particular learning algorithm being used. This is a major obstacle to the practical application of the VC dimension. Hence it is important to know exactly how the sample-size requirement depends on VC dimension, and with that in mind, we describe a general algorithm for learning problems having VC dimension 1. Its sample-size requirement is minimal (as a function of the PAC parameters), and turns out to be the same for all non-trivial learning problems having VC dimension 1. While the method used cannot be naively generalised to higher VC dimension, it suggests that optimal algorithm-dependent bounds may improve substantially on current upper bounds.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Researchers often use 3-way interactions in moderated multiple regression analysis to test the joint effect of 3 independent variables on a dependent variable. However, further probing of significant interaction terms varies considerably and is sometimes error prone. The authors developed a significance test for slope differences in 3-way interactions and illustrate its importance for testing psychological hypotheses. Monte Carlo simulations revealed that sample size, magnitude of the slope difference, and data reliability affected test power. Application of the test to published data yielded detection of some slope differences that were undetected by alternative probing techniques and led to changes of results and conclusions. The authors conclude by discussing the test's applicability for psychological research. Copyright 2006 by the American Psychological Association.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Multiple regression analysis is a complex statistical method with many potential uses. It has also become one of the most abused of all statistical procedures since anyone with a data base and suitable software can carry it out. An investigator should always have a clear hypothesis in mind before carrying out such a procedure and knowledge of the limitations of each aspect of the analysis. In addition, multiple regression is probably best used in an exploratory context, identifying variables that might profitably be examined by more detailed studies. Where there are many variables potentially influencing Y, they are likely to be intercorrelated and to account for relatively small amounts of the variance. Any analysis in which R squared is less than 50% should be suspect as probably not indicating the presence of significant variables. A further problem relates to sample size. It is often stated that the number of subjects or patients must be at least 5-10 times the number of variables included in the study.5 This advice should be taken only as a rough guide but it does indicate that the variables included should be selected with great care as inclusion of an obviously unimportant variable may have a significant impact on the sample size required.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The use of antibiotics was investigated in twelve acute hospitals in England. Data was collected electronically and by questionnaire for the financial years 2001/2, 2002/3 and 2003/4. Hospitals were selected on the basis of their Medicines Management Self-Assessment Scores (MMAS) and included a cohort of three hospitals with integrated electronic prescribing systems. The total sample size was 6.65% of English NHS activity for 2001/2 based on Finished Consultant Episode (FCE) numbers. Data collected included all antibiotics dispensed (ATC category J01), hospital activity FCE's and beddays, Medicines Management Self-assessment scores, Antibiotic Medicines Management scores (AMS), Primary Care Trust (PCT) of origin of referral populations, PCT antibiotic prescribing rates, Index of Multiple Deprivation for each PCT. The DDD/FCE (Defined Daily Dose/FCE) was found to correlate with the DDD 100beddays (r = 0.74 psample mean at 3.48 DDD/FCE in 2001/2 and 3.34 DDD/FCE in 2003/4. The MMAS and AMS were found to correlate (r = 0.74 pcontrol of antibiotic use. No correlation was found between the MMAS and a range of qualitative indicators of antibiotic use. A number of indicators are proposed as triggers for further investigation including a proportion of 0.24 for the ratio of third generation to first/second generation cephalosporin use, and five percent as the limit for parenteral quinolone DOD of total quinolone DOD usage. It was possible to demonstrate a correlation between the IMD 2000 and primary care antibiotic prescribing rates but not between primary and secondary care antibiotic prescribing rates for the same referral population or between the weighted mean IMD 2000 for each hospital's referral population and the hospital antibiotic prescribing rate.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Citation information: Armstrong RA, Davies LN, Dunne MCM & Gilmartin B. Statistical guidelines for clinical studies of human vision. Ophthalmic Physiol Opt 2011, 31, 123-136. doi: 10.1111/j.1475-1313.2010.00815.x ABSTRACT: Statistical analysis of data can be complex and different statisticians may disagree as to the correct approach leading to conflict between authors, editors, and reviewers. The objective of this article is to provide some statistical advice for contributors to optometric and ophthalmic journals, to provide advice specifically relevant to clinical studies of human vision, and to recommend statistical analyses that could be used in a variety of circumstances. In submitting an article, in which quantitative data are reported, authors should describe clearly the statistical procedures that they have used and to justify each stage of the analysis. This is especially important if more complex or 'non-standard' analyses have been carried out. The article begins with some general comments relating to data analysis concerning sample size and 'power', hypothesis testing, parametric and non-parametric variables, 'bootstrap methods', one and two-tail testing, and the Bonferroni correction. More specific advice is then given with reference to particular statistical procedures that can be used on a variety of types of data. Where relevant, examples of correct statistical practice are given with reference to recently published articles in the optometric and ophthalmic literature.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Over the last few years, zonisamide has been proposed as a potentially useful medication for patients with focal seizures, with or without secondary generalization. Since psychiatric adverse effects, including mania, psychosis, and suicidal ideation, have been associated with its use, it was suggested that the presence of antecedent psychiatric disorders is an important factor associated with the discontinuation of zonisamide therapy in patients with epilepsy. We, therefore, set out to assess the tolerability profile of zonisamide in a retrospective chart review of 23 patients with epilepsy and comorbid mental disorders, recruited from two specialist pediatric (n=11) and adult (n=12) neuropsychiatry clinics. All patients had a clinical diagnosis of treatment-refractory epilepsy after extensive neurophysiological and neuroimaging investigations. The vast majority of patients (n=22/23, 95.7%) had tried previous antiepileptic medications, and most adult patients (n=9/11, 81.8%) were on concomitant medication for epilepsy. In the majority of cases, the psychiatric adverse effects of zonisamide were not severe. Four patients (17.4%) discontinued zonisamide because of lack of efficacy, whereas only one patient (4.3%) discontinued it because of the severity of psychiatric adverse effects (major depressive disorder). The low discontinuation rate of zonisamide in a selected population of patients with epilepsy and neuropsychiatric comorbidity suggests that this medication is safe and reasonably well-tolerated for use in patients with treatment-refractory epilepsy. Given the limitations of the present study, including the relatively small sample size, further research is warranted to confirm this finding. © 2013 Elsevier Inc.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The principal theme of this thesis is the identification of additional factors affecting, and consequently to better allow, the prediction of soft contact lens fit. Various models have been put forward in an attempt to predict the parameters that influence soft contact lens fit dynamics; however, the factors that influence variation in soft lens fit are still not fully understood. The investigations in this body of work involved the use of a variety of different imaging techniques to both quantify the anterior ocular topography and assess lens fit. The use of Anterior-Segment Optical Coherence Tomography (AS-OCT) allowed for a more complete characterisation of the cornea and corneoscleral profile (CSP) than either conventional keratometry or videokeratoscopy alone, and for the collection of normative data relating to the CSP for a substantial sample size. The scleral face was identified as being rotationally asymmetric, the mean corneoscleral junction (CSJ) angle being sharpest nasally and becoming progressively flatter at the temporal, inferior and superior limbal junctions. Additionally, 77% of all CSJ angles were within ±50 of 1800, demonstrating an almost tangential extension of the cornea to form the paralimbal sclera. Use of AS-OCT allowed for a more robust determination of corneal diameter than that of white-to-white (WTW) measurement, which is highly variable and dependent on changes in peripheral corneal transparency. Significant differences in ocular topography were found between different ethnicities and sexes, most notably for corneal diameter and corneal sagittal height variables. Lens tightness was found to be significantly correlated with the difference between horizontal CSJ angles (r =+0.40, P =0.0086). Modelling of the CSP data gained allowed for prediction of up to 24% of the variance in contact lens fit; however, it was likely that stronger associations and an increase in the modelled prediction of variance in fit may have occurred had an objective method of lens fit assessment have been made. A subsequent investigation to determine the validity and repeatability of objective contact lens fit assessment using digital video capture showed no significant benefit over subjective evaluation. The technique, however, was employed in the ensuing investigation to show significant changes in lens fit between 8 hours (the longest duration of wear previously examined) and 16 hours, demonstrating that wearing time is an additional factor driving lens fit dynamics. The modelling of data from enhanced videokeratoscopy composite maps alone allowed for up to 77% of the variance in soft contact lens fit, and up to almost 90% to be predicted when used in conjunction with OCT. The investigations provided further insight into the ocular topography and factors affecting soft contact lens fit.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The microstructure and thermoelectric properties of Yb-doped Ca0.9-x Yb x La0.1 MnO3 (0 ≤ x ≤ 0.05) ceramics prepared by using the Pechini method derived powders have been investigated. X-ray diffraction analysis has shown that all samples exhibit single phase with orthorhombic perovskite structure. All ceramic samples possess high relative densities, ranging from 97.04% to 98.65%. The Seebeck coefficient is negative, indicating n-type conduction in all samples. The substitution of Yb for Ca leads to a marked decrease in the electrical resistivity, along with a moderate decrease in the absolute value of the Seebeck coefficient. The highest power factor is obtained for the sample with x = 0.05. The electrical conduction in these compounds is due to electrons hopping between Mn3+ and Mn4+, which is enhanced by increasing Yb content.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Privately owned water utilities typically operate under a regulated monopoly regime. Price-cap regulation has been introduced as a means to enhance efficiency and innovation. The main objective of this paper is to propose a methodology for measuring productivity change across companies and over time when the sample size is limited. An empirical application is developed for the UK water and sewerage companies (WaSCs) for the period 1991-2008. A panel index approach is applied to decompose and derive unit-specific productivity growth as a function of the productivity growth achieved by benchmark firms, and the catch-up to the benchmark firm achieved by less productive firms. The results indicated that significant gains in productivity occurred after 2000, when the regulator set tighter reviews. However, the average WaSC still must improve towards the benchmarking firm by 2.69% over a period of five years to achieve comparable performance. This study is relevant to regulators who are interested in developing comparative performance measurement when the number of water companies that can be evaluated is limited. Moreover, setting an appropriate X factor is essential to improve the efficiency of water companies and this study helps to achieve this challenge.