948 resultados para coefficient of variance
Resumo:
The two-way design has been variously described as a matched-sample F-test, a simple within-subjects ANOVA, a one-way within-groups ANOVA, a simple correlated-groups ANOVA, and a one-factor repeated measures design! This confusion of terminology is likely to lead to problems in correctly identifying this analysis within commercially available software. The essential feature of the design is that each treatment is allocated by randomization to one experimental unit within each group or block. The block may be a plot of land, a single occasion in which the experiment was performed, or a human subject. The ‘blocking’ is designed to remove an aspect of the error variation and increase the ‘power’ of the experiment. If there is no significant source of variation associated with the ‘blocking’ then there is a disadvantage to the two-way design because there is a reduction in the DF of the error term compared with a fully randomised design thus reducing the ‘power’ of the analysis.
Resumo:
There is an alternative model of the 1-way ANOVA called the 'random effects' model or ‘nested’ design in which the objective is not to test specific effects but to estimate the degree of variation of a particular measurement and to compare different sources of variation that influence the measurement in space and/or time. The most important statistics from a random effects model are the components of variance which estimate the variance associated with each of the sources of variation influencing a measurement. The nested design is particularly useful in preliminary experiments designed to estimate different sources of variation and in the planning of appropriate sampling strategies.
Resumo:
Experiments combining different groups or factors are a powerful method of investigation in applied microbiology. ANOVA enables not only the effect of individual factors to be estimated but also their interactions; information which cannot be obtained readily when factors are investigated separately. In addition, combining different treatments or factors in a single experiment is more efficient and often reduces the number of replications required to estimate treatment effects adequately. Because of the treatment combinations used in a factorial experiment, the degrees of freedom (DF) of the error term in the ANOVA is a more important indicator of the ‘power’ of the experiment than simply the number of replicates. A good method is to ensure, where possible, that sufficient replication is present to achieve 15 DF for each error term of the ANOVA. Finally, in a factorial experiment, it is important to define the design of the experiment in detail because this determines the appropriate type of ANOVA. We will discuss some of the common variations of factorial ANOVA in future statnotes. If there is doubt about which ANOVA to use, the researcher should seek advice from a statistician with experience of research in applied microbiology.
Resumo:
In some experimental situations, the factors may not be equivalent to each other and replicates cannot be assigned at random to all treatment combinations. A common case, called a ‘split-plot design’, arises when one factor can be considered to be a major factor and the other a minor factor. Investigators need to be able to distinguish a split-plot design from a fully randomized design as it is a common mistake for researchers to analyse a split-plot design as if it were a fully randomised factorial experiment.
Resumo:
Experiments combining different groups or factors and which use ANOVA are a powerful method of investigation in applied microbiology. ANOVA enables not only the effect of individual factors to be estimated but also their interactions; information which cannot be obtained readily when factors are investigated separately. In addition, combining different treatments or factors in a single experiment is more efficient and often reduces the sample size required to estimate treatment effects adequately. Because of the treatment combinations used in a factorial experiment, the degrees of freedom (DF) of the error term in the ANOVA is a more important indicator of the ‘power’ of the experiment than the number of replicates. A good method is to ensure, where possible, that sufficient replication is present to achieve 15 DF for the error term of the ANOVA testing effects of particular interest. Finally, it is important to always consider the design of the experiment because this determines the appropriate ANOVA to use. Hence, it is necessary to be able to identify the different forms of ANOVA appropriate to different experimental designs and to recognise when a design is a split-plot or incorporates a repeated measure. If there is any doubt about which ANOVA to use in a specific circumstance, the researcher should seek advice from a statistician with experience of research in applied microbiology.
Resumo:
Experiments combining different groups or factors and which use ANOVA are a powerful method of investigation in applied microbiology. ANOVA enables not only the effect of individual factors to be estimated but also their interactions; information which cannot be obtained readily when factors are investigated separately. In addition, combining different treatments or factors in a single experiment is more efficient and often reduces the number of replications required to estimate treatment effects adequately. Because of the treatment combinations used in a factorial experiment, the DF of the error term in the ANOVA is a more important indicator of the ‘power’ of the experiment than the number of replicates. A good method is to ensure, where possible, that sufficient replication is present to achieve 15 DF for each error term of the ANOVA. Finally, it is important to consider the design of the experiment because this determines the appropriate ANOVA to use. Some of the most common experimental designs used in the biosciences and their relevant ANOVAs are discussed by. If there is doubt about which ANOVA to use, the researcher should seek advice from a statistician with experience of research in applied microbiology.
Resumo:
In any investigation in optometry involving more that two treatment or patient groups, an investigator should be using ANOVA to analyse the results assuming that the data conform reasonably well to the assumptions of the analysis. Ideally, specific null hypotheses should be built into the experiment from the start so that the treatments variation can be partitioned to test these effects directly. If 'post-hoc' tests are used, then an experimenter should examine the degree of protection offered by the test against the possibilities of making either a type 1 or a type 2 error. All experimenters should be aware of the complexity of ANOVA. The present article describes only one common form of the analysis, viz., that which applies to a single classification of the treatments in a randomised design. There are many different forms of the analysis each of which is appropriate to the analysis of a specific experimental design. The uses of some of the most common forms of ANOVA in optometry have been described in a further article. If in any doubt, an investigator should consult a statistician with experience of the analysis of experiments in optometry since once embarked upon an experiment with an unsuitable design, there may be little that a statistician can do to help.
Resumo:
The key to the correct application of ANOVA is careful experimental design and matching the correct analysis to that design. The following points should therefore, be considered before designing any experiment: 1. In a single factor design, ensure that the factor is identified as a 'fixed' or 'random effect' factor. 2. In more complex designs, with more than one factor, there may be a mixture of fixed and random effect factors present, so ensure that each factor is clearly identified. 3. Where replicates can be grouped or blocked, the advantages of a randomised blocks design should be considered. There should be evidence, however, that blocking can sufficiently reduce the error variation to counter the loss of DF compared with a randomised design. 4. Where different treatments are applied sequentially to a patient, the advantages of a three-way design in which the different orders of the treatments are included as an 'effect' should be considered. 5. Combining different factors to make a more efficient experiment and to measure possible factor interactions should always be considered. 6. The effect of 'internal replication' should be taken into account in a factorial design in deciding the number of replications to be used. Where possible, each error term of the ANOVA should have at least 15 DF. 7. Consider carefully whether a particular factorial design can be considered to be a split-plot or a repeated measures design. If such a design is appropriate, consider how to continue the analysis bearing in mind the problem of using post hoc tests in this situation.
Resumo:
The purpose of the following studies was to explore the effect of systemic vascular and endothelial dysfunction upon the ocular circulation and functionality of the retina. There are 6 principal sections to the present work. Retinal vessel activity in smokers and non-smokers: the principal findings of this work were: chronic smoking affects retinal vessel motion at baseline and during stimulation with flickering light; chronic smoking leads to a vaso-constrictory shift in retinal arteriolar reactivity to flicker; retinal arteriolar elasticity is decreased in chronic smokers. The effect of acute smoking on retinal vessel dynamics in smokers and non-smokers: the principal finding of this work was that retinal reactivity in chronic smokers is blunted when exposed to clicker light provocation immediately after smoking one cigarette. Ocular blood flow in coronary artery disease: The principal findings of this work were: retrobulbar and retinal blood flow is preserved in CAD patients, despite a change pulse wave transmission; arterial retinal response to flickering light provocation is significantly delayed in CAD patients; retinal venular diameters are significantly dilated in CAD patients. Autonomic nervous system function and peripheral circulation in CAD: The principal findings in this work were: CAD patients demonstrate a sympathetic overdrive during a 24 period; a delay in peripheral vascular reactivity (nail-fold capillaries) as observed in patients suffering from CAD could be caused by either arteriosclerotic changes of the vascular walls or due to systemic haemodynamic changes. Visual function in CAD: The principal findings in this work were: overall visual function in CAD patients is preserved, despite a decrease in contrast sensitivity; applying a filtering technique selecting those with greater coefficient of variance which in turn represents a decrease in reliability, some patients appear to have an impaired visual function as assessed using FDT visual field evaluation. Multiple functional, structural and biochemical vascular endothelial dysfunctions in patients suffering from CAD: relationships and possible implications: The principal findings of this work were: BMI significantly correlated with vWF (a marker of endothelial function) in CAD patients. Retinal vascular reactivity showed a significant correlation with peripheral reactivity parameters in controls which lacked in the CAD group and could reflect a loss in vascular endothelial integrity; visual field parameters as assessed by frequency doubling technology were strongly related with systemic vascular elasticity (ambulatory arterial stiffness index) in controls but not CAD patients.
Resumo:
In Statnote 9, we described a one-way analysis of variance (ANOVA) ‘random effects’ model in which the objective was to estimate the degree of variation of a particular measurement and to compare different sources of variation in space and time. The illustrative scenario involved the role of computer keyboards in a University communal computer laboratory as a possible source of microbial contamination of the hands. The study estimated the aerobic colony count of ten selected keyboards with samples taken from two keys per keyboard determined at 9am and 5pm. This type of design is often referred to as a ‘nested’ or ‘hierarchical’ design and the ANOVA estimated the degree of variation: (1) between keyboards, (2) between keys within a keyboard, and (3) between sample times within a key. An alternative to this design is a 'fixed effects' model in which the objective is not to measure sources of variation per se but to estimate differences between specific groups or treatments, which are regarded as 'fixed' or discrete effects. This statnote describes two scenarios utilizing this type of analysis: (1) measuring the degree of bacterial contamination on 2p coins collected from three types of business property, viz., a butcher’s shop, a sandwich shop, and a newsagent and (2) the effectiveness of drugs in the treatment of a fungal eye infection.
Resumo:
We measure the radial profile of the photoelastic coefficient C(r) in single-mode polymer optical fibers (POFs), and we determine the evolution of C(r) after annealing the fibers at temperatures from 40°C to 80°C. We demonstrate that C(r) in the fibers drawn from a preform without specific thermal pre-treatment changes and converges to values between 1.2 and 1.6×10-12 Pa-1 following annealing at 80°C. The annealed fibers display a smoothened radial profile of C(r) and a lowered residual birefringence. In contrast, the mean value of C(r) of the fiber drawn from a preform that has been pre-annealed remains constant after our annealing process and is significantly higher, i.e., 4×10-12 Pa-1. The annealing process decreases the residual birefringence to a lower extent as well. These measurements indicate the impact of annealing on the thermal stability of the photoelastic coefficient of POFs, which is an essential characteristic in view of developing POF-based thermomechanical sensors.
Resumo:
A human genome contains more than 20 000 protein-encoding genes. A human proteome, instead, has been estimated to be much more complex and dynamic. The most powerful tool to study proteins today is mass spectrometry (MS). MS based proteomics is based on the measurement of the masses of charged peptide ions in a gas-phase. The peptide amino acid sequence can be deduced, and matching proteins can be found, using software to correlate MS-data with sequence database information. Quantitative proteomics allow the estimation of the absolute or relative abundance of a certain protein in a sample. The label-free quantification methods use the intrinsic MS-peptide signals in the calculation of the quantitative values enabling the comparison of peptide signals from numerous patient samples. In this work, a quantitative MS methodology was established to study aromatase overexpressing (AROM+) male mouse liver and ovarian endometriosis tissue samples. The workflow of label-free quantitative proteomics was optimized in terms of sensitivity and robustness, allowing the quantification of 1500 proteins with a low coefficient of variance in both sample types. Additionally, five statistical methods were evaluated for the use with label-free quantitative proteomics data. The proteome data was integrated with other omics datasets, such as mRNA microarray and metabolite data sets. As a result, an altered lipid metabolism in liver was discovered in male AROM+ mice. The results suggest a reduced beta oxidation of long chain phospholipids in the liver and increased levels of pro-inflammatory fatty acids in the circulation in these mice. Conversely, in the endometriosis tissues, a set of proteins highly specific for ovarian endometrioma were discovered, many of which were under the regulation of the growth factor TGF-β1. This finding supports subsequent biomarker verification in a larger number of endometriosis patient samples.
Resumo:
[EN] We carry out quasi-classical trajectory caculations for theC + CH+ → C2+ + H reaction on an ad hoc computed high-level ab initio potential energy surface. Thermal rate coefficients at the temperatures of relevance in cold interstellar clouds are derived and compared with the assumed, temperature-independent estimates publicly available in kinetic databases KIDA and UDfA. For a temperature of 10 K the database value overestimates by a factor of two the one obtained by us (thus improperly enhancing the destruction route of CH+ in astrochemical kinetic models) which is seen to double in the temperature range 5–300 K with a sharp increase in the first 50 K. The computed values are fitted via the popular Arrhenius–Kooij formula and best-fitting parameters α = 1:32 X 10-9 cm3s-1, β = 0:10 and γ = 2:19 K to be included in the online mentioned databases are provided. Further investigation shows that the temperature dependence of the thermal rate coefficient better conforms to the recently proposed so-called ‘deformed Arrhenius’ law by Aquilanti and Mundim.