31 resultados para n-way analysis
em Aston University Research Archive
Resumo:
The two-way design has been variously described as a matched-sample F-test, a simple within-subjects ANOVA, a one-way within-groups ANOVA, a simple correlated-groups ANOVA, and a one-factor repeated measures design! This confusion of terminology is likely to lead to problems in correctly identifying this analysis within commercially available software. The essential feature of the design is that each treatment is allocated by randomization to one experimental unit within each group or block. The block may be a plot of land, a single occasion in which the experiment was performed, or a human subject. The ‘blocking’ is designed to remove an aspect of the error variation and increase the ‘power’ of the experiment. If there is no significant source of variation associated with the ‘blocking’ then there is a disadvantage to the two-way design because there is a reduction in the DF of the error term compared with a fully randomised design thus reducing the ‘power’ of the analysis.
Resumo:
There is an alternative model of the 1-way ANOVA called the 'random effects' model or ‘nested’ design in which the objective is not to test specific effects but to estimate the degree of variation of a particular measurement and to compare different sources of variation that influence the measurement in space and/or time. The most important statistics from a random effects model are the components of variance which estimate the variance associated with each of the sources of variation influencing a measurement. The nested design is particularly useful in preliminary experiments designed to estimate different sources of variation and in the planning of appropriate sampling strategies.
Resumo:
In Statnote 9, we described a one-way analysis of variance (ANOVA) ‘random effects’ model in which the objective was to estimate the degree of variation of a particular measurement and to compare different sources of variation in space and time. The illustrative scenario involved the role of computer keyboards in a University communal computer laboratory as a possible source of microbial contamination of the hands. The study estimated the aerobic colony count of ten selected keyboards with samples taken from two keys per keyboard determined at 9am and 5pm. This type of design is often referred to as a ‘nested’ or ‘hierarchical’ design and the ANOVA estimated the degree of variation: (1) between keyboards, (2) between keys within a keyboard, and (3) between sample times within a key. An alternative to this design is a 'fixed effects' model in which the objective is not to measure sources of variation per se but to estimate differences between specific groups or treatments, which are regarded as 'fixed' or discrete effects. This statnote describes two scenarios utilizing this type of analysis: (1) measuring the degree of bacterial contamination on 2p coins collected from three types of business property, viz., a butcher’s shop, a sandwich shop, and a newsagent and (2) the effectiveness of drugs in the treatment of a fungal eye infection.
A comparison of U.S. and Japanese management systems and their transferability to Singapore industry
Resumo:
This research compares U.S. and Japanese management systems and evaluates their transferability to the Singaporean manufacturing industry. The objectives were:- a) To determine the effectiveness of U.S. and Japanese management systems when applied to Singapore. b) Determine the extent of transferability of U.S. and Japanese management systems to Singapore. c) Survey general problems ecountered in the application of U.S. and Japanese management systems to the Singapore industry. The study using questionnaire survey and interviews covered a total of eighty companies from four groups of firms in four industrial sectors comprising of U.S. and Japanese subsidiaries based in Singapore and their respective parent companies. Data from the questionnaires and interviews were used to investigate environmental conditions, management philosophy, management functions/practices, management effectiveness, and firm productivity. Two-way analysis of variance was used to analyse the questionnaire data. The analysis of the perceptual data from the questionnaire survey and interviews suggested that both U.S. and Japanese parent companies performed better in almost all the management variables studied when compared to their subsidiaries in Singapore. U.S. subsidiaries have less difficulty in adjusting to the Singapore environmental conditions and obtained better results than the Japanese subsidiaries in management functions/practices and management philosophy than the U.S. subsidiaries. In addition, the firm productivity (in terms of labour and capital productivity) of U.S. subsidiaries in Singapore was found to be higher than those of the Japanese subsidiaries. It was found that the Japanese parent companies returned the highest score among the four groups of firms in all the four industrial sectors for all the four management variables (i.e. environmental conditions, management philosophy, management functions/practices, and management effectiveness) surveyed using questionnaires. In contrast, the average score for Japanese subsidiaries in Singapore was generally the lowest among the four groups of firms. Thus the results of this study suggest that the transfer of U.S. management system into the Singapore industry is more successful than the Japanese management system. The problems encountered in the application of U.S. and Japanese management in Singapore were identified and discussed by the study. General recommendations for the Singaporean manufacturing industry were then made based on the findings of the questionnaire survey and interview analysis.
Resumo:
Objectives - The absence of pathophysiologically relevant diagnostic markers of bipolar disorder (BD) leads to its frequent misdiagnosis as unipolar depression (UD). We aimed to determine whether whole brain white matter connectivity differentiated BD from UD depression. Methods - We employed a three-way analysis of covariance, covarying for age, to examine whole brain fractional anisotropy (FA), and corresponding longitudinal and radial diffusivity, in currently depressed adults: 15 with BD-type I (mean age 36.3 years, SD 12.0 years), 16 with recurrent UD (mean age 32.3 years, SD 10.0 years), and 24 healthy control adults (HC) (mean age 29.5 years, SD 9.43 years). Depressed groups did not differ in depression severity, age of illness onset, and illness duration. Results - There was a main effect of group in left superior and inferior longitudinal fasciculi (SLF and ILF) (all F = 9.8; p = .05, corrected). Whole brain post hoc analyses (all t = 4.2; p = .05, corrected) revealed decreased FA in left SLF in BD, versus UD adults in inferior temporal cortex and, versus HC, in primary sensory cortex (associated with increased radial and decreased longitudinal diffusivity, respectively); and decreased FA in left ILF in UD adults versus HC. A main effect of group in right uncinate fasciculus (in orbitofrontal cortex) just failed to meet significance in all participants but was present in women. Post hoc analyses revealed decreased right uncinate fasciculus FA in all and in women, BD versus HC. Conclusions - White matter FA in left occipitotemporal and primary sensory regions supporting visuospatial and sensory processing differentiates BD from UD depression. Abnormally reduced FA in right fronto-temporal regions supporting mood regulation, might underlie predisposition to depression in BD. These measures might help differentiate pathophysiologic processes of BD versus UD depression.
Resumo:
The judicial interest in ‘scientific’ evidence has driven recent work to quantify results for forensic linguistic authorship analysis. Through a methodological discussion and a worked example this paper examines the issues which complicate attempts to quantify results in work. The solution suggested to some of the difficulties is a sampling and testing strategy which helps to identify potentially useful, valid and reliable markers of authorship. An important feature of the sampling strategy is that these markers identified as being generally valid and reliable are retested for use in specific authorship analysis cases. The suggested approach for drawing quantified conclusions combines discriminant function analysis and Bayesian likelihood measures. The worked example starts with twenty comparison texts for each of three potential authors and then uses a progressively smaller comparison corpus, reducing to fifteen, ten, five and finally three texts per author. This worked example demonstrates how reducing the amount of data affects the way conclusions can be drawn. With greater numbers of reference texts quantified and safe attributions are shown to be possible, but as the number of reference texts reduces the analysis shows how the conclusion which should be reached is that no attribution can be made. The testing process at no point results in instances of a misattribution.
Resumo:
It has been argued that a single two-dimensional visualization plot may not be sufficient to capture all of the interesting aspects of complex data sets, and therefore a hierarchical visualization system is desirable. In this paper we extend an existing locally linear hierarchical visualization system PhiVis ¸iteBishop98a in several directions: bf(1) We allow for em non-linear projection manifolds. The basic building block is the Generative Topographic Mapping (GTM). bf(2) We introduce a general formulation of hierarchical probabilistic models consisting of local probabilistic models organized in a hierarchical tree. General training equations are derived, regardless of the position of the model in the tree. bf(3) Using tools from differential geometry we derive expressions for local directional curvatures of the projection manifold. Like PhiVis, our system is statistically principled and is built interactively in a top-down fashion using the EM algorithm. It enables the user to interactively highlight those data in the ancestor visualization plots which are captured by a child model. We also incorporate into our system a hierarchical, locally selective representation of magnification factors and directional curvatures of the projection manifolds. Such information is important for further refinement of the hierarchical visualization plot, as well as for controlling the amount of regularization imposed on the local models. We demonstrate the principle of the approach on a toy data set and apply our system to two more complex 12- and 18-dimensional data sets.
Resumo:
Purpose. To use anterior segment optical coherence tomography (AS-OCT) to analyze ciliary muscle morphology and changes with accommodation and axial ametropia. Methods. Fifty prepresbyopic volunteers, aged 19 to 34 years were recruited. High-resolution images were acquired of nasal and temporal ciliary muscles in the relaxed state and at stimulus vergence levels of -4 and -8 D. Objective accommodative responses and axial lengths were also recorded. Two-way, mixed-factor analyses of variance (ANOVAs) were used to assess the changes in ciliary muscle parameters with accommodation and determine whether these changes are dependent on the nasal–temporal aspect or axial length, whereas linear regression analysis was used to analyze the relationship between axial length and ciliary muscle length. Results. The ciliary muscle was longer (r = 0.34, P = 0.02), but not significantly thicker (F = 2.84, P = 0.06), in eyes with greater axial length. With accommodation, the ciliary muscle showed a contractile shortening (F = 42.9. P < 0.001), particularly anteriorly (F = 177.2, P < 0.001), and a thickening of the anterior portion (F= 46.2, P < 0.001). The ciliary muscle was thicker (F = 17.8, P < 0.001) and showed a greater contractile response on the temporal side. Conclusions. The accommodative changes observed support an anterior, as well as centripetal, contractile shift of ciliary muscle mass.
Resumo:
Researchers often use 3-way interactions in moderated multiple regression analysis to test the joint effect of 3 independent variables on a dependent variable. However, further probing of significant interaction terms varies considerably and is sometimes error prone. The authors developed a significance test for slope differences in 3-way interactions and illustrate its importance for testing psychological hypotheses. Monte Carlo simulations revealed that sample size, magnitude of the slope difference, and data reliability affected test power. Application of the test to published data yielded detection of some slope differences that were undetected by alternative probing techniques and led to changes of results and conclusions. The authors conclude by discussing the test's applicability for psychological research. Copyright 2006 by the American Psychological Association.
Resumo:
Recent discussion of the knowledge-based economy draws increasingly attention to the role that the creation and management of knowledge plays in economic development. Development of human capital, the principal mechanism for knowledge creation and management, becomes a central issue for policy-makers and practitioners at the regional, as well as national, level. Facing competition both within and across nations, regional policy-makers view human capital development as a key to strengthening the positions of their economies in the global market. Against this background, the aim of this study is to go some way towards answering the question of whether, and how, investment in education and vocational training at regional level provides these territorial units with comparative advantages. The study reviews literature in economics and economic geography on economic growth (Chapter 2). In growth model literature, human capital has gained increased recognition as a key production factor along with physical capital and labour. Although leaving technical progress as an exogenous factor, neoclassical Solow-Swan models have improved their estimates through the inclusion of human capital. In contrast, endogenous growth models place investment in research at centre stage in accounting for technical progress. As a result, they often focus upon research workers, who embody high-order human capital, as a key variable in their framework. An issue of discussion is how human capital facilitates economic growth: is it the level of its stock or its accumulation that influences the rate of growth? In addition, these economic models are criticised in economic geography literature for their failure to consider spatial aspects of economic development, and particularly for their lack of attention to tacit knowledge and urban environments that facilitate the exchange of such knowledge. Our empirical analysis of European regions (Chapter 3) shows that investment by individuals in human capital formation has distinct patterns. Those regions with a higher level of investment in tertiary education tend to have a larger concentration of information and communication technology (ICT) sectors (including provision of ICT services and manufacture of ICT devices and equipment) and research functions. Not surprisingly, regions with major metropolitan areas where higher education institutions are located show a high enrolment rate for tertiary education, suggesting a possible link to the demand from high-order corporate functions located there. Furthermore, the rate of human capital development (at the level of vocational type of upper secondary education) appears to have significant association with the level of entrepreneurship in emerging industries such as ICT-related services and ICT manufacturing, whereas such association is not found with traditional manufacturing industries. In general, a high level of investment by individuals in tertiary education is found in those regions that accommodate high-tech industries and high-order corporate functions such as research and development (R&D). These functions are supported through the urban infrastructure and public science base, facilitating exchange of tacit knowledge. They also enjoy a low unemployment rate. However, the existing stock of human and physical capital in those regions with a high level of urban infrastructure does not lead to a high rate of economic growth. Our empirical analysis demonstrates that the rate of economic growth is determined by the accumulation of human and physical capital, not by level of their existing stocks. We found no significant effects of scale that would favour those regions with a larger stock of human capital. The primary policy implication of our study is that, in order to facilitate economic growth, education and training need to supply human capital at a faster pace than simply replenishing it as it disappears from the labour market. Given the significant impact of high-order human capital (such as business R&D staff in our case study) as well as the increasingly fast pace of technological change that makes human capital obsolete, a concerted effort needs to be made to facilitate its continuous development.
Resumo:
The advent of Internet banking and phone banking is changing the role of bank branches from a predominantly transaction-based one to a sales-oriented role. This paper reports on an assessment of the branches of a Portuguese bank in terms of their performance in their new roles in three different areas: Their efficiency in fostering the use of new transaction channels, their efficiency in increasing sales and their customer base, and their efficiency in generating profits. Service quality is also a major issue in service organisations like bank branches, and therefore we analyse the way this dimension of performance has been accounted for in the literature and take it into account in our empirical application. We have used data envelopment analysis (DEA) for the different performance assessments, but we depart from traditional DEA models in some cases. Performance comparisons on each dimension allowed us to identify benchmark bank branches and also problematic bank branches. In addition, we found positive links between operational and profit efficiency and also between transactional and operational efficiency. Service quality is positively related with operational and profit efficiency. © 2006 Elsevier B.V. All rights reserved.
Resumo:
Analysis of variance (ANOVA) is the most efficient method available for the analysis of experimental data. Analysis of variance is a method of considerable complexity and subtlety, with many different variations, each of which applies in a particular experimental context. Hence, it is possible to apply the wrong type of ANOVA to data and, therefore, to draw an erroneous conclusion from an experiment. This article reviews the types of ANOVA most likely to arise in clinical experiments in optometry including the one-way ANOVA ('fixed' and 'random effect' models), two-way ANOVA in randomised blocks, three-way ANOVA, and factorial experimental designs (including the varieties known as 'split-plot' and 'repeated measures'). For each ANOVA, the appropriate experimental design is described, a statistical model is formulated, and the advantages and limitations of each type of design discussed. In addition, the problems of non-conformity to the statistical model and determination of the number of replications are considered. © 2002 The College of Optometrists.
Resumo:
The aim of this paper is to provide managers and Human Resource executives with the basis for making drug testing policy in their organisations by presenting a critical review of existing literature on Workplace Drug Testing (WDT) and related areas which have been structured into the key areas.The key finding is whilst WDT is becoming more and more widely used, the rationale for this in terms of organizational effectiveness and safety is far from clear. Also there are significant ethical issues associated with WDT which are not always fully considered by organisations. Similarly, a cost/benefit analysis for particular organisations may well show little reason to embark on a testing policy. As a result of our review, we recommend that practitioners take a critical view of proposals introducing WDT since in many cases there is little upside to such a policy and a largely under-researched downside. There are also wider implications for society as a whole since the issue of drug taking as a whole is clearly a matter of great importance to practically every country in the world. The workplace is not at all immune from the impact of drug taking and perhaps a knee-jerk response by managers is to attempt to exclude anyone with any sort of drug habit through the use of WDT. This type of review with a specific HR focus has not been carried out before despite several calls for a more rational approach to the area.
Resumo:
Studies of the determinants and effects of innovation commonly make an assumption about the way in which firms make the decision to innovate, but rarely test this assumption. Using a panel of Irish manufacturing firms we test the performance of two alternative models of the innovation decision, and find that a two-stage model (the firm decides whether to innovate, then whether to perform product only, process only or both) outperforms a one-stage, simultaneous model. We also find that external knowledge sourcing affects the innovation decision and the type of innovation undertaken in a way not previously recognised in the literature. © 2007 Elsevier Ltd. All rights reserved.
Resumo:
A new surface analysis technique has been developed which has a number of benefits compared to conventional Low Energy Ion Scattering Spectrometry (LEISS). A major potential advantage arising from the absence of charge exchange complications is the possibility of quantification. The instrumentation that has been developed also offers the possibility of unique studies concerning the interaction between low energy ions and atoms and solid surfaces. From these studies it may also be possible, in principle, to generate sensitivity factors to quantify LEISS data. The instrumentation, which is referred to as a Time-of-Flight Fast Atom Scattering Spectrometer has been developed to investigate these conjecture in practice. The development, involved a number of modifications to an existing instrument, and allowed samples to be bombarded with a monoenergetic pulsed beam of either atoms or ions, and provided the capability to analyse the spectra of scattered atoms and ions separately. Further to this a system was designed and constructed to allow incident, exit and azimuthal angles of the particle beam to be varied independently. The key development was that of a pulsed, and mass filtered atom source; which was developed by a cyclic process of design, modelling and experimentation. Although it was possible to demonstrate the unique capabilities of the instrument, problems relating to surface contamination prevented the measurement of the neutralisation probabilities. However, these problems appear to be technical rather than scientific in nature, and could be readily resolved given the appropriate resources. Experimental spectra obtained from a number of samples demonstrate some fundamental differences between the scattered ion and neutral spectra. For practical non-ordered surfaces the ToF spectra are more complex than their LEISS counterparts. This is particularly true for helium scattering where it appears, in the absence of detailed computer simulation, that quantitative analysis is limited to ordered surfaces. Despite this limitation the ToFFASS instrument opens the way for quantitative analysis of the 'true' surface region to a wider range of surface materials.