26 resultados para Subject analysis
em Aston University Research Archive
Resumo:
Cost functions are estimated, using random effects and stochastic frontier methods, for English higher education institutions. The article advances on existing literature by employing finer disaggregation by subject, institution type and location, and by introducing consideration of quality effects. Estimates are provided of average incremental costs attached to each output type, and of returns to scale and scope. Implications for the policy of expansion of higher education are discussed.
Resumo:
Derivational morphology proposes meaningful connections between words and is largely unrepresented in lexical databases. This thesis presents a project to enrich a lexical database with morphological links and to evaluate their contribution to disambiguation. A lexical database with sense distinctions was required. WordNet was chosen because of its free availability and widespread use. Its suitability was assessed through critical evaluation with respect to specifications and criticisms, using a transparent, extensible model. The identification of serious shortcomings suggested a portable enrichment methodology, applicable to alternative resources. Although 40% of the most frequent words are prepositions, they have been largely ignored by computational linguists, so addition of prepositions was also required. The preferred approach to morphological enrichment was to infer relations from phenomena discovered algorithmically. Both existing databases and existing algorithms can capture regular morphological relations, but cannot capture exceptions correctly; neither of them provide any semantic information. Some morphological analysis algorithms are subject to the fallacy that morphological analysis can be performed simply by segmentation. Morphological rules, grounded in observation and etymology, govern associations between and attachment of suffixes and contribute to defining the meaning of morphological relationships. Specifying character substitutions circumvents the segmentation fallacy. Morphological rules are prone to undergeneration, minimised through a variable lexical validity requirement, and overgeneration, minimised by rule reformulation and restricting monosyllabic output. Rules take into account the morphology of ancestor languages through co-occurrences of morphological patterns. Multiple rules applicable to an input suffix need their precedence established. The resistance of prefixations to segmentation has been addressed by identifying linking vowel exceptions and irregular prefixes. The automatic affix discovery algorithm applies heuristics to identify meaningful affixes and is combined with morphological rules into a hybrid model, fed only with empirical data, collected without supervision. Further algorithms apply the rules optimally to automatically pre-identified suffixes and break words into their component morphemes. To handle exceptions, stoplists were created in response to initial errors and fed back into the model through iterative development, leading to 100% precision, contestable only on lexicographic criteria. Stoplist length is minimised by special treatment of monosyllables and reformulation of rules. 96% of words and phrases are analysed. 218,802 directed derivational links have been encoded in the lexicon rather than the wordnet component of the model because the lexicon provides the optimal clustering of word senses. Both links and analyser are portable to an alternative lexicon. The evaluation uses the extended gloss overlaps disambiguation algorithm. The enriched model outperformed WordNet in terms of recall without loss of precision. Failure of all experiments to outperform disambiguation by frequency reflects on WordNet sense distinctions.
Resumo:
The two-way design has been variously described as a matched-sample F-test, a simple within-subjects ANOVA, a one-way within-groups ANOVA, a simple correlated-groups ANOVA, and a one-factor repeated measures design! This confusion of terminology is likely to lead to problems in correctly identifying this analysis within commercially available software. The essential feature of the design is that each treatment is allocated by randomization to one experimental unit within each group or block. The block may be a plot of land, a single occasion in which the experiment was performed, or a human subject. The ‘blocking’ is designed to remove an aspect of the error variation and increase the ‘power’ of the experiment. If there is no significant source of variation associated with the ‘blocking’ then there is a disadvantage to the two-way design because there is a reduction in the DF of the error term compared with a fully randomised design thus reducing the ‘power’ of the analysis.
Resumo:
This research identifies factors which influence the consumption of potable water supplied to customers' property. A complete spectrum of the customer base is examined including household, commercial and industrial properties. The research considers information from around the world, particularly demand management and tariff related projects from North America. A device termed the Flow Moderator was developed and proven, with extensive trials, to conserve water at a rate equivalent to 40 litres/property/day whilst maintaining standards-of-service considerably in excess of Regulatory requirements. A detailed appraisal of the Moderator underlines the costs and benefits available to the industry through deliberate application of even mild demand management. More radically the concept of a charging policy utilising the Moderator is developed and appraised. Advantages include the lower costs of conventional fixed-price charging systems coupled with the conservation and equitability aspects associated with metering. Explanatory models were developed linking consumption to a range of variables demonstrated that households served by a communal water service-pipe (known in the UK as a shared supply) are subject to associated restrictions equivalent to -180 litres/property/day. The research confirmed that occupancy levels were a significant predictive element for household, commercial and industrial customers. The occurrence of on-property leakage was also demonstrated to be a significant factor recorded as an event which offers considerable scope for demand management in its own right.
Resumo:
This book is aimed primarily at microbiologists who are undertaking research and who require a basic knowledge of statistics to analyse their experimental data. Computer software employing a wide range of data analysis methods is widely available to experimental scientists. The availability of this software, however, makes it essential that investigators understand the basic principles of statistics. Statistical analysis of data can be complex with many different methods of approach, each of which applies in a particular experimental circumstance. Hence, it is possible to apply an incorrect statistical method to data and to draw the wrong conclusions from an experiment. The purpose of this book, which has its origin in a series of articles published in the Society for Applied Microbiology journal ‘The Microbiologist’, is an attempt to present the basic logic of statistics as clearly as possible and therefore, to dispel some of the myths that often surround the subject. The 28 ‘Statnotes’ deal with various topics that are likely to be encountered, including the nature of variables, the comparison of means of two or more groups, non-parametric statistics, analysis of variance, correlating variables, and more complex methods such as multiple linear regression and principal components analysis. In each case, the relevant statistical method is illustrated with examples drawn from experiments in microbiological research. The text incorporates a glossary of the most commonly used statistical terms and there are two appendices designed to aid the investigator in the selection of the most appropriate test.
Resumo:
This study covers two areas of contribution to the knowledge, firstly it tried to investigate rigourously the relationships of a number of factors believed that they may affect the climate perception, classified into three types to arrive to prove a hypothesis of the important role that qualification and personal factors play in shaping the climate perception, this is in contrast with situational factors. Secondly, the study tries to recluster the items of a wide-range applied scale for the measurement of climate named HAY in order to overcome the cross-cultural differences between the Kuwaiti and the American society, and to achieve a modified dimensions of climate for a civil service organisation in Kuwait. Furthermore, the study attempts to carry out a diagnostic test for the climate of the Ministry of Public Health in Kuwait, aiming to diagnose the perceived characteristics of the MoPH organisation, and suggests a number of areas to be given attention if an improvement is to be introduced. The study used extensively the statistical and the computer facilities to make the analysis more representing the field data, on the other hand this study is characterised by the very highly responsive rate of the main survey which would affect the findings reliability. Three main field studies are included, the first one was to conduct the main questionnaire where the second was to measure the "should be" climate by the experts of MoPH using the DELPHI technique, and the third was to conduct an extensive meeting with the very top management team in MoPH. Results of the first stage were subject to CLUSTER analysis for the reconstruction of the HAY tool, whereas comparative analysis was carried on between the results of the second and third stages on one side, the first from the other.
Resumo:
The purlin-sheeting system has been the subject of numerous theoretical and experimental investigations over the past 30 years, but the complexity of the problem has led to great difficulty in developing a sound and general model. The primary aim of the thesis is to investigate the failure behaviours of cold-formed zed and channel sections for use in purlin-sheeting systems. Both the energy method and finite strip method are used to develop an approach to investigate cold-formed zed and channel section beams with partial-lateral restraint from the metal sheeting when subjected to a uniformly distributed transverse load. The stress analysis of cold-formed zed and channel section beams with partially-lateral restraint from the metal sheeting when subjected to a uniformly distributed transverse load is investigated firstly by using the analytical model based on the energy method in which the restraint actions of the sheeting are modelled by using two springs representing the translational and rotational restraints. The numerical results have showed that the two springs have significantly different influences on the stresses of the beams. The influence of the two springs has also been found to depend on the anti-sag bar and the position of the loading line. A novel method is presented for analysing the elastic local buckling behaviour of cold-formed zed and channel section beams with partial-lateral restraint from metal sheeting when subjected to a uniformly distributed transverse load, which is carried out by inputting the cross sectional stresses with the largest compressive stress into the finite strip analysis. By using the presented novel method, individual influences of warning stress, partially lateral restraints from the sheeting and the dimensions of the cross section and position of the loading line on the buckling behaviour are investigated.
Resumo:
This thesis describes work undertaken in order to fulfil a need experienced in the Department of Educational Enquiry at the University of Aston in Birmingham for speech analysis facilities suitable for use in teaching and research work within the Department. The hardware and software developed during the research project provides displays of speech fundamental frequency and intensity in real time. The system is suitable for the provision of visual feedback of these parameters of a subject's speech in a learning situation, and overcomes the inadequacies of equipment currently used for this task in that it provides a clear indication of fundamental frequency contours as the subject is speaking. The thesis considers the use of such equipment in several related fields, and the approaches that have been reported to one of the major problems of speech analysis, namely pitch-period estimation. A number of different systems are described, and their suitability for the present purposes is discussed. Finally, a novel method of pitch-period estimation is developed, and a speech analysis system incorporating this method is described. Comparison is made between the results produced by this system and those produced by a conventional speech spectrograph.
Resumo:
Previous studies into student volunteering have shown how formally organized volunteering activities have social, economic and practical benefits for student volunteers and the recipients of their volunteerism (Egerton, 2002; Vernon & Foster, 2002); moreover student volunteering provides the means by which undergraduates are able to acquire and hone transferable skills sought by employers following graduation (Eldridge & Wilson, 2003; Norris et al, 2006). Although much is known about the benefits of student volunteering, few previous studies have focused on the pedagogical value of student mentoring from the perspectives of both student mentee and mentor. Utilising grounded theory methodology this paper provides a critical analysis of an exploratory study analysing students’ perceptions of the pedagogical and social outcomes of student mentoring. It looks at students’ perceptions of mentoring, and being mentored, in terms of the learning experience and development of knowledge and skills. In doing so the paper considers how volunteering in a mentoring capacity adds ‘value’ to students’ experiences of higher education. From a public policy perspective, the economic, educational, vocational and social outcomes of student volunteering in general, and student mentoring in particular, make this an important subject meriting investigation. In terms of employability, the role of mentoring in equipping mentors and mentees with transferable, employability competencies has not been investigated. By critiquing the mentoring experiences of undergraduates within a single institution, this paper will make an important contribution to policy debates with regards to the pedagogical and employability related outcomes of student volunteering and mentoring.
Resumo:
Purpose - Measurements obtained from the right and left eye of a subject are often correlated whereas many statistical tests assume observations in a sample are independent. Hence, data collected from both eyes cannot be combined without taking this correlation into account. Current practice is reviewed with reference to articles published in three optometry journals, viz., Ophthalmic and Physiological Optics (OPO), Optometry and Vision Science (OVS), Clinical and Experimental Optometry (CEO) during the period 2009–2012. Recent findings - Of the 230 articles reviewed, 148/230 (64%) obtained data from one eye and 82/230 (36%) from both eyes. Of the 148 one-eye articles, the right eye, left eye, a randomly selected eye, the better eye, the worse or diseased eye, or the dominant eye were all used as selection criteria. Of the 82 two-eye articles, the analysis utilized data from: (1) one eye only rejecting data from the adjacent eye, (2) both eyes separately, (3) both eyes taking into account the correlation between eyes, or (4) both eyes using one eye as a treated or diseased eye, the other acting as a control. In a proportion of studies, data were combined from both eyes without correction. Summary - It is suggested that: (1) investigators should consider whether it is advantageous to collect data from both eyes, (2) if one eye is studied and both are eligible, then it should be chosen at random, and (3) two-eye data can be analysed incorporating eyes as a ‘within subjects’ factor.
Resumo:
As student numbers in higher education in the UK have expanded during recent years, it has become increasingly important to understand its cost structure. This study applies Data Envelopment Analysis (DEA) to higher education institutions in England to assess their cost structure, efficiency and productivity. The paper complements an earlier study that used parametric methods to analyse the same panel data. Interestingly, DEA provides estimates of subject-specific unit costs that are in the same ballpark as those provided by the parametric methods. The paper then extends the previous analysis and finds that further student number increases of the order of 20–27% are feasible through exploiting operating and scale efficiency gains and also adjusting student mix. Finally the paper uses a Malmquist index approach to assess productivity change in the UK higher education. The results reveal that for a majority of institutions productivity has actually decreased during the study period.
Resumo:
Purpose – This paper attempts to seek answers to four questions. Two of these questions have been borrowed (but adapted) from the work of Defee et al.: RQ1. To what extent is theory used in purchasing and supply chain management (P&SCM) research? RQ2. What are the prevalent theories to be found in P&SCM research? Following on from these questions an additional question is posed: RQ3. Are theory-based papers more highly cited than papers with no theoretical foundation? Finally, drawing on the work of Harland et al., the authors have added a fourth question: RQ4. To what extent does P&SCM meet the tests of coherence, breadth and depth, and quality necessary to make it a scientific discipline? Design/methodology/approach – A systematic literature review was conducted in accordance with the model outlined by Tranfield et al. for three journals within the field of “purchasing and supply chain management”. In total 1,113 articles were reviewed. In addition a citation analysis was completed covering 806 articles in total. Findings – The headline features from the results suggest that nearly a decade-and-a-half on from its development, the field still lacks coherence. There is the absence of theory in much of the work and although theory-based articles achieved on average a higher number of citations than non-theoretical papers, there is no obvious contender as an emergent paradigm for the discipline. Furthermore, it is evident that P&SCM does not meet Fabian's test necessary to make it a scientific discipline and is still some way from being a normal science. Research limitations/implications – This study would have benefited from the analysis of further journals, however the analysis of 1,113 articles from three leading journals in the field of P&SCM was deemed sufficient in scope. In addition, a further significant line of enquiry to follow is the rigour vs relevance debate. Practical implications – This article is of interest to both an academic and practitioner audience as it highlights the use theories in P&SCM. Furthermore, this article raises a number of important questions. Should research in this area draw more heavily on theory and if so which theories are appropriate? Social implications – The broader social implications relate to the discussion of how a scientific discipline develops and builds on the work of Fabian and Amundson. Originality/value – The data set for this study is significant and builds on a number of previous literature reviews. This review is both greater in scope than previous reviews and is broader in its subject focus. In addition, the citation analysis (not previously conducted in any of the reviews) and statistical test highlights that theory-based articles are more highly cited than non-theoretically based papers. This could indicate that researchers are attempting to build on one another's work.
Resumo:
Selecting the best alternative in a group decision making is a subject of many recent studies. The most popular method proposed for ranking the alternatives is based on the distance of each alternative to the ideal alternative. The ideal alternative may never exist; hence the ranking results are biased to the ideal point. The main aim in this study is to calculate a fuzzy ideal point that is more realistic to the crisp ideal point. On the other hand, recently Data Envelopment Analysis (DEA) is used to find the optimum weights for ranking the alternatives. This paper proposes a four stage approach based on DEA in the Fuzzy environment to aggregate preference rankings. An application of preferential voting system shows how the new model can be applied to rank a set of alternatives. Other two examples indicate the priority of the proposed method compared to the some other suggested methods.
Resumo:
WHAT IS ALREADY KNOWN ABOUT THIS SUBJECT • The cytotoxic effects of 6-mercaptopurine (6-MP) were found to be due to drug-derived intracellular metabolites (mainly 6-thioguanine nucleotides and to some extent 6-methylmercaptopurine nucleotides) rather than the drug itself. • Current empirical dosing methods for oral 6-MP result in highly variable drug and metabolite concentrations and hence variability in treatment outcome. WHAT THIS STUDY ADDS • The first population pharmacokinetic model has been developed for 6-MP active metabolites in paediatric patients with acute lymphoblastic leukaemia and the potential demographic and genetically controlled factors that could lead to interpatient pharmacokinetic variability among this population have been assessed. • The model shows a large reduction in interindividual variability of pharmacokinetic parameters when body surface area and thiopurine methyltransferase polymorphism are incorporated into the model as covariates. • The developed model offers a more rational dosing approach for 6-MP than the traditional empirical method (based on body surface area) through combining it with pharmacogenetically guided dosing based on thiopurine methyltransferase genotype. AIMS - To investigate the population pharmacokinetics of 6-mercaptopurine (6-MP) active metabolites in paediatric patients with acute lymphoblastic leukaemia (ALL) and examine the effects of various genetic polymorphisms on the disposition of these metabolites. METHODS - Data were collected prospectively from 19 paediatric patients with ALL (n = 75 samples, 150 concentrations) who received 6-MP maintenance chemotherapy (titrated to a target dose of 75 mg m−2 day−1). All patients were genotyped for polymorphisms in three enzymes involved in 6-MP metabolism. Population pharmacokinetic analysis was performed with the nonlinear mixed effects modelling program (nonmem) to determine the population mean parameter estimate of clearance for the active metabolites. RESULTS - The developed model revealed considerable interindividual variability (IIV) in the clearance of 6-MP active metabolites [6-thioguanine nucleotides (6-TGNs) and 6-methylmercaptopurine nucleotides (6-mMPNs)]. Body surface area explained a significant part of 6-TGNs clearance IIV when incorporated in the model (IIV reduced from 69.9 to 29.3%). The most influential covariate examined, however, was thiopurine methyltransferase (TPMT) genotype, which resulted in the greatest reduction in the model's objective function (P < 0.005) when incorporated as a covariate affecting the fractional metabolic transformation of 6-MP into 6-TGNs. The other genetic covariates tested were not statistically significant and therefore were not included in the final model. CONCLUSIONS - The developed pharmacokinetic model (if successful at external validation) would offer a more rational dosing approach for 6-MP than the traditional empirical method since it combines the current practice of using body surface area in 6-MP dosing with a pharmacogenetically guided dosing based on TPMT genotype.
Resumo:
Background: Poor diet is thought to be a risk factor for many diseases, including age-related macular disease (ARMD), which is the leading cause of blind registration in those aged over 60 years in the developed world. The aims of this study were 1) to evaluate the dietary food intake of three subject groups: participants under the age of 50 years without ARMD (U50), participants over the age of 50 years without ARMD (O50), and participants with ARMD (AMD), and 2) to obtain information on nutritional supplement usage. Methods: A prospective cross-sectional study designed in a clinical practice setting. Seventy-four participants were divided into three groups: U50; 20 participants aged < 50 years, from 21 to 40 (mean ± SD, 37.7 ± 10.1 years), O50; 27 participants aged > 50 years, from 52 to 77 (62.7 ± 6.8 years), and ARMD; 27 participants aged > 50 years with ARMD, from 55 to 79 (66.0 ± 5.8 years). Participants were issued with a three-day food diary, and were also asked to provide details of any daily nutritional supplements. The diaries were analysed using FoodBase 2000 software. Data were input by one investigator and statistically analysed using Microsoft Excel for Microsoft Windows XP software, employing unpaired t-tests. Results: Group O50 consumed significantly more vitamin C (t = 3.049, p = 0.005) and significantly more fibre (t = 2.107, p = 0.041) than group U50. Group ARMD consumed significantly more protein (t = 3.487, p = 0.001) and zinc (t = 2.252, p = 0.029) than group O50. The ARMD group consumed the highest percentage of specific ocular health supplements and the U50 group consumed the most multivitamins. Conclusions: We did not detect a deficiency of any specific nutrient in the diets of those with ARMD compared with age- and gender-matched controls. ARMD patients may be aware of research into use of nutritional supplementation to prevent progression of their condition.