45 resultados para automation of fit analysis
em Aston University Research Archive
Resumo:
An ultrasonic thermometer has been developed for high temperature measurement over a wide temperature range. It is particularly suitable for use in measuring nuclear fuel rod centerline temperatures in advanced liquid metal and high flux nuclear reactors. The thermometer which was designed to determine fuel temperature up to the fuel melting point, utilizes the temperature dependence of the ultrasonic propagation velocity (related to the elastic modulus} in a thin rod sensor as the temperature transducing mechanism. A pulse excitation technique has been used, where the mechanical resonator at the remote end of the acoustic·line is madto vibrate. Its natural frequency is proportional to the ultrasonic velocity in the material. This is measured by the electronic instrumentation and enables a frequency temperature or period-temperature calibration to be obtained. A completely digital automatic instrument has been designed, constructed and tested to track the resonance frequency of the temperature sensors. It operates smoothly over a frequency range of about 30%, more than the maximum working range of most probe materials. The control uses the basic property of a resonator that the stored energy decays exponentially at the natural frequency of the resonator.The operation of the electronic system is based on a digital multichannel transmitter that is capable of operating with a predefined number of cycles in the burst. this overcomes a basic defect in the previous deslgn where the analogue time-delayed circuits failed to hold synchronization and hence automatic control could be lost. Development of a particular type of temperature probe, that is small enough to fit into a standard 2 mm reactor tube has made the ultrasonic thermometer a practicable device for measuring fuel temperature. The bulkiness of previous probes has been overcome, the new design consists of a tuning fork, integral with a 1mm line, while maintaining a frequency of no more than 100 kHz. A magnetostrictive rod, acoustically matched to the probe is used to launch and receive the acoustic oscillations. This requires a magnetic bias and the previously used bulky magnets have been replaced by a direct current coil. The probe is supported by terminating the launcher with a short heavy isolating rod which can be secured to the reactor structure. This support, the bias and launching coil and the launcher are made up into a single compact unit. On the material side an extensive study of a wide range of refractory materials identified molybdenum, iridium, rhenium and tungsten as satisfactory for a number of applications but mostly exhibiting to some degree a calibration drift with thermal cycling. When attention was directed to ceramic materials, Sapphire (single crystal alumina) was found to have numerous advantages, particularly in respect of stability of calibration which remained with ±2°C after many cycles to 1800oC. Tungsten and thoriated tungsten (W - 2% Tho2) were also found to be quite satisfactory to 1600oC, the specification for a Euratom application.
Resumo:
This study was concerned with the computer automation of land evaluation. This is a broad subject with many issues to be resolved, so the study concentrated on three key problems: knowledge based programming; the integration of spatial information from remote sensing and other sources; and the inclusion of socio-economic information into the land evaluation analysis. Land evaluation and land use planning were considered in the context of overseas projects in the developing world. Knowledge based systems were found to provide significant advantages over conventional programming techniques for some aspects of the land evaluation process. Declarative languages, in particular Prolog, were ideally suited to integration of social information which changes with every situation. Rule-based expert system shells were also found to be suitable for this role, including knowledge acquisition at the interview stage. All the expert system shells examined suffered from very limited constraints to problem size, but new products now overcome this. Inductive expert system shells were useful as a guide to knowledge gaps and possible relationships, but the number of examples required was unrealistic for typical land use planning situations. The accuracy of classified satellite imagery was significantly enhanced by integrating spatial information on soil distribution for Thailand data. Estimates of the rice producing area were substantially improved (30% change in area) by the addition of soil information. Image processing work on Mozambique showed that satellite remote sensing was a useful tool in stratifying vegetation cover at provincial level to identify key development areas, but its full utility could not be realised on typical planning projects, without treatment as part of a complete spatial information system.
Resumo:
Recently, Drǎgulescu and Yakovenko proposed an analytical formula for computing the probability density function of stock log returns, based on the Heston model, which they tested empirically. Their research design inadvertently favourably biased the fit of the data to the Heston model, thus overstating their empirical results. Furthermore, Drǎgulescu and Yakovenko did not perform any goodness-of-fit statistical tests. This study employs a research design that facilitates statistical tests of the goodness-of-fit of the Heston model to empirical returns. Robustness checks are also performed. In brief, the Heston model outperformed the Gaussian model only at high frequencies and even so does not provide a statistically acceptable fit to the data. The Gaussian model performed (marginally) better at medium and low frequencies, at which points the extra parameters of the Heston model have adverse impacts on the test statistics. © 2005 Taylor & Francis Group Ltd.
Resumo:
This article explains first, the reasons why a knowledge of statistics is necessary and describes the role that statistics plays in an experimental investigation. Second, the normal distribution is introduced which describes the natural variability shown by many measurements in optometry and vision sciences. Third, the application of the normal distribution to some common statistical problems including how to determine whether an individual observation is a typical member of a population and how to determine the confidence interval for a sample mean is described.
Resumo:
This thesis seeks to describe the development of an inexpensive and efficient clustering technique for multivariate data analysis. The technique starts from a multivariate data matrix and ends with graphical representation of the data and pattern recognition discriminant function. The technique also results in distances frequency distribution that might be useful in detecting clustering in the data or for the estimation of parameters useful in the discrimination between the different populations in the data. The technique can also be used in feature selection. The technique is essentially for the discovery of data structure by revealing the component parts of the data. lhe thesis offers three distinct contributions for cluster analysis and pattern recognition techniques. The first contribution is the introduction of transformation function in the technique of nonlinear mapping. The second contribution is the us~ of distances frequency distribution instead of distances time-sequence in nonlinear mapping, The third contribution is the formulation of a new generalised and normalised error function together with its optimal step size formula for gradient method minimisation. The thesis consists of five chapters. The first chapter is the introduction. The second chapter describes multidimensional scaling as an origin of nonlinear mapping technique. The third chapter describes the first developing step in the technique of nonlinear mapping that is the introduction of "transformation function". The fourth chapter describes the second developing step of the nonlinear mapping technique. This is the use of distances frequency distribution instead of distances time-sequence. The chapter also includes the new generalised and normalised error function formulation. Finally, the fifth chapter, the conclusion, evaluates all developments and proposes a new program. for cluster analysis and pattern recognition by integrating all the new features.
Resumo:
The concept of a task is fundamental to the discipline of ergonomics. Approaches to the analysis of tasks began in the early 1900's. These approaches have evolved and developed to the present day, when there is a vast array of methods available. Some of these methods are specific to particular contexts or applications, others more general. However, whilst many of these analyses allow tasks to be examined in detail, they do not act as tools to aid the design process or the designer. The present thesis examines the use of task analysis in a process control context, and in particular the use of task analysis to specify operator information and display requirements in such systems. The first part of the thesis examines the theoretical aspect of task analysis and presents a review of the methods, issues and concepts relating to task analysis. A review of over 80 methods of task analysis was carried out to form a basis for the development of a task analysis method to specify operator information requirements in industrial process control contexts. Of the methods reviewed Hierarchical Task Analysis was selected to provide such a basis and developed to meet the criteria outlined for such a method of task analysis. The second section outlines the practical application and evolution of the developed task analysis method. Four case studies were used to examine the method in an empirical context. The case studies represent a range of plant contexts and types, both complex and more simple, batch and continuous and high risk and low risk processes. The theoretical and empirical issues are drawn together and a method developed to provide a task analysis technique to specify operator information requirements and to provide the first stages of a tool to aid the design of VDU displays for process control.
Resumo:
It has been widely recognised that an in-depth textual analysis of a source text is relevant for translation. This book discusses the role of discourse analysis for translation and translator training. One particular model of discourse analysis is presented in detail, and its application in the context of translator training is critically examined.
Resumo:
DUE TO COPYRIGHT RESTRICTIONS ONLY AVAILABLE FOR CONSULTATION AT ASTON UNIVERSITY LIBRARY AND INFORMATION SERVICES WITH PRIOR ARRANGEMENT
Resumo:
The accurate in silico identification of T-cell epitopes is a critical step in the development of peptide-based vaccines, reagents, and diagnostics. It has a direct impact on the success of subsequent experimental work. Epitopes arise as a consequence of complex proteolytic processing within the cell. Prior to being recognized by T cells, an epitope is presented on the cell surface as a complex with a major histocompatibility complex (MHC) protein. A prerequisite therefore for T-cell recognition is that an epitope is also a good MHC binder. Thus, T-cell epitope prediction overlaps strongly with the prediction of MHC binding. In the present study, we compare discriminant analysis and multiple linear regression as algorithmic engines for the definition of quantitative matrices for binding affinity prediction. We apply these methods to peptides which bind the well-studied human MHC allele HLA-A*0201. A matrix which results from combining results of the two methods proved powerfully predictive under cross-validation. The new matrix was also tested on an external set of 160 binders to HLA-A*0201; it was able to recognize 135 (84%) of them.
Resumo:
Purpose – This paper aims to clarify what ‘narrative analysis’ may entail when it is assumed that interview accounts can be treated as (collections of) narratives. What is considered a narrative and how these may be analyzed is open to debate. After suggesting an approach of how to deal with narrative analysis, the authors critically discuss how far it might offer insights into a particular accounting case. Design/methodology/approach – After having explained what the authors’ view on narrative analysis is, and how this is linked with the extant literature, the authors examine the socialisation processes of two early career accountants that have been articulated in an interview context. Findings – The approach to narrative analysis set out in this paper could help to clarify how and why certain interpretations from an interview are generated by a researcher. The authors emphasise the importance of discussing a researcher’s process of discovery when an interpretive approach to research is adopted. Research limitations/implications – The application of any method, and what a researcher thinks can be distilled from this, depends on the research outlook he/she has. As the authors adopt an interpretive approach to research in this paper, they acknowledge that the interpretations of narratives, and what they deem to be narratives, will be infused by their own perceptions. Practical implications – The authors believe that the writing-up of qualitative research from an interpretive stance would benefit from an explicit acceptance of the equivocal nature of interpretation. The way in which they present and discuss the narrative analyses in this paper intends to bring this to the fore. Originality/value – Whenever someone says he/she engages in narrative analysis, both the “narrative” and “analysis” part of “narrative analysis” need to be explicated. The authors believe that this only happens every so often. This paper puts forward an approach of how more clarity on this might be achieved by combining two frameworks in the extant literature, so that the transparency of the research is enhanced.
Resumo:
This study examined the link between employees’ adult attachment orientations and perceptions of line managers’ interpersonal justice behaviors, and the moderating effect of national culture (collectivism). Participants from countries categorized as low collectivistic (N = 205) and high collectivistic (N = 136) completed an online survey. Attachment anxiety and avoidance were negatively related to interpersonal justice perceptions. Cultural differences did not moderate the effects of avoidance. However, the relationship between attachment anxiety and interpersonal justice was non-significant in the Southern Asia (more collectivistic) cultural cluster. Our findings indicate the importance of ‘fit’ between cultural relational values and individual attachment orientations in shaping interpersonal justice perceptions, and highlight the need for more non-western organizational justice research.
Resumo:
As the pressure continues to grow on Diamond and the world's synchrotrons for higher throughput of diffraction experiments, new and novel techniques are required for presenting micron dimension crystals to the X ray beam. Currently this task is both labour intensive and primarily a serial process. Diffraction measurements typically take milliseconds but sample preparation and presentation can reduce throughput down to 4 measurements an hour. With beamline waiting times as long as two years it is of key importance for researchers to capitalize on available beam time, generating as much data as possible. Other approaches detailed in the literature [1] [2] [3] are very much skewed towards automating, with robotics, the actions of a human protocols. The work detailed here is the development and discussion of a bottom up approach relying on SSAW self assembly, including material selection, microfluidic integration and tuning of the acoustic cavity to order the protein crystals.
Resumo:
As the pressure continues to grow on Diamond and the world's synchrotrons for higher throughput of diffraction experiments, new and novel techniques are required for presenting micron dimension crystals to the X ray beam. Currently this task is both labour intensive and primarily a serial process. Diffraction measurements typically take milliseconds but sample preparation and presentation can reduce throughput down to 4 measurements an hour. With beamline waiting times as long as two years it is of key importance for researchers to capitalize on available beam time, generating as much data as possible. Other approaches detailed in the literature [1] [2] [3] are very much skewed towards automating, with robotics, the actions of a human protocols. The work detailed here is the development and discussion of a bottom up approach relying on SSAW self assembly, including material selection, microfluidic integration and tuning of the acoustic cavity to order the protein crystals.