44 resultados para practical epistemology analysis
em Aston University Research Archive
Resumo:
In some contexts data envelopment analysis (DEA) gives poor discrimination on the performance of units. While this may reflect genuine uniformity of performance between units, it may also reflect lack of sufficient observations or other factors limiting discrimination on performance between units. In this paper, we present an overview of the main approaches that can be used to improve the discrimination of DEA. This includes simple methods such as the aggregation of inputs or outputs, the use of longitudinal data, more advanced methods such as the use of weight restrictions, production trade-offs and unobserved units, and a relatively new method based on the use of selective proportionality between the inputs and outputs. © 2007 Springer Science+Business Media, LLC.
Resumo:
Growth in availability and ability of modern statistical software has resulted in greater numbers of research techniques being applied across the marketing discipline. However, with such advances come concerns that techniques may be misinterpreted by researchers. This issue is critical since misinterpretation could cause erroneous findings. This paper investigates some assumptions regarding: 1) the assessment of discriminant validity; and 2) what confirmatory factor analysis accomplishes. Examples that address these points are presented, and some procedural remedies are suggested based upon the literature. This paper is, therefore, primarily concerned with the development of measurement theory and practice. If advances in theory development are not based upon sound methodological practice, we as researchers could be basing our work upon shaky foundations.
Resumo:
The Retinal Vessel Analyser (RVA) is a commercially available ophthalmoscopic instrument capable of acquiring vessel diameter fluctuations in real time and in high temporal resolution. Visual stimulation by means of flickering light is a unique exploration tool of neurovascular coupling in the human retina. Vessel reactivity as mediated by local vascular endothelial vasodilators and vasoconstrictors can be assessed non-invasively, in vivo. In brief, the work in this thesis • deals with interobserver and intraobserver reproducibility of the flicker responses in healthy volunteers • explains the superiority of individually analysed reactivity parameters over vendorgenerated output • links in static retinal measures with dynamic ones • highlights practical limitations in the use of the RVA that may undermine its clinical usefulness • provides recommendations for standardising measurements in terms of vessel location and vessel segment length and • presents three case reports of essential hypertensives in a -year follow-up. Strict standardisation of measurement procedures is a necessity when utilising the RVA system. Agreement between research groups on implemented protocols needs to be met, before it could be considered a clinically useful tool in detecting or predicting microvascular dysfunction.
Resumo:
The security and reliability of a class of public-key cryptosystems against attacks by unauthorized parties, who had acquired partial knowledge of one or more of the private key components and/or of the message, were discussed. The standard statistical mechanical methods of dealing with diluted spin systems with replica symmetric considerations were analyzed. The dynamical transition which defined decryption success in practical situation was studied. The phase diagrams which showed the dynamical threshold as a function of the partial acquired knowledge of the private key were also presented.
Resumo:
A novel approach to watermarking of audio signals using Independent Component Analysis (ICA) is proposed. It exploits the statistical independence of components obtained by practical ICA algorithms to provide a robust watermarking scheme with high information rate and low distortion. Numerical simulations have been performed on audio signals, showing good robustness of the watermark against common attacks with unnoticeable distortion, even for high information rates. An important aspect of the method is its domain independence: it can be used to hide information in other types of data, with minor technical adaptations.
Resumo:
Using analytical methods of statistical mechanics, we analyse the typical behaviour of a multiple-input multiple-output (MIMO) Gaussian channel with binary inputs under low-density parity-check (LDPC) network coding and joint decoding. The saddle point equations for the replica symmetric solution are found in particular realizations of this channel, including a small and large number of transmitters and receivers. In particular, we examine the cases of a single transmitter, a single receiver and symmetric and asymmetric interference. Both dynamical and thermodynamical transitions from the ferromagnetic solution of perfect decoding to a non-ferromagnetic solution are identified for the cases considered, marking the practical and theoretical limits of the system under the current coding scheme. Numerical results are provided, showing the typical level of improvement/deterioration achieved with respect to the single transmitter/receiver result, for the various cases. © 2007 IOP Publishing Ltd.
Resumo:
Understanding the cultural value systems of nations is a key factor in anticipating the behaviour of business managers and employees in a specific business environment. Many research studies have acknowledged the impact of culture on communication across nations and its impact on business operations, however no study has attempted to measure and quantify the cultural orientations of people originating from one nation, but working in two different national settings. This study adopted Kluckhohn and Strodtbeck's framework to examine cultural dimensions of a total of 580 Indian respondents comprising two groups: 429 Indian natives living and working in India and 151 Indian migrants living and working in the USA. It initially compares the cultural orientations of the total population of each of the two groups and then examines cultural differences in the same based on demographic characteristics consisting of occupation, gender, age, and level of education. The study found significant cultural value differences between the two groups on both levels of analysis. The theoretical and practical implications of these findings are discussed in detail.
Resumo:
Purpose - The purpose of this paper is to develop an integrated quality management model that identifies problems, suggests solutions, develops a framework for implementation and helps to evaluate dynamically healthcare service performance. Design/methodology/approach - This study used the logical framework analysis (LFA) to improve the performance of healthcare service processes. LFA has three major steps - problems identification, solution derivation, and formation of a planning matrix for implementation. LFA has been applied in a case-study environment to three acute healthcare services (Operating Room utilisation, Accident and Emergency, and Intensive Care) in order to demonstrate its effectiveness. Findings - The paper finds that LFA is an effective method of quality management of hospital-based healthcare services. Research limitations/implications - This study shows LFA application in three service processes in one hospital. This very limited population sample needs to be extended. Practical implications - The proposed model can be implemented in hospital-based healthcare services in order to improve performance. It may also be applied to other services. Originality/value - Quality improvement in healthcare services is a complex and multi-dimensional task. Although various quality management tools are routinely deployed for identifying quality issues in healthcare delivery, they are not without flaws. There is an absence of an integrated approach, which can identify and analyse issues, provide solutions to resolve those issues, develop a project management framework to implement those solutions. This study introduces an integrated and uniform quality management tool for healthcare services. © Emerald Group Publishing Limited.
Resumo:
A new surface analysis technique has been developed which has a number of benefits compared to conventional Low Energy Ion Scattering Spectrometry (LEISS). A major potential advantage arising from the absence of charge exchange complications is the possibility of quantification. The instrumentation that has been developed also offers the possibility of unique studies concerning the interaction between low energy ions and atoms and solid surfaces. From these studies it may also be possible, in principle, to generate sensitivity factors to quantify LEISS data. The instrumentation, which is referred to as a Time-of-Flight Fast Atom Scattering Spectrometer has been developed to investigate these conjecture in practice. The development, involved a number of modifications to an existing instrument, and allowed samples to be bombarded with a monoenergetic pulsed beam of either atoms or ions, and provided the capability to analyse the spectra of scattered atoms and ions separately. Further to this a system was designed and constructed to allow incident, exit and azimuthal angles of the particle beam to be varied independently. The key development was that of a pulsed, and mass filtered atom source; which was developed by a cyclic process of design, modelling and experimentation. Although it was possible to demonstrate the unique capabilities of the instrument, problems relating to surface contamination prevented the measurement of the neutralisation probabilities. However, these problems appear to be technical rather than scientific in nature, and could be readily resolved given the appropriate resources. Experimental spectra obtained from a number of samples demonstrate some fundamental differences between the scattered ion and neutral spectra. For practical non-ordered surfaces the ToF spectra are more complex than their LEISS counterparts. This is particularly true for helium scattering where it appears, in the absence of detailed computer simulation, that quantitative analysis is limited to ordered surfaces. Despite this limitation the ToFFASS instrument opens the way for quantitative analysis of the 'true' surface region to a wider range of surface materials.
Resumo:
A novel biosensing system based on a micromachined rectangular silicon membrane is proposed and investigated in this paper. A distributive sensing scheme is designed to monitor the dynamics of the sensing structure. An artificial neural network is used to process the measured data and to identify cell presence and density. Without specifying any particular bio-application, the investigation is mainly concentrated on the performance testing of this kind of biosensor as a general biosensing platform. The biosensing experiments on the microfabricated membranes involve seeding different cell densities onto the sensing surface of membrane, and measuring the corresponding dynamics information of each tested silicon membrane in the form of a series of frequency response functions (FRFs). All of those experiments are carried out in cell culture medium to simulate a practical working environment. The EA.hy 926 endothelial cell lines are chosen in this paper for the bio-experiments. The EA.hy 926 endothelial cell lines represent a particular class of biological particles that have irregular shapes, non-uniform density and uncertain growth behaviour, which are difficult to monitor using the traditional biosensors. The final predicted results reveal that the methodology of a neural-network based algorithm to perform the feature identification of cells from distributive sensory measurement has great potential in biosensing applications.
Resumo:
The concept of a task is fundamental to the discipline of ergonomics. Approaches to the analysis of tasks began in the early 1900's. These approaches have evolved and developed to the present day, when there is a vast array of methods available. Some of these methods are specific to particular contexts or applications, others more general. However, whilst many of these analyses allow tasks to be examined in detail, they do not act as tools to aid the design process or the designer. The present thesis examines the use of task analysis in a process control context, and in particular the use of task analysis to specify operator information and display requirements in such systems. The first part of the thesis examines the theoretical aspect of task analysis and presents a review of the methods, issues and concepts relating to task analysis. A review of over 80 methods of task analysis was carried out to form a basis for the development of a task analysis method to specify operator information requirements in industrial process control contexts. Of the methods reviewed Hierarchical Task Analysis was selected to provide such a basis and developed to meet the criteria outlined for such a method of task analysis. The second section outlines the practical application and evolution of the developed task analysis method. Four case studies were used to examine the method in an empirical context. The case studies represent a range of plant contexts and types, both complex and more simple, batch and continuous and high risk and low risk processes. The theoretical and empirical issues are drawn together and a method developed to provide a task analysis technique to specify operator information requirements and to provide the first stages of a tool to aid the design of VDU displays for process control.
Resumo:
The aim of this research was to improve the quantitative support to project planning and control principally through the use of more accurate forecasting for which new techniques were developed. This study arose from the observation that in most cases construction project forecasts were based on a methodology (c.1980) which relied on the DHSS cumulative cubic cost model and network based risk analysis (PERT). The former of these, in particular, imposes severe limitations which this study overcomes. Three areas of study were identified, namely growth curve forecasting, risk analysis and the interface of these quantitative techniques with project management. These fields have been used as a basis for the research programme. In order to give a sound basis for the research, industrial support was sought. This resulted in both the acquisition of cost profiles for a large number of projects and the opportunity to validate practical implementation. The outcome of this research project was deemed successful both in theory and practice. The new forecasting theory was shown to give major reductions in projection errors. The integration of the new predictive and risk analysis technologies with management principles, allowed the development of a viable software management aid which fills an acknowledged gap in current technology.
Resumo:
Much research is currently centred on the detection of damage in structures using vibrational data. The work presented here examined several areas of interest in support of a practical technique for identifying and locating damage within bridge structures using apparent changes in their vibrational response to known excitation. The proposed goals of such a technique included the need for the measurement system to be operated on site by a minimum number of staff and that the procedure should be as non-invasive to the bridge traffic-flow as possible. Initially the research investigated changes in the vibrational bending characteristics of two series of large-scale model bridge-beams in the laboratory and these included ordinary-reinforced and post-tensioned, prestressed designs. Each beam was progressively damaged at predetermined positions and its vibrational response to impact excitation was analysed. For the load-regime utilised the results suggested that the infuced damage manifested itself as a function of the span of a beam rather than a localised area. A power-law relating apparent damage with the applied loading and prestress levels was then proposed, together with a qualitative vibrational measure of structural damage. In parallel with the laboratory experiments a series of tests were undertaken at the sites of a number of highway bridges. The bridges selected had differing types of construction and geometric design including composite-concrete, concrete slab-and-beam, concrete-slab with supporting steel-troughing constructions together with regular-rectangular, skewed and heavily-skewed geometries. Initial investigations were made of the feasibility and reliability of various methods of structure excitation including traffic and impulse methods. It was found that localised impact using a sledge-hammer was ideal for the purposes of this work and that a cartridge `bolt-gun' could be used in some specific cases.
Resumo:
In this Thesis, details of a proposed method for the elastic-plastic failure load analysis of complete building structures are given. In order to handle the problem, a computer programme in Atlas Autocode is produced. The structures consist of a number of parallel shear walls and intermediate frames connected by floor slabs. The results of an experimental investigation are given to verify the theoretical results and to demonstrate various factors that may influence the behaviour of these structures. Large full scale practical structures are also analysed by the proposed method and suggestions are made for achieving design economy as well as for extending research in various aspects of this field. The existing programme for elastic-plastic analysis of large frames is modified to allow for the effect of composite action of structural members, i.e. reinforced concrete floor slabs and the supporting steel beams. This modified programme is used to analyse some framed type structures with composite action as well as those which incorporate plates and shear walls. The results obtained are studied to ascertain the influence of composite action and other factors on the load carrying capacity of both bare frames and complete building structures. The theoretical failure load presented in this thesis does not predict the overall failure load of the structure nor does it predict the partial failure load of the shear walls and slabs but it merely predicts the partial failure load of a single frame and assumes that the loss of stiffess of such a frame renders the overall structure unusable. For most structures the analysis proposed in this thesis is likely to break down prematurely due to the failure of the slab and shear wall system and this factor must be taken into account in any future work on such structures. The experimental work reported in this thesis is acknowledged to be unsatisfactory as a verification of the limited theory proposed. In particular perspex was not found to be a suitable material for testing at high loads, micro-concrete may be more suitable.
Resumo:
This thesis describes the development of a simple and accurate method for estimating the quantity and composition of household waste arisings. The method is based on the fundamental tenet that waste arisings can be predicted from information on the demographic and socio-economic characteristics of households, thus reducing the need for the direct measurement of waste arisings to that necessary for the calibration of a prediction model. The aim of the research is twofold: firstly to investigate the generation of waste arisings at the household level, and secondly to devise a method for supplying information on waste arisings to meet the needs of waste collection and disposal authorities, policy makers at both national and European level and the manufacturers of plant and equipment for waste sorting and treatment. The research was carried out in three phases: theoretical, empirical and analytical. In the theoretical phase specific testable hypotheses were formulated concerning the process of waste generation at the household level. The empirical phase of the research involved an initial questionnaire survey of 1277 households to obtain data on their socio-economic characteristics, and the subsequent sorting of waste arisings from each of the households surveyed. The analytical phase was divided between (a) the testing of the research hypotheses by matching each household's waste against its demographic/socioeconomic characteristics (b) the development of statistical models capable of predicting the waste arisings from an individual household and (c) the development of a practical method for obtaining area-based estimates of waste arisings using readily available data from the national census. The latter method was found to represent a substantial improvement over conventional methods of waste estimation in terms of both accuracy and spatial flexibility. The research therefore represents a substantial contribution both to scientific knowledge of the process of household waste generation, and to the practical management of waste arisings.