33 resultados para Calibration methodologies
em Aston University Research Archive
Resumo:
In developing neural network techniques for real world applications it is still very rare to see estimates of confidence placed on the neural network predictions. This is a major deficiency, especially in safety-critical systems. In this paper we explore three distinct methods of producing point-wise confidence intervals using neural networks. We compare and contrast Bayesian, Gaussian Process and Predictive error bars evaluated on real data. The problem domain is concerned with the calibration of a real automotive engine management system for both air-fuel ratio determination and on-line ignition timing. This problem requires real-time control and is a good candidate for exploring the use of confidence predictions due to its safety-critical nature.
Resumo:
This paper aims to contribute to the debate about the role of the public sector in stimulating greater use of private sector equity for business start-up and growth in two ways. First, to examine the extent to which the provision of public sector equity finance enables individual firms to raise additional funds in the private sector market place. Second, to consider the methodological implications for an economic impact assessment of industrial policy interventions (especially those which include an equity component) at the level of the individual firm. We assess the extent to which there may be indirect positive effects (externalities) associated with public sector financial assistance to individual firms and if so how they distort standard evaluation methodologies designed to estimate the level of additionality of that support. The paper draws upon the results of a recent study of the impact of Enterprise Ireland (EI) financial assistance to indigenous Irish industry in the period 2000 to 2002. The paper demonstrates that a process of re-calibration is necessary in estimates of economic impact in order to account for these positive externalities and the result in this study was a ‘boost’ to additionality. In operational and conceptual terms, the study underlines the importance of the relationship between private and public sector sources of equity finance as an important dynamic in the attempt by industrial and regional policy to stimulate the number of firms with viable investment proposals accessing external equity finance.
Resumo:
Few works address methodological issues of how to conduct strategy-as-practice research and even fewer focus on how to analyse the subsequent data in ways that illuminate strategy as an everyday, social practice. We address this gap by proposing a quantitative method for analysing observational data, which can complement more traditional qualitative methodologies. We propose that rigorous but context-sensitive coding of transcripts can render everyday practice analysable statistically. Such statistical analysis provides a means for analytically representing patterns and shifts within the mundane, repetitive elements through which practice is accomplished. We call this approach the Event Database (EDB) and it consists of five basic coding categories that help us capture the stream of practice. Indexing codes help to index or categorise the data, in order to give context and offer some basic information about the event under discussion. Indexing codes are descriptive codes, which allow us to catalogue and classify events according to their assigned characteristics. Content codes are to do with the qualitative nature of the event; this is the essence of the event. It is a description that helps to inform judgements about the phenomenon. Nature codes help us distinguish between discursive and tangible events. We include this code to acknowledge that some events differ qualitatively from other events. Type events are codes abstracted from the data in order to help us classify events based on their description or nature. This involves significantly more judgement than the index codes but consequently is also more meaningful. Dynamics codes help us capture some of the movement or fluidity of events. This category has been included to let us capture the flow of activity over time.
Resumo:
Two types of prediction problem can be solved using a regression line viz., prediction of the ‘population’ regression line at the point ‘x’ and prediction of an ‘individual’ new member of the population ‘y1’ for which ‘x1’ has been measured. The second problem is probably the most commonly encountered and the most relevant to calibration studies. A regression line is likely to be most useful for calibration if the range of values of the X variable is large, if there is a good representation of the ‘x,y’ values across the range of X, and if several estimates of ‘y’ are made at each ‘x’. It is poor statistical practice to use a regression line for calibration or prediction beyond the limits of the data.
Resumo:
Purpose: The use of PHMB as a disinfectant in contact lens multipurpose solutions has been at the centre of much debate in recent times, particularly in relation to the issue of solution induced corneal staining. Clinical studies have been carried out which suggest different effects with individual contact lens materials used in combination with specific PHMB containing care regimes. There does not appear to be, however, a reliable analytical technique that would detect and quantify with any degree of accuracy the specific levels of PHMB that are taken up and released from individual solutions by the various contact lens materials. Methods: PHMB is a mixture of positively charged polymer units of varying molecular weight that has maximum absorbance wavelength of 236 nm. On the basis of these properties a range of assays including capillary electrophoresis, HPLC, a nickelnioxime colorimetric technique, mass spectrophotometry, UV spectroscopy and ion chromatography were assessed paying particular attention to each of their constraints and detection levels. Particular interest was focused on the relative advantage of contactless conductivity compared to UV and mass spectrometry detection in capillary electrophoresis (CE). This study provides an overview of the comparative performance of these techniques. Results: The UV absorbance of PHMB solutions, ranging from 0.0625 to 50 ppm was measured at 236 nm. Within this range the calibration curve appears to be linear however, absorption values below 1 ppm (0.0001%) were extremely difficult to reproduce. The concentration of PHMB in solutions is in the range of 0.0002–0.00005% and our investigations suggest that levels of PHMB below 0.0001% (levels encountered in uptake and release studies) can not be accurately estimated, in particular when analysing complex lens care solutions which can contain competitively absorbing, and thus interfering, species in the solution. The use of separative methodologies, such as CE using UV detection alone is similarly limited. Alternative techniques including contactless conductivity detection offer greater discrimination in complex solutions together with the opportunity for dual channel detection. Preliminary results achieved by TraceDec1 contactless conductivity detection, (Gain 150%, Offset 150) in conjunction with the Agilent capillary electrophoresis system using a bare fused silica capillary (extended light path, 50 mid, total length 64.5 cm, effective length 56 cm) and a cationic buffer at pH 3.2, exhibit great potential with reproducible PHMB split peaks. Conclusions: PHMB-based solutions are commonly associated with the potential to invoke corneal staining in combination with certain contact lens materials. However this terminology ‘PHMBbased solution’ is used primarily because PHMB itself has yet to be adequately implicated as the causative agent of the staining and compromised corneal cell integrity. The lack of well characterised adequately sensitive assays, coupled with the range of additional components that characterise individual care solutions pose a major barrier to the investigation of PHMB interactions in the lenswearing eye.
Resumo:
This comparative study considers the main causative factors for change in recent years in the teaching of modern languages in England and France and seeks to contribute, in a general sense, to the understanding of change in comparable institutions. In England by 1975 the teaching of modern languages in the comprehensive schools was seen to be inappropriate to the needs of children of the whole ability-range. A combination of the external factor of the Council of Europe initiative in devising a needs-based learning approach for adult learners, and the internal factor of teacher-based initiatives in developing a graded-objectives learning approach for the less-able, has reversed this situation to some extent. The study examines and evaluates this reversal, and, in addition, assesses teachers' attitudes towards, and understanding of, the changes involved. In France the imposition of `la reforme Haby' in 1977 and the creation of `le college unique' were the main external factors for change. The subsequent failure of the reform and the socialist government's support of decentralisation policies returning the initiative for renewal to schools are examined and evaluated, as are the internal factors for changes in language-teaching - `groupes de niveau' and the creation of `equipes pedagogiques'. In both countries changes in the function of examinations at 15/16 plus are examined. The final chapter compared the changes in both education systems.
Resumo:
This thesis first considers the calibration and signal processing requirements of a neuromagnetometer for the measurement of human visual function. Gradiometer calibration using straight wire grids is examined and optimal grid configurations determined, given realistic constructional tolerances. Simulations show that for gradiometer balance of 1:104 and wire spacing error of 0.25mm the achievable calibration accuracy of gain is 0.3%, of position is 0.3mm and of orientation is 0.6°. Practical results with a 19-channel 2nd-order gradiometer based system exceed this performance. The real-time application of adaptive reference noise cancellation filtering to running-average evoked response data is examined. In the steady state, the filter can be assumed to be driven by a non-stationary step input arising at epoch boundaries. Based on empirical measures of this driving step an optimal progression for the filter time constant is proposed which improves upon fixed time constant filter performance. The incorporation of the time-derivatives of the reference channels was found to improve the performance of the adaptive filtering algorithm by 15-20% for unaveraged data, falling to 5% with averaging. The thesis concludes with a neuromagnetic investigation of evoked cortical responses to chromatic and luminance grating stimuli. The global magnetic field power of evoked responses to the onset of sinusoidal gratings was shown to have distinct chromatic and luminance sensitive components. Analysis of the results, using a single equivalent current dipole model, shows that these components arise from activity within two distinct cortical locations. Co-registration of the resulting current source localisations with MRI shows a chromatically responsive area lying along the midline within the calcarine fissure, possibly extending onto the lingual and cuneal gyri. It is postulated that this area is the human homologue of the primate cortical area V4.
Resumo:
The primary objective of this research was to understand what kinds of knowledge and skills people use in `extracting' relevant information from text and to assess the extent to which expert systems techniques could be applied to automate the process of abstracting. The approach adopted in this thesis is based on research in cognitive science, information science, psycholinguistics and textlinguistics. The study addressed the significance of domain knowledge and heuristic rules by developing an information extraction system, called INFORMEX. This system, which was implemented partly in SPITBOL, and partly in PROLOG, used a set of heuristic rules to analyse five scientific papers of expository type, to interpret the content in relation to the key abstract elements and to extract a set of sentences recognised as relevant for abstracting purposes. The analysis of these extracts revealed that an adequate abstract could be generated. Furthermore, INFORMEX showed that a rule based system was a suitable computational model to represent experts' knowledge and strategies. This computational technique provided the basis for a new approach to the modelling of cognition. It showed how experts tackle the task of abstracting by integrating formal knowledge as well as experiential learning. This thesis demonstrated that empirical and theoretical knowledge can be effectively combined in expert systems technology to provide a valuable starting approach to automatic abstracting.
Resumo:
Optical coherence tomography (OCT) is a non-invasive three-dimensional imaging system that is capable of producing high resolution in-vivo images. OCT is approved for use in clinical trials in Japan, USA and Europe. For OCT to be used effectively in a clinical diagnosis, a method of standardisation is required to assess the performance across different systems. This standardisation can be implemented using highly accurate and reproducible artefacts for calibration at both installation and throughout the lifetime of a system. Femtosecond lasers can write highly reproducible and highly localised micro-structured calibration artefacts within a transparent media. We report on the fabrication of high quality OCT calibration artefacts in fused silica using a femtosecond laser. The calibration artefacts were written in fused silica due to its high purity and ability to withstand high energy femtosecond pulses. An Amplitude Systemes s-Pulse Yb:YAG femtosecond laser with an operating wavelength of 1026 nm was used to inscribe three dimensional patterns within the highly optically transmissive substrate. Four unique artefacts have been designed to measure a wide variety of parameters, including the points spread function (PSF), modulation transfer function (MTF), sensitivity, distortion and resolution - key parameters which define the performance of the OCT. The calibration artefacts have been characterised using an optical microscope and tested on a swept source OCT. The results demonstrate that the femtosecond laser inscribed artefacts have the potential of quantitatively and qualitatively validating the performance of any OCT system.
Resumo:
The recent expansion of clinical applications for optical coherence tomography (OCT) is driving the development of approaches for consistent image acquisition. There is a simultaneous need for time-stable, easy-to-use imaging targets for calibration and standardization of OCT devices. We present calibration targets consisting of three-dimensional structures etched into nanoparticle-embedded resin. Spherical iron oxide nanoparticles with a predominant particle diameter of 400 nm were homogeneously dispersed in a two part polyurethane resin and allowed to harden overnight. These samples were then etched using a precision micromachining femtosecond laser with a center wavelength of 1026 nm, 100kHz repetition rate and 450 fs pulse duration. A series of lines in depth were etched, varying the percentage of inscription energy and speed of the translation stage moving the target with respect to the laser. Samples were imaged with a dual wavelength spectral-domain OCT system and point-spread function of nanoparticles within the target was measured.
Resumo:
Few works address methodological issues of how to conduct strategy-as-practice research and even fewer focus on how to analyse the subsequent data in ways that illuminate strategy as an everyday, social practice. We address this gap by proposing a quantitative method for analysing observational data, which can complement more traditional qualitative methodologies. We propose that rigorous but context-sensitive coding of transcripts can render everyday practice analysable statistically. Such statistical analysis provides a means for analytically representing patterns and shifts within the mundane, repetitive elements through which practice is accomplished. We call this approach the Event Database (EDB) and it consists of five basic coding categories that help us capture the stream of practice. Indexing codes help to index or categorise the data, in order to give context and offer some basic information about the event under discussion. Indexing codes are descriptive codes, which allow us to catalogue and classify events according to their assigned characteristics. Content codes are to do with the qualitative nature of the event; this is the essence of the event. It is a description that helps to inform judgements about the phenomenon. Nature codes help us distinguish between discursive and tangible events. We include this code to acknowledge that some events differ qualitatively from other events. Type events are codes abstracted from the data in order to help us classify events based on their description or nature. This involves significantly more judgement than the index codes but consequently is also more meaningful. Dynamics codes help us capture some of the movement or fluidity of events. This category has been included to let us capture the flow of activity over time.
Resumo:
Insights from the stream of research on knowledge calibration, which refers to the correspondence between accuracy and confidence in knowledge, enable a better understanding of consequences of inaccurate perceptions of managers. This paper examines the consequences of inaccurate managerial knowledge through the lens of knowledge calibration. Specifically, the paper examines the antecedent role of miscalibration of knowledge in strategy formation. It is postulated that miscalibrated managers who overestimate external factors and display a high level of confidence in their estimates are likely to enact strategies that are relatively more evolutionary and incremental in nature, whereas miscalibrated managers who overestimate internal factors and display a high level of confidence in their estimates are likely to enact strategies that are relatively more discontinuous and disruptive in nature. Perspectives from social cognitive theory provide support for the underlying processes. The paper, in part, explains the paradox of the prevalence of inaccurate managerial perceptions and efficacious performance. It also advances the literature on strategy formation through the application of the construct of knowledge calibration.
Resumo:
Calibration of consumer knowledge of the web refers to the correspondence between accuracy and confidence in knowledge of the web. Being well-calibrated means that a person is realistic in his or her assessment of the level of knowledge that he or she possesses. This study finds that involvement leads to better calibration and that calibration is higher for procedural knowledge and common knowledge, as compared to declarative knowledge and specialized knowledge. Neither usage, nor experience, has any effect on calibration of knowledge of the web. No difference in calibration is observed between genders. But, in agreement with previous findings, this study also finds that males are more confident in their knowledge of the web. The results point out that calibration could be more a function of knowledge-specific factors and less that of individual-specific factors. The study also identifies flow and frustration with the web as consequences of calibration of knowledge of the web and draws the attention of future researchers to examine these aspects.
Resumo:
Objectives and Methods: Contact angle, as a representative measure of surface wettability, is often employed to interpret contact lens surface properties. The literature is often contradictory and can lead to confusion. This literature review is part of a series regarding the analysis of hydrogel contact lenses using contact angle techniques. Here we present an overview of contact angle terminology, methodology, and analysis. Having discussed this background material, subsequent parts of the series will discuss the analysis of contact lens contact angles and evaluate differences in published laboratory results. Results: The concepts of contact angle, wettability and wetting are presented as an introduction. Contact angle hysteresis is outlined and highlights the advantages in using dynamic analytical techniques over static methods. The surface free energy of a material illustrates how contact angle analysis is capable of providing supplementary surface characterization. Although single values are able to distinguish individual material differences, surface free energy and dynamic methods provide an improved understanding of material behavior. The frequently used sessile drop, captive bubble, and Wilhelmy plate techniques are discussed. Their use as both dynamic and static methods, along with the advantages and disadvantages of each technique, is explained. Conclusions: No single contact angle technique fully characterizes the wettability of a material surface, and the application of complimenting methods allows increased characterization. At present, there is not an ISO standard method designed for soft materials. It is important that each contact angle technique has a standard protocol, as small protocol differences between laboratories often contribute to a variety of published data that are not easily comparable. © 2013 Contact Lens Association of Ophthalmologists.