18 resultados para Detection, Optimisation, Assessment, Highway

em Aston University Research Archive


Relevância:

40.00% 40.00%

Publicador:

Resumo:

A technique for direct real-time assessment of a distributed feedback fibre laser cavity conditions during operation is demonstrated and used to provide a cavity mode conditioning feedback mechanism to optimise output performance. Negligible wavelength drift is demonstrated over a 52 mW pump power range.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

A technique for direct real-time assessment of a distributed feedback fibre laser cavity conditions during operation is demonstrated and used to provide a cavity mode conditioning feedback mechanism to optimise output performance. Negligible wavelength drift is demonstrated over a 52 mW pump power range.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Phospholipid oxidation by adventitious damage generates a wide variety of products with potentially novel biological activities that can modulate inflammatory processes associated with various diseases. To understand the biological importance of oxidised phospholipids (OxPL) and their potential role as disease biomarkers requires precise information about the abundance of these compounds in cells and tissues. There are many chemiluminescence and spectrophotometric assays available for detecting oxidised phospholipids, but they all have some limitations. Mass spectrometry coupled with liquid chromatography is a powerful and sensitive approach that can provide detailed information about the oxidative lipidome, but challenges still remain. The aim of this work is to develop improved methods for detection of OxPLs by optimisation of chromatographic separation through testing several reverse phase columns and solvent systems, and using targeted mass spectrometry approaches. Initial experiments were carried out using oxidation products generated in vitro to optimise the chromatography separation parameters and mass spectrometry parameters. We have evaluated the chromatographic separation of oxidised phosphatidylcholines (OxPCs) and oxidised phosphatidylethanolamines (OXPEs) using C8, C18 and C30 reverse phase, polystyrene – divinylbenzene based monolithic and mixed – mode hydrophilic interaction (HILIC) columns, interfaced with mass spectrometry. Our results suggest that the monolithic column was best able to separate short chain OxPCs and OxPEs from long chain oxidised and native PCs and PEs. However, variation in charge of polar head groups and extreme diversity of oxidised species make analysis of several classes of OxPLs within one analytical run impractical. We evaluated and optimised the chromatographic separation of OxPLs by serially coupling two columns: HILIC and monolith column that provided us the larger coverage of OxPL species in a single analytical run.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

The goal of FOCUS, which stands for Frailty Management Optimization through EIPAHA Commitments and Utilization of Stakeholders’ Input, is to reduce the burden of frailty in Europe. The partners are working on advancing knowledge of frailty detection, assessment, and management, including biological, clinical, cognitive and psychosocial markers, in order to change the paradigm of frailty care from acute intervention to prevention. FOCUS partners are working on ways to integrate the best available evidence from frailty-related screening tools, epidemiological and interventional studies into the care of frail people and their quality of life. Frail citizens in Italy, Poland and the UK and their caregivers are being called to express their views and their experiences with treatments and interventions aimed at improving quality of life. The FOCUS Consortium is developing pathways to leverage the knowledge available and to put it in the service of frail citizens. In order to reach out to the broadest audience possible, the FOCUS Platform for Knowledge Exchange and the platform for Scaling Up are being developed with the collaboration of stakeholders. The FOCUS project is a development of the work being done by the European Innovation Partnership on Active and Healthy Ageing (EIPAHA), which aims to increase the average healthy lifespan in Europe by 2020 while fostering sustainability of health/social care systems and innovation in Europe. The knowledge and tools developed by the FOCUS project, with input from stakeholders, will be deployed to all EIPAHA participants dealing with frail older citizens to support activities and optimize performance.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper, addresses the problem of novelty detection in the case that the observed data is a mixture of a known 'background' process contaminated with an unknown other process, which generates the outliers, or novel observations. The framework we describe here is quite general, employing univariate classification with incomplete information, based on knowledge of the distribution (the 'probability density function', 'pdf') of the data generated by the 'background' process. The relative proportion of this 'background' component (the 'prior' 'background' 'probability), the 'pdf' and the 'prior' probabilities of all other components are all assumed unknown. The main contribution is a new classification scheme that identifies the maximum proportion of observed data following the known 'background' distribution. The method exploits the Kolmogorov-Smirnov test to estimate the proportions, and afterwards data are Bayes optimally separated. Results, demonstrated with synthetic data, show that this approach can produce more reliable results than a standard novelty detection scheme. The classification algorithm is then applied to the problem of identifying outliers in the SIC2004 data set, in order to detect the radioactive release simulated in the 'oker' data set. We propose this method as a reliable means of novelty detection in the emergency situation which can also be used to identify outliers prior to the application of a more general automatic mapping algorithm. © Springer-Verlag 2007.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The purpose of the work described here has been to seek methods of narrowing the present gap between currently realised heat pump performance and the theoretical limit. The single most important pre-requisite to this objective is the identification and quantitative assessment of the various non-idealities and degradative phenomena responsible for the present shortfall. The use of availability analysis has been introduced as a diagnostic tool, and applied to a few very simple, highly idealised Rankine cycle optimisation problems. From this work, it has been demonstrated that the scope for improvement through optimisation is small in comparison with the extensive potential for improvement by reducing the compressor's losses. A fully instrumented heat pump was assembled and extensively tested. This furnished performance data, and led to an improved understanding of the systems behaviour. From a very simple analysis of the resulting compressor performance data, confirmation of the compressor's low efficiency was obtained. In addition, in order to obtain experimental data concerning specific details of the heat pump's operation, several novel experiments were performed. The experimental work was concluded with a set of tests which attempted to obtain definitive performance data for a small set of discrete operating conditions. These tests included an investigation of the effect of two compressor modifications. The resulting performance data was analysed by a sophisticated calculation which used that measurements to quantify each dagradative phenomenon occurring in that compressor, and so indicate where the greatest potential for improvement lies. Finally, in the light of everything that was learnt, specific technical suggestions have been made, to reduce the losses associated with both the refrigerant circuit and the compressor.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We investigate the feasibility of simultaneous suppressing of the amplification noise and nonlinearity, representing the most fundamental limiting factors in modern optical communication. To accomplish this task we developed a general design optimisation technique, based on concepts of noise and nonlinearity management. We demonstrate the immense efficiency of the novel approach by applying it to a design optimisation of transmission lines with periodic dispersion compensation using Raman and hybrid Raman-EDFA amplification. Moreover, we showed, using nonlinearity management considerations, that the optimal performance in high bit-rate dispersion managed fibre systems with hybrid amplification is achieved for a certain amplifier spacing – which is different from commonly known optimal noise performance corresponding to fully distributed amplification. Required for an accurate estimation of the bit error rate, the complete knowledge of signal statistics is crucial for modern transmission links with strong inherent nonlinearity. Therefore, we implemented the advanced multicanonical Monte Carlo (MMC) method, acknowledged for its efficiency in estimating distribution tails. We have accurately computed acknowledged for its efficiency in estimating distribution tails. We have accurately computed marginal probability density functions for soliton parameters, by numerical modelling of Fokker-Plank equation applying the MMC simulation technique. Moreover, applying a powerful MMC method we have studied the BER penalty caused by deviations from the optimal decision level in systems employing in-line 2R optical regeneration. We have demonstrated that in such systems the analytical linear approximation that makes a better fit in the central part of the regenerator nonlinear transfer function produces more accurate approximation of the BER and BER penalty. We present a statistical analysis of RZ-DPSK optical signal at direct detection receiver with Mach-Zehnder interferometer demodulation

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A re-examination of fundamental concepts and a formal structuring of the waveform analysis problem is presented in Part I. eg. the nature of frequency is examined and a novel alternative to the classical methods of detection proposed and implemented which has the advantage of speed and independence from amplitude. Waveform analysis provides the link between Parts I and II. Part II is devoted to Human Factors and the Adaptive Task Technique. The Historical, Technical and Intellectual development of the technique is traced in a review which examines the evidence of its advantages relative to non-adaptive fixed task methods of training, skill assessment and man-machine optimisation. A second review examines research evidence on the effect of vibration on manual control ability. Findings are presented in terms of percentage increment or decrement in performance relative to performance without vibration in the range 0-0.6Rms'g'. Primary task performance was found to vary by as much as 90% between tasks at the same Rms'g'. Differences in task difficulty accounted for this difference. Within tasks vibration-added-difficulty accounted for the effects of vibration intensity. Secondary tasks were found to be largely insensitive to vibration except secondaries which involved fine manual adjustment of minor controls. Three experiments are reported next in which an adaptive technique was used to measure the % task difficulty added by vertical random and sinusoidal vibration to a 'Critical Compensatory Tracking task. At vibration intensities between 0 - 0.09 Rms 'g' it was found that random vibration added (24.5 x Rms'g')/7.4 x 100% to the difficulty of the control task. An equivalence relationship between Random and Sinusoidal vibration effects was established based upon added task difficulty. Waveform Analyses which were applied to the experimental data served to validate Phase Plane analysis and uncovered the development of a control and possibly a vibration isolation strategy. The submission ends with an appraisal of subjects mentioned in the thesis title.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This thesis sets out to examine in detail the condition of systemic hypertension (high Blood Pressure) in relation to optometric practice in the United Kingdom. Systemic hypertension, which is asymptomatic in the early stages, is diagnosed from the Blood Pressure (BP) measurement recorded by a sphygmomanometer and/or from the complications that have developed in target organs. Optometric practice based surveys revealed that diagnosed systemic hypertension was the most prevalent cardiovascular medical condition (20.5%). Measurement of BP of patients in this sample revealed that if an optometrist included sphygmomanometry into the sight examination then at least one patient each day would be referred for suspect systemic hypertension. Optometric opinion felt that the measurement of BP in optometric practice would advance the profession, being appreciated by both patients and General Practitioners (GPs), but was felt to be an unnecessary routine procedure. The present sight examination for the systemic hypertensive is similar to that of the normotensive patient, but may involve an altered fundus examination and a visual field test. The GPs were in favour of optometric BP measurement and a future role in the share care management of the systemic hypertensive. The application of a new pictorial grading scale for the grading of vascular changes associated with pre-malignant systemic hypertension was found to be both accurate and reliable. Clinical trial of the grading scale in optometric practice found positive correlations between BP and increasing severity of the retinal vascular features. The subtle pre-malignant vascular changes require reliable accurate detection and analysis to assist in the management of the systemic hypertensive patient. Vessel width was shown to decrease with increasing age. Image analysis of the A/V ratio, arteriolar tortuosity and focal calibre changes revealed a positive correlation to the patient's BP (p<0.001). The retinal vasculature is relatively stable longitudinally with only minor changes in response to early disease states. Age and elevated BP increased a patient's risk of developing systemic medical conditions over a two-year period. The application of the pictorial grading scale to optometric practice and training the optometrist in the use of sphygmomanometry would improve the management of the systemic hypertensive patient in optometric practice. Future advances in image analysis hold substantial benefits for the detection and monitoring of subtle vascular changes associated with systemic hypertension.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This thesis presents research within empirical financial economics with focus on liquidity and portfolio optimisation in the stock market. The discussion on liquidity is focused on measurement issues, including TAQ data processing and measurement of systematic liquidity factors (FSO). Furthermore, a framework for treatment of the two topics in combination is provided. The liquidity part of the thesis gives a conceptual background to liquidity and discusses several different approaches to liquidity measurement. It contributes to liquidity measurement by providing detailed guidelines on the data processing needed for applying TAQ data to liquidity research. The main focus, however, is the derivation of systematic liquidity factors. The principal component approach to systematic liquidity measurement is refined by the introduction of moving and expanding estimation windows, allowing for time-varying liquidity co-variances between stocks. Under several liability specifications, this improves the ability to explain stock liquidity and returns, as compared to static window PCA and market average approximations of systematic liquidity. The highest ability to explain stock returns is obtained when using inventory cost as a liquidity measure and a moving window PCA as the systematic liquidity derivation technique. Systematic factors of this setting also have a strong ability in explaining a cross-sectional liquidity variation. Portfolio optimisation in the FSO framework is tested in two empirical studies. These contribute to the assessment of FSO by expanding the applicability to stock indexes and individual stocks, by considering a wide selection of utility function specifications, and by showing explicitly how the full-scale optimum can be identified using either grid search or the heuristic search algorithm of differential evolution. The studies show that relative to mean-variance portfolios, FSO performs well in these settings and that the computational expense can be mitigated dramatically by application of differential evolution.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Development of mass spectrometry techniques to detect protein oxidation, which contributes to signalling and inflammation, is important. Label-free approaches have the advantage of reduced sample manipulation, but are challenging in complex samples owing to undirected analysis of large data sets using statistical search engines. To identify oxidised proteins in biological samples, we previously developed a targeted approach involving precursor ion scanning for diagnostic MS3 ions from oxidised residues. Here, we tested this approach for other oxidations, and compared it with an alternative approach involving the use of extracted ion chromatograms (XICs) generated from high-resolution MSMS data using very narrow mass windows. This accurate mass XIC data methodology was effective at identifying nitrotyrosine, chlorotyrosine, and oxidative deamination of lysine, and for tyrosine oxidations highlighted more modified peptide species than precursor ion scanning or statistical database searches. Although some false positive peaks still occurred in the XICs, these could be identified by comparative assessment of the peak intensities. The method has the advantage that a number of different modifications can be analysed simultaneously in a single LC-MSMS run. This article is part of a Special Issue entitled: Posttranslational Protein modifications in biology and Medicine. Biological significance: The use of accurate mass extracted product ion chromatograms to detect oxidised peptides could improve the identification of oxidatively damaged proteins in inflammatory conditions. © 2013 Elsevier B.V.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Visual field assessment is a core component of glaucoma diagnosis and monitoring, and the Standard Automated Perimetry (SAP) test is considered up until this moment, the gold standard of visual field assessment. Although SAP is a subjective assessment and has many pitfalls, it is being constantly used in the diagnosis of visual field loss in glaucoma. Multifocal visual evoked potential (mfVEP) is a newly introduced method used for visual field assessment objectively. Several analysis protocols have been tested to identify early visual field losses in glaucoma patients using the mfVEP technique, some were successful in detection of field defects, which were comparable to the standard SAP visual field assessment, and others were not very informative and needed more adjustment and research work. In this study, we implemented a novel analysis approach and evaluated its validity and whether it could be used effectively for early detection of visual field defects in glaucoma. OBJECTIVES: The purpose of this study is to examine the effectiveness of a new analysis method in the Multi-Focal Visual Evoked Potential (mfVEP) when it is used for the objective assessment of the visual field in glaucoma patients, compared to the gold standard technique. METHODS: 3 groups were tested in this study; normal controls (38 eyes), glaucoma patients (36 eyes) and glaucoma suspect patients (38 eyes). All subjects had a two standard Humphrey visual field HFA test 24-2 and a single mfVEP test undertaken in one session. Analysis of the mfVEP results was done using the new analysis protocol; the Hemifield Sector Analysis HSA protocol. Analysis of the HFA was done using the standard grading system. RESULTS: Analysis of mfVEP results showed that there was a statistically significant difference between the 3 groups in the mean signal to noise ratio SNR (ANOVA p<0.001 with a 95% CI). The difference between superior and inferior hemispheres in all subjects were all statistically significant in the glaucoma patient group 11/11 sectors (t-test p<0.001), partially significant 5/11 (t-test p<0.01) and no statistical difference between most sectors in normal group (only 1/11 was significant) (t-test p<0.9). sensitivity and specificity of the HAS protocol in detecting glaucoma was 97% and 86% respectively, while for glaucoma suspect were 89% and 79%. DISCUSSION: The results showed that the new analysis protocol was able to confirm already existing field defects detected by standard HFA, was able to differentiate between the 3 study groups with a clear distinction between normal and patients with suspected glaucoma; however the distinction between normal and glaucoma patients was especially clear and significant. CONCLUSION: The new HSA protocol used in the mfVEP testing can be used to detect glaucomatous visual field defects in both glaucoma and glaucoma suspect patient. Using this protocol can provide information about focal visual field differences across the horizontal midline, which can be utilized to differentiate between glaucoma and normal subjects. Sensitivity and specificity of the mfVEP test showed very promising results and correlated with other anatomical changes in glaucoma field loss.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In the face of global population growth and the uneven distribution of water supply, a better knowledge of the spatial and temporal distribution of surface water resources is critical. Remote sensing provides a synoptic view of ongoing processes, which addresses the intricate nature of water surfaces and allows an assessment of the pressures placed on aquatic ecosystems. However, the main challenge in identifying water surfaces from remotely sensed data is the high variability of spectral signatures, both in space and time. In the last 10 years only a few operational methods have been proposed to map or monitor surface water at continental or global scale, and each of them show limitations. The objective of this study is to develop and demonstrate the adequacy of a generic multi-temporal and multi-spectral image analysis method to detect water surfaces automatically, and to monitor them in near-real-time. The proposed approach, based on a transformation of the RGB color space into HSV, provides dynamic information at the continental scale. The validation of the algorithm showed very few omission errors and no commission errors. It demonstrates the ability of the proposed algorithm to perform as effectively as human interpretation of the images. The validation of the permanent water surface product with an independent dataset derived from high resolution imagery, showed an accuracy of 91.5% and few commission errors. Potential applications of the proposed method have been identified and discussed. The methodology that has been developed 27 is generic: it can be applied to sensors with similar bands with good reliability, and minimal effort. Moreover, this experiment at continental scale showed that the methodology is efficient for a large range of environmental conditions. Additional preliminary tests over other continents indicate that the proposed methodology could also be applied at the global scale without too many difficulties

Relevância:

30.00% 30.00%

Publicador:

Resumo:

CONCLUSIONS: The new HSA protocol used in the mfVEP testing can be applied to detect glaucomatous visual field defects in both glaucoma and glaucoma suspect patients. Using this protocol can provide information about focal visual field differences across the horizontal midline, which can be utilized to differentiate between glaucoma and normal subjects. Sensitivity and specificity of the mfVEP test showed very promising results and correlated with other anatomical changes in glaucoma field loss. PURPOSE: Multifocal visual evoked potential (mfVEP) is a newly introduced method used for objective visual field assessment. Several analysis protocols have been tested to identify early visual field losses in glaucoma patients using the mfVEP technique, some were successful in detection of field defects, which were comparable to the standard automated perimetry (SAP) visual field assessment, and others were not very informative and needed more adjustment and research work. In this study we implemented a novel analysis approach and evaluated its validity and whether it could be used effectively for early detection of visual field defects in glaucoma. METHODS: Three groups were tested in this study; normal controls (38 eyes), glaucoma patients (36 eyes) and glaucoma suspect patients (38 eyes). All subjects had a two standard Humphrey field analyzer (HFA) test 24-2 and a single mfVEP test undertaken in one session. Analysis of the mfVEP results was done using the new analysis protocol; the hemifield sector analysis (HSA) protocol. Analysis of the HFA was done using the standard grading system. RESULTS: Analysis of mfVEP results showed that there was a statistically significant difference between the three groups in the mean signal to noise ratio (ANOVA test, p < 0.001 with a 95% confidence interval). The difference between superior and inferior hemispheres in all subjects were statistically significant in the glaucoma patient group in all 11 sectors (t-test, p < 0.001), partially significant in 5 / 11 (t-test, p < 0.01), and no statistical difference in most sectors of the normal group (1 / 11 sectors was significant, t-test, p < 0.9). Sensitivity and specificity of the HSA protocol in detecting glaucoma was 97% and 86%, respectively, and for glaucoma suspect patients the values were 89% and 79%, respectively.