17 resultados para System failures (Engineering) Graphic methods
em Aston University Research Archive
Resumo:
The work described in this thesis is an attempt to elucidate the relationships between the pore system and a number of engineering properties of hardened cement paste, particularly tensile strength and resistances to carbonation and ionic penetration. By examining aspects such as the rate of carbonisation, the pore size distribution, the concentration of ions in the pore solution and the phase composition of cement pastes, relationships between the pore system (pores and pore solution) and the resistance to carbonation were investigated. The study was carried out in two parts. First, cement pastes with different pore systems were compared, whilst secondly comparisons were made between the pore systems of cement pastes with different degrees of carbonation. Relationships between the pore structure and ionic penetration were studied by comparing kinetic data relating to the diffusion of various ions in cement pastes with different pore systems. Diffusion coefficients and activation energies for the diffusion process of Cl- and Na+ ions in carbonated and non-carbonated cement pastes were determined by a quasi-steady state technique. The effect of the geometry of pores on ionic diffusion was studied by comparing the mechanisms of ionic diffusion for ions with different radii. In order to investigate the possible relationship between tensile strength and macroporosity, cement paste specimens with cross sectional areas less than 1mm2 were produced so that the chance of a macropore existing within them was low. The tensile strengths of such specimens were then compared with those of larger specimens.
Resumo:
We experimentally demonstrated a highly sensitive twist sensor system based on a 45° and an 81° tilted fibre grating (TFG). The 81°-TFG has a set of dual-peaks that are due to the birefringence induced by its extremely tilted structure. When the 81°-TFG subjected to twist, the coupling to the two peaks would interchange from each other, providing a mechanism to measure and monitor the twist. We have investigated the performance of the sensor system by three interrogation methods (spectral, power-measurement and voltage-measurement). The experimental results clearly show that the 81°-TFG and the 45°-TFG could be combined forming a full fibre twist sensor system capable of not just measuring the magnitude but also recognising the direction of the applied twist.
Resumo:
We experimentally demonstrated a highly sensitive twist sensor system based on a 45° and an 81° tilted fibre grating (TFG). The 81°-TFG has a set of dual-peaks that are due to the birefringence induced by its extremely tilted structure. When the 81°-TFG subjected to twist, the coupling to the two peaks would interchange from each other, providing a mechanism to measure and monitor the twist. We have investigated the performance of the sensor system by three interrogation methods (spectral, power-measurement and voltage-measurement). The experimental results clearly show that the 81°-TFG and the 45°-TFG could be combined forming a full fibre twist sensor system capable of not just measuring the magnitude but also recognising the direction of the applied twist.
Computational mechanics reveals nanosecond time correlations in molecular dynamics of liquid systems
Resumo:
Statistical complexity, a measure introduced in computational mechanics has been applied to MD simulated liquid water and other molecular systems. It has been found that statistical complexity does not converge in these systems but grows logarithmically without a limit. The coefficient of the growth has been introduced as a new molecular parameter which is invariant for a given liquid system. Using this new parameter extremely long time correlations in the system undetectable by traditional methods are elucidated. The existence of hundreds of picosecond and even nanosecond long correlations in bulk water has been demonstrated. © 2008 Elsevier B.V. All rights reserved.
Resumo:
Exploratory analysis of data seeks to find common patterns to gain insights into the structure and distribution of the data. In geochemistry it is a valuable means to gain insights into the complicated processes making up a petroleum system. Typically linear visualisation methods like principal components analysis, linked plots, or brushing are used. These methods can not directly be employed when dealing with missing data and they struggle to capture global non-linear structures in the data, however they can do so locally. This thesis discusses a complementary approach based on a non-linear probabilistic model. The generative topographic mapping (GTM) enables the visualisation of the effects of very many variables on a single plot, which is able to incorporate more structure than a two dimensional principal components plot. The model can deal with uncertainty, missing data and allows for the exploration of the non-linear structure in the data. In this thesis a novel approach to initialise the GTM with arbitrary projections is developed. This makes it possible to combine GTM with algorithms like Isomap and fit complex non-linear structure like the Swiss-roll. Another novel extension is the incorporation of prior knowledge about the structure of the covariance matrix. This extension greatly enhances the modelling capabilities of the algorithm resulting in better fit to the data and better imputation capabilities for missing data. Additionally an extensive benchmark study of the missing data imputation capabilities of GTM is performed. Further a novel approach, based on missing data, will be introduced to benchmark the fit of probabilistic visualisation algorithms on unlabelled data. Finally the work is complemented by evaluating the algorithms on real-life datasets from geochemical projects.
Resumo:
This thesis describes advances in the characterisation, calibration and data processing of optical coherence tomography (OCT) systems. Femtosecond (fs) laser inscription was used for producing OCT-phantoms. Transparent materials are generally inert to infra-red radiations, but with fs lasers material modification occurs via non-linear processes when the highly focused light source interacts with the materials. This modification is confined to the focal volume and is highly reproducible. In order to select the best inscription parameters, combination of different inscription parameters were tested, using three fs laser systems, with different operating properties, on a variety of materials. This facilitated the understanding of the key characteristics of the produced structures with the aim of producing viable OCT-phantoms. Finally, OCT-phantoms were successfully designed and fabricated in fused silica. The use of these phantoms to characterise many properties (resolution, distortion, sensitivity decay, scan linearity) of an OCT system was demonstrated. Quantitative methods were developed to support the characterisation of an OCT system collecting images from phantoms and also to improve the quality of the OCT images. Characterisation methods include the measurement of the spatially variant resolution (point spread function (PSF) and modulation transfer function (MTF)), sensitivity and distortion. Processing of OCT data is a computer intensive process. Standard central processing unit (CPU) based processing might take several minutes to a few hours to process acquired data, thus data processing is a significant bottleneck. An alternative choice is to use expensive hardware-based processing such as field programmable gate arrays (FPGAs). However, recently graphics processing unit (GPU) based data processing methods have been developed to minimize this data processing and rendering time. These processing techniques include standard-processing methods which includes a set of algorithms to process the raw data (interference) obtained by the detector and generate A-scans. The work presented here describes accelerated data processing and post processing techniques for OCT systems. The GPU based processing developed, during the PhD, was later implemented into a custom built Fourier domain optical coherence tomography (FD-OCT) system. This system currently processes and renders data in real time. Processing throughput of this system is currently limited by the camera capture rate. OCTphantoms have been heavily used for the qualitative characterization and adjustment/ fine tuning of the operating conditions of OCT system. Currently, investigations are under way to characterize OCT systems using our phantoms. The work presented in this thesis demonstrate several novel techniques of fabricating OCT-phantoms and accelerating OCT data processing using GPUs. In the process of developing phantoms and quantitative methods, a thorough understanding and practical knowledge of OCT and fs laser processing systems was developed. This understanding leads to several novel pieces of research that are not only relevant to OCT but have broader importance. For example, extensive understanding of the properties of fs inscribed structures will be useful in other photonic application such as making of phase mask, wave guides and microfluidic channels. Acceleration of data processing with GPUs is also useful in other fields.
Resumo:
In this paper we propose a hybrid TCP/UDP transport, specifically for H.264/AVC encoded video, as a compromise between the delay-prone TCP and the loss-prone UDP. When implementing the hybrid approach, we argue that the playback at the receiver often need not be 100% perfect, provided that a certain level of quality is assured. Reliable TCP is used to transmit and guarantee delivery of the most important packets. This allows use of additional features in the H.264/AVC standard which simultaneously provide an enhanced playback quality, in addition to a reduction in throughput. These benefits are demonstrated through experimental results using a test-bed to emulate the hybrid proposal. We compare the proposed system with other protection methods, such as FEC, and in one case show that for the same bandwidth overhead, FEC is unable to match the performance of the hybrid system in terms of playback quality. Furthermore, we measure the delay associated with our approach, and examine its potential for use as an alternative to the conventional methods of transporting video by either TCP or UDP alone. © 2011 IEEE.
Resumo:
The work presented in this thesis describes an investigation into the production and properties of thin amorphous C films, with and without Cr doping, as a low wear / friction coating applicable to MEMS and other micro- and nano-engineering applications. Firstly, an assessment was made of the available testing techniques. Secondly, the optimised test methods were applied to a series of sputtered films of thickness 10 - 2000 nm in order to: (i) investigate the effect of thickness on the properties of coatingslcoating process (ii) investigate fundamental tribology at the nano-scale and (iii) provide a starting point for nanotribological coating optimisation at ultra low thickness. The use of XPS was investigated for the determination of Sp3/Sp2 carbon bonding. Under C 1s peak analysis, significant errors were identified and this was attributed to the absence of sufficient instrument resolution to guide the component peak structure (even with a high resolution instrument). A simple peak width analysis and correlation work with C KLL D value confirmed the errors. The use of XPS for Sp3/Sp2 was therefore limited to initial tentative estimations. Nanoindentation was shown to provide consistent hardness and reduced modulus results with depth (to < 7nm) when replicate data was suitably statistically processed. No significant pile-up or cracking of the films was identified under nanoindentation. Nanowear experimentation by multiple nanoscratching provided some useful information, however the conditions of test were very different to those expect for MEMS and micro- / nano-engineering systems. A novel 'sample oscillated nanoindentation' system was developed for testing nanowear under more relevant conditions. The films were produced in an industrial production coating line. In order to maximise the available information and to take account of uncontrolled process variation a statistical design of experiment procedure was used to investigate the effect of four key process control parameters. Cr doping was the most significant control parameter at all thicknesses tested and produced a softening effect and thus increased nanowear. Substrate bias voltage was also a significant parameter and produced hardening and a wear reducing effect at all thicknesses tested. The use of a Cr adhesion layer produced beneficial results at 150 nm thickness, but was ineffective at 50 nm. Argon flow to the coating chamber produced a complex effect. All effects reduced significantly with reducing film thickness. Classic fretting wear was produced at low amplitude under nanowear testing. Reciprocating sliding was produced at higher amplitude which generated three body abrasive wear and this was generally consistent with the Archard model. Specific wear rates were very low (typically 10-16 - 10-18 m3N-1m-1). Wear rates reduced exponentially with reduced film thickness and below (approx.) 20 nm, thickness was identified as the most important control of wear.
Resumo:
In this thesis various mathematical methods of studying the transient and dynamic stabiIity of practical power systems are presented. Certain long established methods are reviewed and refinements of some proposed. New methods are presented which remove some of the difficulties encountered in applying the powerful stability theories based on the concepts of Liapunov. Chapter 1 is concerned with numerical solution of the transient stability problem. Following a review and comparison of synchronous machine models the superiority of a particular model from the point of view of combined computing time and accuracy is demonstrated. A digital computer program incorporating all the synchronous machine models discussed, and an induction machine model, is described and results of a practical multi-machine transient stability study are presented. Chapter 2 reviews certain concepts and theorems due to Liapunov. In Chapter 3 transient stability regions of single, two and multi~machine systems are investigated through the use of energy type Liapunov functions. The treatment removes several mathematical difficulties encountered in earlier applications of the method. In Chapter 4 a simple criterion for the steady state stability of a multi-machine system is developed and compared with established criteria and a state space approach. In Chapters 5, 6 and 7 dynamic stability and small signal dynamic response are studied through a state space representation of the system. In Chapter 5 the state space equations are derived for single machine systems. An example is provided in which the dynamic stability limit curves are plotted for various synchronous machine representations. In Chapter 6 the state space approach is extended to multi~machine systems. To draw conclusions concerning dynamic stability or dynamic response the system eigenvalues must be properly interpreted, and a discussion concerning correct interpretation is included. Chapter 7 presents a discussion of the optimisation of power system small sjgnal performance through the use of Liapunov functions.
Resumo:
This thesis is an exploration of the organisation and functioning of the human visual system using the non-invasive functional imaging modality magnetoencephalography (MEG). Chapters one and two provide an introduction to the ‘human visual system and magnetoencephalographic methodologies. These chapters subsequently describe the methods by which MEG can be used to measure neuronal activity from the visual cortex. Chapter three describes the development and implementation of novel analytical tools; including beamforming based analyses, spectrographic movies and an optimisation of group imaging methods. Chapter four focuses on the use of established and contemporary analytical tools in the investigation of visual function. This is initiated with an investigation of visually evoked and induced responses; covering visual evoked potentials (VEPs) and event related synchronisation/desynchronisation (ERS/ERD). Chapter five describes the employment of novel methods in the investigation of cortical contrast response and demonstrates distinct contrast response functions in striate and extra-striate regions of visual cortex. Chapter six use synthetic aperture magnetometry (SAM) to investigate the phenomena of visual cortical gamma oscillations in response to various visual stimuli; concluding that pattern is central to its generation and that it increases in amplitude linearly as a function of stimulus contrast, consistent with results from invasive electrode studies in the macaque monkey. Chapter seven describes the use of driven visual stimuli and tuned SAM methods in a pilot study of retinotopic mapping using MEG; finding that activity in the primary visual cortex can be distinguished in four quadrants and two eccentricities of the visual field. Chapter eight is a novel implementation of the SAM beamforming method in the investigation of a subject with migraine visual aura; the method reveals desynchronisation of the alpha and gamma frequency bands in occipital and temporal regions contralateral to observed visual abnormalities. The final chapter is a summary of main conclusions and suggested further work.
Resumo:
The civil engineering industry generally regards new methods and technology with a high amount of scepticism, preferring to use traditional and trusted methods. During the 1980s competition for civil engineering consultancy work in the world has become fierce. Halcrow recognised the need to maintain and improve their competitive edge over other consultants. The use of new technology in the form of microcomputers was seen to be one method to maintain and improve their repuation in the world. This thesis examines the role of microcomputers in civil engineering consultancy with particular reference to overseas projects. The involvement of civil engineers with computers, both past and present, has been investigated and a survey of the use of microcomputers by consultancies was carried out, the results are presented and analysed. A resume of the state-of-the-art of microcomputer technology was made. Various case studies were carried out in order to examine the feasibility of using microcomputers on overseas projects. One case study involved the examination of two projects in Bangladesh and is used to illustrate the requirements and problems encountered in such situations. Two programming applications were undertaken, a dynamic programming model of a single site reservoir and the simulation of the Bangladesh gas grid system. A cost-benefit analysis of a water resources project using microcomputers in the Aguan Valley, Honduras was carried out. Although the initial cost of microcomputers is often small, the overall costs can prove to be very high and are likely to exceed the costs of traditional computer methods. A planned approach for the use of microcomputers is essential in order to reap the expected benefits and recommendations for the implementation of such an approach are presented.
Resumo:
The aim of the research project was to gain d complete and accurate accounting of the needs and deficiencies of materials selection and design data, with particular attention given to the feasibility of a computerised materials selection system that would include application analysis, property data and screening techniques. The project also investigates and integrates the three major aspects of materials resources, materials selection and materials recycling. Consideration of the materials resource base suggests that, though our discovery potential has increased, geologic availability is the ultimate determinant and several metals may well become scarce at the same time, thus compounding the problem of substitution. With around 2- to 20- million units of engineering materials data, the use of a computer is the only logical answer for scientific selection of materials. The system developed at Aston is used for data storage, mathematical computation and output. The system enables programs to be run in batch and interactive (on-line) mode. The program with modification can also handle such variables as quantity of mineral resources, energy cost of materials and depletion and utilisation rates of strateqic materials. The work also carries out an in-depth study of copper recycling in the U.K. and concludes that, somewhere in the region of 2 million tonnes of copper is missing from the recycling cycle. It also sets out guidelines on product design and conservation policies from the recyclability point of view.
Resumo:
Mental-health risk assessment practice in the UK is mainly paper-based, with little standardisation in the tools that are used across the Services. The tools that are available tend to rely on minimal sets of items and unsophisticated scoring methods to identify at-risk individuals. This means the reasoning by which an outcome has been determined remains uncertain. Consequently, there is little provision for: including the patient as an active party in the assessment process, identifying underlying causes of risk, and eecting shared decision-making. This thesis develops a tool-chain for the formulation and deployment of a computerised clinical decision support system for mental-health risk assessment. The resultant tool, GRiST, will be based on consensual domain expert knowledge that will be validated as part of the research, and will incorporate a proven psychological model of classication for risk computation. GRiST will have an ambitious remit of being a platform that can be used over the Internet, by both the clinician and the layperson, in multiple settings, and in the assessment of patients with varying demographics. Flexibility will therefore be a guiding principle in the development of the platform, to the extent that GRiST will present an assessment environment that is tailored to the circumstances in which it nds itself. XML and XSLT will be the key technologies that help deliver this exibility.
Resumo:
Completing projects faster than normal is always a challenge as it often demands many paradigm shifts. Globalization opportunities and competition from private sectors and multinationals are forcing the management of public sector organizations in India's petroleum industry to take various aggressive strategies to maintain profitability. These projects are required to be completed sooner than with a typical schedule to remain competitive, get faster return on investment and give longer project life.