7 resultados para Analysis software
em Aston University Research Archive
Resumo:
Microbial transglutaminase (mTGase) is an enzyme that introduces a covalent bond between peptide bound glutamine and lysine residues. Proteins cross-linked in this manner are often more resistant to proteolytic degradation and show increased tensile strength. This study evaluates the effects of mTGase mediated cross-linking of collagen on the cellular morphology, behaviour and viability of murine 3T3 fibroblasts following their seeding into collagen scaffolds. Additionally, cell mediated scaffold contraction, porosity and level of cross-linking of the scaffold has been analysed using image analysis software, scanning electron microscopy (SEM), colorimetric assays, and Fourier transform infrared spectroscopy (FTIR). We demonstrate that the biocompatibility and cellular morphology, when comparing cultures of fibroblasts integrated in mTGase cross-linked collagen scaffolds with the native collagen counterparts, remained unaffected. It has been also elicited that the structural characteristics of collagen have been preserved while introducing enzymatically resistant covalent bonds.
Resumo:
This doctoral research project examines the effects that geographical transience has on Royal Air Force families. The methodology employed in this exploratory and qualitative study consisted largely of open-ended interview questions but also included a series of demographic variables. In total, 29 RAF personnel without families, 33 RAF personnel with families, 33 RAF spouses, and 15 RAF children participated in this research (N = 110). All respondents volunteered to take part in the study and were based in the United Kingdom at the time of data collection. The interviews were transcribed and content coded according to six major relocation themes arising from the literature (change, tasks, support, coping, difficulty, and outcome). QSR NVIVO 2.0, a qualitative data analysis software package, was used to facilitate the process. Through the utilisations of qualitative methodology, the researcher was able to offer various novel and reoccurring variables that appear to play an important role (at least subjectively) in relocation. Additionally, frequencies associated with these factors were presented. The findings were integrated with those from the literature in order to offer an initial comparison and differentiation between civilian and military samples. The main theoretical contributions were the introduction of the concept of mobile mentality, the creation of a novel relocation model that takes familial interaction into account, and the development of a taxonomy for the classification of relocation outcomes. Finally, additional observations, recommendations for future research, and practical implications are reviewed.
Resumo:
The ability to measure ocular surface temperature (OST) with thermal imaging offers potential insight into ocular physiology that has been acknowledged in the literature. The TH7102MX thermo-camera (NEC San-ei, Japan) continuously records dynamic information about OST without sacrificing spatial resolution. Using purpose-designed image analysis software, it was possible to select and quantify the principal components of absolute temperature values and the magnitude plus rate of temperature change that followed blinking. The techniques was examined for repeatability, reproducibility and the effects of extrinsic factors: a suitable experimental protocol was thus developed. The precise source of the measured thermal radiation has previously been subject toe dispute: in this thesis, the results of a study examining the relationships between physical parameters of the anterior eye and OST, confirmed a principal role for the tear film in OST. The dynamic changes in OST were studied in a large group of young subjects: quantifying the post-blink changes in temperature with time also established a role for tear flow dynamics in OST. Using dynamic thermography, the effects of hydrogel contact lens wear on OST were investigated: a model eye for in vivo work, and both neophyte and adapted contact lens wearers for in vivo studies. Significantly greater OST was observed in contact lens wearers, particularly with silicone hydrogel lenses compared to etafilcon A, and tended to be greatest when lenses had been worn continuously. This finding is important to understanding the ocular response to contact lens wear. In a group of normal subjects, dynamic thermography appeared to measure the ocular response to the application of artificial tear drops: this may prove to be a significant research and clinical tool.
Resumo:
This thesis describes an investigation into methods for controlling the mode distribution in multimode optical fibres. The major contributions presented in this thesis are summarised below. Emerging standards for Gigabit Ethernet transmission over multimode optical fibre have led to a resurgence of interest in the precise control, and specification, of modal launch conditions. In particular, commercial LED and OTDR test equipment does not, in general, comply with these standards. There is therefore a need for mode control devices, which can ensure compliance with the standards. A novel device consisting of a point-load mode-scrambler in tandem with a mode-filter is described in this thesis. The device, which has been patented, may be tuned to achieve a wide range of mode distributions and has been implemented in a ruggedised package for field use. Various other techniques for mode control have been described in this work, including the use of Long Period Gratings and air-gap mode-filters. Some of the methods have been applied to other applications, such as speckle suppression and in sensor technology. A novel, self-referencing, sensor comprising two modal groups in the Mode Power Distribution has been designed and tested. The feasibility of a two-channel Mode Group Diversity Multiplexed system has been demonstrated over 985m. A test apparatus for measuring mode distribution has been designed and constructed. The apparatus consists of a purpose-built video microscope, and comprehensive control and analysis software written in Visual Basic. The system may be fitted with a Silicon camera or an InGaAs camera, for measurement in the 850nm and 130nm transmission windows respectively. A limitation of the measurement method, when applied to well-filled fibres, has been identified and an improvement to the method has been proposed, based on modelled Laguerre Gauss field solutions.
Resumo:
Nanoindentation has become a common technique for measuring the hardness and elastic-plastic properties of materials, including coatings and thin films. In recent years, different nanoindenter instruments have been commercialised and used for this purpose. Each instrument is equipped with its own analysis software for the derivation of the hardness and reduced Young's modulus from the raw data. These data are mostly analysed through the Oliver and Pharr method. In all cases, the calibration of compliance and area function is mandatory. The present work illustrates and describes a calibration procedure and an approach to raw data analysis carried out for six different nanoindentation instruments through several round-robin experiments. Three different indenters were used, Berkovich, cube corner, spherical, and three standardised reference samples were chosen, hard fused quartz, soft polycarbonate, and sapphire. It was clearly shown that the use of these common procedures consistently limited the hardness and reduced the Young's modulus data spread compared to the same measurements performed using instrument-specific procedures. The following recommendations for nanoindentation calibration must be followed: (a) use only sharp indenters, (b) set an upper cut-off value for the penetration depth below which measurements must be considered unreliable, (c) perform nanoindentation measurements with limited thermal drift, (d) ensure that the load-displacement curves are as smooth as possible, (e) perform stiffness measurements specific to each instrument/indenter couple, (f) use Fq and Sa as calibration reference samples for stiffness and area function determination, (g) use a function, rather than a single value, for the stiffness and (h) adopt a unique protocol and software for raw data analysis in order to limit the data spread related to the instruments (i.e. the level of drift or noise, defects of a given probe) and to make the H and E r data intercomparable. © 2011 Elsevier Ltd.
Resumo:
The book aims to introduce the reader to DEA in the most accessible manner possible. It is specifically aimed at those who have had no prior exposure to DEA and wish to learn its essentials, how it works, its key uses, and the mechanics of using it. The latter will include using DEA software. Students on degree or training courses will find the book especially helpful. The same is true of practitioners engaging in comparative efficiency assessments and performance management within their organisation. Examples are used throughout the book to help the reader consolidate the concepts covered. Table of content: List of Tables. List of Figures. Preface. Abbreviations. 1. Introduction to Performance Measurement. 2. Definitions of Efficiency and Related Measures. 3. Data Envelopment Analysis Under Constant Returns to Scale: Basic Principles. 4. Data Envelopment Analysis under Constant Returns to Scale: General Models. 5. Using Data Envelopment Analysis in Practice. 6. Data Envelopment Analysis under Variable Returns to Scale. 7. Assessing Policy Effectiveness and Productivity Change Using DEA. 8. Incorporating Value Judgements in DEA Assessments. 9. Extensions to Basic DEA Models. 10. A Limited User Guide for Warwick DEA Software. Author Index. Topic Index. References.
Resumo:
Developers of interactive software are confronted by an increasing variety of software tools to help engineer the interactive aspects of software applications. Not only do these tools fall into different categories in terms of functionality, but within each category there is a growing number of competing tools with similar, although not identical, features. Choice of user interface development tool (UIDT) is therefore becoming increasingly complex.