40 resultados para Box girder bridges Design and construction Evaluation Data processing
em Aston University Research Archive
Resumo:
SINNMR (Sonically Induced Narrowing of the Nuclear Magnetic Resonance spectra of solids), is a novel technique that is being developed to enable the routine study of solids by nuclear magnetic resonance spectroscopy. SINNMR aims to narrow the broad resonances that are characteristic of solid state NMR by inducing rapid incoherent motion of solid particles suspended in a support medium, using high frequency ultrasound in the range 2-10 MHz. The width of the normal broad resonances from solids are due to incomplete averaging of several components of the total spin Hamiltonian caused by restrictions placed on molecular motion within a solid. At present Magic Angle Spinning (MAS) NMR is the classical solid state technique used to reduce line broadening, but: this has associated problems, not least of which is the appearance of many spinning side bands which confuse the spectra. It is hoped that SlNNMR will offer a simple alternative, particularly as it does not reveal spinning sidebands The fundamental question concerning whether the use of ultrasound within a cryo-magnet will cause quenching has been investigated with success, as even under the most extreme conditions of power, frequency and irradiator time, the magnet does not quench. The objective of this work is to design and construct a SINNMR probe for use in a super conducting cryo-magnet NMR spectrometer. A cell for such a probe has been constructed and incorporated into an adapted high resolution broadband probe. It has been proved that the cell is capable of causing cavitation, up to 10 MHz, by running a series of ultrasonic reactions within it and observing the reaction products. It was found that the ultrasound was causing the sample to be heated to unacceptable temperatures and this necessitated the incorporation of temperature stabilisation devices. Work has been performed on the investigation of the narrowing of the solid state 23Na spectrum of tri-sodium phosphate using high frequency ultrasound. Work has also been completed on the signal enhancement and T1 reduction of a liquid mixture and a pure compound using ultrasound. Some preliminary "bench" experiments have been completed on a novel ultrasonic device designed to help minimise sample heating. The concept involves passing the ultrasound through a temperature stabilised, liquid filled funnel that has a drum skin on the end that will enable the passage of ultrasound into the sample. Bench experiments have proved that acoustic attenuation is low and that cavitation in the liquid beyond the device is still possible.
Resumo:
This study is concerned with quality and productivity aspects of traditional house building. The research focuses on these issues by concentrating on the services and finishing stages of the building process. These are work stages which have not been fully investigated in previous productivity related studies. The primary objective of the research is to promote an integrated design and construction led approach to traditional house building based on an original concept of 'development cycles'. This process involves the following: site monitoring; the analysis of work operations; implementing design and construction changes founded on unique information collected during site monitoring; and subsequent re-monitoring to measure and assess Ihe effect of change. A volume house building firm has been involved in this applied research and has allowed access to its sites for production monitoring purposes. The firm also assisted in design detailing for a small group of 'experimental' production houses where various design and construction changes were implemented. Results from the collaborative research have shown certain quality and productivity improvements to be possible using this approach, albeit on a limited scale at this early experimental stage. The improvements have been possible because an improved activity sampling technique, developed for, and employed by the study, has been able to describe why many quality and productivity related problems occur during site building work. Experience derived from the research has shown the following attributes to be important: positive attitudes towards innovation; effective communication; careful planning and organisation; and good coordination and control at site level. These are all essential aspects of quality led management and determine to a large extent the overall success of this approach. Future work recommendations must include a more widespread use of innovative practices so that further design and construction modifications can be made. By doing this, productivity can be improved, cost savings made and better quality afforded.
Resumo:
This thesis describes the development of a complete data visualisation system for large tabular databases, such as those commonly found in a business environment. A state-of-the-art 'cyberspace cell' data visualisation technique was investigated and a powerful visualisation system using it was implemented. Although allowing databases to be explored and conclusions drawn, it had several drawbacks, the majority of which were due to the three-dimensional nature of the visualisation. A novel two-dimensional generic visualisation system, known as MADEN, was then developed and implemented, based upon a 2-D matrix of 'density plots'. MADEN allows an entire high-dimensional database to be visualised in one window, while permitting close analysis in 'enlargement' windows. Selections of records can be made and examined, and dependencies between fields can be investigated in detail. MADEN was used as a tool for investigating and assessing many data processing algorithms, firstly data-reducing (clustering) methods, then dimensionality-reducing techniques. These included a new 'directed' form of principal components analysis, several novel applications of artificial neural networks, and discriminant analysis techniques which illustrated how groups within a database can be separated. To illustrate the power of the system, MADEN was used to explore customer databases from two financial institutions, resulting in a number of discoveries which would be of interest to a marketing manager. Finally, the database of results from the 1992 UK Research Assessment Exercise was analysed. Using MADEN allowed both universities and disciplines to be graphically compared, and supplied some startling revelations, including empirical evidence of the 'Oxbridge factor'.
Resumo:
This thesis describes advances in the characterisation, calibration and data processing of optical coherence tomography (OCT) systems. Femtosecond (fs) laser inscription was used for producing OCT-phantoms. Transparent materials are generally inert to infra-red radiations, but with fs lasers material modification occurs via non-linear processes when the highly focused light source interacts with the materials. This modification is confined to the focal volume and is highly reproducible. In order to select the best inscription parameters, combination of different inscription parameters were tested, using three fs laser systems, with different operating properties, on a variety of materials. This facilitated the understanding of the key characteristics of the produced structures with the aim of producing viable OCT-phantoms. Finally, OCT-phantoms were successfully designed and fabricated in fused silica. The use of these phantoms to characterise many properties (resolution, distortion, sensitivity decay, scan linearity) of an OCT system was demonstrated. Quantitative methods were developed to support the characterisation of an OCT system collecting images from phantoms and also to improve the quality of the OCT images. Characterisation methods include the measurement of the spatially variant resolution (point spread function (PSF) and modulation transfer function (MTF)), sensitivity and distortion. Processing of OCT data is a computer intensive process. Standard central processing unit (CPU) based processing might take several minutes to a few hours to process acquired data, thus data processing is a significant bottleneck. An alternative choice is to use expensive hardware-based processing such as field programmable gate arrays (FPGAs). However, recently graphics processing unit (GPU) based data processing methods have been developed to minimize this data processing and rendering time. These processing techniques include standard-processing methods which includes a set of algorithms to process the raw data (interference) obtained by the detector and generate A-scans. The work presented here describes accelerated data processing and post processing techniques for OCT systems. The GPU based processing developed, during the PhD, was later implemented into a custom built Fourier domain optical coherence tomography (FD-OCT) system. This system currently processes and renders data in real time. Processing throughput of this system is currently limited by the camera capture rate. OCTphantoms have been heavily used for the qualitative characterization and adjustment/ fine tuning of the operating conditions of OCT system. Currently, investigations are under way to characterize OCT systems using our phantoms. The work presented in this thesis demonstrate several novel techniques of fabricating OCT-phantoms and accelerating OCT data processing using GPUs. In the process of developing phantoms and quantitative methods, a thorough understanding and practical knowledge of OCT and fs laser processing systems was developed. This understanding leads to several novel pieces of research that are not only relevant to OCT but have broader importance. For example, extensive understanding of the properties of fs inscribed structures will be useful in other photonic application such as making of phase mask, wave guides and microfluidic channels. Acceleration of data processing with GPUs is also useful in other fields.
Resumo:
In 1974 Dr D M Bramwell published his research work at the University of Aston a part of which was the establishment of an elemental work study data base covering drainage construction. The Transport and Road Research Laboratory decided to, extend that work as part of their continuing research programme into the design and construction of buried pipelines by placing a research contract with Bryant Construction. This research may be considered under two broad categories. In the first, site studies were undertaken to validate and extend the data base. The studies showed good agreement with the existing data with the exception of the excavation trench shoring and pipelaying data which was amended to incorporate new construction plant and methods. An inter-active on-line computer system for drainage estimating was developed. This system stores the elemental data, synthesizes the standard time of each drainage operation and is used to determine the required resources and construction method of the total drainage activity. The remainder of the research was into the general topic of construction efficiency. An on-line command driven computer system was produced. This system uses a stochastic simulation technique, based on distributions of site efficiency measurements to evaluate the effects of varying performance levels. The analysis of this performance data quantities the variability inherent in construction and demonstrates how some of this variability can be reconciled by considering the characteristics of a contract. A long term trend of decreasing efficiency with contract duration was also identified. The results obtained from the simulation suite were compared to site records collected from current contracts. This showed that this approach will give comparable answers, but these are greatly affected by the site performance parameters.
Resumo:
Mobile technology has been one of the major growth areas in computing over recent years (Urbaczewski, Valacich, & Jessup, 2003). Mobile devices are becoming increasingly diverse and are continuing to shrink in size and weight. Although this increases the portability of such devices, their usability tends to suffer. Fuelled almost entirely by lack of usability, users report high levels of frustration regarding interaction with mobile technologies (Venkatesh, Ramesh, & Massey, 2003). This will only worsen if interaction design for mobile technologies does not continue to receive increasing research attention. For the commercial benefit of mobility and mobile commerce (m-commerce) to be fully realized, users’ interaction experiences with mobile technology cannot be negative. To ensure this, it is imperative that we design the right types of mobile interaction (m-interaction); an important prerequisite for this is ensuring that users’ experience meets both their sensory and functional needs (Venkatesh, Ramesh, & Massey, 2003). Given the resource disparity between mobile and desktop technologies, successful electronic commerce (e-commerce) interface design and evaluation does not necessarily equate to successful m-commerce design and evaluation. It is, therefore, imperative that the specific needs of m-commerce are addressed–both in terms of design and evaluation. This chapter begins by exploring the complexities of designing interaction for mobile technology, highlighting the effect of context on the use of such technology. It then goes on to discuss how interaction design for mobile devices might evolve, introducing alternative interaction modalities that are likely to affect that future evolution. It is impossible, within a single chapter, to consider each and every potential mechanism for interacting with mobile technologies; to provide a forward-looking flavor of what might be possible, this chapter focuses on some more novel methods of interaction and does not, therefore, look at the typical keyboard and visual display-based interaction which, in essence, stem from the desktop interaction design paradigm. Finally, this chapter touches on issues associated with effective evaluation of m-interaction and mobile application designs. By highlighting some of the issues and possibilities for novel m-interaction design and evaluation, we hope that future designers will be encouraged to “think out of the box” in terms of their designs and evaluation strategies.
Resumo:
The evaluation of geospatial data quality and trustworthiness presents a major challenge to geospatial data users when making a dataset selection decision. The research presented here therefore focused on defining and developing a GEO label – a decision support mechanism to assist data users in efficient and effective geospatial dataset selection on the basis of quality, trustworthiness and fitness for use. This thesis thus presents six phases of research and development conducted to: (a) identify the informational aspects upon which users rely when assessing geospatial dataset quality and trustworthiness; (2) elicit initial user views on the GEO label role in supporting dataset comparison and selection; (3) evaluate prototype label visualisations; (4) develop a Web service to support GEO label generation; (5) develop a prototype GEO label-based dataset discovery and intercomparison decision support tool; and (6) evaluate the prototype tool in a controlled human-subject study. The results of the studies revealed, and subsequently confirmed, eight geospatial data informational aspects that were considered important by users when evaluating geospatial dataset quality and trustworthiness, namely: producer information, producer comments, lineage information, compliance with standards, quantitative quality information, user feedback, expert reviews, and citations information. Following an iterative user-centred design (UCD) approach, it was established that the GEO label should visually summarise availability and allow interrogation of these key informational aspects. A Web service was developed to support generation of dynamic GEO label representations and integrated into a number of real-world GIS applications. The service was also utilised in the development of the GEO LINC tool – a GEO label-based dataset discovery and intercomparison decision support tool. The results of the final evaluation study indicated that (a) the GEO label effectively communicates the availability of dataset quality and trustworthiness information and (b) GEO LINC successfully facilitates ‘at a glance’ dataset intercomparison and fitness for purpose-based dataset selection.
Resumo:
The primary goal of this research is to design and develop an education technology to support learning in global operations management. The research implements a series of studies to determine the right balance among user requirements, learning methods and applied technologies, on a view of student-centred learning. This research is multidisciplinary by nature, involving topics from various disciplines such as global operations management, curriculum and contemporary learning theory, and computer aided learning. Innovative learning models that emphasise on technological implementation are employed and discussed throughout this research.
Resumo:
The purpose of this research is to propose a procurement system across other disciplines and retrieved information with relevant parties so as to have a better co-ordination between supply and demand sides. This paper demonstrates how to analyze the data with an agent-based procurement system (APS) to re-engineer and improve the existing procurement process. The intelligence agents take the responsibility of searching the potential suppliers, negotiation with the short-listed suppliers and evaluating the performance of suppliers based on the selection criteria with mathematical model. Manufacturing firms and trading companies spend more than half of their sales dollar in the purchase of raw material and components. Efficient data collection with high accuracy is one of the key success factors to generate quality procurement which is to purchasing right material at right quality from right suppliers. In general, the enterprises spend a significant amount of resources on data collection and storage, but too little on facilitating data analysis and sharing. To validate the feasibility of the approach, a case study on a manufacturing small and medium-sized enterprise (SME) has been conducted. APS supports the data and information analyzing technique to facilitate the decision making such that the agent can enhance the negotiation and suppler evaluation efficiency by saving time and cost.
Resumo:
The key to the correct application of ANOVA is careful experimental design and matching the correct analysis to that design. The following points should therefore, be considered before designing any experiment: 1. In a single factor design, ensure that the factor is identified as a 'fixed' or 'random effect' factor. 2. In more complex designs, with more than one factor, there may be a mixture of fixed and random effect factors present, so ensure that each factor is clearly identified. 3. Where replicates can be grouped or blocked, the advantages of a randomised blocks design should be considered. There should be evidence, however, that blocking can sufficiently reduce the error variation to counter the loss of DF compared with a randomised design. 4. Where different treatments are applied sequentially to a patient, the advantages of a three-way design in which the different orders of the treatments are included as an 'effect' should be considered. 5. Combining different factors to make a more efficient experiment and to measure possible factor interactions should always be considered. 6. The effect of 'internal replication' should be taken into account in a factorial design in deciding the number of replications to be used. Where possible, each error term of the ANOVA should have at least 15 DF. 7. Consider carefully whether a particular factorial design can be considered to be a split-plot or a repeated measures design. If such a design is appropriate, consider how to continue the analysis bearing in mind the problem of using post hoc tests in this situation.
Resumo:
The objective of this work was to design, construct and commission a new ablative pyrolysis reactor and a high efficiency product collection system. The reactor was to have a nominal throughput of 10 kg/11r of dry biomass and be inherently scalable up to an industrial scale application of 10 tones/hr. The whole process consists of a bladed ablative pyrolysis reactor, two high efficiency cyclones for char removal and a disk and doughnut quench column combined with a wet walled electrostatic precipitator, which is directly mounted on top, for liquids collection. In order to aid design and scale-up calculations, detailed mathematical modelling was undertaken of the reaction system enabling sizes, efficiencies and operating conditions to be determined. Specifically, a modular approach was taken due to the iterative nature of some of the design methodologies, with the output from one module being the input to the next. Separate modules were developed for the determination of the biomass ablation rate, specification of the reactor capacity, cyclone design, quench column design and electrostatic precipitator design. These models enabled a rigorous design protocol to be developed capable of specifying the required reactor and product collection system size for specified biomass throughputs, operating conditions and collection efficiencies. The reactor proved capable of generating an ablation rate of 0.63 mm/s for pine wood at a temperature of 525 'DC with a relative velocity between the heated surface and reacting biomass particle of 12.1 m/s. The reactor achieved a maximum throughput of 2.3 kg/hr, which was the maximum the biomass feeder could supply. The reactor is capable of being operated at a far higher throughput but this would require a new feeder and drive motor to be purchased. Modelling showed that the reactor is capable of achieving a reactor throughput of approximately 30 kg/hr. This is an area that should be considered for the future as the reactor is currently operating well below its theoretical maximum. Calculations show that the current product collection system could operate efficiently up to a maximum feed rate of 10 kg/Fir, provided the inert gas supply was adjusted accordingly to keep the vapour residence time in the electrostatic precipitator above one second. Operation above 10 kg/hr would require some modifications to the product collection system. Eight experimental runs were documented and considered successful, more were attempted but due to equipment failure had to be abandoned. This does not detract from the fact that the reactor and product collection system design was extremely efficient. The maximum total liquid yield was 64.9 % liquid yields on a dry wood fed basis. It is considered that the liquid yield would have been higher had there been sufficient development time to overcome certain operational difficulties and if longer operating runs had been attempted to offset product losses occurring due to the difficulties in collecting all available product from a large scale collection unit. The liquids collection system was highly efficient and modeling determined a liquid collection efficiency of above 99% on a mass basis. This was validated due to the fact that a dry ice/acetone condenser and a cotton wool filter downstream of the collection unit enabled mass measurements of the amount of condensable product exiting the product collection unit. This showed that the collection efficiency was in excess of 99% on a mass basis.
Resumo:
This thesis describes the design and engineering of a pressurised biomass gasification test facility. A detailed examination of the major elements within the plant has been undertaken in relation to specification of equipment, evaluation of options and final construction. The retrospective project assessment was developed from consideration of relevant literature and theoretical principles. The literature review includes a discussion on legislation and applicable design codes. From this analysis, each of the necessary equipment units was reviewed and important design decisions and procedures highlighted and explored. Particular emphasis was placed on examination of the stringent demands of the ASME VIII design codes. The inter-relationship of functional units was investigated and areas of deficiency, such as biomass feeders and gas cleaning, have been commented upon. Finally, plant costing was summarized in relation to the plant design and proposed experimental programme. The main conclusion drawn from the study is that pressurised gasification of biomass is far more difficult and expensive to support than atmospheric gasification. A number of recommendations have been made regarding future work in this area.
Herbal medicines:physician's recommendation and clinical evaluation of St.John's Wort for depression
Resumo:
Why some physicians recommend herbal medicines while others do not is not well understood. We undertook a survey designed to identify factors, which predict recommendation of herbal medicines by physicians in Malaysia. About a third (206 out of 626) of the physicians working at the University of Malaya Medical Centre ' were interviewed face-to-face, using a structured questionnaire. Physicians were asked about their personal use of, recommendation of, perceived interest in and, usefulness and safety of herbal medicines. Using logistic regression modelling we identified personal use, general interest, interest in receiving training, race and higher level of medical training as significant predictors of recommendation. St. John's wort is one of the most widely used herbal remedies. It is also probably the most widely evaluated herbal remedy with no fewer than 57 randomised controlled trials. Evidence from the depression trials suggests that St. John's wort is more effective than placebo while its comparative efficacy to conventional antidepressants is not well established. We updated previous meta-analyses of St. John's wort, described the characteristics of the included trials, applied methods of data imputation and transformation for incomplete trial data and examined sources of heterogeneity in the design and results of those trials. Thirty randomised controlled trials, which were heterogeneous in design, were identified. Our meta-analysis showed that St. John's wort was significantly more effective than placebo [pooled RR 1.90 (1.54-2.35)] and [Pooled WMD 4.09 (2.33 to 5.84)]. However, the remedy was similar to conventional antidepressant in its efficacy [Pooled RR I. 0 I (0.93 -1.10)] and [Pooled WMD 0.18 (- 0.66 to 1.02). Subgroup analyses of the placebo-controlled trials suggested that use of different diagnostic classifications at the inclusion stage led to different estimates of effect. Similarly a significant difference in the estimates of efficacy was observed when trials were categorised according to length of follow-up. Confounding between the variables, diagnostic classification and length of trial was shown by loglinear analysis. Despite extensive study, there is still no consensus on how effective St. lohn's wort is in depression. However, most experts would agree that it has some effect. Our meta-analysis highlights the problems associated with the clinical evaluation of herbal medicines when the active ingredients are poorly defined or unknown. The problem is compounded when the target disease (e.g. depression) is also difficult to define and different instruments are available to diagnose and evaluate it.
Resumo:
The fossil arthropod Class Trilobita is characterised by the possession of a highly mineralised dorsal exoskeleton with an incurved marginal flange (doublure). This cuticle is usually the only part of the organism to be preserved. Despite the common occurrence of trilobites in Palaeozoic sediments, the original exoskeletal mineralogy has not been determined previously. Petrographic data involving over seventy trilobite species, ranging in age from Cambrian to Devonian, together with atomic absorption and stable isotope analyses, indicate a primary low-magnesian calcite composition. Trilobite cuticles exhibit a variety of preservational textures which are related to the different diagenetic realms through which they have passed. A greater knowledge of post-depositional processes and the specific features they produce, has enabled post-mortem artefacts to be distinguished from primary cuticular microstructures. Alterations of the cuticle can either enhance or destroy primary features, and their effects are best observed in thin-sections, both under transmitted light and cathodoluminescence. Well-preserved trilobites often retain primary microstructures such as laminations, canals, and tubercles. These have been examined in stained thin-sections and by scanning electron microscopy, from as wide a range of trilobites as possible. Construction of sensory field maps has shown that although the basic organisation of the exoskeleton is the same in all trilobites, the types of microstructures found, and their distribution is species-specific. The composition, microstructure, and architecture of the trilobite exoskeleton have also been studied from a biomechanical viewpoint. Total cuticle thickness, and the relative proportions of the different layers, together with the overall architecture all affected the mechanical properties of the exoskeleton.
Resumo:
Purine and pyrimidine triplex-forming oligonucleotides (TFOs), as potential antibacterial agents, were designed to bind by Hoogsteen and reverse Hoogsteen hydrogen bonds in a sequence specific manner in the major groove of genomic DNA at specific polypurine sites within the gyrA gene of E. coli and S. pneumoniae. Sequences were prepared by automated synthesis, with purification and characterisation determined by high performance liquid chromatograpy, capillary electrophoresis and mass spectrometry. Triplex stability was assessed using melting curves where the binding of the third strand to the duplex target, was assessed over a temperature range of 0-80°C, and at pH 6.4 and 7.2. The most successful of the unmodified TFOs (6) showed a Tm value of 26 °C at both pH values with binding via reverse Hoogsteen bonds. Binding to genomic DNA was also demonstrated by spectrofluorimetry, using fluorescein-labelled TFOs, from which dissociation constants were determined. Modifications in the form of 5mC, 5' acridine attachment, phosphorothioation, 2'-0-methylation and phosphoramidation, were made in order to. increase Tm values. Phosphoramidate modification was the most with increased Tm values of 42°C. However, the final purity of these sequences was poor due to their difficult syntheses. FACS (fluorescent activated cell sorting) analysis was used to determine the potential uptake of a fluorescently labelled analogue of 6 via passive, coJd shock mediated, and anionic liposome aided, uptake. This was established at 20°C and 37°C. At both temperatures anionic lipid-mediated uptake produced unrivalled fluorescence, equivalent to 20 and 43% at 20 and 37°C respectively. Antibacterial activity of each oligonucleotide was assessed by viable count anaJysis relying on passive uptake, cold shocking techniques, chlorpromazine-mediated uptake, and, cationic and anionic lipid-aided uptake. All oligonucleotides were assessed for their ability to enhance uptake, which is a major barrier to the effectiveness of these agents. Compound 6 under cold shocking conditions produced the greatest consistent decline in colony forming units per ml. Results for this compound were sometimes variable indicating inconsistent uptake by this particular assay method.