35 resultados para Processing methods
em Aston University Research Archive
Resumo:
Queueing theory is an effective tool in the analysis of canputer camrunication systems. Many results in queueing analysis have teen derived in the form of Laplace and z-transform expressions. Accurate inversion of these transforms is very important in the study of computer systems, but the inversion is very often difficult. In this thesis, methods for solving some of these queueing problems, by use of digital signal processing techniques, are presented. The z-transform of the queue length distribution for the Mj GY jl system is derived. Two numerical methods for the inversion of the transfom, together with the standard numerical technique for solving transforms with multiple queue-state dependence, are presented. Bilinear and Poisson transform sequences are presented as useful ways of representing continuous-time functions in numerical computations.
Resumo:
One of the main objectives of this study was to functionalise various rubbers (i.e. ethylene propylene copolymer (EP), ethylene propylene diene terpolymer (EPDM), and natural rubber (NR)) using functional monomers, maleic anhydride (MA) and glycidyl methacrylate (GMA), via reactive processing routes. The functionalisation of the rubber was carried out via different reactive processing methods in an internal mixer. GMA was free-radically grafted onto EP and EPDM in the melt state in the absence and presence of a comonomer, trimethylolpropane triacrylate (TRlS). To optinuse the grafting conditions and the compositions, the effects of various paranleters on the grafting yields and the extent of side reactions were investigated. Precipitation method and Soxhlet extraction method was established to purifY the GMA modified rubbers and the grafting degree was determined by FTIR and titration. It was found that without TRlS the grafting degree of GMA increased with increasing peroxide concentration. However, grafting was low and the homopolymerisation of GMA and crosslinking of the polymers were identified as the main side reactions competing with the desired grafting reaction for EP and EPDM, respectively. The use of the tri-functional comonomer, TRlS, was shown to greatly enhance the GMA grafting and reduce the side reactions in terms of the higher GMA grafting degree, less alteration of the rheological properties of the polymer substrates and very little formation of polyGMA. The grafting mechanisms were investigated. MA was grafted onto NR using both thermal initiation and peroxide initiation. The results showed clearly that the reaction of MA with NR could be thermally initiated above 140°C in the absence of peroxide. At a preferable temperature of 200°C, the grafting degree was increased with increasing MA concentration. The grafting reaction could also be initiated with peroxide. It was found that 2,5-dimethyl-2,5-bis(ter-butylproxy) hexane (TIOI) was a suitable peroxide to initiate the reaction efficiently above I50°C. The second objective of the work was to utilize the functionalised rubbers in a second step to achieve an in-situ compatibilisation of blends based on poly(ethylene terephthalate) (PET), in particular, with GMA-grafted-EP and -EPDM and the reactive blending was carried out in an internal mixer. The effects of GMA grafting degree, viscosities of GMAgrafted- EP and -EPDM and the presence of polyGMA in the rubber samples on the compatibilisation of PET blends in terms of morphology, dynamical mechanical properties and tensile properties were investigated. It was found that the GMA modified rubbers were very efficient in compatibilising the PET blends and this was supported by the much finer morphology and the better tensile properties. The evidence obtained from the analysis of the PET blends strongly supports the existence of the copolymers through the interfacial reactions between the grafted epoxy group in the GMA modified rubber and the terminal groups of PET in the blends.
Resumo:
The trend in modal extraction algorithms is to use all the available frequency response functions data to obtain a global estimate of the natural frequencies, damping ratio and mode shapes. Improvements in transducer and signal processing technology allow the simultaneous measurement of many hundreds of channels of response data. The quantity of data available and the complexity of the extraction algorithms make considerable demands on the available computer power and require a powerful computer or dedicated workstation to perform satisfactorily. An alternative to waiting for faster sequential processors is to implement the algorithm in parallel, for example on a network of Transputers. Parallel architectures are a cost effective means of increasing computational power, and a larger number of response channels would simply require more processors. This thesis considers how two typical modal extraction algorithms, the Rational Fraction Polynomial method and the Ibrahim Time Domain method, may be implemented on a network of transputers. The Rational Fraction Polynomial Method is a well known and robust frequency domain 'curve fitting' algorithm. The Ibrahim Time Domain method is an efficient algorithm that 'curve fits' in the time domain. This thesis reviews the algorithms, considers the problems involved in a parallel implementation, and shows how they were implemented on a real Transputer network.
Resumo:
Remote sensing data is routinely used in ecology to investigate the relationship between landscape pattern as characterised by land use and land cover maps, and ecological processes. Multiple factors related to the representation of geographic phenomenon have been shown to affect characterisation of landscape pattern resulting in spatial uncertainty. This study investigated the effect of the interaction between landscape spatial pattern and geospatial processing methods statistically; unlike most papers which consider the effect of each factor in isolation only. This is important since data used to calculate landscape metrics typically undergo a series of data abstraction processing tasks and are rarely performed in isolation. The geospatial processing methods tested were the aggregation method and the choice of pixel size used to aggregate data. These were compared to two components of landscape pattern, spatial heterogeneity and the proportion of landcover class area. The interactions and their effect on the final landcover map were described using landscape metrics to measure landscape pattern and classification accuracy (response variables). All landscape metrics and classification accuracy were shown to be affected by both landscape pattern and by processing methods. Large variability in the response of those variables and interactions between the explanatory variables were observed. However, even though interactions occurred, this only affected the magnitude of the difference in landscape metric values. Thus, provided that the same processing methods are used, landscapes should retain their ranking when their landscape metrics are compared. For example, highly fragmented landscapes will always have larger values for the landscape metric "number of patches" than less fragmented landscapes. But the magnitude of difference between the landscapes may change and therefore absolute values of landscape metrics may need to be interpreted with caution. The explanatory variables which had the largest effects were spatial heterogeneity and pixel size. These explanatory variables tended to result in large main effects and large interactions. The high variability in the response variables and the interaction of the explanatory variables indicate it would be difficult to make generalisations about the impact of processing on landscape pattern as only two processing methods were tested and it is likely that untested processing methods will potentially result in even greater spatial uncertainty. © 2013 Elsevier B.V.
Resumo:
This thesis discusses the need for nondestructive testing and highlights some of the limitations in present day techniques. Special interest has been given to ultrasonic examination techniques and the problems encountered when they are applied to thick welded plates. Some suggestions are given using signal processing methods. Chapter 2 treats the need for nondestructive testing as seen in the light of economy and safety. A short review of present day techniques in nondestructive testing is also given. The special problems using ultrasonic techniques for welded structures is discussed in Chapter 3 with some examples of elastic wave propagation in welded steel. The limitations in applying sophisticated signal processing techniques to ultrasonic NDT~ mainly found in the transducers generating or receiving the ultrasound. Chapter 4 deals with the different transducers used. One of the difficulties with ultrasonic testing is the interpretation of the signals encountered. Similar problems might be found with SONAR/RADAR techniques and Chapter 5 draws some analogies between SONAR/RADAR and ultrasonic nondestructive testing. This chapter also includes a discussion on some on the techniques used in signal processing in general. A special signal processing technique found useful is cross-correlation detection and this technique is treated in Chapter 6. Electronic digital compute.rs have made signal processing techniques easier to implement -Chapter 7 discusses the use of digital computers in ultrasonic NDT. Experimental equipment used to test cross-correlation detection of ultrasonic signals is described in Chapter 8. Chapter 9 summarises the conclusions drawn during this investigation.
Resumo:
Functionalisation of polystyrene, PS, and ethylene-co-propylene-co-cyclopentadiene terpolymer, EPDM, with acrylic acid, AA, in a melt reactive processing procedure, in the presence of peroxide, trigonox 101, and coagents, Divinyl benzene, DVB (for PS), and trimethylolpropane triacrylate, TRIS (for EPDM), were successfully carried out. The level of grafting of the AA, as determined by infrared analysis, was significantly enhanced by the coagents. The grafting reaction of AA takes place simultaneously with homopolymerisation of the monomers, melt degradation and crosslinking reactions of the polymers. The extent of these competing reactions were inferred from measurements of melt flow index and insoluble gel content. Through a judicious use of both the peroxide and the coagent, particularly TRIS, unwanted side reactions were minimized. Five different processing methods were investigated for both functionalisation experiments; the direct addition of the pre-mixed polymer with peroxide and reactive modifiers was found to give optimum condition for grafting. The functionalised PS, F-PS, and EPDM, F-EPD, and maleinised polypropylene carrying a potential antioxidant, N-(4-anilinophenyl maleimide), F-PP were melt blended in binary mixtures of F-PS/F-EPD and F-PP/F-EPD in the presence (or absence) of organic diamines which act as an interlinking agent, e.g, Ethylene Diamine, EDA, and Hexamethylene Diamine, HEMDA. The presence of an interlinking agent, particularly HEMDA shows significant enhancement in the mechanical properties of the blend, suggesting that the copolymer formed has acted as compatibiliser to the otherwise incompatible polymer pairs. The functionalised and amidised blends, F and A-PSIEPDM (SPOI) and F and A-PPIEPDM (SPD2) were subsequently used as compatibiliser concentrates in the corresponding PSIEPDM and PPIEPDM blends containing various weight propotion of the homopolymers. The SPD1 caused general decreased in tensile strength, albeit increased in drop impact strength particularly in blend containing high PS content (80%). The SPD2 was particularly effective in enhancing impact strength in blends containing low weight ratio of PP (<70%). The SPD2 was also a good thermal antioxidant albeit less effective than commercial antioxidant. In all blends the evidence of compatibility was examined by scanning electron microscopy.
Resumo:
The main aim of this work was to study the effect of two comonomers, trimethylolpropane trimethacrylate (TRIS) and divinylbenzene (DVB) on the nature and efficiency of grafting of two different monomers, glycidyl methacrylate (GMA) and maleic anhydride (MA) on polypropylene (P) and on natural rubber (NR) using reactive processing methods. Four different peroxides, benzoyl peroxide (BPO), dicumyl peroxide (DCP), 2,5-dimethyl-2,5-bis-(tert-butyl peroxy) hexane (t-101), and 1,1-di(tert-butylperoxy)-3,3,5-trimethyl cyclohexene (T-29B90) were examined as free radical initiators. An appropriate methodology was established and chemical composition and reactive processing parameters were examined and optimised. It was found that in the absence of the coagents DVB and TRIS, the grafting degree of GMA and MA increased with increasing peroxide concentration, but the level of grafting was low and the homopolymerisaton of GMA and the crosslinking of NR or chain scission of PP were identified as the main side reactions that competed with the desired grafting reaction in the polymers. At high concentrations of the peroxide T-101 (>0.02 mr) cross linking of NR and chain scission of PP became dominant and unacceptable. An attempt to add a reactive coagent, e.g. TRIS during grafting of GMA on natural rubber resulted in excessive crosslinking because of the very high reactivity of this comonomer with the C=C of the rubber. Therefore, the use of any multifunctional and highly reactive coagent such as TRIS, could not be applied in the grafting of GAM onto natural rubber. In the case of PP, however, the use of TRIS and DVB was shown to greatly enhance the grafting degree and reduce the chain scission with very little extent of monomer homopolymerisation taking place. The results showed that the grafting degree was increased with increasing GMA and MA concentrations. It was also found that T-101 was a suitable peroxide to initiate the grafting reaction of these monomers on NR and PP and the optimum temperature for this peroxide was =160°C. A very preliminary work was also conducted on the use of the functionalised-PP (f-PP) in the absence and presence of the two comonomers (f-PP-DVB or f-PP-TRIS) for the purpose of compatibilising PP-PBT blends through reactive blending. Examination of the morphology of the blends suggested that an effective compatibilisation has been achieved when using f-PP-DVB and f-PP-TRIS, however more work is required in this area.
Resumo:
Removing noise from piecewise constant (PWC) signals is a challenging signal processing problem arising in many practical contexts. For example, in exploration geosciences, noisy drill hole records need to be separated into stratigraphic zones, and in biophysics, jumps between molecular dwell states have to be extracted from noisy fluorescence microscopy signals. Many PWC denoising methods exist, including total variation regularization, mean shift clustering, stepwise jump placement, running medians, convex clustering shrinkage and bilateral filtering; conventional linear signal processing methods are fundamentally unsuited. This paper (part I, the first of two) shows that most of these methods are associated with a special case of a generalized functional, minimized to achieve PWC denoising. The minimizer can be obtained by diverse solver algorithms, including stepwise jump placement, convex programming, finite differences, iterated running medians, least angle regression, regularization path following and coordinate descent. In the second paper, part II, we introduce novel PWC denoising methods, and comparisons between these methods performed on synthetic and real signals, showing that the new understanding of the problem gained in part I leads to new methods that have a useful role to play.
Resumo:
This thesis describes advances in the characterisation, calibration and data processing of optical coherence tomography (OCT) systems. Femtosecond (fs) laser inscription was used for producing OCT-phantoms. Transparent materials are generally inert to infra-red radiations, but with fs lasers material modification occurs via non-linear processes when the highly focused light source interacts with the materials. This modification is confined to the focal volume and is highly reproducible. In order to select the best inscription parameters, combination of different inscription parameters were tested, using three fs laser systems, with different operating properties, on a variety of materials. This facilitated the understanding of the key characteristics of the produced structures with the aim of producing viable OCT-phantoms. Finally, OCT-phantoms were successfully designed and fabricated in fused silica. The use of these phantoms to characterise many properties (resolution, distortion, sensitivity decay, scan linearity) of an OCT system was demonstrated. Quantitative methods were developed to support the characterisation of an OCT system collecting images from phantoms and also to improve the quality of the OCT images. Characterisation methods include the measurement of the spatially variant resolution (point spread function (PSF) and modulation transfer function (MTF)), sensitivity and distortion. Processing of OCT data is a computer intensive process. Standard central processing unit (CPU) based processing might take several minutes to a few hours to process acquired data, thus data processing is a significant bottleneck. An alternative choice is to use expensive hardware-based processing such as field programmable gate arrays (FPGAs). However, recently graphics processing unit (GPU) based data processing methods have been developed to minimize this data processing and rendering time. These processing techniques include standard-processing methods which includes a set of algorithms to process the raw data (interference) obtained by the detector and generate A-scans. The work presented here describes accelerated data processing and post processing techniques for OCT systems. The GPU based processing developed, during the PhD, was later implemented into a custom built Fourier domain optical coherence tomography (FD-OCT) system. This system currently processes and renders data in real time. Processing throughput of this system is currently limited by the camera capture rate. OCTphantoms have been heavily used for the qualitative characterization and adjustment/ fine tuning of the operating conditions of OCT system. Currently, investigations are under way to characterize OCT systems using our phantoms. The work presented in this thesis demonstrate several novel techniques of fabricating OCT-phantoms and accelerating OCT data processing using GPUs. In the process of developing phantoms and quantitative methods, a thorough understanding and practical knowledge of OCT and fs laser processing systems was developed. This understanding leads to several novel pieces of research that are not only relevant to OCT but have broader importance. For example, extensive understanding of the properties of fs inscribed structures will be useful in other photonic application such as making of phase mask, wave guides and microfluidic channels. Acceleration of data processing with GPUs is also useful in other fields.
Resumo:
The use of ontologies as representations of knowledge is widespread but their construction, until recently, has been entirely manual. We argue in this paper for the use of text corpora and automated natural language processing methods for the construction of ontologies. We delineate the challenges and present criteria for the selection of appropriate methods. We distinguish three ma jor steps in ontology building: associating terms, constructing hierarchies and labelling relations. A number of methods are presented for these purposes but we conclude that the issue of data-sparsity still is a ma jor challenge. We argue for the use of resources external tot he domain specific corpus.
Resumo:
Through the application of novel signal processing techniques we are able to measure physical measurands with both high accuracy and low noise susceptibility. The first interrogation scheme is based upon a CCD spectrometer. We compare different algorithms for resolving the Bragg wavelength from a low resolution discrete representation of the reflected spectrum, and present optimal processing methods for providing a high integrity measurement from the reflection image. Our second sensing scheme uses a novel network of sensors to measure the distributive strain response of a mechanical system. Using neural network processing methods we demonstrate the measurement capabilities of a scalable low-cost fibre Bragg grating sensor network. This network has been shown to be comparable with the performance of existing fibre Bragg grating sensing techniques, at a greatly reduced implementation cost.
Resumo:
This thesis documents the design, manufacture and testing of a passive and non-invasive micro-scale planar particle-from-fluid filter for segregating cell types from a homogeneous suspension. The microfluidics system can be used to separate spermatogenic cells from testis biopsy samples, providing a mechanism for filtrate retrieval for assisted reproduction therapy. The system can also be used for point-of-service diagnostics applications for hospitals, lab-on-a-chip pre-processing and field applications such as clinical testing in the third world. Various design concepts are developed and manufactured, and are assessed based on etched structure morphology, robustness to variations in the manufacturing process, and design impacts on fluid flow and particle separation characteristics. Segregation was measured using image processing algorithms that demonstrate efficiency is more than 55% for 1 µl volumes at populations exceeding 1 x 107. the technique supports a significant reduction in time over conventional processing, in the separation and identification of particle groups, offering a potential reduction in the associated cost of the targeted procedure. The thesis has developed a model of quasi-steady wetting flow within the micro channel and identifies the forces across the system during post-wetting equalisation. The model and its underlying assumptions are validated empirically in microfabricated test structures through a novel Micro-Particle Image Velocimetry technique. The prototype devices do not require ancillary equipment nor additional filtration media, and therefore offer fewer opportunities for sample contamination over conventional processing methods. The devices are disposable with minimal reagent volumes and process waste. Optimal processing parameters and production methods are identified with any improvements that could be made to enhance their performance in a number of identified potential applications.
Resumo:
The thesis examines the possibilities for the beneficiation of steel making slags by using mineral processing methods. Chemical and Mineralogical investigations have been carried out by SEM and EP}ffi to determine the most suitable separation methods in terms of crystal size, chemical composition and surface properties. Magnetic separation was applied in connection with size reductions for the extraction of the metallic iron prills and other iron containing phases and the results were related to the feed size and operating conditions. The behaviour of the slags in flotation tests was studied with respect to the recovery and grade. It was found that the free lime presence in the slags caused a high acid consumption of both weak and strong acids. It also reacted with acids and consequently produced a white precipitate, (CaS04 for H2 S04 ). The poor response of the phases to the flotation by different types of collector was found to be due to surface alteration caused by the free lime. The flocculation tests were carried out at the natural pH of the slags to prevent surface alterations. Settling tests were done to determine the suitable flocculants for the separation tests. The effect of the settling period, flocculant concentration, conditioning period and number of cleaning cycles were determined to optimize the separation tests. The discussion brings together this study with previous theoretically based work cited in the literature to elucidate the factor5governing the utilisation of steel making slags.
Resumo:
Scaffolds derived from processed tissues offer viable alternatives to synthetic polymers as biological scaffolds for regenerative medicine. Tissue-derived scaffolds provide an extracellular matrix (ECM) as the starting material for wound healing and the functional reconstruction of tissues, offering a potentially valuable approach for the replacement of damaged or missing tissues. Additionally, acellular tissue may provide a natural microenvironment for host-cell migration and the induction of stem cell differentiation to contribute to tissue regeneration. There are a number of processing methods that aim to stabilize and provide an immunologically inert tissue scaffold. Furthermore, these tissue-processing methods can often be applied to xenogenic transplants because the essential components of the ECM are often maintained between species. In this study, we applied several tissue-processing protocols to the cornea in order to obtain a decellularized cornea matrix that maintained the clarity and mechanical properties of the native tissue. Histology, mechanical testing and electron microscopy techniques were used to assess the cell extraction process and the organization of the remaining ECM. In vitro cell seeding experiments confirmed the processed corneas’ biocompatibility.
Resumo:
A major problem in modern probabilistic modeling is the huge computational complexity involved in typical calculations with multivariate probability distributions when the number of random variables is large. Because exact computations are infeasible in such cases and Monte Carlo sampling techniques may reach their limits, there is a need for methods that allow for efficient approximate computations. One of the simplest approximations is based on the mean field method, which has a long history in statistical physics. The method is widely used, particularly in the growing field of graphical models. Researchers from disciplines such as statistical physics, computer science, and mathematical statistics are studying ways to improve this and related methods and are exploring novel application areas. Leading approaches include the variational approach, which goes beyond factorizable distributions to achieve systematic improvements; the TAP (Thouless-Anderson-Palmer) approach, which incorporates correlations by including effective reaction terms in the mean field theory; and the more general methods of graphical models. Bringing together ideas and techniques from these diverse disciplines, this book covers the theoretical foundations of advanced mean field methods, explores the relation between the different approaches, examines the quality of the approximation obtained, and demonstrates their application to various areas of probabilistic modeling.