967 resultados para Processing methods
Resumo:
The main objectives of this research were to develop optimised chemical compositions and reactive processing conditions for grafting a functional monomer maleic anhydride (MA) in polypropylene (PP), ethylene propylene diene monomer (EPDM) and mixtures of PP-EPDM, and to optimise synthetic routes for production of PP/EPDM copolymers for the purpose of compatibilisation of PP/EPDM blends. The MA-functionalisation was achieved using an internal mixer in the presence of low concentrations (less than 0.01 molar ratio) of a free radical initiator. Various methods were used to purify MA-functionalised PP and the grafting yield was determined using either FTIR or titrametry. The grafting yield of MA alone, which due to its low free-radical reactivity towards polymer macroradicals, was accompanied by severe degradation in the case of PP and crosslinking for EPDM. In the case of MA-functionalised PP/EPDM, both degradation and crosslinking occurred though not to a great extent. The use of tri-functional coagents e.g. trimethylopropane triacrylates (TRIS) with MA, led to high improvement of the grafting yield of MA on the polymers. This is almost certainly due to high free-radical activity of TRIS leading to copolymerisation of MA and TRIS which was followed by grafting of the copolymer onto the polymer backbone. In the case of PP, the use of coagent was also found to reduce the polymer degradation. PP/EPDM copolymers with optimum tensile properties were synthesised using a 'one-step' continues reactive processing procedure. This was achieved firstly by functionalisation of a mixture of PP (higher w/w ratio) and EPDM (low w/w ratio) with MA, in the presence of the coagent TRIS and a small concentration of a free radical initiator. This was then followed by an imidisation reaction with the interlinking agent hexamethylene diamine (HEMDA). Small amount of copolymers, up to 5 phr, which were interlinked with up to 15 phr of HEMDA, were sufficient to compatibilise PP/EPDM75/25 blends resulting in excellent tensile properties compared to binary PP/EPDM 75/25 blend. Improvement in blend's compatibility and phases-stabilisation (observed through tensile and SEM analysis) was shown in all cases with significant interphases adhesion improvement between PP and EPDM, and reduction in domain size across the fractured surface indicating efficient distribution of the compatibiliser.
Resumo:
Reproducible preparation of a number of modified clay and clay~like materials by both conventional and microwave-assisted chemistry, and their subsequent characterisation, has been achieved, These materials are designed as hydrocracking catalysts for the upgrading of liquids obtained by the processing of coal. Contact with both coal derived liquids and heavy petroleum resids has demonstrated that these catalysts are superior to established proprietary catalysts in terms of both initial activity and deactivation resistance, Of particular activity were a chromium-pillared montmorillonite and a tin intercalated laponite, Layered Double Hydroxides (LDH's) have exhibited encouraging thermal stability. Development of novel methods for hydrocracking coal derived liquids, using a commercial microwave oven, modified reaction vessels and coal model compounds has been attempted. Whilst safe and reliable operation of a high pressure microwave "bomb" apparatus employing hydrogen, has been achieved, no hydrotreatment reactions occurred,
Resumo:
Introduction: The requirement of adjuvants in subunit protein vaccination is well known yet their mechanisms of action remain elusive. Of the numerous mechanisms suggested, cationic liposomes appear to fulfil at least three: the antigen depot effect, the delivery of antigen to antigen presenting cells (APCs) and finally the danger signal. We have investigated the role of antigen depot effect with the use of dual radiolabelling whereby adjuvant and antigen presence in tissues can be quantified. In our studies a range of cationic liposomes and different antigens were studied to determine the importance of physical properties such as liposome surface charge, antigen association and inherent lipid immunogenicity. More recently we have investigated the role of liposome size with the cationic liposome formulation DDA:TDB, composed of the cationic lipid dimethyldioctadecylammonium (DDA) and the synthetic mycobacterial glycolipid trehalose 6,6’-dibehenate (TDB). Vesicle size is a frequently investigated parameter which is known to result in different routes of endocytosis. It has been postulated that targeting different routes leads to different intracellular signaling pathway activation and it is certainly true that numerous studies have shown vesicle size to have an effect on the resulting immune responses (e.g. Th1 vs. Th2). Aim: To determine the effect of cationic liposome size on the biodistribution of adjuvant and antigen, the ensuing humoral and cell-mediated immune responses and the uptake and activation of antigen by APCs including macrophages and dendritic cells. Methods: DDA:TDB liposomes were made to three different sizes (~ 0.2, 0.5 and 2 µm) followed by the addition of tuberculosis antigen Ag85B-ESAT-6 therefore resulting in surface adsorption. Liposome formulations were injected into Balb/c or C57Bl/6 mice via the intramuscular route. The biodistribution of the liposome formulations was followed using dual radiolabelling. Tissues including muscle from the site of injection and local draining lymph nodes were removed and liposome and antigen presence quantified. Mice were also immunized with the different vaccine formulations and cytokine production (from Ag85B-ESAT-6 restimulated splenocytes) and antibody presence in blood assayed. Furthermore, splenocyte proliferation after restimulating with Ag85B-ESAT-6 was measured. Finally, APCs were compared for their ability to endocytose vaccine formulations and the effect this had on the maturation status of the cell populations was compared. Flow cytometry and fluorescence labelling was used to investigate maturation marker up-regulation and efficacy of phagocytosis. Results: Our results show that for an efficient Ag85B-ESAT-6 antigen depot at the injection site, liposomes composed of DDA and TDB are required. There is no significant change in the presence of liposome or antigen at 6hrs or 24hrs p.i, nor does liposome size have an effect. Approximately 0.05% of the injected liposome dose is detected in the local draining lymph node 24hrs p.i however protein presence is low (<0.005% dose). Preliminary in vitro data shows liposome and antigen endocytosis by macrophages; further studies on this will be presented in addition to the results of the immunisation study.
Resumo:
Recent advances in technology have produced a significant increase in the availability of free sensor data over the Internet. With affordable weather monitoring stations now available to individual meteorology enthusiasts a reservoir of real time data such as temperature, rainfall and wind speed can now be obtained for most of the United States and Europe. Despite the abundance of available data, obtaining useable information about the weather in your local neighbourhood requires complex processing that poses several challenges. This paper discusses a collection of technologies and applications that harvest, refine and process this data, culminating in information that has been tailored toward the user. In this case we are particularly interested in allowing a user to make direct queries about the weather at any location, even when this is not directly instrumented, using interpolation methods. We also consider how the uncertainty that the interpolation introduces can then be communicated to the user of the system, using UncertML, a developing standard for uncertainty representation.
Resumo:
Recent advances in technology have produced a significant increase in the availability of free sensor data over the Internet. With affordable weather monitoring stations now available to individual meteorology enthusiasts a reservoir of real time data such as temperature, rainfall and wind speed can now be obtained for most of the United States and Europe. Despite the abundance of available data, obtaining useable information about the weather in your local neighbourhood requires complex processing that poses several challenges. This paper discusses a collection of technologies and applications that harvest, refine and process this data, culminating in information that has been tailored toward the user. In this case we are particularly interested in allowing a user to make direct queries about the weather at any location, even when this is not directly instrumented, using interpolation methods. We also consider how the uncertainty that the interpolation introduces can then be communicated to the user of the system, using UncertML, a developing standard for uncertainty representation.
Resumo:
Removing noise from signals which are piecewise constant (PWC) is a challenging signal processing problem that arises in many practical scientific and engineering contexts. In the first paper (part I) of this series of two, we presented background theory building on results from the image processing community to show that the majority of these algorithms, and more proposed in the wider literature, are each associated with a special case of a generalized functional, that, when minimized, solves the PWC denoising problem. It shows how the minimizer can be obtained by a range of computational solver algorithms. In this second paper (part II), using this understanding developed in part I, we introduce several novel PWC denoising methods, which, for example, combine the global behaviour of mean shift clustering with the local smoothing of total variation diffusion, and show example solver algorithms for these new methods. Comparisons between these methods are performed on synthetic and real signals, revealing that our new methods have a useful role to play. Finally, overlaps between the generalized methods of these two papers and others such as wavelet shrinkage, hidden Markov models, and piecewise smooth filtering are touched on.
Resumo:
Recent advances in our ability to watch the molecular and cellular processes of life in action-such as atomic force microscopy, optical tweezers and Forster fluorescence resonance energy transfer-raise challenges for digital signal processing (DSP) of the resulting experimental data. This article explores the unique properties of such biophysical time series that set them apart from other signals, such as the prevalence of abrupt jumps and steps, multi-modal distributions and autocorrelated noise. It exposes the problems with classical linear DSP algorithms applied to this kind of data, and describes new nonlinear and non-Gaussian algorithms that are able to extract information that is of direct relevance to biological physicists. It is argued that these new methods applied in this context typify the nascent field of biophysical DSP. Practical experimental examples are supplied.
Resumo:
This research investigates specific ash control methods to limit inorganic content within biomass prior to fast pyrolysis and effect of specific ash components on fast pyrolysis processing, mass balance yields and bio-oil quality and stability. Inorganic content in miscanthus was naturally reduced over the winter period from June (7.36 wt. %) to February (2.80 wt. %) due to a combination of senescence and natural leaching from rain water. September harvest produced similar mass balance yields, bio-oil quality and stability compared to February harvest (conventional harvest), but nitrogen content in above ground crop was to high (208 kg ha.-1) to maintain sustainable crop production. Deionised water, 1.00% HCl and 0.10% Triton X-100 washes were used to reduce inorganic content of miscanthus. Miscanthus washed with 0.10% Triton X-100 resulted in the highest total liquid yield (76.21 wt. %) and lowest char and reaction water yields (9.77 wt. % and 8.25 wt. % respectively). Concentrations of Triton X-100 were varied to study further effects on mass balance yields and bio-oil stability. All concentrations of Triton X-100 increased total liquid yield and decreased char and reaction water yields compared to untreated miscanthus. In terms of bio-oil stability 1.00% Triton X-100 produced the most stable bio-oil with lowest viscosity index (2.43) and lowest water content index (1.01). Beech wood was impregnated with potassium and phosphorus resulting in lower liquid yields and increased char and gas yields due to their catalytic effect on fast pyrolysis product distribution. Increased potassium and phosphorus concentrations produced less stable bio-oils with viscosity and water content indexes increasing. Fast pyrolysis processing of phosphorus impregnated beech wood was problematic as the reactor bed material agglomerated into large clumps due to char formation within the reactor, affecting fluidisation and heat transfer.
Resumo:
The research presented in this thesis was developed as part of DIBANET, an EC funded project aiming to develop an energetically self-sustainable process for the production of diesel miscible biofuels (i.e. ethyl levulinate) via acid hydrolysis of selected biomass feedstocks. Three thermal conversion technologies, pyrolysis, gasification and combustion, were evaluated in the present work with the aim of recovering the energy stored in the acid hydrolysis solid residue (AHR). Mainly consisting of lignin and humins, the AHR can contain up to 80% of the energy in the original feedstock. Pyrolysis of AHR proved unsatisfactory, so attention focussed on gasification and combustion with the aim of producing heat and/or power to supply the energy demanded by the ethyl levulinate production process. A thermal processing rig consisting on a Laminar Entrained Flow Reactor (LEFR) equipped with solid and liquid collection and online gas analysis systems was designed and built to explore pyrolysis, gasification and air-blown combustion of AHR. Maximum liquid yield for pyrolysis of AHR was 30wt% with volatile conversion of 80%. Gas yield for AHR gasification was 78wt%, with 8wt% tar yields and conversion of volatiles close to 100%. 90wt% of the AHR was transformed into gas by combustion, with volatile conversions above 90%. 5volO2%-95vol%N2 gasification resulted in a nitrogen diluted, low heating value gas (2MJ/m3). Steam and oxygen-blown gasification of AHR were additionally investigated in a batch gasifier at KTH in Sweden. Steam promoted the formation of hydrogen (25vol%) and methane (14vol%) improving the gas heating value to 10MJ/m3, below the typical for steam gasification due to equipment limitations. Arrhenius kinetic parameters were calculated using data collected with the LEFR to provide reaction rate information for process design and optimisation. Activation energy (EA) and pre-exponential factor (ko in s-1) for pyrolysis (EA=80kJ/mol, lnko=14), gasification (EA=69kJ/mol, lnko=13) and combustion (EA=42kJ/mol, lnko=8) were calculated after linearly fitting the data using the random pore model. Kinetic parameters for pyrolysis and combustion were also determined by dynamic thermogravimetric analysis (TGA), including studies of the original biomass feedstocks for comparison. Results obtained by differential and integral isoconversional methods for activation energy determination were compared. Activation energy calculated by the Vyazovkin method was 103-204kJ/mol for pyrolysis of untreated feedstocks and 185-387kJ/mol for AHRs. Combustion activation energy was 138-163kJ/mol for biomass and 119-158 for AHRs. The non-linear least squares method was used to determine reaction model and pre-exponential factor. Pyrolysis and combustion of biomass were best modelled by a combination of third order reaction and 3 dimensional diffusion models, while AHR decomposed following the third order reaction for pyrolysis and the 3 dimensional diffusion for combustion.
Resumo:
The combination of the third-order optical nonlinearity with chromatic dispersion in optical fibers offers an extremely rich variety of possibilities for tailoring the temporal and spectral content of a light signal, depending on the regime of dispersion that is used. Here, we review recent progress on the use of third-order nonlinear processes in optical fibers for pulse shaping in the temporal and spectral domains. Various examples of practical significance will be discussed, spanning fields from the generation of specialized temporal waveforms to the generation of ultrashort pulses, and to stable continuum generation.
Resumo:
After many years of scholar study, manuscript collections continue to be an important source of novel information for scholars, concerning both the history of earlier times as well as the development of cultural documentation over the centuries. D-SCRIBE project aims to support and facilitate current and future efforts in manuscript digitization and processing. It strives toward the creation of a comprehensive software product, which can assist the content holders in turning an archive of manuscripts into a digital collection using automated methods. In this paper, we focus on the problem of recognizing early Christian Greek manuscripts. We propose a novel digital image binarization scheme for low quality historical documents allowing further content exploitation in an efficient way. Based on the existence of closed cavity regions in the majority of characters and character ligatures in these scripts, we propose a novel, segmentation-free, fast and efficient technique that assists the recognition procedure by tracing and recognizing the most frequently appearing characters or character ligatures.
Resumo:
The problems and methods for adaptive control and multi-agent processing of information in global telecommunication and computer networks (TCN) are discussed. Criteria for controllability and communication ability (routing ability) of dataflows are described. Multi-agent model for exchange of divided information resources in global TCN has been suggested. Peculiarities for adaptive and intelligent control of dataflows in uncertain conditions and network collisions are analyzed.
Resumo:
Fluoroscopic images exhibit severe signal-dependent quantum noise, due to the reduced X-ray dose involved in image formation, that is generally modelled as Poisson-distributed. However, image gray-level transformations, commonly applied by fluoroscopic device to enhance contrast, modify the noise statistics and the relationship between image noise variance and expected pixel intensity. Image denoising is essential to improve quality of fluoroscopic images and their clinical information content. Simple average filters are commonly employed in real-time processing, but they tend to blur edges and details. An extensive comparison of advanced denoising algorithms specifically designed for both signal-dependent noise (AAS, BM3Dc, HHM, TLS) and independent additive noise (AV, BM3D, K-SVD) was presented. Simulated test images degraded by various levels of Poisson quantum noise and real clinical fluoroscopic images were considered. Typical gray-level transformations (e.g. white compression) were also applied in order to evaluate their effect on the denoising algorithms. Performances of the algorithms were evaluated in terms of peak-signal-to-noise ratio (PSNR), signal-to-noise ratio (SNR), mean square error (MSE), structural similarity index (SSIM) and computational time. On average, the filters designed for signal-dependent noise provided better image restorations than those assuming additive white Gaussian noise (AWGN). Collaborative denoising strategy was found to be the most effective in denoising of both simulated and real data, also in the presence of image gray-level transformations. White compression, by inherently reducing the greater noise variance of brighter pixels, appeared to support denoising algorithms in performing more effectively. © 2012 Elsevier Ltd. All rights reserved.
Resumo:
2000 Mathematics Subject Classification: 62P10, 92C20
Resumo:
It has been proposed that language impairments in children with Autism Spectrum Disorders (ASD) stem from atypical neural processing of speech and/or nonspeech sounds. However, the strength of this proposal is compromised by the unreliable outcomes of previous studies of speech and nonspeech processing in ASD. The aim of this study was to determine whether there was an association between poor spoken language and atypical event-related field (ERF) responses to speech and nonspeech sounds in children with ASD (n = 14) and controls (n = 18). Data from this developmental population (ages 6-14) were analysed using a novel combination of methods to maximize the reliability of our findings while taking into consideration the heterogeneity of the ASD population. The results showed that poor spoken language scores were associated with atypical left hemisphere brain responses (200 to 400 ms) to both speech and nonspeech in the ASD group. These data support the idea that some children with ASD may have an immature auditory cortex that affects their ability to process both speech and nonspeech sounds. Their poor speech processing may impair their ability to process the speech of other people, and hence reduce their ability to learn the phonology, syntax, and semantics of their native language.