12 resultados para sequential injection analysis

em Aston University Research Archive


Relevância:

80.00% 80.00%

Publicador:

Resumo:

In three experiments we investigated the impact that exposure to counter-stereotypes has on emotional reactions to outgroups. In Experiment 1, thinking about gender counter-stereotypes attenuated stereotyped emotions toward females and males. In Experiment 2, an immigrant counterstereotype attenuated stereotyped emotions toward this outgroup and reduced dehumanization tendencies. Experiment 3 replicated these results using an alternative measure of humanization. In both Experiments 2 and 3 sequential meditational analysis revealed that counter-stereotypes produced feelings of surprise which, in turn, elicited a cognitive process of expectancy violation which resulted in attenuated stereotyped emotions and an enhanced use of uniquely human characteristics to describe the outgroup. The findings extend research supporting the usefulness of counter-stereotype exposure for reducing prejudice and highlight its positive impact on intergroup emotions.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

The objective of this study was to investigate the effects of circularity, comorbidity, prevalence and presentation variation on the accuracy of differential diagnoses made in optometric primary care using a modified form of naïve Bayesian sequential analysis. No such investigation has ever been reported before. Data were collected for 1422 cases seen over one year. Positive test outcomes were recorded for case history (ethnicity, age, symptoms and ocular and medical history) and clinical signs in relation to each diagnosis. For this reason only positive likelihood ratios were used for this modified form of Bayesian analysis that was carried out with Laplacian correction and Chi-square filtration. Accuracy was expressed as the percentage of cases for which the diagnoses made by the clinician appeared at the top of a list generated by Bayesian analysis. Preliminary analyses were carried out on 10 diagnoses and 15 test outcomes. Accuracy of 100% was achieved in the absence of presentation variation but dropped by 6% when variation existed. Circularity artificially elevated accuracy by 0.5%. Surprisingly, removal of Chi-square filtering increased accuracy by 0.4%. Decision tree analysis showed that accuracy was influenced primarily by prevalence followed by presentation variation and comorbidity. Analysis of 35 diagnoses and 105 test outcomes followed. This explored the use of positive likelihood ratios, derived from the case history, to recommend signs to look for. Accuracy of 72% was achieved when all clinical signs were entered. The drop in accuracy, compared to the preliminary analysis, was attributed to the fact that some diagnoses lacked strong diagnostic signs; the accuracy increased by 1% when only recommended signs were entered. Chi-square filtering improved recommended test selection. Decision tree analysis showed that accuracy again influenced primarily by prevalence, followed by comorbidity and presentation variation. Future work will explore the use of likelihood ratios based on positive and negative test findings prior to considering naïve Bayesian analysis as a form of artificial intelligence in optometric practice.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The main aim of the work is to investigate sequential pyrolysis of willow SRC using two different heating rates (25 and 1500 °C/min) between 320 and 520 °C. Thermogravimetric analysis (TGA) and pyrolysis - gas chromatography - mass spectroscopy (Py-GC-MS) have been used for this analysis. In addition, laboratory scale processing has been undertaken to compare product distribution from fast and slow pyrolysis at 500 °C. Fast pyrolysis was carried out using a 1 kg/h continuous bubbling fluidized bed reactor, and slow pyrolysis using a 100 g batch reactor. Findings from this study show that heating rate and pyrolysis temperatures have a significant influence on the chemical content of decomposition products. From the analytical sequential pyrolysis, an inverse relationship was seen between the total yield of furfural (at high heating rates) and 2-furanmethanol (at low heating rates). The total yield of 1,2-dihydroxybenzene (catechol) was found to be significant higher at low heating rates. The intermediates of catechol, 2-methoxy-4-(2-propenyl)phenol (eugenol); 2-methoxyphenol (guaiacol); 4-Hydroxy-3,5-dimethoxybenzaldehyde (syringaldehyde) and 4-hydroxy-3-methoxybenzaldehyde (vanillin), were found to be highest at high heating rates. It was also found that laboratory scale processing alters the pyrolysis bio-oil chemical composition, and the proportions of pyrolysis product yields. The GC-MS/FID analysis of fast and slow pyrolysis bio-oils reveals significant differences. © 2011 Elsevier Ltd. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Grafting of antioxidants and other modifiers onto polymers by reactive extrusion, has been performed successfully by the Polymer Processing and Performance Group at Aston University. Traditionally the optimum conditions for the grafting process have been established within a Brabender internal mixer. Transfer of this batch process to a continuous processor, such as an extruder, has, typically, been empirical. To have more confidence in the success of direct transfer of the process requires knowledge of, and comparison between, residence times, mixing intensities, shear rates and flow regimes in the internal mixer and in the continuous processor.The continuous processor chosen for the current work in the closely intermeshing, co-rotating twin-screw extruder (CICo-TSE). CICo-TSEs contain screw elements that convey material with a self-wiping action and are widely used for polymer compounding and blending. Of the different mixing modules contained within the CICo-TSE, the trilobal elements, which impose intensive mixing, and the mixing discs, which impose extensive mixing, are of importance when establishing the intensity of mixing. In this thesis, the flow patterns within the various regions of the single-flighted conveying screw elements and within both the trilobal element and mixing disc zones of a Betol BTS40 CICo-TSE, have been modelled using the computational fluid dynamics package Polyflow. A major obstacle encountered when solving the flow problem within all of these sets of elements, arises from both the complex geometry and the time-dependent flow boundaries as the elements rotate about their fixed axes. Simulation of the time dependent boundaries was overcome by selecting a number of sequential 2D and 3D geometries, used to represent partial mixing cycles. The flow fields were simulated using the ideal rheological properties of polypropylene and characterised in terms of velocity vectors, shear stresses generated and a parameter known as the mixing efficiency. The majority of the large 3D simulations were performed on the Cray J90 supercomputer situated at the Rutherford-Appleton laboratories, with pre- and postprocessing operations achieved via a Silicon Graphics Indy workstation. A mechanical model was constructed consisting of various CICo-TSE elements rotating within a transparent outer barrel. A technique has been developed using coloured viscous clays whereby the flow patterns and mixing characteristics within the CICo-TSE may be visualised. In order to test and verify the simulated predictions, the patterns observed within the mechanical model were compared with the flow patterns predicted by the computational model. The flow patterns within the single-flighted conveying screw elements in particular, showed good agreement between the experimental and simulated results.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The primary objective of this research was to examine the concepts of the chemical modification of polymer blends by reactive processing using interlinking agents (multi-functional, activated vinyl compounds; trimethylolpropane triacrylates {TRIS} and divinylbenzene {DVD}) to target in-situ interpolymer formation between immiscible polymers in PS/EPDM blends via peroxide-initiated free radical reactions during melt mixing. From a comprehensive survey of previous studies of compatibility enhancement in polystyrene blends, it was recognised that reactive processing offers opportunities for technological success that have not yet been fully realised; learning from this study is expected to assist in the development and application of this potential. In an experimental-scale operation for the simultaneous melt blending and reactive processing of both polymers, involving manual injection of precise reactive agent/free radical initiator mixtures directly into molten polymer within an internal mixer, torque changes were distinct, quantifiable and rationalised by ongoing physical and chemical effects. EPDM content of PS/EPDM blends was the prime determinant of torque increases on addition of TRIS, itself liable to self-polymerisation at high additions, with little indication of PS reaction in initial reactively processed blends with TRIS, though blend compatibility, from visual assessment of morphology by SEM, was nevertheless improved. Suitable operating windows were defined for the optimisation of reactive blending, for use once routes to encourage PS reaction could be identified. The effectiveness of PS modification by reactive processing with interlinking agents was increased by the selection of process conditions to target specific reaction routes, assessed by spectroscopy (FT-IR and NMR) and thermal analysis (DSC) coupled dichloromethane extraction and fractionation of PS. Initiator concentration was crucial in balancing desired PS modification and interlinking agent self-polymerisation, most particularly with TRIS. Pre-addition of initiator to PS was beneficial in the enhancement of TRIS binding to PS and minimisation of modifier polymerisation; believed to arise from direct formation of polystyryl radicals for addition to active unsaturation in TRIS. DVB was found to be a "compatible" modifier for PS, but its efficacy was not quantified. Application of routes for PS reaction in PS/EPDM blends was successful for in-situ formation of interpolymer (shown by sequential solvent extraction combined with FT-IR and DSC analysis); the predominant outcome depending on the degree of reaction of each component, with optimum "between-phase" interpolymer formed under conditions selected for equalisation of differing component reactivities and avoidance of competitive processes. This was achieved for combined addition of TRIS+DVB at optimum initiator concentrations with initiator pre-addition to PS. Improvements in blend compatibility (by tensiles, SEM and thermal analysis) were shown in all cases with significant interpolymer formation, though physical benefits were not; morphology and other reactive effects were also important factors. Interpolymer from specific "between-phase" reaction of blend components and interlinking agent was vital for the realisation of positive performance on compatibilisation by the chemical modification of polymer blends by reactive processing.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The principled statistical application of Gaussian random field models used in geostatistics has historically been limited to data sets of a small size. This limitation is imposed by the requirement to store and invert the covariance matrix of all the samples to obtain a predictive distribution at unsampled locations, or to use likelihood-based covariance estimation. Various ad hoc approaches to solve this problem have been adopted, such as selecting a neighborhood region and/or a small number of observations to use in the kriging process, but these have no sound theoretical basis and it is unclear what information is being lost. In this article, we present a Bayesian method for estimating the posterior mean and covariance structures of a Gaussian random field using a sequential estimation algorithm. By imposing sparsity in a well-defined framework, the algorithm retains a subset of “basis vectors” that best represent the “true” posterior Gaussian random field model in the relative entropy sense. This allows a principled treatment of Gaussian random field models on very large data sets. The method is particularly appropriate when the Gaussian random field model is regarded as a latent variable model, which may be nonlinearly related to the observations. We show the application of the sequential, sparse Bayesian estimation in Gaussian random field models and discuss its merits and drawbacks.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The retrieval of wind vectors from satellite scatterometer observations is a non-linear inverse problem. A common approach to solving inverse problems is to adopt a Bayesian framework and to infer the posterior distribution of the parameters of interest given the observations by using a likelihood model relating the observations to the parameters, and a prior distribution over the parameters. We show how Gaussian process priors can be used efficiently with a variety of likelihood models, using local forward (observation) models and direct inverse models for the scatterometer. We present an enhanced Markov chain Monte Carlo method to sample from the resulting multimodal posterior distribution. We go on to show how the computational complexity of the inference can be controlled by using a sparse, sequential Bayes algorithm for estimation with Gaussian processes. This helps to overcome the most serious barrier to the use of probabilistic, Gaussian process methods in remote sensing inverse problems, which is the prohibitively large size of the data sets. We contrast the sampling results with the approximations that are found by using the sparse, sequential Bayes algorithm.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The reliability of the printed circuit board assembly under dynamic environments, such as those found onboard airplanes, ships and land vehicles is receiving more attention. This research analyses the dynamic characteristics of the printed circuit board (PCB) supported by edge retainers and plug-in connectors. By modelling the wedge retainer and connector as providing simply supported boundary condition with appropriate rotational spring stiffnesses along their respective edges with the aid of finite element codes, accurate natural frequencies for the board against experimental natural frequencies are obtained. For a PCB supported by two opposite wedge retainers and a plug-in connector and with its remaining edge free of any restraint, it is found that these real supports behave somewhere between the simply supported and clamped boundary conditions and provide a percentage fixity of 39.5% more than the classical simply supported case. By using an eigensensitivity method, the rotational stiffnesses representing the boundary supports of the PCB can be updated effectively and is capable of representing the dynamics of the PCB accurately. The result shows that the percentage error in the fundamental frequency of the PCB finite element model is substantially reduced from 22.3% to 1.3%. The procedure demonstrated the effectiveness of using only the vibration test frequencies as reference data when the mode shapes of the original untuned model are almost identical to the referenced modes/experimental data. When using only modal frequencies in model improvement, the analysis is very much simplified. Furthermore, the time taken to obtain the experimental data will be substantially reduced as the experimental mode shapes are not required.In addition, this thesis advocates a relatively simple method in determining the support locations for maximising the fundamental frequency of vibrating structures. The technique is simple and does not require any optimisation or sequential search algorithm in the analysis. The key to the procedure is to position the necessary supports at positions so as to eliminate the lower modes from the original configuration. This is accomplished by introducing point supports along the nodal lines of the highest possible mode from the original configuration, so that all the other lower modes are eliminated by the introduction of the new or extra supports to the structure. It also proposes inspecting the average driving point residues along the nodal lines of vibrating plates to find the optimal locations of the supports. Numerical examples are provided to demonstrate its validity. By applying to the PCB supported on its three sides by two wedge retainers and a connector, it is found that a single point constraint that would yield maximum fundamental frequency is located at the mid-point of the nodal line, namely, node 39. This point support has the effect of increasing the structure's fundamental frequency from 68.4 Hz to 146.9 Hz, or 115% higher.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The trend in modal extraction algorithms is to use all the available frequency response functions data to obtain a global estimate of the natural frequencies, damping ratio and mode shapes. Improvements in transducer and signal processing technology allow the simultaneous measurement of many hundreds of channels of response data. The quantity of data available and the complexity of the extraction algorithms make considerable demands on the available computer power and require a powerful computer or dedicated workstation to perform satisfactorily. An alternative to waiting for faster sequential processors is to implement the algorithm in parallel, for example on a network of Transputers. Parallel architectures are a cost effective means of increasing computational power, and a larger number of response channels would simply require more processors. This thesis considers how two typical modal extraction algorithms, the Rational Fraction Polynomial method and the Ibrahim Time Domain method, may be implemented on a network of transputers. The Rational Fraction Polynomial Method is a well known and robust frequency domain 'curve fitting' algorithm. The Ibrahim Time Domain method is an efficient algorithm that 'curve fits' in the time domain. This thesis reviews the algorithms, considers the problems involved in a parallel implementation, and shows how they were implemented on a real Transputer network.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

An essential stage in endocytic coated vesicle recycling is the dissociation of clathrin from the vesicle coat by the molecular chaperone, 70-kDa heat-shock cognate protein (Hsc70), and the J-domain-containing protein, auxilin, in an ATP-dependent process. We present a detailed mechanistic analysis of clathrin disassembly catalyzed by Hsc70 and auxilin, using loss of perpendicular light scattering to monitor the process. We report that a single auxilin per clathrin triskelion is required for maximal rate of disassembly, that ATP is hydrolyzed at the same rate that disassembly occurs, and that three ATP molecules are hydrolyzed per clathrin triskelion released. Stopped-flow measurements revealed a lag phase in which the scattering intensity increased owing to association of Hsc70 with clathrin cages followed by serial rounds of ATP hydrolysis prior to triskelion removal. Global fit of stopped-flow data to several physically plausible mechanisms showed the best fit to a model in which sequential hydrolysis of three separate ATP molecules is required for the eventual release of a triskelion from the clathrin-auxilin cage.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Disclosed is a fluid sampling apparatus (12). The apparatus has a sample inlet port (14) in communication with a fluid space (10) containing the fluid to be sampled. An analysis port (16) is provided for communication with an analysis device such as a mass spectrometer. A dilution gas injection port (22) is provided to dilute fluid is sampled from the fluid space via the sample inlet port. The diluted sample fluid is then conducted to the analysis port. The sampling apparatus is intended particularly for use in analysing biomass pyrolysis processes.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We propose a fibre-based approach for generation of optical frequency combs (OFCs) with the aim of calibration of astronomical spectrographs in the low and medium-resolution range. This approach includes two steps: in the first step, an appropriate state of optical pulses is generated and subsequently moulded in the second step delivering the desired OFC. More precisely, the first step is realised by injection of two continuous-wave (CW) lasers into a conventional single-mode fibre, whereas the second step generates a broad OFC by using the optical solitons generated in step one as initial condition. We investigate the conversion of a bichromatic input wave produced by two initial CW lasers into a train of optical solitons, which happens in the fibre used as step one. Especially, we are interested in the soliton content of the pulses created in this fibre. For that, we study different initial conditions (a single cosine-hump, an Akhmediev breather, and a deeply modulated bichromatic wave) by means of soliton radiation beat analysis and compare the results to draw conclusion about the soliton content of the state generated in the first step. In case of a deeply modulated bichromatic wave, we observed the formation of a collective soliton crystal for low input powers and the appearance of separated solitons for high input powers. An intermediate state showing the features of both, the soliton crystal and the separated solitons, turned out to be most suitable for the generation of OFC for the purpose of calibration of astronomical spectrographs.