9 resultados para clean-up procedure for PBTs
em Aston University Research Archive
Resumo:
The procedure for successful scale-up of batchwise emulsion polymerisation has been studied. The relevant literature on liquid-liquid dispersion on scale-up and on emulsion polymerisation has been crit1cally reviewed. Batchwise emulsion polymerisation of styrene in a specially built 3 litre, unbaffled, reactor confirmed that impeller speed had a direct effect on the latex particle size and on the reaction rate. This was noted to be more significant at low soap concentrations and the phenomenon was related to the depletion of micelle forming soap by soap adsorption onto the monomer emulsion surface. The scale-up procedure necessary to maintain constant monomer emulsion surface area in an unbaffled batch reactor was therefore investigated. Three geometrically similar 'vessels of 152, 229 and 305mm internal diameter, and a range of impeller speeds (190 to 960 r.p.m.) were employed. The droplet sizes were measured either through photomicroscopy or via a Coulter Counter. The power input to the impeller was also measured. A scale-up procedure was proposed based on the governing relationship between droplet diameter, impeller speed and impeller diameter. The relationships between impeller speed soap concentration, latex particle size and reaction rate were investigated in a series of polymerisations employing an amended commercial recipe for polystyrene. The particle size was determined via a light transmission technique. Two computer models, based on the Smith and Ewart approach but taking into account the adsorption/desorption of soap at the monomer surface, were successful 1n predicting the particle size and the progress of the reaction up to the end of stage II, i.e. to the end of the period of constant reaction rate.
Resumo:
Ion exchange resins are used for many purposes in various areas of science and commerce. One example is the use of cation exchange resins in the nuclear industry for the clean up of radioactively contaminated water (for example the removal of 137Cs). However, during removal of radionuclides, the resin itself becomes radioactively contaminated, and must be treated as Intermediate Level Waste. This radioactive contamination of the resin creates a disposal problem. Conventionally, there are two main avenues of disposal for industrial wastes, landfill burial or incineration. However, these are regarded as inappropriate for the disposal of the cation exchange resin involved in this project. Thus, a method involving the use of Fenton's Reagent (Hydrogen Peroxide/soluble Iron catalyst) to destroy the resin by wet oxidation has been developed. This process converts 95% of the solid resin to gaseous CO2, thus greatly reducing the volume of radioactive waste that has to be disposed of. However, hydrogen peroxide is an expensive reagent, and is a major component of the cost of any potential plant for the destruction of ion exchange resin. The aim of my project has been to discover a way of improving the efficiency of the destruction of the resin thus reducing the cost involved in the use of hydrogen peroxide. The work on this problem has been concentrated in two main areas:-1) Use of analytical techniques such as NMR and IR to follow the process of the hydrogen peroxide destruction of both resin beads and model systems such as water soluble calixarenes. 2) Use of various physical and chemical techniques in an attempt to improve the overall efficiency of hydrogen peroxide utilization. Examples of these techniques include UV irradiation, both with and without a photocatalyst, oxygen carrying molecules and various stirring regimes.
Resumo:
The objective of this study was to design, construct, commission and operate a laboratory scale gasifier system that could be used to investigate the parameters that influence the gasification process. The gasifier is of the open-core variety and is fabricated from 7.5 cm bore quartz glass tubing. Gas cleaning is by a centrifugal contacting scrubber, with the product gas being flared. The system employs an on-line dedicated gas analysis system, monitoring the levels of H2, CO, CO2 and CH4 in the product gas. The gas composition data, as well as the gas flowrate, temperatures throughout the system and pressure data is recorded using a BBC microcomputer based data-logging system. Ten runs have been performed using the system of which six were predominantly commissioning runs. The main emphasis in the commissioning runs was placed on the gas clean-up, the product gas cleaning and the reactor bed temperature measurement. The reaction was observed to occur in a narrow band, of about 3 to 5 particle diameters thick. Initially the fuel was pyrolysed, with the volatiles produced being combusted and providing the energy to drive the process, and then the char product was gasified by reaction with the pyrolysis gases. Normally, the gasifier is operated with reaction zone supported on a bed of char, although it has been operated for short periods without a char bed. At steady state the depth of char remains constant, but by adjusting the air inlet rate it has been shown that the depth of char can be increased or decreased. It has been shown that increasing the depth of the char bed effects some improvement in the product gas quality.
Resumo:
Here we report on a potential catalytic process for efficient clean-up of plastic pollution in waters, such as the Great Pacific Garbage Patch (CPGP). Detailed catalytic mechanisms of RuO2 during supercritical water gasification of common polyolefin plastics including low-density polyethylene (LDPE), high-density polyethylene (HDPE), polypropylene (PP) and polystyrene (PP), have been investigated in a batch reactor at 450 °C, 60 min. All four plastics gave very high carbon gasification efficiencies (CGE) and hydrogen gasification efficiencies (HGE). Methane was the highest gas component, with a yield of up to 37 mol kg−1LDPE using the 20 wt% RuO2 catalyst. Evaluation of the gas yields, CGE and HGE revealed that the conversion of PS involved thermal degradation, steam reforming and methanation; whereas hydrogenolysis was a possible additional mechanism during the conversion of aliphatic plastics. The process has the benefits of producing a clean-pressurized methane-rich fuel gas as well as cleaning up hydrocarbons-polluted waters.
Resumo:
Spectral and coherence methodologies are ubiquitous for the analysis of multiple time series. Partial coherence analysis may be used to try to determine graphical models for brain functional connectivity. The outcome of such an analysis may be considerably influenced by factors such as the degree of spectral smoothing, line and interference removal, matrix inversion stabilization and the suppression of effects caused by side-lobe leakage, the combination of results from different epochs and people, and multiple hypothesis testing. This paper examines each of these steps in turn and provides a possible path which produces relatively ‘clean’ connectivity plots. In particular we show how spectral matrix diagonal up-weighting can simultaneously stabilize spectral matrix inversion and reduce effects caused by side-lobe leakage, and use the stepdown multiple hypothesis test procedure to help formulate an interaction strength.
Resumo:
Nanotechnologies have been called the "Next Industrial Revolution." At the same time, scientists are raising concerns about the potential health and environmental risks related to the nano-sized materials used in nanotechnologies. Analyses suggest that current U.S. federal regulatory structures are not likely to adequately address these risks in a proactive manner. Given these trends, the premise of this paper is that state and local-level agencies will likely deal with many "end-of-pipe" issues as nanomaterials enter environmental media without prior toxicity testing, federal standards, or emissions controls. In this paper we (1) briefly describe potential environmental risks and benefits related to emerging nanotechnologies; (2) outline the capacities of the Toxic Substances Control Act, the Clean Air Act, the Clean Water Act, and the Resources Conservation and Recovery Act to address potential nanotechnology risks, and how risk data gaps challenge these regulations; (3) outline some of the key data gaps that challenge state-level regulatory capacities to address nanotechnologies' potential risks, using Wisconsin as a case study; and (4) discuss advantages and disadvantages of state versus federal approaches to nanotechnology risk regulation. In summary, we suggest some ways government agencies can be better prepared to address nanotechnology risk knowledge gaps and risk management.
Resumo:
A description of the background to testing friction materials for automotive brakes explains the need for a rapid, inexpensive means of assessing their behaviour in a way which is both accurate and meaningful. Various methods of controlling inertia dynamometers to simulate road vehicles are rejected in favour of programming by means of a commercially available XY plotter. Investigation of brake service conditions is used to set up test schedules, and a dynamometer programming unit built to enable service conditions on vehicles to be simulated on a full scale dynamometer. A technique is developed by which accelerated testing can be achieved without operating under overload conditions, saving time and cost without sacrificing validity. The development of programming by XY plotter is described, with a method of operating one XY plotter to programme the machine, monitor its own behaviour, and plot its own results in logical sequence. Commissioning trials are described and the generation of reproducible results in frictional behaviour and material durability is discussed. Teclmiques are developed to cross check the operation of the machine in retrospect, and retrospectively correct results in the event of malfunctions. Sensitivity errors in the measuring circuits are displayed between calibrations, whilst leaving the recorded results almost unaffected by error. Typical results of brake lining tests are used to demonstrate the range of performance parameters which can be studied by use of the machine. Successful test investigations completed on the machine are reported, including comments on behaviour of cast iron drums and discs. The machine shows that materials can repeat their complex friction/ temperature/speed/pressure relationships at a reproducibility of the order of +-0.003u and +~ 0.0002 in. thickness loss during wear tests. Discussion of practical and academic implications completes the report with recommendations for further work in both fields.
Resumo:
Poly(Nε-trifluoroacetyl-l-lysine) was used as a model solute to investigate the potential of nonaqueous capillary electrophoresis (NACE) for the characterization of synthetic organic polymers. The information obtained by NACE was compared to that derived from size exclusion chromatography (SEC) experiments, and the two techniques were found to be complimentary for polymer characterization. On one hand, NACE permitted (i) the separation of oligomers according to their molar mass and (ii) the separation of the polymers according to the nature of the end groups. On the other hand, SEC experiments were used for the characterization of the molar mass distribution for higher molar masses. Due to the tendency of the solutes (polypeptides) to adsorb onto the fused-silica capillary wall, careful attention was paid to the rinsing procedure of the capillary between runs in order to keep the capillary surface clean. For that purpose, the use of electrophoretic desorption under denaturating conditions was very effective. Optimization of the separation was performed by studying (i) the influence of the proportion of methanol in a methanol/acetonitrile mixture and (ii) the influence of acetic acid concentration in the background electrolyte. Highly resolved separation of the oligomers (up to a degree of polymerization n of ∼50) was obtained by adding trifluoroacetic acid to the electrolyte. Important information concerning the polymer conformations could be obtained from the mobility data. Two different plots relating the effective mobility data to the degree of polymerization were proposed for monitoring the changes in polymer conformations as a function of the number of monomers.
Resumo:
The thermal decomposition of propene over clean and sulphate precovered Pt{111} has been followed by Fast XPS. The saturation propene coverage over the clean surface is 0.21 mL at 90 K. Propene is stable up to 200 K, above which molecular desorption and dehydrogenation result in the formation of a stable propylidyne intermediate adlayer at 300 K. Propylidyne decomposes above 400 K eventually forming graphitic carbon above 800 K. Preadsorbed surface sulphate promotes room temperature propene combustion associated with the decomposition of a thermally unstable alkyl--sulphate complex. Propylidyne also forms as on clean Pt{111}, but is less reactive, its decomposition above 450 K triggering partial oxidation with residual surface oxygen to liberate gas phase CO.