934 resultados para Direct modified method


Relevância:

30.00% 30.00%

Publicador:

Resumo:

A direct quadrupole ICP-MS technique has been developed for the analysis of the rare earth elements and yttrium in natural waters. The method has been validated by comparison of the results obtained for the river water reference material SLRS-4 with literature values. The detection limit of the technique was investigated by analysis of serial dilutions of SLRS-4 and revealed that single elements can be quantified at single-digit fg/g concentrations. A coherent normalised rare earth pattern was retained at concentrations two orders of magnitude below natural concentrations for SLRS-4, demonstrating the excellent inter-element accuracy and precision of the method. The technique was applied to the analysis of a diluted mid-salinity estuarine sample, which also displayed a coherent normalised rare earth element pattern, yielding the expected distinctive marine characteristics. (c) 2006 Published by Elsevier Ltd.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper presents the implementation of a modified particle filter for vision-based simultaneous localization and mapping of an autonomous robot in a structured indoor environment. Through this method, artificial landmarks such as multi-coloured cylinders can be tracked with a camera mounted on the robot, and the position of the robot can be estimated at the same time. Experimental results in simulation and in real environments show that this approach has advantages over the extended Kalman filter with ambiguous data association and various levels of odometric noise.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Turbulent flow around a rotating circular cylinder has numerous applications including wall shear stress and mass-transfer measurement related to the corrosion studies. It is also of interest in the context of flow over convex surfaces where standard turbulence models perform poorly. The main purpose of this paper is to elucidate the basic turbulence mechanism around a rotating cylinder at low Reynolds numbers to provide a better understanding of flow fundamentals. Direct numerical simulation (DNS) has been performed in a reference frame rotating at constant angular velocity with the cylinder. The governing equations are discretized by using a finite-volume method. As for fully developed channel, pipe, and boundary layer flows, a laminar sublayer, buffer layer, and logarithmic outer region were observed. The level of mean velocity is lower in the buffer and outer regions but the logarithmic region still has a slope equal to the inverse of the von Karman constant. Instantaneous flow visualization revealed that the turbulence length scale typically decreases as the Reynolds number increases. Wavelet analysis provided some insight into the dependence of structural characteristics on wave number. The budget of the turbulent kinetic energy was computed and found to be similar to that in plane channel flow as well as in pipe and zero pressure gradient boundary layer flows. Coriolis effects show as an equivalent production for the azimuthal and radial velocity fluctuations leading to their ratio being lowered relative to similar nonrotating boundary layer flows.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Freshwater is extremely precious; but even more precious than freshwater is clean freshwater. From the time that 2/3 of our planet is covered in water, we have contaminated our globe with chemicals that have been used by industrial activities over the last century in a unprecedented way causing harm to humans and wildlife. We have to adopt a new scientific mindset in order to face this problem so to protect this important resource. The Water Framework Directive (European Parliament and the Council, 2000) is a milestone legislative document that transformed the way that water quality monitoring is undertaken across all Member States by introducing the Ecological and Chemical Status. A “good or higher” Ecological Status is expected to be achieved for all waterbodies in Europe by 2015. Yet, most of the European waterbodies, which are determined to be at risk, or of moderate to bad quality, further information will be required so that adequate remediation strategies can be implemented. To date, water quality evaluation is based on five biological components (phytoplankton, macrophytes and benthic algae, macroinvertebrates and fishes) and various hydromorphological and physicochemical elements. The evaluation of the chemical status is principally based on 33 priority substances and on 12 xenobiotics, considered as dangerous for the environment. This approach takes into account only a part of the numerous xenobiotics that can be present in surface waters and could not evidence all the possible causes of ecotoxicological stress that can act in a water section. The mixtures of toxic chemicals may constitute an ecological risk not predictable on the basis of the single component concentration. To improve water quality, sources of contamination and causes of ecological alterations need to be identified. On the other hand, the analysis of the community structure, which is the result of multiple processes, including hydrological constrains and physico-chemical stress, give back only a “photograph” of the actual status of a site without revealing causes and sources of the perturbation. A multidisciplinary approach, able to integrate the information obtained by different methods, such as community structure analysis and eco-genotoxicological studies, could help overcome some of the difficulties in properly identifying the different causes of stress in risk assessment. In synthesis, the river ecological status is the result of a combination of multiple pressures that, for management purposes and quality improvement, have to be disentangled from each other. To reduce actual uncertainty in risk assessment, methods that establish quantitative links between levels of contamination and community alterations are needed. The analysis of macrobenthic invertebrate community structure has been widely used to identify sites subjected to perturbation. Trait-based descriptors of community structure constitute a useful method in ecological risk assessment. The diagnostic capacity of freshwater biomonitoring could be improved by chronic sublethal toxicity testing of water and sediment samples. Requiring an exposure time that covers most of the species’ life cycle, chronic toxicity tests are able to reveal negative effects on life-history traits at contaminant concentrations well below the acute toxicity level. Furthermore, the responses of high-level endpoints (growth, fecundity, mortality) can be integrated in order to evaluate the impact on population’s dynamics, a highly relevant endpoint from the ecological point of view. To gain more accurate information about potential causes and consequences of environmental contamination, the evaluation of adverse effects at physiological, biochemical and genetic level is also needed. The use of different biomarkers and toxicity tests can give information about the sub-lethal and toxic load of environmental compartments. Biomarkers give essential information about the exposure to toxicants, such as endocrine disruptor compounds and genotoxic substances whose negative effects cannot be evidenced by using only high-level toxicological endpoints. The increasing presence of genotoxic pollutants in the environment has caused concern regarding the potential harmful effects of xenobiotics on human health, and interest on the development of new and more sensitive methods for the assessment of mutagenic and cancerogenic risk. Within the WFD, biomarkers and bioassays are regarded as important tools to gain lines of evidence for cause-effect relationship in ecological quality assessment. Despite the scientific community clearly addresses the advantages and necessity of an ecotoxicological approach within the ecological quality assessment, a recent review reports that, more than one decade after the publication of the WFD, only few studies have attempted to integrate ecological water status assessment and biological methods (namely biomarkers or bioassays). None of the fifteen reviewed studies included both biomarkers and bioassays. The integrated approach developed in this PhD Thesis comprises a set of laboratory bioassays (Daphnia magna acute and chronic toxicity tests, Comet Assay and FPG-Comet) newly-developed, modified tacking a cue from standardized existing protocols or applied for freshwater quality testing (ecotoxicological, genotoxicological and toxicogenomic assays), coupled with field investigations on macrobenthic community structures (SPEAR and EBI indexes). Together with the development of new bioassays with Daphnia magna, the feasibility of eco-genotoxicological testing of freshwater and sediment quality with Heterocypris incongruens was evaluated (Comet Assay and a protocol for chronic toxicity). However, the Comet Assay, although standardized, was not applied to freshwater samples due to the lack of sensitivity of this species observed after 24h of exposure to relatively high (and not environmentally relevant) concentrations of reference genotoxicants. Furthermore, this species demonstrated to be unsuitable also for chronic toxicity testing due to the difficult evaluation of fecundity as sub-lethal endpoint of exposure and complications due to its biology and behaviour. The study was applied to a pilot hydrographic sub-Basin, by selecting section subjected to different levels of anthropogenic pressure: this allowed us to establish the reference conditions, to select the most significant endpoints and to evaluate the coherence of the responses of the different lines of evidence (alteration of community structure, eco-genotoxicological responses, alteration of gene expression profiles) and, finally, the diagnostic capacity of the monitoring strategy. Significant correlations were found between the genotoxicological parameter Tail Intensity % (TI%) and macrobenthic community descriptors SPEAR (p<0.001) and EBI (p<0.05), between the genotoxicological parameter describing DNA oxidative stress (ΔTI%) and mean levels of nitrates (p<0.01) and between reproductive impairment (Failed Development % from D. magna chronic bioassays) and TI% (p<0.001) as well as EBI (p<0.001). While correlation among parameters demonstrates a general coherence in the response to increasing impacts, the concomitant ability of each single endpoint to be responsive to specific sources of stress is at the basis of the diagnostic capacity of the integrated approach as demonstrated by stations presenting a mismatch among the different lines of evidence. The chosen set of bioassays, as well as the selected endpoints, are not providing redundant indications on the water quality status but, on the contrary, are contributing with complementary pieces of information about the several stressors that insist simultaneously on a waterbody section providing this monitoring strategy with a solid diagnostic capacity. Our approach should provide opportunities for the integration of biological effects into monitoring programmes for surface water, especially in investigative monitoring. Moreover, it should provide a more realistic assessment of impact and exposure of aquatic organisms to contaminants. Finally this approach should provide an evaluation of drivers of change in biodiversity and its causalities on ecosystem function/services provision, that is the direct and indirect contributions to human well-being.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The effects of a 15-mer antisense c-myc phosphorothioate modified oligodeoxynucleotide (OdN) upon the volume-sensitive Cl- current in ROS 17/2.8 cells were investigated using the whole-cell configuration of the patch clamp technique. At 5 microM, the OdN reversibly inhibited the current in a voltage- and time-dependent fashion. This was evident from the reduction in the peak current as assessed at the termination of each voltage pulse and an acceleration of the time-dependent inactivation present at strongly depolarised potentials. The kinetic modifications induced by the OdN suggest it may act by blocking the pore of open channels when the cell membrane potential is depolarised.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This thesis consists of three empirical and one theoretical studies. While China has received an increasing amount of foreign direct investment (FDI) and become the second largest host country for FDI in recent years, the absence of comprehensive studies on FDI inflows into this country drives this research. In the first study, an econometric model is developed to analyse the economic, political, cultural and geographic determinants of both pledged and realised FDI in China. The results of this study suggest that China's relatively cheaper labour force, high degree of international integration with the outside world (represented by its exports and imports) and bilateral exchange rates are the important economic determinants of both pledged FDI and realised FDI in China. The second study analyses the regional distribution of both pledged and realised FDI within China. The econometric properties of the panel data set are examined using a standardised 't-bar' test. The empirical results indicate that provinces with higher level of international trade, lower wage rates, more R&D manpower, more preferential policies and closer ethnic links with overseas Chinese attract relatively more FDI. The third study constructs a dynamic equilibrium model to study the interactions among FDI, knowledge spillovers and long run economic growth in a developing country. The ideas of endogenous product cycles and trade-related international knowledge spillovers are modified and extended to FDI. The major conclusion is that, in the presence of FDI, economic growth is determined by the stock of human capital, the subjective discount rate and knowledge gap, while unskilled labour can not sustain growth. In the fourth study, the role of FDI in the growth process of the Chinese economy is investigated by using a panel of data for 27 provinces across China between 1986 and 1995. In addition to FDI, domestic R&D expenditure, international trade and human capital are added to the standard convergence regressions to control for different structural characteristics in each province. The empirical results support endogenous innovation growth theory in which regional per capita income can converge given technological diffusion, transfer and imitation.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This work follows a feasibility study (187) which suggested that a process for purifying wet-process phosphoric acid by solvent extraction should be economically viable. The work was divided into two main areas, (i) chemical and physical measurements on the three-phase system, with or without impurities; (ii) process simulation and optimization. The object was to test the process technically and economically and to optimise the type of solvent. The chemical equilibria and distribution curves for the system water - phosphoric acid - solvent for the solvents n-amyl alcohol, tri-n-butyl phosphate, di-isopropyl ether and methyl isobutyl ketone have been determined. Both pure phosphoric acid and acid containing known amounts of naturally occurring impurities (Fe P0 4 , A1P0 4 , Ca3(P04)Z and Mg 3(P0 4 )Z) were examined. The hydrodynamic characteristics of the systems were also studied. The experimental results obtained for drop size distribution were compared with those obtainable from Hinze's equation (32) and it was found that they deviated by an amount related to the turbulence. A comprehensive literature survey on the purification of wet-process phosphoric acid by organic solvents has been made. The literature regarding solvent extraction fundamentals and equipment and optimization methods for the envisaged process was also reviewed. A modified form of the Kremser-Brown and Souders equation to calculate the number of contact stages was derived. The modification takes into account the special nature of phosphoric acid distribution curves in the studied systems. The process flow-sheet was developed and simulated. Powell's direct search optimization method was selected in conjunction with the linear search algorithm of Davies, Swann and Campey. The objective function was defined as the total annual manufacturing cost and the program was employed to find the optimum operating conditions for anyone of the chosen solvents. The final results demonstrated the following order of feasibility to purify wet-process acid: di-isopropyl ether, methylisobutyl ketone, n-amyl alcohol and tri-n-butyl phosphate.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

AIMS To demonstrate the potential use of in vitro poly(lactic-co-glycolic acid) (PLGA) microparticles in comparison with triamcinolone suspension to aid visualisation of vitreous during anterior and posterior vitrectomy. METHODS PLGA microparticles (diameter 10-60 microm) were fabricated using single and/or double emulsion technique(s) and used untreated or following the surface adsorption of a protein (transglutaminase). Particle size, shape, morphology and surface topography were assessed using scanning electron microscopy (SEM) and compared with a standard triamcinolone suspension. The efficacy of these microparticles to enhance visualisation of vitreous against the triamcinolone suspension was assessed using an in vitro set-up exploiting porcine vitreous. RESULTS Unmodified PLGA microparticles failed to adequately adhere to porcine vitreous and were readily washed out by irrigation. In contrast, modified transglutaminase-coated PLGA microparticles demonstrated a significant improvement in adhesiveness and were comparable to a triamcinolone suspension in their ability to enhance the visualisation of vitreous. This adhesive behaviour also demonstrated selectivity by not binding to the corneal endothelium. CONCLUSION The use of transglutaminase-modified biodegradable PLGA microparticles represents a novel method of visualising vitreous and aiding vitrectomy. This method may provide a distinct alternative for the visualisation of vitreous whilst eliminating the pharmacological effects of triamcinolone acetonide suspension.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The aim of this investigation was to study the chemical reactions occurring during the batchwise production of a butylated melamine-formaldehyde resin, in order to optimise the efficiency and economics of the batch processes. The batch process models are largely empirical in nature as the reaction mechanism is unknown. The process chemistry and the commercial manufacturing method are described. A small scale system was established in glass and the ability to produce laboratory resins with the required quality was demonstrated, simulating the full scale plant. During further experiments the chemical reactions of methylolation, condensation and butylation were studied. The important process stages were identified and studied separately. The effects of variation of certain process parameters on the chemical reactions were also studied. A published model of methylolation was modified and used to simulate the methylolation stage. A major result of this project was the development of an indirect method for studying the condensation and butylation reactions occurring during the dehydration and acid reaction stages, as direct quantitative methods were not available. A mass balance method was devised for this purpose and used to collect experimental data. The reaction scheme was verified using this data. The reactions stages were simulated using an empirical model. This has revealed new information regarding the mechanism and kinetics of the reactions. Laboratory results were shown to be comparable with plant scale results. This work has improved the understanding of the batch process, which can be used to improve product consistency. Future work has been identified and recommended to produce an optimum process and plant design to reduce the batch time.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Computer simulated trajectories of bulk water molecules form complex spatiotemporal structures at the picosecond time scale. This intrinsic complexity, which underlies the formation of molecular structures at longer time scales, has been quantified using a measure of statistical complexity. The method estimates the information contained in the molecular trajectory by detecting and quantifying temporal patterns present in the simulated data (velocity time series). Two types of temporal patterns are found. The first, defined by the short-time correlations corresponding to the velocity autocorrelation decay times (â‰0.1â€ps), remains asymptotically stable for time intervals longer than several tens of nanoseconds. The second is caused by previously unknown longer-time correlations (found at longer than the nanoseconds time scales) leading to a value of statistical complexity that slowly increases with time. A direct measure based on the notion of statistical complexity that describes how the trajectory explores the phase space and independent from the particular molecular signal used as the observed time series is introduced. © 2008 The American Physical Society.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Having a fixed differential-group delay (DGD) term b′ in the coarse-step method results in a repetitive pattern in the autocorrelation function (ACF). We solve this problem by inserting a varying DGD term at each integration step. Furthermore we compute the range of values needed for b′ and simulate the phenomenon of polarisation mode dispersion for different statistical distributions of b′. We examine systematically the modified coarse-step method compared to the analytical model, through our simulation results. © 2006 Elsevier B.V. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A method for inscribing fiber bragg gratings (FBG) using direct, point-by-point writing by an infrared femtosecond laser was described. The method requires neither phase-masks nor photosensitized fibers and hence offers remarkable technology flexibility. It requires a very short inscription time of less than 60 s per grating. Gratings of first to third order were produced in non-photosensitized, standard telecommunication fiber (SMF) and dispersion shifted fiber (DSF). The gratings produced in this method showed low insertion loss, narrow linewidth and strong, fundamental or high-order resonance.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This research was undertaken to: develop a process for the direct solvent extraction of castor oil seeds. A literature survey confirmed the desirability of establishing such a process with emphasis on the decortication, size, reduction, detoxification-deallergenization, and solvent·extraction operations. A novel process was developed for the dehulling of castor seeds which consists of pressurizing the beans and then suddenly releasing the pressure to vaccum. The degree of dehulling varied according to the pressure applied and the size of the beans. Some of the batches were difficult-to-hull, and this phenomenon was investigated using the scanning electron microscope and by thickness and compressive strength measurements. The other variables studied to lesser degrees included residence time, moisture, content, and temperature.The method was successfully extended to cocoa beans, and (with modifications) to peanuts. The possibility of continuous operation was looked into, and a mechanism was suggested to explain the method works. The work on toxins and allergens included an extensive literature survey on the properties of these substances and the methods developed for their deactivation Part of the work involved setting up an assay method for measuring their concentration in the beans and cake, but technical difficulties prevented the completion of this aspect of the project. An appraisal of the existing deactivation methods was made in the course of searching for new ones. A new method of reducing the size of oilseeds was introduced in this research; it involved freezing the beans in cardice and milling them in a coffee grinder, the method was found to be a quick, efficient, and reliable. An application of the freezing technique was successful in dehulling soybeans and de-skinning peanut kernels. The literature on the solvent extraction, of oilseeds, especially castor, was reviewed: The survey covered processes, equipment, solvents, and mechanism of leaching. three solvents were experimentally investigated: cyclohexane, ethanol, and acetone. Extraction with liquid ammonia and liquid butane was not effective under the conditions studied. Based on the results of the research a process has been suggested for the direct solvent extraction of castor seeds, the various sections of the process have analysed, and the factors affecting the economics of the process were discussed.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Numerical techniques have been finding increasing use in all aspects of fracture mechanics, and often provide the only means for analyzing fracture problems. The work presented here, is concerned with the application of the finite element method to cracked structures. The present work was directed towards the establishment of a comprehensive two-dimensional finite element, linear elastic, fracture analysis package. Significant progress has been made to this end, and features which can now be studied include multi-crack tip mixed-mode problems, involving partial crack closure. The crack tip core element was refined and special local crack tip elements were employed to reduce the element density in the neighbourhood of the core region. The work builds upon experience gained by previous research workers and, as part of the general development, the program was modified to incorporate the eight-node isoparametric quadrilateral element. Also. a more flexible solving routine was developed, and provided a very compact method of solving large sets of simultaneous equations, stored in a segmented form. To complement the finite element analysis programs, an automatic mesh generation program has been developed, which enables complex problems. involving fine element detail, to be investigated with a minimum of input data. The scheme has proven to be versati Ie and reasonably easy to implement. Numerous examples are given to demonstrate the accuracy and flexibility of the finite element technique.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The aim of the work presented in this thesis is to produce a direct method to design structures subject to deflection constraints at the working loads. The work carried out can be divided into four main parts. In the first part, a direct design procedure for plane steel frames subjected to sway limitations is proposed. The stiffness equations are modified so that the sway in each storey is equal to some specified values. The modified equations are then solved by iteration to calculate the cross-sectional properties of the columns as well as the other joint displacements. The beam sections are selected initially and then altered in an effort to reduce the total material cost of the frame. A linear extrapolation technique is used to reduce this cost. In this design, stability functions are used so that the effect of axial loads in the members are taken into consideration. The final reduced cost design is checked for strength requirements and the members are altered accordingly. In the second part, the design method is applied to the design of reinforced concrete frames in which the sway in the columns play an active part in the design criteria. The second moment of area of each column is obtained by solving the modified stiffness equations and then used to calculate the mlnlmum column depth required. Again the frame has to be checked for all the ultimate limit state load cases. In the third part, the method is generalised to design pin-jointed space frames for deflection limitatlions. In these the member areas are calculated so that the deflection at a specified joint is equal to its specified value. In the final part, the Lagrange multiplier technique is employed to obtain an optimum design for plane rigidly jointed steel frames. The iteration technique is used here to solve the modified stiffness equations as well as derivative equations obtained in accordance to the requirements of the optimisation method.