887 resultados para path sampling
Resumo:
This paper describes the development of a sequential injection chromatography (SIC) procedure for separation and quantification of the herbicides simazine, atrazine, and propazine exploring the low backpressure of a 2.5 cm long monolithic C(18) column. The separation of the three compounds was achieved in less than 90 s with resolution > 1.5 using a mobile phase composed by ACN/1.25 mmol/L acetate buffer (pH 4.5) at the volumetric ratio of 35:65 and flow rate of 40 mu L/s. Detection was made at 223 nm using a flow cell with 40 mm of optical path length. The LOD was 10 mu g/L for the three triazines and the quantification limits were of 30 mu g/L for simazine and propazine and 40 mu g/L for atrazine. The sampling frequency is 27 samples per hour, consuming 1.1 mL of ACN per analysis. The proposed methodology was applied to spiked water samples and no statistically significant differences were observed in comparison to a conventional HPLC-UV method. The major metabolites of atrazine and other herbicides did not interfere in the analysis, being eluted from the column either together with the unretained peak, or at retention times well-resolved from the studied compounds.
Resumo:
Compared to other volatile carbonylic compounds present in outdoor air, formaldehyde (CH2O) is the most toxic, deserving more attention in terms of indoor and outdoor air quality legislation and control. The analytical determination of CH2O in air still presents challenges due to the low-level concentration (in the sub-ppb range) and its variation with sampling site and time. Of the many available analytical methods for carbonylic compounds, the most widespread one is the time consuming collection in cartridges impregnated with 2,4-dinitrophenylhydrazine followed by the analysis of the formed hydrazones by HPLC. The present work proposes the use of polypropylene hollow porous capillary fibers to achieve efficient CH2O collection. The Oxyphan (R) fiber (designed for blood oxygenation) was chosen for this purpose because it presents good mechanical resistance, high density of very fine pores and high ratio of collection area to volume of the acceptor fluid in the tube, all favorable for the development of air sampling apparatus. The collector device consists of a Teflon pipe inside of which a bundle of polypropylene microporous capillary membranes was introduced. While the acceptor passes at a low flow rate through the capillaries, the sampled air circulates around the fibers, impelled by a low flow membrane pump (of the type used for aquariums ventilation). The coupling of this sampling technique with the selective and quantitative determination of CH2O, in the form of hydroxymethanesulfonate (HMS) after derivatization with HSO3-, by capillary electrophoresis with capacitively coupled contactless conductivity detection (CE-(CD)-D-4) enabled the development of a complete analytical protocol for the CH2O evaluation in air. (C) 2008 Published by Elsevier B.V.
Resumo:
A fast and reliable method for the direct determination of iron in sand by solid sampling graphite furnace atomic absorption spectrometry was developed. A Zeeman-effect 3-field background corrector was used to decrease the sensitivity of spectrometer measurements. This strategy allowed working with up to 200 mu g of samples, thus improving the representativity. Using samples with small particle sizes (1-50 mu m) and adding 5 mu g Pd as chemical modifier, it was possible to obtain suitable calibration curves with aqueous reference solutions. The pyrolysis and atomization temperatures for the optimized heating program were 1400 and 2500 degrees C, respectively. The characteristic mass, based on integrated absorbance, was 56 pg, and the detection limits, calculated considering the variability of 20 consecutive measurements of platform inserted without sample was 32 pg. The accuracy of the procedure was checked with the analysis of two reference materials (IPT 62 and 63). The determined concentrations were in agreement with the recommended values (95% confidence level). Five sand samples were analyzed, and a good agreement (95% confidence level) was observed using the proposed method and conventional flame atomic absorption spectrometry. The relative standard deviations were lower than 25% (n = 5). The tube and boat platform lifetimes were around 1000 and 250 heating cycles, respectively.
Resumo:
One method using a solid sampling device for the direct determination of Cr and Ni in fresh and used lubricating oils by graphite furnace atomic absorption spectrometry are proposed. The high organic content in the samples was minimized using a digestion step at 400 degrees C in combination with an oxidant mixture 1.0% (v v(-1)) HNO3+15% (v v(-1)) H2O2+0.1% (m v(-1)) Triton X-100 for the in situ digestion. The 3-field mode Zeeman-effect allowed the spectrometer calibration up to 5 ng of Cr and Ni. The quantification limits were 0.86 mu g g(-1) for Cr and 0.82 mg g(-1) for Ni, respectively. The analysis of reference materials showed no statistically significant difference between the recommended values and those obtained by the proposed methods.
Resumo:
To connect different electrical, network and data devices with the minimum cost and shortest path, is a complex job. In huge buildings, where the devices are placed at different locations on different floors and only some specific routes are available to pass the cables and buses, the shortest path search becomes more complex. The aim of this thesis project is, to develop an application which indentifies the best path to connect all objects or devices by following the specific routes.To address the above issue we adopted three algorithms Greedy Algorithm, Simulated Annealing and Exhaustive search and analyzed their results. The given problem is similar to Travelling Salesman Problem. Exhaustive search is a best algorithm to solve this problem as it checks each and every possibility and give the accurate result but it is an impractical solution because of huge time consumption. If no. of objects increased from 12 it takes hours to search the shortest path. Simulated annealing is emerged with some promising results with lower time cost. As of probabilistic nature, Simulated annealing could be non optimal but it gives a near optimal solution in a reasonable duration. Greedy algorithm is not a good choice for this problem. So, simulated annealing is proved best algorithm for this problem. The project has been implemented in C-language which takes input and store output in an Excel Workbook
Resumo:
We discuss the development and performance of a low-power sensor node (hardware, software and algorithms) that autonomously controls the sampling interval of a suite of sensors based on local state estimates and future predictions of water flow. The problem is motivated by the need to accurately reconstruct abrupt state changes in urban watersheds and stormwater systems. Presently, the detection of these events is limited by the temporal resolution of sensor data. It is often infeasible, however, to increase measurement frequency due to energy and sampling constraints. This is particularly true for real-time water quality measurements, where sampling frequency is limited by reagent availability, sensor power consumption, and, in the case of automated samplers, the number of available sample containers. These constraints pose a significant barrier to the ubiquitous and cost effective instrumentation of large hydraulic and hydrologic systems. Each of our sensor nodes is equipped with a low-power microcontroller and a wireless module to take advantage of urban cellular coverage. The node persistently updates a local, embedded model of flow conditions while IP-connectivity permits each node to continually query public weather servers for hourly precipitation forecasts. The sampling frequency is then adjusted to increase the likelihood of capturing abrupt changes in a sensor signal, such as the rise in the hydrograph – an event that is often difficult to capture through traditional sampling techniques. Our architecture forms an embedded processing chain, leveraging local computational resources to assess uncertainty by analyzing data as it is collected. A network is presently being deployed in an urban watershed in Michigan and initial results indicate that the system accurately reconstructs signals of interest while significantly reducing energy consumption and the use of sampling resources. We also expand our analysis by discussing the role of this approach for the efficient real-time measurement of stormwater systems.
Resumo:
Print No:77
Resumo:
A abertura da economia e o reinício das relações econômicas japonesas com o exterior e o processo de industrialização da economia japonesa na Era Meiji levaram o Japão a investir no exterior. Os investimentos japoneses no exterior cresceram preponderantemente no Leste Asiático, concentrando-se nessa área até o término da Segunda Guerra Mundial. A tese mostra que existem restrições de caráter institucional e histórico, que impõem limitações às empresas quanto ao leque de opções para investirem no exterior. Uma vez que as empresas japonesas fizeram a opção, impostas por aquelas restrições, de investir no Leste Asiático, feedbacks positivos, propiciados pelos eventos históricos e institucionais, geraram auto-reforços, levando a um resultado de inflexibilidade - lock-in - para sair dessa região. A tese comprova que explicar os investimentos externos japoneses pelo modelo de economia evolucionária, através do processo de path dependence, que incorpora o caráter institucional e histórico, é mais plausível do que as interpretações convencionais.
Resumo:
Fischer (1979) and Asako (1983) analyze the sign of the correlation between the growth rate of money and the rate of capital accumulation on the transition path. Both plug a CRRA utility (based on a Cobb-Douglas and a Leontief function, respectively) into Sidrauski's model - yet return contrasting results. The present analysis, by using a more general CES utility, presents both of those settings and conclusions as limiting cases, and generates economic gures more consistent with reality (for instance, the interest-rate elasticity of the money demands derived from those previous works is necessarily 1 and 0, respectively).
Resumo:
Convex combinations of long memory estimates using the same data observed at different sampling rates can decrease the standard deviation of the estimates, at the cost of inducing a slight bias. The convex combination of such estimates requires a preliminary correction for the bias observed at lower sampling rates, reported by Souza and Smith (2002). Through Monte Carlo simulations, we investigate the bias and the standard deviation of the combined estimates, as well as the root mean squared error (RMSE), which takes both into account. While comparing the results of standard methods and their combined versions, the latter achieve lower RMSE, for the two semi-parametric estimators under study (by about 30% on average for ARFIMA(0,d,0) series).