156 resultados para SAMPLERS


Relevância:

10.00% 10.00%

Publicador:

Resumo:

Passive samplers are not only a versatile tool to integrate environmental concentrations of pollutants, but also to avoid the use of live sentinel organisms for environmental monitoring. This study introduced the use of magnetic silicone polymer composites (Fe-PDMS) as passive sampling media to pre-concentrate a wide range of analytes from environmental settings. The composite samplers were assessed for their accumulation properties by performing lab experiments with two model herbicides (Atrazine and Irgarol 1051) and evaluated for their uptake properties from environmental settings (waters and sediments). The Fe-PDMS composites showed good accumulation of herbicides and pesticides from both freshwater and saltwater settings and the accumulation mechanism was positively correlated with the log Kow value of individual analytes. Results from the studies show that these composites could be easily used for a wide number of applications such as monitoring, cleanup, and/or bioaccumulation modeling, and as a non-intrusive and nondestructive monitoring tool for environmental forensic purposes.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This study aimed to assess ambient air quality in a urban area of Natal, capital of Rio Grande do Norte (latitude 5º49'29 '' S and longitude 35º13'34'' W), aiming to determine the metals concentration in particulate matter (PM10 and PM2,5) of atmospheric air in the urban area o the Natal city. The sampling period for the study consisted of data acquisition from January to December 2012. Samples were collected on glass fiber filters by means of two large volumes samplers, one for PM2,5 (AGV PM 2,5) and another for PM10 (PM10 AGV). Monthly averages ranged from 8.92 to 19.80 g.m-3 , where the annual average was 16,21 g.m-3 for PM10 and PM2,5 monthly averages ranged from 2,84 to 7,89 g.m -3 , with an annual average of 5,61 g.m-3 . The results of PM2,5 and PM10 concentrations were related meteorological variables and for information on the effects of these variables on the concentration of PM, an exploratory analysis of the data using Principal Component Analysis (PCA) was performed. The results of the PCA showed that with increasing barometric pressure, the direction of the winds, the rainfall and relative humidity decreases the concentration of PM and the variable weekday little influence compared the meteorological variables. Filters containing particulate matter were selected in six days and subjected to microwave digestion. After digestion samples were analyzed by with Inductively Coupled Plasma Mass Spectrometry (ICP-MS). The concentrations for heavy metals Vanadium, Chromium, Manganese, Nickel, Copper, Arsenic and lead were determined. The highest concentrations of metals were for Pb and Cu, whose average PM10 values were, respectively, 5,34 and 2,34 ng.m-3 and PM2,5 4,68 and 2,95 ng.m-3 . Concentrations for metals V, Cr, Mn, Ni, and Cd were respectively 0,13, 0,39, 0,48, 0,45 and 0,03 ng.m-3 for PM10 fraction and PM2,5 fraction, 0,05, 0,10, 0,10, 0,34 and 0,01 ng.m-3. The concentration for As was null for the two fractions

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Many modern applications fall into the category of "large-scale" statistical problems, in which both the number of observations n and the number of features or parameters p may be large. Many existing methods focus on point estimation, despite the continued relevance of uncertainty quantification in the sciences, where the number of parameters to estimate often exceeds the sample size, despite huge increases in the value of n typically seen in many fields. Thus, the tendency in some areas of industry to dispense with traditional statistical analysis on the basis that "n=all" is of little relevance outside of certain narrow applications. The main result of the Big Data revolution in most fields has instead been to make computation much harder without reducing the importance of uncertainty quantification. Bayesian methods excel at uncertainty quantification, but often scale poorly relative to alternatives. This conflict between the statistical advantages of Bayesian procedures and their substantial computational disadvantages is perhaps the greatest challenge facing modern Bayesian statistics, and is the primary motivation for the work presented here.

Two general strategies for scaling Bayesian inference are considered. The first is the development of methods that lend themselves to faster computation, and the second is design and characterization of computational algorithms that scale better in n or p. In the first instance, the focus is on joint inference outside of the standard problem of multivariate continuous data that has been a major focus of previous theoretical work in this area. In the second area, we pursue strategies for improving the speed of Markov chain Monte Carlo algorithms, and characterizing their performance in large-scale settings. Throughout, the focus is on rigorous theoretical evaluation combined with empirical demonstrations of performance and concordance with the theory.

One topic we consider is modeling the joint distribution of multivariate categorical data, often summarized in a contingency table. Contingency table analysis routinely relies on log-linear models, with latent structure analysis providing a common alternative. Latent structure models lead to a reduced rank tensor factorization of the probability mass function for multivariate categorical data, while log-linear models achieve dimensionality reduction through sparsity. Little is known about the relationship between these notions of dimensionality reduction in the two paradigms. In Chapter 2, we derive several results relating the support of a log-linear model to nonnegative ranks of the associated probability tensor. Motivated by these findings, we propose a new collapsed Tucker class of tensor decompositions, which bridge existing PARAFAC and Tucker decompositions, providing a more flexible framework for parsimoniously characterizing multivariate categorical data. Taking a Bayesian approach to inference, we illustrate empirical advantages of the new decompositions.

Latent class models for the joint distribution of multivariate categorical, such as the PARAFAC decomposition, data play an important role in the analysis of population structure. In this context, the number of latent classes is interpreted as the number of genetically distinct subpopulations of an organism, an important factor in the analysis of evolutionary processes and conservation status. Existing methods focus on point estimates of the number of subpopulations, and lack robust uncertainty quantification. Moreover, whether the number of latent classes in these models is even an identified parameter is an open question. In Chapter 3, we show that when the model is properly specified, the correct number of subpopulations can be recovered almost surely. We then propose an alternative method for estimating the number of latent subpopulations that provides good quantification of uncertainty, and provide a simple procedure for verifying that the proposed method is consistent for the number of subpopulations. The performance of the model in estimating the number of subpopulations and other common population structure inference problems is assessed in simulations and a real data application.

In contingency table analysis, sparse data is frequently encountered for even modest numbers of variables, resulting in non-existence of maximum likelihood estimates. A common solution is to obtain regularized estimates of the parameters of a log-linear model. Bayesian methods provide a coherent approach to regularization, but are often computationally intensive. Conjugate priors ease computational demands, but the conjugate Diaconis--Ylvisaker priors for the parameters of log-linear models do not give rise to closed form credible regions, complicating posterior inference. In Chapter 4 we derive the optimal Gaussian approximation to the posterior for log-linear models with Diaconis--Ylvisaker priors, and provide convergence rate and finite-sample bounds for the Kullback-Leibler divergence between the exact posterior and the optimal Gaussian approximation. We demonstrate empirically in simulations and a real data application that the approximation is highly accurate, even in relatively small samples. The proposed approximation provides a computationally scalable and principled approach to regularized estimation and approximate Bayesian inference for log-linear models.

Another challenging and somewhat non-standard joint modeling problem is inference on tail dependence in stochastic processes. In applications where extreme dependence is of interest, data are almost always time-indexed. Existing methods for inference and modeling in this setting often cluster extreme events or choose window sizes with the goal of preserving temporal information. In Chapter 5, we propose an alternative paradigm for inference on tail dependence in stochastic processes with arbitrary temporal dependence structure in the extremes, based on the idea that the information on strength of tail dependence and the temporal structure in this dependence are both encoded in waiting times between exceedances of high thresholds. We construct a class of time-indexed stochastic processes with tail dependence obtained by endowing the support points in de Haan's spectral representation of max-stable processes with velocities and lifetimes. We extend Smith's model to these max-stable velocity processes and obtain the distribution of waiting times between extreme events at multiple locations. Motivated by this result, a new definition of tail dependence is proposed that is a function of the distribution of waiting times between threshold exceedances, and an inferential framework is constructed for estimating the strength of extremal dependence and quantifying uncertainty in this paradigm. The method is applied to climatological, financial, and electrophysiology data.

The remainder of this thesis focuses on posterior computation by Markov chain Monte Carlo. The Markov Chain Monte Carlo method is the dominant paradigm for posterior computation in Bayesian analysis. It has long been common to control computation time by making approximations to the Markov transition kernel. Comparatively little attention has been paid to convergence and estimation error in these approximating Markov Chains. In Chapter 6, we propose a framework for assessing when to use approximations in MCMC algorithms, and how much error in the transition kernel should be tolerated to obtain optimal estimation performance with respect to a specified loss function and computational budget. The results require only ergodicity of the exact kernel and control of the kernel approximation accuracy. The theoretical framework is applied to approximations based on random subsets of data, low-rank approximations of Gaussian processes, and a novel approximating Markov chain for discrete mixture models.

Data augmentation Gibbs samplers are arguably the most popular class of algorithm for approximately sampling from the posterior distribution for the parameters of generalized linear models. The truncated Normal and Polya-Gamma data augmentation samplers are standard examples for probit and logit links, respectively. Motivated by an important problem in quantitative advertising, in Chapter 7 we consider the application of these algorithms to modeling rare events. We show that when the sample size is large but the observed number of successes is small, these data augmentation samplers mix very slowly, with a spectral gap that converges to zero at a rate at least proportional to the reciprocal of the square root of the sample size up to a log factor. In simulation studies, moderate sample sizes result in high autocorrelations and small effective sample sizes. Similar empirical results are observed for related data augmentation samplers for multinomial logit and probit models. When applied to a real quantitative advertising dataset, the data augmentation samplers mix very poorly. Conversely, Hamiltonian Monte Carlo and a type of independence chain Metropolis algorithm show good mixing on the same dataset.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Free energy calculations are a computational method for determining thermodynamic quantities, such as free energies of binding, via simulation.

Currently, due to computational and algorithmic limitations, free energy calculations are limited in scope.

In this work, we propose two methods for improving the efficiency of free energy calculations.

First, we expand the state space of alchemical intermediates, and show that this expansion enables us to calculate free energies along lower variance paths.

We use Q-learning, a reinforcement learning technique, to discover and optimize paths at low computational cost.

Second, we reduce the cost of sampling along a given path by using sequential Monte Carlo samplers.

We develop a new free energy estimator, pCrooks (pairwise Crooks), a variant on the Crooks fluctuation theorem (CFT), which enables decomposition of the variance of the free energy estimate for discrete paths, while retaining beneficial characteristics of CFT.

Combining these two advancements, we show that for some test models, optimal expanded-space paths have a nearly 80% reduction in variance relative to the standard path.

Additionally, our free energy estimator converges at a more consistent rate and on average 1.8 times faster when we enable path searching, even when the cost of path discovery and refinement is considered.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Nodule samples obtained were described and studied on board for 1) observation of occurrence and morphology in and outside samplers, size classification, measurement of weight and calculation of population density (kg/m2); 2) photographing whole nodules on the plate marked with the frames of unit areas of both 0cean-70 (0.50 m2) and freefall grab (0.13 m2), and that of typical samples on the plate with a 5 cm grid scale: 3) observation of internal structures of the nodules on cut section; and 4) determination of mineral composition by X-ray diffractometer. The relation between nodule types and geological environment or chemical composition was examined by referring to other data of related studies, such as sedimentology. acoustic survey, and chemical analysis.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Rhizon samplers were originally designed as micro-tensiometers for soil science to sample seepage water in the unsaturated zone. This study shows applications of Rhizons for porewater sampling from sediments in aquatic systems and presents a newly developed Rhizon in situ sampler (RISS). With the inexpensive Rhizon sampling technique, porewater profiles can be sampled with minimum disturbance of both the sediment structure and possible flow fields. Field experiments, tracer studies, and numerical modeling were combined to assess the suitability of Rhizons for porewater sampling. It is shown that the low effort and simple application makes Rhizons a powerful tool for porewater sampling and an alternative to classical methods. Our investigations show that Rhizons are well suited for sampling porewater on board a ship, in a laboratory, and also for in situ sampling. The results revealed that horizontally aligned Rhizons can sample porewater with a vertical resolution of 1 cm. Combined with an in situ benthic chamber system, the RISS allows studies of benthic fluxes and porewater profiles at the same location on the seafloor with negligible effect on the incubated sediment water interface. Results derived by porewater sampling of sediment cores from the Southern Ocean (Atlantic sector) and by in situ sampling of tidal flat sediments of the Wadden Sea (Sahlenburg/Cuxhaven, Germany) are presented.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

As the development of a viable quantum computer nears, existing widely used public-key cryptosystems, such as RSA, will no longer be secure. Thus, significant effort is being invested into post-quantum cryptography (PQC). Lattice-based cryptography (LBC) is one such promising area of PQC, which offers versatile, efficient, and high performance security services. However, the vulnerabilities of these implementations against side-channel attacks (SCA) remain significantly understudied. Most, if not all, lattice-based cryptosystems require noise samples generated from a discrete Gaussian distribution, and a successful timing analysis attack can render the whole cryptosystem broken, making the discrete Gaussian sampler the most vulnerable module to SCA. This research proposes countermeasures against timing information leakage with FPGA-based designs of the CDT-based discrete Gaussian samplers with constant response time, targeting encryption and signature scheme parameters. The proposed designs are compared against the state-of-the-art and are shown to significantly outperform existing implementations. For encryption, the proposed sampler is 9x faster in comparison to the only other existing time-independent CDT sampler design. For signatures, the first time-independent CDT sampler in hardware is proposed. 

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The objective of this study was to assess worker exposure to mineral dust particles and a metabolic model, based on the model adopted by ICRP, was applied to assess human exposure to Ta, and predicted values of Ta concentrations in excreta. The occupational exposure to Th, U, Nb, and Ta bearing particles during routine tasks to obtain Fe-Nb alloys was estimated using air samplers and excreta samples. Ta concentrations in food samples and in drinking water were also determined. The results support that workers were occupationally exposed to Ta bearing particles, and also indicate that a source of Ta exposure for both workers and the control group was the ingestion of drinking water containing soluble compounds of Ta. Therefore, some Ta compounds should be considered soluble compounds in gastrointestinal tract. Consequently the metabolic model based on ICRP metabolic model and/or the transfer factor f1 for Ta should be reviewed and the solubility of Ta compounds in gastrointestinal should be determined.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Se determinó la composición química del agua de lluvia y de niebla en tres sitios en la Reserva Biológica Monteverde, Puntarenas; entre octubre 2009 y enero 2010. Debido a su estado de conservación y a su ubicación geográfica sobre la deriva continental, la Reserva Biológica Monteverde ofrece un sitio de estudio ideal, para el estudio de la composición de las aguas atmosféricas (agua de lluvia y de niebla). Las muestras de agua de niebla se recolectaron al utilizar muestreadores de niebla con líneas de teflón, mientras que las de agua de lluvia se recogieron al emplear muestreadores de lluvia simples y uno de cascada. En ambos tipos de agua se analizaron las especies iónicas más relevantes: H3O+, NH4 +, Ca2+, Mg2+, K+, Na+, Cl-, NO3 - y SO4 2-, al utilizar cromatografía de iones y detección por conductividad eléctrica. Las concentraciones promedio de estas especies en el agua de lluvia estuvieron entre 0,54 ± 0,02 μeq L-1 y 101± 3 μeq L-1, mientras que en el agua de niebla variaron entre 1,00 ± 0,02 μeq L-1 y 93 ± 4 μeq L-1. Además, se presentan el balance iónico y los factores de enriquecimiento con respecto al mar y el suelo de ambos tipos de muestras.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

There is increasing evidence of a causal link between airborne particles and ill health and this study monitored the exposure to both airborne particles and the gas phase contaminants of environmental tobacco smoke (ETS) in a nightclub. The present study followed a number of pilot studies in which the human exposure to airborne particles in a nightclub was assessed and the spatio-temporal distribution of gas phase pollutants was evaluated in restaurants and pubs. The work reported here re-examined the nightclub environment and utilized concurrent and continuous monitoring using optical scattering samplers to measure particulates (PM10) together with multi-gas analysers. The analysis illustrated the highly episodic nature of both gaseous and particulate concentrations in both the dance floor and in the bar area but levels were well below the maximum recommended exposure levels. Short-term exposure to high concentrations may however be relevant when considering the possible toxic effects on biological systems. The results give an indication of the problems associated with achieving acceptable indoor air quality (IAQ) in a complex space and identified some of the problems inherent in the design and operation of ventilation systems for such spaces.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

There is increasing evidence of a causal link between airborne particles and ill health and this study examined the exposure to both airborne particles and the gas phase contaminants of environmental tobacco smoke (ETS) in a bar. The work reported here utilized concurrent and continuous monitoring using real-time optical scattering personal samplers to record particulate (PM10) concentrations at two internal locations. Very high episodes were observed in seating areas compared with the bar area. A photo-acoustic multi-gas analyser was used to record the gas phases (CO and CO2) at eight different locations throughout the bar and showed little spatial variation. This gave a clear indication of the problems associated with achieving acceptable Indoor Air Quality in a public space and identified a fundamental problem with the simplistic design approach taken to ventilate the space. Both gaseous and particulate concentrations within the bar were below maximum recommended levels although the time-series analysis illustrated the highly episodic nature of this exposure.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In the last decades, the effects of the air pollution have been increasing, especially in the case of the human health diseases. In order to overcome this problem, scientists have been studying the components of the air. As a part of water-soluble organic compounds, amino acids are present in the atmospheric environment as components of diverse living organisms which can be responsible for spreading diseases through the air. Liquid chromatography is one technique capable of distinguish the different amino acids from each other. In this work, aiming at separating the amino acids found in the aerosols samples collected in Aveiro, the ability of four columns (Mixed-Mode WAX-1, Mixed-Mode HILIC-1, Luna HILIC and Luna C18) to separate four amino acids (aspartic acid, lysine, glycine and tryptophan) and the way the interaction of the stationary phases of the columns with the analytes is influenced by organic solvent concentration and presence/concentration of the buffer, are being assessed. In the Mixed-Mode WAX-1 column, the chromatograms of the distinct amino acids revealed the separation was not efficient, since the retention times were very similar. In the case of lysine, in the elution with 80% (V/V) MeOH, the peaks appeared during the volume void. In the Mixed-Mode HILIC-1 column, the variation of the organic solvent concentration did not affect the elution of the four studied amino acids. Considering the Luna HILIC column, the retention times of the amino acids were too close to each other to ensure a separation among each other. Lastly, the Luna C18 column revealed to be useful to separate amino acids in a gradient mode, being the variation of the mobile phase composition in the organic solvent concentration (ACN). Luna C18 was the column used to separate the amino acids in the real samples and the mobile phase had acidified water and ACN. The gradient consisted in the following program: 0 – 2 min: 5% (V/V) ACN, 2 – 8 min: 5 – 2 % (V/V) ACN, 8 – 16 min: 2% (V/V) ACN, 16 – 20 min: 2 – 20 % (V/V) ACN, 20 – 35 min: 20 – 35 % (V/V) ACN. The aerosols samples were collected by using three passive samplers placed in two different locations in Aveiro and each sampler had two filters - one faced up and the other faced down. After the sampling, the water-soluble organic compounds was extracted by dissolution in ultra-pure water, sonication bath and filtration. The resulting filtered solutions were diluted in acidified water for the chromatographic separation. The results from liquid chromatography revealed the presence of the amino acids, although it was not possible to identify each one of them individually. The chromatograms and the fluorescence spectra showed the existence of some patterns: the samples that correspond to the up filters had more intense peaks and signals, revealing that the up filters collected more organic matter.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

La reciente aprobación de la ley Sinde en medio de un profundo y profuso debate público ha puesto de manifiesto que las ambiciones de las industrias de la cultura no están en sintonía con las de consumidores y ciudadanos. Se ha discutido la necesidad de reformular las leyes de propiedad intelectual, contraponiendo el derecho de los creadores a rentabilizar su trabajo con los derechos de los ciudadanos a la circulación de la cultura. Se hace necesario analizar los términos de este debate, además de los elementos ausentes en la discusión. La participación de los ciudadanos parece limitarse meramente a circular materiales culturales fuera de los cauces comerciales, pero no aparecen tematizadas las prácticas creativas, cada vez más comunes, que también infringen las leyes de propiedad intelectual. Los blogs mantenidos por fans sobre personajes de comic, las versiones dobladas de trailers de películas, el uso de samplers o mashups en la música popular, etc… Prácticas que implican la reelaboración creativa de textos preexistentes que cuestionan la centralidad del autor en su acepción clásica (la protegida por la legislación de derechos de autor). Proponemos contrastar los discursos en torno al derecho de los ciudadanos a la cultura que entienden este ejercicio como mero intercambio de archivos con las citadas prácticas intertextuales que suponen una forma de empoderamiento de las ciudadanos que discuten los límites de la creatividad y la originalidad. El análisis pretende dar algunas pistas para entender dos cuestiones: a) el alcance los derechos culturales de la ciudadanía democrática en la sociedad de la información y b) la necesidad de reelaborar las leyes de propiedad intelectual para dar cabida a las nuevas prácticas de la cultura digital.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

A poluição atmosférica é um dos principais factores de degradação da qualidade de vida da população. O conjunto BTEX (benzeno, tolueno, etilbenzeno e xilenos) constitui o grupo mais importante dos compostos orgânicos voláteis (VOCs) na atmosfera uma vez que participam na química da atmosfera e constituem um perigo para a saúde, nomeadamente o benzeno, por ser altamente cancerígeno. São maioritariamente libertados pelo tráfego automóvel. Neste trabalho foi determinada a concentração dos BTEX em nove pontos da cidade de Évora no período de 21 Março a 1 de Julho de 2009 tendo-se recorrido à técnica de amostragem passiva, com amostradores Radiello™, seguida de desadsorção líquida, usando CS2, e subsequente análise por GC-MS. A concentração de benzeno no ar da cidade de Évora não excedeu o valor legislado de 5 g/m3 neste período de amostragem, sendo as concentrações obtidas para os poluentes em geral muito baixas e na sua maioria inferiores ao LOQ do método analítico. ABSTRACT; Air pollution is the major factor in the degradation of the population quality of life. BTEX (benzene, toluene, ethylbenzene and xylenes) is the most important group of volatile organic compounds (VOCs) in the atmosphere because of their role in atmospheric chemistry and the risk they posed to human health, with benzene, being a highly carcinogenic compound. BTEX are released mainly by road traffic. Concentrations of BTEX were determined at nine sampling points in the city of Évora in the period from 21 March to 1 July 2009, using passive samplers Radiello™, followed by liquid desorption with CS2, and subsequent analysis by GC-MS. During the sampling period, the concentration of benzene in the outdoor air of Évora city did not exceed 5 g/m3, the maximum value admissible by legislation. The concentrations measured of the other pollutants were, in general, very low and mostly below the LOQ of the analytical method.