941 resultados para Environmental monitoring Statistical methods
Resumo:
Using water quality management programs is a necessary and inevitable way for preservation and sustainable use of water resources. One of the important issues in determining the quality of water in rivers is designing effective quality control networks, so that the measured quality variables in these stations are, as far as possible, indicative of overall changes in water quality. One of the methods to achieve this goal is increasing the number of quality monitoring stations and sampling instances. Since this will dramatically increase the annual cost of monitoring, deciding on which stations and parameters are the most important ones, along with increasing the instances of sampling, in a way that shows maximum change in the system under study can affect the future decision-making processes for optimizing the efficacy of extant monitoring network, removing or adding new stations or parameters and decreasing or increasing sampling instances. This end, the efficiency of multivariate statistical procedures was studied in this thesis. Multivariate statistical procedure, with regard to its features, can be used as a practical and useful method in recognizing and analyzing rivers’ pollution and consequently in understanding, reasoning, controlling, and correct decision-making in water quality management. This research was carried out using multivariate statistical techniques for analyzing the quality of water and monitoring the variables affecting its quality in Gharasou river, in Ardabil province in northwest of Iran. During a year, 28 physical and chemical parameters were sampled in 11 stations. The results of these measurements were analyzed by multivariate procedures such as: Cluster Analysis (CA), Principal Component Analysis (PCA), Factor Analysis (FA), and Discriminant Analysis (DA). Based on the findings from cluster analysis, principal component analysis, and factor analysis the stations were divided into three groups of highly polluted (HP), moderately polluted (MP), and less polluted (LP) stations Thus, this study illustrates the usefulness of multivariate statistical techniques for analysis and interpretation of complex data sets, and in water quality assessment, identification of pollution sources/factors and understanding spatial variations in water quality for effective river water quality management. This study also shows the effectiveness of these techniques for getting better information about the water quality and design of monitoring network for effective management of water resources. Therefore, based on the results, Gharasou river water quality monitoring program was developed and presented.
Resumo:
Environmental pollution continues to be an emerging study field, as there are thousands of anthropogenic compounds mixed in the environment whose possible mechanisms of toxicity and physiological outcomes are of great concern. Developing methods to access and prioritize the screening of these compounds at trace levels in order to support regulatory efforts is, therefore, very important. A methodology based on solid phase extraction followed by derivatization and gas chromatography-mass spectrometry analysis was developed for the assessment of four endocrine disrupting compounds (EDCs) in water matrices: bisphenol A, estrone, 17b-estradiol and 17a-ethinylestradiol. The study was performed, simultaneously, by two different laboratories in order to evaluate the robustness of the method and to increase the quality control over its application in routine analysis. Validation was done according to the International Conference on Harmonisation recommendations and other international guidelines with specifications for the GC-MS methodology. Matrix-induced chromatographic response enhancement was avoided by using matrix-standard calibration solutions and heteroscedasticity has been overtaken by a weighted least squares linear regression model application. Consistent evaluation of key analytical parameters such as extraction efficiency, sensitivity, specificity, linearity, limits of detection and quantification, precision, accuracy and robustness was done in accordance with standards established for acceptance. Finally, the application of the optimized method in the assessment of the selected analytes in environmental samples suggested that it is an expedite methodology for routine analysis of EDC residues in water matrices.
Resumo:
The development and application of a polyaniline/carbon nanotube (CNT) cyclodextrin matrix (PANI-β-CD/MWCNT)-based electrochemical sensor for the quantitative determination of the herbicide 4-chloro-2-methylphenoxyacetic acid (MCPA) and its main transformation product 4-chloro-2-methylphenol in natural waters are described. A simple cyclic voltammetry-based electrochemical methodology, in phosphate buffer solution at pH 6.0, was used to develop a method to determine both MCPA and 4-chloro-2-methylphenol, without any previous extraction or derivatization steps. A linear concentration range (10 to 50 μmol L−1) and detection limits of 1.1 and 1.9 μmol L−1, respectively, were achieved using optimized cyclic voltammetric parameters. The proposed method was successfully applied to the determination of MCPA and 4-chloro-2-methylphenol in natural water samples with satisfactory recoveries (94 to 107 %) and in good agreement with the results obtained by an established high-performance liquid chromatography technique, no significant differences being found between the methods. Interferences from ionic species and other herbicides used for broad-leaf weed control were shown to be small. The newly developed methodology was also successfully applied to MCPA photodegradation environmental studies.
Resumo:
The Spanish Government has established post-market environmental monitoring (PMEM) as mandatory for genetically modified (GM) crop varieties cultivated in Spain. In order to comply with this regulation, effects of Bt maize varieties derived from the event MON810 on the predatory fauna were monitored for two years in northeast and central Spain. The study was carried out with a randomized block design in maize fields of 3-4 ha on which the abundance of plant-dwelling predators and the activity-density of soil-dwelling predators in Bt vs. non-Bt near-isogenic varieties were compared. To this end, the plots were sampled by visual inspection of a certain number of plants and pitfall traps 6 or 7 times throughout two seasons. No significant differences in predator densities on plants were found between Bt and non-Bt varieties. In the pitfall traps, significant differences between the two types of maize were found only in Staphylinidae, in which trap catches in non-Bt maize were higher than in Bt maize in central Spain. Based on the statistical power of the assays, surrogate arthropods for PMEM purposes are proposed; Orius spp. and Araneae for visual sampling and Carabidae, Araneae, and Staphylinidae for pitfall trapping. The other predator groups recorded in the study, Nabis sp. and Coccinellidae in visual sampling and Dermaptera in pitfall trapping, gave very poor power results. To help to establish a standardized protocol for PMEM of genetically modified crops, the effect-detecting capacity with a power of 0.8 of each predator group is given.
Resumo:
Tärkeä tehtävä ympäristön tarkkailussa on arvioida ympäristön nykyinen tila ja ihmisen siihen aiheuttamat muutokset sekä analysoida ja etsiä näiden yhtenäiset suhteet. Ympäristön muuttumista voidaan hallita keräämällä ja analysoimalla tietoa. Tässä diplomityössä on tutkittu vesikasvillisuudessa hai vainuja muutoksia käyttäen etäältä hankittua mittausdataa ja kuvan analysointimenetelmiä. Ympäristön tarkkailuun on käytetty Suomen suurimmasta järvestä Saimaasta vuosina 1996 ja 1999 otettuja ilmakuvia. Ensimmäinen kuva-analyysin vaihe on geometrinen korjaus, jonka tarkoituksena on kohdistaa ja suhteuttaa otetut kuvat samaan koordinaattijärjestelmään. Toinen vaihe on kohdistaa vastaavat paikalliset alueet ja tunnistaa kasvillisuuden muuttuminen. Kasvillisuuden tunnistamiseen on käytetty erilaisia lähestymistapoja sisältäen valvottuja ja valvomattomia tunnistustapoja. Tutkimuksessa käytettiin aitoa, kohinoista mittausdataa, minkä perusteella tehdyt kokeet antoivat hyviä tuloksia tutkimuksen onnistumisesta.
Resumo:
A number of recent works have introduced statistical methods for detecting genetic loci that affect phenotypic variability, which we refer to as variability-controlling quantitative trait loci (vQTL). These are genetic variants whose allelic state predicts how much phenotype values will vary about their expected means. Such loci are of great potential interest in both human and non-human genetic studies, one reason being that a detected vQTL could represent a previously undetected interaction with other genes or environmental factors. The simultaneous publication of these new methods in different journals has in many cases precluded opportunity for comparison. We survey some of these methods, the respective trade-offs they imply, and the connections between them. The methods fall into three main groups: classical non-parametric, fully parametric, and semi-parametric two-stage approximations. Choosing between alternatives involves balancing the need for robustness, flexibility, and speed. For each method, we identify important assumptions and limitations, including those of practical importance, such as their scope for including covariates and random effects. We show in simulations that both parametric methods and their semi-parametric approximations can give elevated false positive rates when they ignore mean-variance relationships intrinsic to the data generation process. We conclude that choice of method depends on the trait distribution, the need to include non-genetic covariates, and the population size and structure, coupled with a critical evaluation of how these fit with the assumptions of the statistical model.
Resumo:
This paper defines and compares several models for describing excess influenza pneumonia mortality in Houston. First, the methodology used by the Center for Disease Control is examined and several variations of this methodology are studied. All of the models examined emphasize the difficulty of omitting epidemic weeks.^ In an attempt to find a better method of describing expected and epidemic mortality, time series methods are examined. Grouping in four-week periods, truncating the data series to adjust epidemic periods, and seasonally-adjusting the series y(,t), by:^ (DIAGRAM, TABLE OR GRAPHIC OMITTED...PLEASE SEE DAI)^ is the best method examined. This new series w(,t) is stationary and a moving average model MA(1) gives a good fit for forecasting influenza and pneumonia mortality in Houston.^ Influenza morbidity, other causes of death, sex, race, age, climate variables, environmental factors, and school absenteeism are all examined in terms of their relationship to influenza and pneumonia mortality. Both influenza morbidity and ischemic heart disease mortality show a very high relationship that remains when seasonal trends are removed from the data. However, when jointly modeling the three series it is obvious that the simple time series MA(1) model of truncated, seasonally-adjusted four-week data gives a better forecast.^
Resumo:
Complex diseases such as cancer result from multiple genetic changes and environmental exposures. Due to the rapid development of genotyping and sequencing technologies, we are now able to more accurately assess causal effects of many genetic and environmental factors. Genome-wide association studies have been able to localize many causal genetic variants predisposing to certain diseases. However, these studies only explain a small portion of variations in the heritability of diseases. More advanced statistical models are urgently needed to identify and characterize some additional genetic and environmental factors and their interactions, which will enable us to better understand the causes of complex diseases. In the past decade, thanks to the increasing computational capabilities and novel statistical developments, Bayesian methods have been widely applied in the genetics/genomics researches and demonstrating superiority over some regular approaches in certain research areas. Gene-environment and gene-gene interaction studies are among the areas where Bayesian methods may fully exert its functionalities and advantages. This dissertation focuses on developing new Bayesian statistical methods for data analysis with complex gene-environment and gene-gene interactions, as well as extending some existing methods for gene-environment interactions to other related areas. It includes three sections: (1) Deriving the Bayesian variable selection framework for the hierarchical gene-environment and gene-gene interactions; (2) Developing the Bayesian Natural and Orthogonal Interaction (NOIA) models for gene-environment interactions; and (3) extending the applications of two Bayesian statistical methods which were developed for gene-environment interaction studies, to other related types of studies such as adaptive borrowing historical data. We propose a Bayesian hierarchical mixture model framework that allows us to investigate the genetic and environmental effects, gene by gene interactions (epistasis) and gene by environment interactions in the same model. It is well known that, in many practical situations, there exists a natural hierarchical structure between the main effects and interactions in the linear model. Here we propose a model that incorporates this hierarchical structure into the Bayesian mixture model, such that the irrelevant interaction effects can be removed more efficiently, resulting in more robust, parsimonious and powerful models. We evaluate both of the 'strong hierarchical' and 'weak hierarchical' models, which specify that both or one of the main effects between interacting factors must be present for the interactions to be included in the model. The extensive simulation results show that the proposed strong and weak hierarchical mixture models control the proportion of false positive discoveries and yield a powerful approach to identify the predisposing main effects and interactions in the studies with complex gene-environment and gene-gene interactions. We also compare these two models with the 'independent' model that does not impose this hierarchical constraint and observe their superior performances in most of the considered situations. The proposed models are implemented in the real data analysis of gene and environment interactions in the cases of lung cancer and cutaneous melanoma case-control studies. The Bayesian statistical models enjoy the properties of being allowed to incorporate useful prior information in the modeling process. Moreover, the Bayesian mixture model outperforms the multivariate logistic model in terms of the performances on the parameter estimation and variable selection in most cases. Our proposed models hold the hierarchical constraints, that further improve the Bayesian mixture model by reducing the proportion of false positive findings among the identified interactions and successfully identifying the reported associations. This is practically appealing for the study of investigating the causal factors from a moderate number of candidate genetic and environmental factors along with a relatively large number of interactions. The natural and orthogonal interaction (NOIA) models of genetic effects have previously been developed to provide an analysis framework, by which the estimates of effects for a quantitative trait are statistically orthogonal regardless of the existence of Hardy-Weinberg Equilibrium (HWE) within loci. Ma et al. (2012) recently developed a NOIA model for the gene-environment interaction studies and have shown the advantages of using the model for detecting the true main effects and interactions, compared with the usual functional model. In this project, we propose a novel Bayesian statistical model that combines the Bayesian hierarchical mixture model with the NOIA statistical model and the usual functional model. The proposed Bayesian NOIA model demonstrates more power at detecting the non-null effects with higher marginal posterior probabilities. Also, we review two Bayesian statistical models (Bayesian empirical shrinkage-type estimator and Bayesian model averaging), which were developed for the gene-environment interaction studies. Inspired by these Bayesian models, we develop two novel statistical methods that are able to handle the related problems such as borrowing data from historical studies. The proposed methods are analogous to the methods for the gene-environment interactions on behalf of the success on balancing the statistical efficiency and bias in a unified model. By extensive simulation studies, we compare the operating characteristics of the proposed models with the existing models including the hierarchical meta-analysis model. The results show that the proposed approaches adaptively borrow the historical data in a data-driven way. These novel models may have a broad range of statistical applications in both of genetic/genomic and clinical studies.
Resumo:
Accurate quantitative estimation of exposure using retrospective data has been one of the most challenging tasks in the exposure assessment field. To improve these estimates, some models have been developed using published exposure databases with their corresponding exposure determinants. These models are designed to be applied to reported exposure determinants obtained from study subjects or exposure levels assigned by an industrial hygienist, so quantitative exposure estimates can be obtained. ^ In an effort to improve the prediction accuracy and generalizability of these models, and taking into account that the limitations encountered in previous studies might be due to limitations in the applicability of traditional statistical methods and concepts, the use of computer science- derived data analysis methods, predominantly machine learning approaches, were proposed and explored in this study. ^ The goal of this study was to develop a set of models using decision trees/ensemble and neural networks methods to predict occupational outcomes based on literature-derived databases, and compare, using cross-validation and data splitting techniques, the resulting prediction capacity to that of traditional regression models. Two cases were addressed: the categorical case, where the exposure level was measured as an exposure rating following the American Industrial Hygiene Association guidelines and the continuous case, where the result of the exposure is expressed as a concentration value. Previously developed literature-based exposure databases for 1,1,1 trichloroethane, methylene dichloride and, trichloroethylene were used. ^ When compared to regression estimations, results showed better accuracy of decision trees/ensemble techniques for the categorical case while neural networks were better for estimation of continuous exposure values. Overrepresentation of classes and overfitting were the main causes for poor neural network performance and accuracy. Estimations based on literature-based databases using machine learning techniques might provide an advantage when they are applied to other methodologies that combine `expert inputs' with current exposure measurements, like the Bayesian Decision Analysis tool. The use of machine learning techniques to more accurately estimate exposures from literature-based exposure databases might represent the starting point for the independence from the expert judgment.^
Resumo:
The purpose of this research is design considerations for environmental monitoring platforms for the detection of hazardous materials using System-on-a-Chip (SoC) design. Design considerations focus on improving key areas such as: (1) sampling methodology; (2) context awareness; and (3) sensor placement. These design considerations for environmental monitoring platforms using wireless sensor networks (WSN) is applied to the detection of methylmercury (MeHg) and environmental parameters affecting its formation (methylation) and deformation (demethylation). ^ The sampling methodology investigates a proof-of-concept for the monitoring of MeHg using three primary components: (1) chemical derivatization; (2) preconcentration using the purge-and-trap (P&T) method; and (3) sensing using Quartz Crystal Microbalance (QCM) sensors. This study focuses on the measurement of inorganic mercury (Hg) (e.g., Hg2+) and applies lessons learned to organic Hg (e.g., MeHg) detection. ^ Context awareness of a WSN and sampling strategies is enhanced by using spatial analysis techniques, namely geostatistical analysis (i.e., classical variography and ordinary point kriging), to help predict the phenomena of interest in unmonitored locations (i.e., locations without sensors). This aids in making more informed decisions on control of the WSN (e.g., communications strategy, power management, resource allocation, sampling rate and strategy, etc.). This methodology improves the precision of controllability by adding potentially significant information of unmonitored locations.^ There are two types of sensors that are investigated in this study for near-optimal placement in a WSN: (1) environmental (e.g., humidity, moisture, temperature, etc.) and (2) visual (e.g., camera) sensors. The near-optimal placement of environmental sensors is found utilizing a strategy which minimizes the variance of spatial analysis based on randomly chosen points representing the sensor locations. Spatial analysis is employed using geostatistical analysis and optimization occurs with Monte Carlo analysis. Visual sensor placement is accomplished for omnidirectional cameras operating in a WSN using an optimal placement metric (OPM) which is calculated for each grid point based on line-of-site (LOS) in a defined number of directions where known obstacles are taken into consideration. Optimal areas of camera placement are determined based on areas generating the largest OPMs. Statistical analysis is examined by using Monte Carlo analysis with varying number of obstacles and cameras in a defined space. ^
Resumo:
Time Series Analysis of multispectral satellite data offers an innovative way to extract valuable information of our changing planet. This is now a real option for scientists thanks to data availability as well as innovative cloud-computing platforms, such as Google Earth Engine. The integration of different missions would mitigate known issues in multispectral time series construction, such as gaps due to clouds or other atmospheric effects. With this purpose, harmonization among Landsat-like missions is possible through statistical analysis. This research offers an overview of the different instruments from Landsat and Sentinel missions (TM, ETM, OLI, OLI-2 and MSI sensors) and products levels (Collection-2 Level-1 and Surface Reflectance for Landsat and Level-1C and Level-2A for Sentinel-2). Moreover, a cross-sensors comparison was performed to assess the interoperability of the sensors on-board Landsat and Sentinel-2 constellations, having in mind a possible combined use for time series analysis. Firstly, more than 20,000 pairs of images almost simultaneously acquired all over Europe were selected over a period of several years. The study performed a cross-comparison analysis on these data, and provided an assessment of the calibration coefficients that can be used to minimize differences in the combined use. Four of the most popular vegetation indexes were selected for the study: NDVI, EVI, SAVI and NDMI. As a result, it is possible to reconstruct a longer and denser harmonized time series since 1984, useful for vegetation monitoring purposes. Secondly, the spectral characteristics of the recent Landsat-9 mission were assessed for a combined use with Landsat-8 and Sentinel-2. A cross-sensor analysis of common bands of more than 3,000 almost simultaneous acquisitions verified a high consistency between datasets. The most relevant discrepancy has been observed in the blue and SWIRS bands, often used in vegetation and water related studies. This analysis was supported with spectroradiometer ground measurements.
Resumo:
Tree-rings have frequently been used for dating of trees and to determine annual growth increments and forest dynamics, but little is known in tropical conditions about their utilization for environmental monitoring. This paper presents the results of Araucaria columnaris tree-ring characterization by wood anatomy and X-ray densitometric analysis and the determination of Pb concentration. Core samples from twelve araucaria trees were extracted from two sites exposed to air pollution due to intense traffic of vehicles and industrial activities. The tree-rings distinctly presented radial variation in early-latewood thickness and density, and characteristics of juvenile and mature wood. Anatomical and X-ray densitometric analysis were useful to delimit the tree-ring boundaries and to date the tree-rings, as well as to prove the annual formation. The lead concentration in annual araucaria tree-rings, analyzed with graphite furnace atomic absorption spectrometry, indicated the seasonal presence of the heavy metal in the environment during the 30 years studied, although the Pb did not affect tree growth. (c) 2008 Elsevier GmbH. All rights reserved.
Resumo:
Two major factors are likely to impact the utilisation of remotely sensed data in the near future: (1)an increase in the number and availability of commercial and non-commercial image data sets with a range of spatial, spectral and temporal dimensions, and (2) increased access to image display and analysis software through GIS. A framework was developed to provide an objective approach to selecting remotely sensed data sets for specific environmental monitoring problems. Preliminary applications of the framework have provided successful approaches for monitoring disturbed and restored wetlands in southern California.