941 resultados para Environmental monitoring Statistical methods


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Snow samples collected from hand-dug pits at two sites in Simcoe County, Ontario, Canada were analysed for major and trace elements using the clean lab methods established for polar ice. Potentially toxic, chalcophile elements are highly enriched in snow, relative to their natural abundance in crustal rocks, with enrichment factor (EF) values (calculated using Sc) in the range 107 to 1081 for Ag, As, Bi, Cd, Cu, Mo, Pb, Sb, Te, and Zn. Relative to M/Sc ratios in snow, water samples collected at two artesian flows in this area are significantly depleted in Ag, Al, Be, Bi, Cd, Cr, Cu, Ni, Pb, Sb, Tl, V, and Zn at both sites, and in Co, Th and Tl at one of the sites. The removal from the waters of these elements is presumably due to such processes as physical retention (filtration) of metal-bearing atmospheric aerosols by organic and mineral soil components as well as adsorption and surface complexation of ionic species onto organic, metal oxyhydroxide and clay mineral surfaces. In the case of Pb, the removal processes are so effective that apparently ''natural'' ratios of Pb to Sc are found in the groundwaters. Tritium measurements show that the groundwater at one of the sites is modern (ie not more than 30 years old) meaning that the inputs of Pb and other trace elements to the groundwaters may originally have been much higher than they are today; the M/Sc ratios measured in the groundwaters today, therefore, represent a conservative estimate of the extent of metal removal along the flow path. Lithogenic elements significantly enriched in the groundwaters at both sites include Ba, Ca, Li, Mg, Mn, Na, Rb, S, Si, Sr, and Ti. The abundance of these elements can largely be explained in terms of weathering of the dominant silicate (plagioclase, potassium feldspar, amphibole and biotite) and carbonate minerals (calcite, dolomite and ankerite) in the soils and sediments of the watershed. Arsenic, Mo, Te, and especially U are also highly enriched in the groundwaters, due to chemical weathering: these could easily be explained if there are small amounts of sulfides (As, Mo, Te) and apatite (U) in the soils of the source area. Elements neither significantly enriched nor depleted at both sites include Fe, Ga, Ge, and P.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

As human populations and resource consumption increase, it is increasingly important to monitor the quality of our environment. While laboratory instruments offer useful information, portable, easy to use sensors would allow environmental analysis to occur on-site, at lower cost, and with minimal operator training. We explore the synthesis, modification, and applications of modified polysiloxane in environmental sensing. Multiple methods of producing modified siloxanes were investigated. Oligomers were formed by using functionalized monomers, producing siloxane materials containing silicon hydride, methyl, and phenyl side chains. Silicon hydride-functionalized oligomers were further modified by hydrosilylation to incorporate methyl ester and naphthyl side chains. Modifications to the siloxane materials were also carried out using post-curing treatments. Methyl ester-functionalized siloxane was incorporated into the surface of a cured poly(dimethylsiloxane) film by siloxane equilibration. The materials containing methyl esters were hydrolyzed to reveal carboxylic acids, which could later be used for covalent protein immobilization. Finally, the siloxane surfaces were modified to incorporate antibodies by covalent, affinity, and adsorption-based attachment. These modifications were characterized by a variety of methods, including contact angle, attenuated total reflectance Fourier transform infrared spectroscopy, dye labels, and 1H nuclear magnetic resonance spectroscopy. The modified siloxane materials were employed in a variety of sensing schemes. Volatile organic compounds were detected using methyl, phenyl, and naphthyl-functionalized materials on a Fabry-Perot interferometer and a refractometer. The Fabry-Perot interferometer was found to detect the analytes upon siloxane extraction by deformation of the Bragg reflectors. The refractometer was used to determine that naphthyl-functionalized siloxanes had elevated refractive indices, rendering these materials more sensitive to some analytes. Antibody-modified siloxanes were used to detect biological analytes through a solid phase microextraction-mediated enzyme linked immunosorbent assay (SPME ELISA). The SPME ELISA was found to have higher analyte sensitivity compared to a conventional ELISA system. The detection scheme was used to detect Escherichia coli at 8500 CFU/mL. These results demonstrate the variety of methods that can be used to modify siloxanes and the wide range of applications of modified siloxanes has been demonstrated through chemical and biological sensing schemes.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Hypertrophic cardiomyopathy (HCM) is a cardiovascular disease where the heart muscle is partially thickened and blood flow is - potentially fatally - obstructed. It is one of the leading causes of sudden cardiac death in young people. Electrocardiography (ECG) and Echocardiography (Echo) are the standard tests for identifying HCM and other cardiac abnormalities. The American Heart Association has recommended using a pre-participation questionnaire for young athletes instead of ECG or Echo tests due to considerations of cost and time involved in interpreting the results of these tests by an expert cardiologist. Initially we set out to develop a classifier for automated prediction of young athletes’ heart conditions based on the answers to the questionnaire. Classification results and further in-depth analysis using computational and statistical methods indicated significant shortcomings of the questionnaire in predicting cardiac abnormalities. Automated methods for analyzing ECG signals can help reduce cost and save time in the pre-participation screening process by detecting HCM and other cardiac abnormalities. Therefore, the main goal of this dissertation work is to identify HCM through computational analysis of 12-lead ECG. ECG signals recorded on one or two leads have been analyzed in the past for classifying individual heartbeats into different types of arrhythmia as annotated primarily in the MIT-BIH database. In contrast, we classify complete sequences of 12-lead ECGs to assign patients into two groups: HCM vs. non-HCM. The challenges and issues we address include missing ECG waves in one or more leads and the dimensionality of a large feature-set. We address these by proposing imputation and feature-selection methods. We develop heartbeat-classifiers by employing Random Forests and Support Vector Machines, and propose a method to classify full 12-lead ECGs based on the proportion of heartbeats classified as HCM. The results from our experiments show that the classifiers developed using our methods perform well in identifying HCM. Thus the two contributions of this thesis are the utilization of computational and statistical methods for discovering shortcomings in a current screening procedure and the development of methods to identify HCM through computational analysis of 12-lead ECG signals.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The complexity of modern geochemical data sets is increasing in several aspects (number of available samples, number of elements measured, number of matrices analysed, geological-environmental variability covered, etc), hence it is becoming increasingly necessary to apply statistical methods to elucidate their structure. This paper presents an exploratory analysis of one such complex data set, the Tellus geochemical soil survey of Northern Ireland (NI). This exploratory analysis is based on one of the most fundamental exploratory tools, principal component analysis (PCA) and its graphical representation as a biplot, albeit in several variations: the set of elements included (only major oxides vs. all observed elements), the prior transformation applied to the data (none, a standardization or a logratio transformation) and the way the covariance matrix between components is estimated (classical estimation vs. robust estimation). Results show that a log-ratio PCA (robust or classical) of all available elements is the most powerful exploratory setting, providing the following insights: the first two processes controlling the whole geochemical variation in NI soils are peat coverage and a contrast between “mafic” and “felsic” background lithologies; peat covered areas are detected as outliers by a robust analysis, and can be then filtered out if required for further modelling; and peat coverage intensity can be quantified with the %Br in the subcomposition (Br, Rb, Ni).

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Les réseaux de capteurs sont formés d’un ensemble de dispositifs capables de prendre individuellement des mesures d’un environnement particulier et d’échanger de l’information afin d’obtenir une représentation de haut niveau sur les activités en cours dans la zone d’intérêt. Une telle détection distribuée, avec de nombreux appareils situés à proximité des phénomènes d’intérêt, est pertinente dans des domaines tels que la surveillance, l’agriculture, l’observation environnementale, la surveillance industrielle, etc. Nous proposons dans cette thèse plusieurs approches pour effectuer l’optimisation des opérations spatio-temporelles de ces dispositifs, en déterminant où les placer dans l’environnement et comment les contrôler au fil du temps afin de détecter les cibles mobiles d’intérêt. La première nouveauté consiste en un modèle de détection réaliste représentant la couverture d’un réseau de capteurs dans son environnement. Nous proposons pour cela un modèle 3D probabiliste de la capacité de détection d’un capteur sur ses abords. Ce modèle inègre également de l’information sur l’environnement grâce à l’évaluation de la visibilité selon le champ de vision. À partir de ce modèle de détection, l’optimisation spatiale est effectuée par la recherche du meilleur emplacement et l’orientation de chaque capteur du réseau. Pour ce faire, nous proposons un nouvel algorithme basé sur la descente du gradient qui a été favorablement comparée avec d’autres méthodes génériques d’optimisation «boites noires» sous l’aspect de la couverture du terrain, tout en étant plus efficace en terme de calculs. Une fois que les capteurs placés dans l’environnement, l’optimisation temporelle consiste à bien couvrir un groupe de cibles mobiles dans l’environnement. D’abord, on effectue la prédiction de la position future des cibles mobiles détectées par les capteurs. La prédiction se fait soit à l’aide de l’historique des autres cibles qui ont traversé le même environnement (prédiction à long terme), ou seulement en utilisant les déplacements précédents de la même cible (prédiction à court terme). Nous proposons de nouveaux algorithmes dans chaque catégorie qui performent mieux ou produits des résultats comparables par rapport aux méthodes existantes. Une fois que les futurs emplacements de cibles sont prédits, les paramètres des capteurs sont optimisés afin que les cibles soient correctement couvertes pendant un certain temps, selon les prédictions. À cet effet, nous proposons une méthode heuristique pour faire un contrôle de capteurs, qui se base sur les prévisions probabilistes de trajectoire des cibles et également sur la couverture probabiliste des capteurs des cibles. Et pour terminer, les méthodes d’optimisation spatiales et temporelles proposées ont été intégrées et appliquées avec succès, ce qui démontre une approche complète et efficace pour l’optimisation spatio-temporelle des réseaux de capteurs.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Thesis (Ph.D.)--University of Washington, 2016-08

Relevância:

100.00% 100.00%

Publicador:

Resumo:

La stratégie actuelle de contrôle de la qualité de l’anode est inadéquate pour détecter les anodes défectueuses avant qu’elles ne soient installées dans les cuves d’électrolyse. Des travaux antérieurs ont porté sur la modélisation du procédé de fabrication des anodes afin de prédire leurs propriétés directement après la cuisson en utilisant des méthodes statistiques multivariées. La stratégie de carottage des anodes utilisée à l’usine partenaire fait en sorte que ce modèle ne peut être utilisé que pour prédire les propriétés des anodes cuites aux positions les plus chaudes et les plus froides du four à cuire. Le travail actuel propose une stratégie pour considérer l’histoire thermique des anodes cuites à n’importe quelle position et permettre de prédire leurs propriétés. Il est montré qu’en combinant des variables binaires pour définir l’alvéole et la position de cuisson avec les données routinières mesurées sur le four à cuire, les profils de température des anodes cuites à différentes positions peuvent être prédits. Également, ces données ont été incluses dans le modèle pour la prédiction des propriétés des anodes. Les résultats de prédiction ont été validés en effectuant du carottage supplémentaire et les performances du modèle sont concluantes pour la densité apparente et réelle, la force de compression, la réactivité à l’air et le Lc et ce peu importe la position de cuisson.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The widespread efforts to incorporate the economic values of oceans into national income accounts have reached a stage where coordination of national efforts is desirable. A symposium held in 2015 began this process by bringing together representatives from ten countries. The symposium concluded that a definition of core ocean industries was possible but beyond that core the definition of ocean industries is in flux. Better coordination of ocean income accounts will require addressing issues of aggregation, geography, partial ocean industries, confidential, and imputation is also needed. Beyond the standard national income accounts, a need to incorporate environmental resource and ecosystem service values to gain a complete picture of the economic role of the oceans was identified. The U.N. System of Environmental and Economic Accounts and the Experimental Ecosystem Service Accounts provide frameworks for this expansion. This will require the development of physical accounts of environmental assets linked to the economic accounts as well as the adaptation of transaction and welfare based economic valuation methods to environmental resources and ecosystem services. The future development of ocean economic data is most likely to require cooperative efforts at development of metadata standards and the use of multiple platforms of opportunity created by policy analysis, economic development, and conservation projects to both collect new economic data and to sustain ocean economy data collection into the future by building capacity in economic data collection and use..

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Terrestrial remote sensing imagery involves the acquisition of information from the Earth's surface without physical contact with the area under study. Among the remote sensing modalities, hyperspectral imaging has recently emerged as a powerful passive technology. This technology has been widely used in the fields of urban and regional planning, water resource management, environmental monitoring, food safety, counterfeit drugs detection, oil spill and other types of chemical contamination detection, biological hazards prevention, and target detection for military and security purposes [2-9]. Hyperspectral sensors sample the reflected solar radiation from the Earth surface in the portion of the spectrum extending from the visible region through the near-infrared and mid-infrared (wavelengths between 0.3 and 2.5 µm) in hundreds of narrow (of the order of 10 nm) contiguous bands [10]. This high spectral resolution can be used for object detection and for discriminating between different objects based on their spectral xharacteristics [6]. However, this huge spectral resolution yields large amounts of data to be processed. For example, the Airbone Visible/Infrared Imaging Spectrometer (AVIRIS) [11] collects a 512 (along track) X 614 (across track) X 224 (bands) X 12 (bits) data cube in 5 s, corresponding to about 140 MBs. Similar data collection ratios are achieved by other spectrometers [12]. Such huge data volumes put stringent requirements on communications, storage, and processing. The problem of signal sbspace identification of hyperspectral data represents a crucial first step in many hypersctral processing algorithms such as target detection, change detection, classification, and unmixing. The identification of this subspace enables a correct dimensionality reduction (DR) yelding gains in data storage and retrieval and in computational time and complexity. Additionally, DR may also improve algorithms performance since it reduce data dimensionality without losses in the useful signal components. The computation of statistical estimates is a relevant example of the advantages of DR, since the number of samples required to obtain accurate estimates increases drastically with the dimmensionality of the data (Hughes phnomenon) [13].

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Relatório de Estágio apresentado à Escola Superior de Educação de Paula Frassinetti para obtenção de grau Mestre em Educação Pré-Escolar e Ensino 1º ciclo do Ensino Básico.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Many maritime countries in Europe have implemented marine environmental monitoring programmes which include the measurement of chemical contaminants and related biological effects. How best to integrate data obtained in these two types of monitoring into meaningful assessments has been the subject of recent efforts by the International Council for Exploration of the Sea (ICES) Expert Groups. Work within these groups has concentrated on defining a core set of chemical and biological endpoints that can be used across maritime areas, defining confounding factors, supporting parameters and protocols for measurement. The framework comprised markers for concentrations of, exposure to and effects from, contaminants. Most importantly, assessment criteria for biological effect measurements have been set and the framework suggests how these measurements can be used in an integrated manner alongside contaminant measurements in biota, sediments and potentially water. Output from this process resulted in OSPAR Commission (www.ospar.org) guidelines that were adopted in 2012 on a trial basis for a period of 3 years. The developed assessment framework can furthermore provide a suitable approach for the assessment of Good Environmental Status (GES) for Descriptor 8 of the European Union (EU) Marine Strategy Framework Directive (MSFD).

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The dinoflagellates of Alexandrium genus are known to be producers of paralytic shellfish toxins that regularly impact the shellfish aquaculture industry and fisheries. Accurate detection of Alexandrium including A. minutum is crucial for environmental monitoring and sanitary issues. In this study, we firstly developed a quantitative lateral flow immunoassay (LFIA) using super-paramagnetic nanobeads for A. minutum whole cells. This dipstick assay relies on two distinct monoclonal antibodies used in a sandwich format and directed against surface antigens of this organism. No sample preparation is required. Either frozen or live cells can be detected and quantified. The specificity and sensitivity are assessed by using phytoplankton culture and field samples spiked with a known amount of cultured A. minutum cells. This LFIA is shown to be highly specific for A. minutum and able to detect reproducibly 105 cells/L within 30 min. The test is applied to environmental samples already characterized by light microscopy counting. No significant difference is observed between the cell densities obtained by these two methods. This handy super-paramagnetic lateral flow immnunoassay biosensor can greatly assist water quality monitoring programs as well as ecological research.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The main aim of the research project "On the Contribution of Schools to Children's Overall Indoor Air Exposure" is to study associations between adverse health effects, namely, allergy, asthma, and respiratory symptoms, and indoor air pollutants to which children are exposed to in primary schools and homes. Specifically, this investigation reports on the design of the study and methods used for data collection within the research project and discusses factors that need to be considered when designing such a study. Further, preliminary findings concerning descriptors of selected characteristics in schools and homes, the study population, and clinical examination are presented. The research project was designed in two phases. In the first phase, 20 public primary schools were selected and a detailed inspection and indoor air quality (IAQ) measurements including volatile organic compounds (VOC), aldehydes, particulate matter (PM2.5, PM10), carbon dioxide (CO2), carbon monoxide (CO), bacteria, fungi, temperature, and relative humidity were conducted. A questionnaire survey of 1600 children of ages 8-9 years was undertaken and a lung function test, exhaled nitric oxide (eNO), and tear film stability testing were performed. The questionnaire focused on children's health and on the environment in their school and homes. One thousand and ninety-nine questionnaires were returned. In the second phase, a subsample of 68 children was enrolled for further studies, including a walk-through inspection and checklist and an extensive set of IAQ measurements in their homes. The acquired data are relevant to assess children's environmental exposures and health status.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Abstract: Quantitative Methods (QM) is a compulsory course in the Social Science program in CEGEP. Many QM instructors assign a number of homework exercises to give students the opportunity to practice the statistical methods, which enhances their learning. However, traditional written exercises have two significant disadvantages. The first is that the feedback process is often very slow. The second disadvantage is that written exercises can generate a large amount of correcting for the instructor. WeBWorK is an open-source system that allows instructors to write exercises which students answer online. Although originally designed to write exercises for math and science students, WeBWorK programming allows for the creation of a variety of questions which can be used in the Quantitative Methods course. Because many statistical exercises generate objective and quantitative answers, the system is able to instantly assess students’ responses and tell them whether they are right or wrong. This immediate feedback has been shown to be theoretically conducive to positive learning outcomes. In addition, the system can be set up to allow students to re-try the problem if they got it wrong. This has benefits both in terms of student motivation and reinforcing learning. Through the use of a quasi-experiment, this research project measured and analysed the effects of using WeBWorK exercises in the Quantitative Methods course at Vanier College. Three specific research questions were addressed. First, we looked at whether students who did the WeBWorK exercises got better grades than students who did written exercises. Second, we looked at whether students who completed more of the WeBWorK exercises got better grades than students who completed fewer of the WeBWorK exercises. Finally, we used a self-report survey to find out what students’ perceptions and opinions were of the WeBWorK and the written exercises. For the first research question, a crossover design was used in order to compare whether the group that did WeBWorK problems during one unit would score significantly higher on that unit test than the other group that did the written problems. We found no significant difference in grades between students who did the WeBWorK exercises and students who did the written exercises. The second research question looked at whether students who completed more of the WeBWorK exercises would get significantly higher grades than students who completed fewer of the WeBWorK exercises. The straight-line relationship between number of WeBWorK exercises completed and grades was positive in both groups. However, the correlation coefficients for these two variables showed no real pattern. Our third research question was investigated by using a survey to elicit students’ perceptions and opinions regarding the WeBWorK and written exercises. Students reported no difference in the amount of effort put into completing each type of exercise. Students were also asked to rate each type of exercise along six dimensions and a composite score was calculated. Overall, students gave a significantly higher score to the written exercises, and reported that they found the written exercises were better for understanding the basic statistical concepts and for learning the basic statistical methods. However, when presented with the choice of having only written or only WeBWorK exercises, slightly more students preferred or strongly preferred having only WeBWorK exercises. The results of this research suggest that the advantages of using WeBWorK to teach Quantitative Methods are variable. The WeBWorK system offers immediate feedback, which often seems to motivate students to try again if they do not have the correct answer. However, this does not necessarily translate into better performance on the written tests and on the final exam. What has been learned is that the WeBWorK system can be used by interested instructors to enhance student learning in the Quantitative Methods course. Further research may examine more specifically how this system can be used more effectively.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A poluição das águas por metais, principalmente os metais pesados, vem chamando a atenção no mundo, pois estes poluentes aquáticos representam um risco em potencial, devido ao seu caráter acumulativo. Entre os metais, o arsênio recebe destaque pelo seu potencial tóxico. O arsênio inorgânico ocorre na natureza em quatro estados de oxidação: As5+, As3+, As0 e As3- . O estado de oxidação do arsênio tem um papel importante no seu comportamento e toxicidade nos sistemas aquáticos. Pelo fato do arsênio ser extremamente perigoso e nocivo para o meio ambiente, novos métodos analíticos de especiação química no meio ambiente têm sido publicados. Neste estudo foi otimizado e validado um método para realizar a especiação química de arsênio inorgânico presente em amostras de água coletadas nos meses de julho e outubro de 2010 no estuário da Lagoa dos Patos (RS, Brasil), como parte das atividades do Programa de Monitoramento Ambiental do Porto do Rio Grande-RS. Foi usada a técnica de espectrometria de absorção atômica com geração de hidretos e injeção em fluxo (FI-HG AAS), podendo ser quantificadas espécies de As3+ e As5+ nas amostras de água estuarina. A concentração do arsênio trivalente inorgânico foi determinada, após adição de solução tampão citrato de sódio (0,4 mol L-1 ; pH = 6,0). A concentração de arsênio inorgânico total foi determinada, após uma etapa de pré-redução da espécie pentavalente para a forma trivalente usando uma mistura de iodeto de potássio, ácido ascórbico em meio ácido clorídrico concentrado. A concentração de arsênio pentavalente foi calculada pela diferença das concentrações de arsênio inorgânico total e trivalente. A interpretação dos resultados gerados pelo método proposto usado ao analisar amostras de águas coletadas no estuário da Lagoa dos Patos foi feita pela análise dos componentes principais. Os dados tratados estatisticamente revelaram uma interação significativa neste estudo entre o arsênio, o material em suspensão (MS) e o NH4 + na superfície da coluna d’água no período da primavera.