956 resultados para River monitoring network
Resumo:
Results of the monitoring network of the Posidonia oceanica meadows in the Valencia region in Spain are analysed. For spatial comparison the whole data set has been analysed, however, for temporal trends we only selected stations that have been monitored at least 6 years in the period of 2002–2011 (26 stations in 13 localities). At the south of the studied area, meadows are larger, and they have higher density and covering than that in the Valencia Gulf, excluding Oropesa meadow. Monitoring of P. oceanica meadows in the Valencia region in Spain indicates that most of them are stationary or they are increasing their density and covering while no decline was observed in the studied meadows. These results indicate that there is not a general decline of P. oceanica meadows and that the decline of P. oceanica, when it has been observed in other studies, is produced by local causes that may be managed at the local level. This study also reflects the importance of long series of direct data to analyse trends in the population dynamics for slow-growing species.
Resumo:
Objetivo: Evaluar la variación espacial de la exposición a dióxido de nitrógeno (NO2) en la ciudad de Valencia y su relación con la privación socioeconómica y la edad. Métodos: La población por sección censal (SC) procede del Instituto Nacional de Estadística. Los niveles de NO2 se midieron en 100 puntos del área de estudio, mediante captadores pasivos, en tres campañas entre 2002 y 2004. Se utilizó regresión por usos del suelo (LUR) para obtener el mapa de los niveles de NO2. Las predicciones del LUR se compararon con las proporcionadas por: a) el captador más cercano de la red de vigilancia, b) el captador pasivo más cercano, c) el conjunto de captadores en un entorno y d) kriging. Se asignaron niveles de contaminación para cada SC. Se analizó la relación entre los niveles de NO2, un índice de privación con cinco categorías y la edad (≥65 años). Resultados: El modelo LUR resultó el método más preciso. Más del 99% de la población superó los niveles de seguridad propuestos por la Organización Mundial de la Salud. Se encontró una relación inversa entre los niveles de NO2 y el índice de privación (β = –2,01 μg/m3 en el quintil de mayor privación respecto al de menor, IC95%: –3,07 a –0,95), y una relación directa con la edad (β = 0,12 μg/m3 por incremento en unidad porcentual de población ≥65 años, IC95%: 0,08 a 0,16). Conclusiones: El método permitió obtener mapas de contaminación y describir la relación entre niveles de NO2 y características sociodemográficas.
Resumo:
The research developed in this work consists in proposing a set of techniques for management of social networks and their integration into the educational process. The proposals made are based on assumptions that have been proven with simple examples in a real scenario of university teaching. The results show that social networks have more capacity to spread information than educational web platforms. Moreover, educational social networks are developed in a context of freedom of expression intrinsically linked to Internet freedom. In that context, users can write opinions or comments which are not liked by the staff of schools. However, this feature can be exploited to enrich the educational process and improve the quality of their achievement. The network has covered needs and created new ones. So, the figure of the Community Manager is proposed as agent in educational context for monitoring network and aims to channel the opinions and to provide a rapid response to an academic problem.
Resumo:
The understanding of the continental carbon budget is essential to predict future climate change. In order to quantify CO₂ and CH₄ fluxes at the regional scale, a measurement system was installed at the former radio tower in Beromünster as part of the Swiss greenhouse gas monitoring network (CarboCount CH). We have been measuring the mixing ratios of CO₂, CH₄ and CO on this tower with sample inlets at 12.5, 44.6, 71.5, 131.6 and 212.5 m above ground level using a cavity ring down spectroscopy (CRDS) analyzer. The first 2-year (December 2012–December 2014) continuous atmospheric record was analyzed for seasonal and diurnal variations and interspecies correlations. In addition, storage fluxes were calculated from the hourly profiles along the tower. The atmospheric growth rates from 2013 to 2014 determined from this 2-year data set were 1.78 ppm yr⁻¹, 9.66 ppb yr⁻¹ and and -1.27 ppb yr⁻¹ for CO₂, CH₄ and CO, respectively. After detrending, clear seasonal cycles were detected for CO₂ and CO, whereas CH₄ showed a stable baseline suggesting a net balance between sources and sinks over the course of the year. CO and CO₂ were strongly correlated (r² > 0.75) in winter (DJF), but almost uncorrelated in summer. In winter, anthropogenic emissions dominate the biospheric CO₂ fluxes and the variations in mixing ratios are large due to reduced vertical mixing. The diurnal variations of all species showed distinct cycles in spring and summer, with the lowest sampling level showing the most pronounced diurnal amplitudes. The storage flux estimates exhibited reasonable diurnal shapes for CO₂, but underestimated the strength of the surface sinks during daytime. This seems plausible, keeping in mind that we were only able to calculate the storage fluxes along the profile of the tower but not the flux into or out of this profile, since no Eddy covariance flux measurements were taken at the top of the tower.
Resumo:
In many Environmental Information Systems the actual observations arise from a discrete monitoring network which might be rather heterogeneous in both location and types of measurements made. In this paper we describe the architecture and infrastructure for a system, developed as part of the EU FP6 funded INTAMAP project, to provide a service oriented solution that allows the construction of an interoperable, automatic, interpolation system. This system will be based on the Open Geospatial Consortium’s Web Feature Service (WFS) standard. The essence of our approach is to extend the GML3.1 observation feature to include information about the sensor using SensorML, and to further extend this to incorporate observation error characteristics. Our extended WFS will accept observations, and will store them in a database. The observations will be passed to our R-based interpolation server, which will use a range of methods, including a novel sparse, sequential kriging method (only briefly described here) to produce an internal representation of the interpolated field resulting from the observations currently uploaded to the system. The extended WFS will then accept queries, such as ‘What is the probability distribution of the desired variable at a given point’, ‘What is the mean value over a given region’, or ‘What is the probability of exceeding a certain threshold at a given location’. To support information-rich transfer of complex and uncertain predictions we are developing schema to represent probabilistic results in a GML3.1 (object-property) style. The system will also offer more easily accessible Web Map Service and Web Coverage Service interfaces to allow users to access the system at the level of complexity they require for their specific application. Such a system will offer a very valuable contribution to the next generation of Environmental Information Systems in the context of real time mapping for monitoring and security, particularly for systems that employ a service oriented architecture.
Resumo:
The INTAMAP FP6 project has developed an interoperable framework for real-time automatic mapping of critical environmental variables by extending spatial statistical methods and employing open, web-based, data exchange protocols and visualisation tools. This paper will give an overview of the underlying problem, of the project, and discuss which problems it has solved and which open problems seem to be most relevant to deal with next. The interpolation problem that INTAMAP solves is the generic problem of spatial interpolation of environmental variables without user interaction, based on measurements of e.g. PM10, rainfall or gamma dose rate, at arbitrary locations or over a regular grid covering the area of interest. It deals with problems of varying spatial resolution of measurements, the interpolation of averages over larger areas, and with providing information on the interpolation error to the end-user. In addition, monitoring network optimisation is addressed in a non-automatic context.
Resumo:
Large monitoring networks are becoming increasingly common and can generate large datasets from thousands to millions of observations in size, often with high temporal resolution. Processing large datasets using traditional geostatistical methods is prohibitively slow and in real world applications different types of sensor can be found across a monitoring network. Heterogeneities in the error characteristics of different sensors, both in terms of distribution and magnitude, presents problems for generating coherent maps. An assumption in traditional geostatistics is that observations are made directly of the underlying process being studied and that the observations are contaminated with Gaussian errors. Under this assumption, sub–optimal predictions will be obtained if the error characteristics of the sensor are effectively non–Gaussian. One method, model based geostatistics, assumes that a Gaussian process prior is imposed over the (latent) process being studied and that the sensor model forms part of the likelihood term. One problem with this type of approach is that the corresponding posterior distribution will be non–Gaussian and computationally demanding as Monte Carlo methods have to be used. An extension of a sequential, approximate Bayesian inference method enables observations with arbitrary likelihoods to be treated, in a projected process kriging framework which is less computationally intensive. The approach is illustrated using a simulated dataset with a range of sensor models and error characteristics.
Resumo:
In 2001, a weather and climate monitoring network was established along the temperature and aridity gradient between the sub-humid Moroccan High Atlas Mountains and the former end lake of the Middle Drâa in a pre-Saharan environment. The highest Automated Weather Stations (AWS) was installed just below the M'Goun summit at 3850 m, the lowest station Lac Iriki was at 450 m. This network of 13 AWS stations was funded and maintained by the German IMPETUS (BMBF Grant 01LW06001A, North Rhine-Westphalia Grant 313-21200200) project and since 2011 five stations were further maintained by the GERMAN DFG Fennec project (FI 786/3-1), this way some stations of the AWS network provided data for almost 12 years from 2001-2012. Standard meteorological variables such as temperature, humidity, and wind were measured at an altitude of 2 m above ground. Other meteorological variables comprise precipitation, station pressure, solar irradiance, soil temperature at different depths and for high mountain station snow water equivalent. The stations produced data summaries for 5-minute-precipitation-data, 10- or 15-minute-data and a daily summary of all other variables. This network is a unique resource of multi-year weather data in the remote semi-arid to arid mountain region of the Saharan flank of the Atlas Mountains. The network is described in Schulz et al. (2010) and its further continuation until 2012 is briefly discussed in Redl et al. (2015, doi:10.1175/MWR-D-15-0223.1) and Redl et al. (2016, doi:10.1002/2015JD024443).
Resumo:
Dissertação (Mestrado)
Resumo:
The long-term adverse effects on health associated with air pollution exposure can be estimated using either cohort or spatio-temporal ecological designs. In a cohort study, the health status of a cohort of people are assessed periodically over a number of years, and then related to estimated ambient pollution concentrations in the cities in which they live. However, such cohort studies are expensive and time consuming to implement, due to the long-term follow up required for the cohort. Therefore, spatio-temporal ecological studies are also being used to estimate the long-term health effects of air pollution as they are easy to implement due to the routine availability of the required data. Spatio-temporal ecological studies estimate the health impact of air pollution by utilising geographical and temporal contrasts in air pollution and disease risk across $n$ contiguous small-areas, such as census tracts or electoral wards, for multiple time periods. The disease data are counts of the numbers of disease cases occurring in each areal unit and time period, and thus Poisson log-linear models are typically used for the analysis. The linear predictor includes pollutant concentrations and known confounders such as socio-economic deprivation. However, as the disease data typically contain residual spatial or spatio-temporal autocorrelation after the covariate effects have been accounted for, these known covariates are augmented by a set of random effects. One key problem in these studies is estimating spatially representative pollution concentrations in each areal which are typically estimated by applying Kriging to data from a sparse monitoring network, or by computing averages over modelled concentrations (grid level) from an atmospheric dispersion model. The aim of this thesis is to investigate the health effects of long-term exposure to Nitrogen Dioxide (NO2) and Particular matter (PM10) in mainland Scotland, UK. In order to have an initial impression about the air pollution health effects in mainland Scotland, chapter 3 presents a standard epidemiological study using a benchmark method. The remaining main chapters (4, 5, 6) cover the main methodological focus in this thesis which has been threefold: (i) how to better estimate pollution by developing a multivariate spatio-temporal fusion model that relates monitored and modelled pollution data over space, time and pollutant; (ii) how to simultaneously estimate the joint effects of multiple pollutants; and (iii) how to allow for the uncertainty in the estimated pollution concentrations when estimating their health effects. Specifically, chapters 4 and 5 are developed to achieve (i), while chapter 6 focuses on (ii) and (iii). In chapter 4, I propose an integrated model for estimating the long-term health effects of NO2, that fuses modelled and measured pollution data to provide improved predictions of areal level pollution concentrations and hence health effects. The air pollution fusion model proposed is a Bayesian space-time linear regression model for relating the measured concentrations to the modelled concentrations for a single pollutant, whilst allowing for additional covariate information such as site type (e.g. roadside, rural, etc) and temperature. However, it is known that some pollutants might be correlated because they may be generated by common processes or be driven by similar factors such as meteorology. The correlation between pollutants can help to predict one pollutant by borrowing strength from the others. Therefore, in chapter 5, I propose a multi-pollutant model which is a multivariate spatio-temporal fusion model that extends the single pollutant model in chapter 4, which relates monitored and modelled pollution data over space, time and pollutant to predict pollution across mainland Scotland. Considering that we are exposed to multiple pollutants simultaneously because the air we breathe contains a complex mixture of particle and gas phase pollutants, the health effects of exposure to multiple pollutants have been investigated in chapter 6. Therefore, this is a natural extension to the single pollutant health effects in chapter 4. Given NO2 and PM10 are highly correlated (multicollinearity issue) in my data, I first propose a temporally-varying linear model to regress one pollutant (e.g. NO2) against another (e.g. PM10) and then use the residuals in the disease model as well as PM10, thus investigating the health effects of exposure to both pollutants simultaneously. Another issue considered in chapter 6 is to allow for the uncertainty in the estimated pollution concentrations when estimating their health effects. There are in total four approaches being developed to adjust the exposure uncertainty. Finally, chapter 7 summarises the work contained within this thesis and discusses the implications for future research.
Resumo:
Activity of 7-ethoxyresorufin-O-deethylase (EROD) in fish is certainly the best-studied biomarker of exposure applied in the field to evaluate biological effects of contamination in the marine environment. Since 1991, a feasibility study for a monitoring network using this biomarker of exposure has been conducted along French coasts. Using data obtained during several cruises, this study aims to determine the number of fish required to detect a given difference between 2 mean EROD activities, i.e. to achieve an a priori fixed statistical power (l-beta) given significance level (alpha), variance estimations and projected ratio of unequal sample sizes (k). Mean EROD activity and standard error were estimated at each of 82 sampling stations. The inter-individual variance component was dominant in estimating the variance of mean EROD activity. Influences of alpha, beta, k and variability on sample sizes are illustrated and discussed in terms of costs. In particular, sample sizes do not have to be equal, especially if such a requirement would lead to a significant cost in sampling extra material. Finally, the feasibility of longterm monitoring is discussed.
Resumo:
Stochastic methods based on time-series modeling combined with geostatistics can be useful tools to describe the variability of water-table levels in time and space and to account for uncertainty. Monitoring water-level networks can give information about the dynamic of the aquifer domain in both dimensions. Time-series modeling is an elegant way to treat monitoring data without the complexity of physical mechanistic models. Time-series model predictions can be interpolated spatially, with the spatial differences in water-table dynamics determined by the spatial variation in the system properties and the temporal variation driven by the dynamics of the inputs into the system. An integration of stochastic methods is presented, based on time-series modeling and geostatistics as a framework to predict water levels for decision making in groundwater management and land-use planning. The methodology is applied in a case study in a Guarani Aquifer System (GAS) outcrop area located in the southeastern part of Brazil. Communication of results in a clear and understandable form, via simulated scenarios, is discussed as an alternative, when translating scientific knowledge into applications of stochastic hydrogeology in large aquifers with limited monitoring network coverage like the GAS.
Resumo:
The coastal zone of the Nord – Pas de Calais / Picardie showed dysfonctioning patterns of the ecosystem considered to be link to human activities along shores. These results in regular massive development of species, such as the phytoplanktonic seaweed, Phaeocystis sp. which life cycle was partly linked to nutrients availability and consequently to anthropogenic inputs. As part of the evaluation of the influence of continental inputs on the marine environment (nitrates, phosphates,…) and on potential eutrophication processes, of the estimation of the efficiency of the sewage treatments plants in the possible elimination of dumpings and in order to establish a long-term survey to follow up the change in coastal waters quality, the regional nutrients monitoring network was implemented by Ifremer in collaboration with the Agence de l'Eau Artois-Picardie in 1992 in order to complete the REPHY (Phytoplankton and Phycotoxins) monitoring programme. This study reports the main results for the year 2015 in terms of temporal change of the main physico-chemical and biological parameters characteristic of water masses sampled along three transects offshore Dunkerque, Boulogne-sur-Mer and the Bay of Somme.
Resumo:
The Water Framework Directive (WFD) establishes Environmental Quality Standards (EQS) in marine water for 34 priority substances. Among these substances, 25 are hydrophobic and bioaccumulable (2 metals and 23 organic compounds). For these 25 substances, monitoring in water matrix is not appropriate and an alternative matrix should be developed. Bivalve mollusks, particularly mussels (Mytilus edulis, Mytilus galloprovincialis), are used by Ifremer as a quantitative biological indicator since 1979 in France, to assess the marine water quality. This study has been carried out in order to determine thresholds in mussels at least as protective as EQS in marine water laid down by the WFD. Three steps are defined: - Provide an overview of knowledges about the relations between the concentrations of contaminants in the marine water and mussels through bioaccumulation factor (BAF) and bioconcentration factor (BCF). This allows to examine how a BCF or a BAF can be determined: BCF can be determined experimentally (according to US EPA or ASTM standards), or by Quantitative Activity-Structure Relationship models (QSAR): four equations can be used for mussels. BAF can be determined by field experiment; but none standards exists. It could be determined by using QSAR but this method is considered as invalid for mussels, or by using existing model: Dynamic Budget Model, but this is complex to use. - Collect concentrations data in marine water (Cwater) in bibliography for those 25 substances; and compare them with concentration in mussels (Cmussels) obtained through French monitoring network of chemicals contaminants (ROCCH) and biological integrator network RINBIO. According to available data, this leads to determine the BAF or the BCF (Cmussels /Cwater) with field data. - Compare BAF and BCF values (when available) obtained with various methods for these substances: BCF (stemming from the bibliography, using experimental process), BCF calculated by QSAR and BAF determined using field data. This study points out that experimental BCF data are available for 3 substances (Chlorpyrifos, HCH, Pentachlorobenzene). BCF by QSAR can be calculated for 20 substances. The use of field data allows to evaluate 4 BAF for organic compounds and 2 BAF for metals. Using these BAF or BCF value, thresholds in shellfish can be determined as an alternative to EQS in marine water.
Resumo:
The transport of fluids through pipes is used in the oil industry, being the pipelines an important link in the logistics flow of fluids. However, the pipelines suffer deterioration in their walls caused by several factors which may cause loss of fluids to the environment, justifying the investment in techniques and methods of leak detection to minimize fluid loss and environmental damage. This work presents the development of a supervisory module in order to inform to the operator the leakage in the pipeline monitored in the shortest time possible, in order that the operator log procedure that entails the end of the leak. This module is a component of a system designed to detect leaks in oil pipelines using sonic technology, wavelets and neural networks. The plant used in the development and testing of the module presented here was the system of tanks of LAMP, and its LAN, as monitoring network. The proposal consists of, basically, two stages. Initially, assess the performance of the communication infrastructure of the supervisory module. Later, simulate leaks so that the DSP sends information to the supervisory performs the calculation of the location of leaks and indicate to which sensor the leak is closer, and using the system of tanks of LAMP, capture the pressure in the pipeline monitored by piezoresistive sensors, this information being processed by the DSP and sent to the supervisory to be presented to the user in real time