21 resultados para Smoothed bootstrap

em Universidad Politécnica de Madrid


Relevância:

20.00% 20.00%

Publicador:

Resumo:

The theoretical formulation of the smoothed particle hydrodynamics (SPH) method deserves great care because of some inconsistencies occurring when considering free-surface inviscid flows. Actually, in SPH formulations one usually assumes that (i) surface integral terms on the boundary of the interpolation kernel support are neglected, (ii) free-surface conditions are implicitly verified. These assumptions are studied in detail in the present work for free-surface Newtonian viscous flow. The consistency of classical viscous weakly compressible SPH formulations is investigated. In particular, the principle of virtual work is used to study the verification of the free-surface boundary conditions in a weak sense. The latter can be related to the global energy dissipation induced by the viscous term formulations and their consistency. Numerical verification of this theoretical analysis is provided on three free-surface test cases including a standing wave, with the three viscous term formulations investigated.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this work, we propose the Seasonal Dynamic Factor Analysis (SeaDFA), an extension of Nonstationary Dynamic Factor Analysis, through which one can deal with dimensionality reduction in vectors of time series in such a way that both common and specific components are extracted. Furthermore, common factors are able to capture not only regular dynamics (stationary or not) but also seasonal ones, by means of the common factors following a multiplicative seasonal VARIMA(p, d, q) × (P, D, Q)s model. Additionally, a bootstrap procedure that does not need a backward representation of the model is proposed to be able to make inference for all the parameters in the model. A bootstrap scheme developed for forecasting includes uncertainty due to parameter estimation, allowing enhanced coverage of forecasting intervals. A challenging application is provided. The new proposed model and a bootstrap scheme are applied to an innovative subject in electricity markets: the computation of long-term point forecasts and prediction intervals of electricity prices. Several appendices with technical details, an illustrative example, and an additional table are available online as Supplementary Materials.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We perform a review of Web Mining techniques and we describe a Bootstrap Statistics methodology applied to pattern model classifier optimization and verification for Supervised Learning for Tour-Guide Robot knowledge repository management. It is virtually impossible to test thoroughly Web Page Classifiers and many other Internet Applications with pure empirical data, due to the need for human intervention to generate training sets and test sets. We propose using the computer-based Bootstrap paradigm to design a test environment where they are checked with better reliability.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

En este trabajo se muestran los resultados de la aplicación de la metodología bootstrap a datos de 3369 encuestas realizadas en 2009 a nivel nacional entre conductores de furgonetas, para obtener datos de movilidad interurbana y total, según edad de los vehículos, uso, conductores y otras características de este tipo de vehículo. Se obtienen estimaciones puntuales e intervalos de confianza para la movilidad total de furgonetas, así como para los cuatro tipos de furgonetas según la clasificación realizada en el proyecto de referencia. Se comparan los resultados obtenidos con estimaciones alternativas realizadas con otras fuentes de datos para el mismo colectivo (encuestas realizadas en inspecciones en carretera realizadas por la ATGC de la DGT e inspecciones en ITV) y datos publicados por fuentes oficiales. Estos resultados de movilidad (en término de vehículo-kilómetro) se usarán para la estimación de ratios de accidentalidad en un estudio comparado con otros colectivos de vehículos.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Los objetivos de este trabajo son analizar la relación entre factores estacionales y selvícolas y la presencia de P. pini en la masa de origen natural de pino silvestre localizada en la Sierra de Guadarrama, y realizar una evaluación de la capacidad de predecir las situaciones más vulnerables al ataque. Según el modelo, la probabilidad de presencia de P. pini disminuye de forma lineal con la altitud. La variación en la orientación apenas influye en la probabilidad de presencia de pies chamosos excepto en las zonas de solana pura en las que la probabilidad aumenta ligeramente. Tanto la densidad como el área basimétrica influyen positivamente en la probabilidad de presencia de P. pini. El efecto de ambas variables es de forma sigmoide, con valores máximos de probabilidad de presencia a partir de 35m2/ha y por debajo de 500 pies/ha. La evaluación interna (usando bootstrap) de la capacidad predictiva del modelo es aceptable para la discriminación (área bajo la curva ROC de 0.71) y muy buena para la calibración (pendiente= 0.95; constante= -0.04). A la espera de una validación del modelo con una muestra independiente, los resultados sugieren que el riesgo de daños por P. pini se puede pronosticar de forma fiable con el modelo desarrollado.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Sloshing describes the movement of liquids inside partially filled tanks, generating dynamic loads on the tank structure. The resulting impact pressures are of great importance in assessing structural strength, and their correct evaluation still represents a challenge for the designer due to the high level of nonlinearities involved, with complex free surface deformations, violent impact phenomena and influence of air trapping. In the present paper, a set of two-dimensional cases, for which experimental results are available, is considered to assess the merits and shortcomings of different numerical methods for sloshing evaluation, namely two commercial RANS solvers (FLOW-3D and LS-DYNA), and two academic software (Smoothed Particle Hydrodynamics and RANS). Impact pressures at various critical locations and global moment induced by water motion in a partially filled rectangular tank, subject to a simple harmonic rolling motion, are evaluated and predictions are compared with experimental measurements. 2012 Copyright Taylor and Francis Group, LLC.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The implementation of boundary conditions is one of the points where the SPH methodology still has some work to do. The aim of the present work is to provide an in-depth analysis of the most representative mirroring techniques used in SPH to enforce boundary conditions (BC) along solid profiles. We specifically refer to dummy particles, ghost particles, and Takeda et al. [1] boundary integrals. A Pouseuille flow has been used as a example to gradually evaluate the accuracy of the different implementations. Our goal is to test the behavior of the second-order differential operator with the proposed boundary extensions when the smoothing length h and other dicretization parameters as dx/h tend simultaneously to zero. First, using a smoothed continuous approximation of the unidirectional Pouseuille problem, the evolution of the velocity profile has been studied focusing on the values of the velocity and the viscous shear at the boundaries, where the exact solution should be approximated as h decreases. Second, to evaluate the impact of the discretization of the problem, an Eulerian SPH discrete version of the former problem has been implemented and similar results have been monitored. Finally, for the sake of completeness, a 2D Lagrangian SPH implementation of the problem has been also studied to compare the consequences of the particle movement

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper outlines the problems found in the parallelization of SPH (Smoothed Particle Hydrodynamics) algorithms using Graphics Processing Units. Different results of some parallel GPU implementations in terms of the speed-up and the scalability compared to the CPU sequential codes are shown. The most problematic stage in the GPU-SPH algorithms is the one responsible for locating neighboring particles and building the vectors where this information is stored, since these specific algorithms raise many dificulties for a data-level parallelization. Because of the fact that the neighbor location using linked lists does not show enough data-level parallelism, two new approaches have been pro- posed to minimize bank conflicts in the writing and subsequent reading of the neighbor lists. The first strategy proposes an efficient coordination between CPU-GPU, using GPU algorithms for those stages that allow a straight forward parallelization, and sequential CPU algorithms for those instructions that involve some kind of vector reduction. This coordination provides a relatively orderly reading of the neighbor lists in the interactions stage, achieving a speed-up factor of x47 in this stage. However, since the construction of the neighbor lists is quite expensive, it is achieved an overall speed-up of x41. The second strategy seeks to maximize the use of the GPU in the neighbor's location process by executing a specific vector sorting algorithm that allows some data-level parallelism. Al- though this strategy has succeeded in improving the speed-up on the stage of neighboring location, the global speed-up on the interactions stage falls, due to inefficient reading of the neighbor vectors. Some changes to these strategies are proposed, aimed at maximizing the computational load of the GPU and using the GPU texture-units, in order to reach the maximum speed-up for such codes. Different practical applications have been added to the mentioned GPU codes. First, the classical dam-break problem is studied. Second, the wave impact of the sloshing fluid contained in LNG vessel tanks is also simulated as a practical example of particle methods

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Background Malignancies arising in the large bowel cause the second largest number of deaths from cancer in the Western World. Despite progresses made during the last decades, colorectal cancer remains one of the most frequent and deadly neoplasias in the western countries. Methods A genomic study of human colorectal cancer has been carried out on a total of 31 tumoral samples, corresponding to different stages of the disease, and 33 non-tumoral samples. The study was carried out by hybridisation of the tumour samples against a reference pool of non-tumoral samples using Agilent Human 1A 60-mer oligo microarrays. The results obtained were validated by qRT-PCR. In the subsequent bioinformatics analysis, gene networks by means of Bayesian classifiers, variable selection and bootstrap resampling were built. The consensus among all the induced models produced a hierarchy of dependences and, thus, of variables. Results After an exhaustive process of pre-processing to ensure data quality--lost values imputation, probes quality, data smoothing and intraclass variability filtering--the final dataset comprised a total of 8, 104 probes. Next, a supervised classification approach and data analysis was carried out to obtain the most relevant genes. Two of them are directly involved in cancer progression and in particular in colorectal cancer. Finally, a supervised classifier was induced to classify new unseen samples. Conclusions We have developed a tentative model for the diagnosis of colorectal cancer based on a biomarker panel. Our results indicate that the gene profile described herein can discriminate between non-cancerous and cancerous samples with 94.45% accuracy using different supervised classifiers (AUC values in the range of 0.997 and 0.955)

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The main purpose of a gene interaction network is to map the relationships of the genes that are out of sight when a genomic study is tackled. DNA microarrays allow the measure of gene expression of thousands of genes at the same time. These data constitute the numeric seed for the induction of the gene networks. In this paper, we propose a new approach to build gene networks by means of Bayesian classifiers, variable selection and bootstrap resampling. The interactions induced by the Bayesian classifiers are based both on the expression levels and on the phenotype information of the supervised variable. Feature selection and bootstrap resampling add reliability and robustness to the overall process removing the false positive findings. The consensus among all the induced models produces a hierarchy of dependences and, thus, of variables. Biologists can define the depth level of the model hierarchy so the set of interactions and genes involved can vary from a sparse to a dense set. Experimental results show how these networks perform well on classification tasks. The biological validation matches previous biological findings and opens new hypothesis for future studies

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Background:Malignancies arising in the large bowel cause the second largest number of deaths from cancer in the Western World. Despite progresses made during the last decades, colorectal cancer remains one of the most frequent and deadly neoplasias in the western countries. Methods: A genomic study of human colorectal cancer has been carried out on a total of 31 tumoral samples, corresponding to different stages of the disease, and 33 non-tumoral samples. The study was carried out by hybridisation of the tumour samples against a reference pool of non-tumoral samples using Agilent Human 1A 60-mer oligo microarrays. The results obtained were validated by qRT-PCR. In the subsequent bioinformatics analysis, gene networks by means of Bayesian classifiers, variable selection and bootstrap resampling were built. The consensus among all the induced models produced a hierarchy of dependences and, thus, of variables. Results: After an exhaustive process of pre-processing to ensure data quality--lost values imputation, probes quality, data smoothing and intraclass variability filtering--the final dataset comprised a total of 8, 104 probes. Next, a supervised classification approach and data analysis was carried out to obtain the most relevant genes. Two of them are directly involved in cancer progression and in particular in colorectal cancer. Finally, a supervised classifier was induced to classify new unseen samples. Conclusions: We have developed a tentative model for the diagnosis of colorectal cancer based on a biomarker panel. Our results indicate that the gene profile described herein can discriminate between non-cancerous and cancerous samples with 94.45% accuracy using different supervised classifiers (AUC values in the range of 0.997 and 0.955).

Relevância:

10.00% 10.00%

Publicador:

Resumo:

System identification deals with the problem of building mathematical models of dynamical systems based on observed data from the system" [1]. In the context of civil engineering, the system refers to a large scale structure such as a building, bridge, or an offshore structure, and identification mostly involves the determination of modal parameters (the natural frequencies, damping ratios, and mode shapes). This paper presents some modal identification results obtained using a state-of-the-art time domain system identification method (data-driven stochastic subspace algorithms [2]) applied to the output-only data measured in a steel arch bridge. First, a three dimensional finite element model was developed for the numerical analysis of the structure using ANSYS. Modal analysis was carried out and modal parameters were extracted in the frequency range of interest, 0-10 Hz. The results obtained from the finite element modal analysis were used to determine the location of the sensors. After that, ambient vibration tests were conducted during April 23-24, 2009. The response of the structure was measured using eight accelerometers. Two stations of three sensors were formed (triaxial stations). These sensors were held stationary for reference during the test. The two remaining sensors were placed at the different measurement points along the bridge deck, in which only vertical and transversal measurements were conducted (biaxial stations). Point estimate and interval estimate have been carried out in the state space model using these ambient vibration measurements. In the case of parametric models (like state space), the dynamic behaviour of a system is described using mathematical models. Then, mathematical relationships can be established between modal parameters and estimated point parameters (thus, it is common to use experimental modal analysis as a synonym for system identification). Stable modal parameters are found using a stabilization diagram. Furthermore, this paper proposes a method for assessing the precision of estimates of the parameters of state-space models (confidence interval). This approach employs the nonparametric bootstrap procedure [3] and is applied to subspace parameter estimation algorithm. Using bootstrap results, a plot similar to a stabilization diagram is developed. These graphics differentiate system modes from spurious noise modes for a given order system. Additionally, using the modal assurance criterion, the experimental modes obtained have been compared with those evaluated from a finite element analysis. A quite good agreement between numerical and experimental results is observed.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This work evaluates a spline-based smoothing method applied to the output of a glucose predictor. Methods:Our on-line prediction algorithm is based on a neural network model (NNM). We trained/validated the NNM with a prediction horizon of 30 minutes using 39/54 profiles of patients monitored with the Guardian® Real-Time continuous glucose monitoring system The NNM output is smoothed by fitting a causal cubic spline. The assessment parameters are the error (RMSE), mean delay (MD) and the high-frequency noise (HFCrms). The HFCrms is the root-mean-square values of the high-frequency components isolated with a zero-delay non-causal filter. HFCrms is 2.90±1.37 (mg/dl) for the original profiles.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

La Universidad Politécnica de Madrid (UPM) y la Università degli Studi di Firenze (UniFi), bajo la coordinación técnica de AMPHOS21, participan desde 2009 en el proyecto de investigación “Estrategias de Monitorización de CO2 y otros gases en el estudio de Análogos Naturales”, financiado por la Fundación Ciudad de la Energía (CIUDEN) en el marco del Proyecto Compostilla OXYCFB300 (http://www.compostillaproject.eu), del Programa “European Energy Program for Recovery - EEPR”. El objetivo principal del proyecto fue el desarrollo y puesta a punto de metodologías de monitorización superficiales para su aplicación en el seguimiento y control de los emplazamientos donde se realice el almacenamiento geológico de CO2, analizando técnicas que permitan detectar y cuantificar las posibles fugas de CO2 a la atmósfera. Los trabajos se realizaron tanto en análogos naturales (españoles e italianos) como en la Planta de Desarrollo Tecnológico de Almacenamiento de CO2 de Hontomín. Las técnicas analizadas se centran en la medición de gases y aguas superficiales (de escorrentía y manantiales). En cuanto a la medición de gases se analizó el flujo de CO2 que emana desde el suelo a la atmósfera y la aplicabilidad de trazadores naturales (como el radón) para la detección e identificación de las fugas de CO2. En cuanto al análisis químico de las aguas se analizaron los datos geoquímicos e isotópicos y los gases disueltos en las aguas de los alrededores de la PDT de Hontomín, con objeto de determinar qué parámetros son los más apropiados para la detección de una posible migración del CO2 inyectado, o de la salmuera, a los ambientes superficiales. Las medidas de flujo de CO2 se realizaron con la técnica de la cámara de acúmulo. A pesar de ser una técnica desarrollada y aplicada en diferentes ámbitos científicos se estimó necesario adaptar un protocolo de medida y de análisis de datos a las características específicas de los proyectos de captura y almacenamiento de CO2 (CAC). Donde los flujos de CO2 esperados son bajos y en caso de producirse una fuga habrá que detectar pequeñas variaciones en los valores flujo con un “ruido” en la señal alto, debido a actividad biológica en el suelo. La medida de flujo de CO2 mediante la técnica de la cámara de acúmulo se puede realizar sin limpiar la superficie donde se coloca la cámara o limpiando y esperando al reequilibrio del flujo después de la distorsión al sistema. Sin embargo, los resultados obtenidos después de limpiar y esperar muestran menor dispersión, lo que nos indica que este procedimiento es el mejor para la monitorización de los complejos de almacenamiento geológico de CO2. El protocolo de medida resultante, utilizado para la obtención de la línea base de flujo de CO2 en Hontomín, sigue los siguiente pasos: a) con una espátula se prepara el punto de medición limpiando y retirando el recubrimiento vegetal o la primera capa compacta de suelo, b) se espera un tiempo para la realización de la medida de flujo, facilitando el reequilibrio del flujo del gas tras la alteración provocada en el suelo y c) se realiza la medida de flujo de CO2. Una vez realizada la medición de flujo de CO2, y detectada si existen zonas de anomalías, se debe estimar la cantidad de CO2 que se está escapando a la atmósfera (emanación total), con el objetivo de cuantificar la posible fuga. Existen un amplio rango de metodologías para realizar dicha estimación, siendo necesario entender cuáles son las más apropiadas para obtener el valor más representativo del sistema. En esta tesis se comparan seis técnicas estadísticas: media aritmética, estimador insegado de la media (aplicando la función de Sichel), remuestreo con reemplazamiento (bootstrap), separación en diferentes poblaciones mediante métodos gráficos y métodos basados en criterios de máxima verosimilitud, y la simulación Gaussiana secuencial. Para este análisis se realizaron ocho campañas de muestreo, tanto en la Planta de Desarrollo Tecnológico de Hontomón como en análogos naturales (italianos y españoles). Los resultados muestran que la simulación Gaussiana secuencial suele ser el método más preciso para realizar el cálculo, sin embargo, existen ocasiones donde otros métodos son más apropiados. Como consecuencia, se desarrolla un procedimiento de actuación para seleccionar el método que proporcione el mejor estimador. Este procedimiento consiste, en primer lugar, en realizar un análisis variográfico. Si existe una autocorrelación entre los datos, modelizada mediante el variograma, la mejor técnica para calcular la emanación total y su intervalo de confianza es la simulación Gaussiana secuencial (sGs). Si los datos son independientes se debe comprobar la distribución muestral, aplicando la media aritmética o el estimador insesgado de la media (Sichel) para datos normales o lognormales respectivamente. Cuando los datos no son normales o corresponden a una mezcla de poblaciones la mejor técnica de estimación es la de remuestreo con reemplazamiento (bootstrap). Siguiendo este procedimiento el máximo valor del intervalo de confianza estuvo en el orden del ±20/25%, con la mayoría de valores comprendidos entre ±3,5% y ±8%. La identificación de las diferentes poblaciones muestrales en los datos de flujo de CO2 puede ayudar a interpretar los resultados obtenidos, toda vez que esta distribución se ve afectada por la presencia de varios procesos geoquímicos como, por ejemplo, una fuente geológica o biológica del CO2. Así pues, este análisis puede ser una herramienta útil en el programa de monitorización, donde el principal objetivo es demostrar que no hay fugas desde el reservorio a la atmósfera y, si ocurren, detectarlas y cuantificarlas. Los resultados obtenidos muestran que el mejor proceso para realizar la separación de poblaciones está basado en criterios de máxima verosimilitud. Los procedimientos gráficos, aunque existen pautas para realizarlos, tienen un cierto grado de subjetividad en la interpretación de manera que los resultados son menos reproducibles. Durante el desarrollo de la tesis se analizó, en análogos naturales, la relación existente entre el CO2 y los isótopos del radón (222Rn y 220Rn), detectándose en todas las zonas de emisión de CO2 una relación positiva entre los valores de concentración de 222Rn en aire del suelo y el flujo de CO2. Comparando la concentración de 220Rn con el flujo de CO2 la relación no es tan clara, mientras que en algunos casos aumenta en otros se detecta una disminución, hecho que parece estar relacionado con la profundidad de origen del radón. Estos resultados confirmarían la posible aplicación de los isótopos del radón como trazadores del origen de los gases y su aplicación en la detección de fugas. Con respecto a la determinación de la línea base de flujo CO2 en la PDT de Hontomín, se realizaron mediciones con la cámara de acúmulo en las proximidades de los sondeos petrolíferos, perforados en los ochenta y denominados H-1, H-2, H-3 y H-4, en la zona donde se instalarán el sondeo de inyección (H-I) y el de monitorización (H-A) y en las proximidades de la falla sur. Desde noviembre de 2009 a abril de 2011 se realizaron siete campañas de muestreo, adquiriéndose más de 4.000 registros de flujo de CO2 con los que se determinó la línea base y su variación estacional. Los valores obtenidos fueron bajos (valores medios entre 5 y 13 g•m-2•d-1), detectándose pocos valores anómalos, principalmente en las proximidades del sondeo H-2. Sin embargo, estos valores no se pudieron asociar a una fuente profunda del CO2 y seguramente estuvieran más relacionados con procesos biológicos, como la respiración del suelo. No se detectaron valores anómalos cerca del sistema de fracturación (falla Ubierna), toda vez que en esta zona los valores de flujo son tan bajos como en el resto de puntos de muestreo. En este sentido, los valores de flujo de CO2 aparentemente están controlados por la actividad biológica, corroborado al obtenerse los menores valores durante los meses de otoño-invierno e ir aumentando en los periodos cálidos. Se calcularon dos grupos de valores de referencia, el primer grupo (UCL50) es 5 g•m-2•d-1 en las zonas no aradas en los meses de otoño-invierno y 3,5 y 12 g•m-2•d-1 en primavera-verano para zonas aradas y no aradas, respectivamente. El segundo grupo (UCL99) corresponde a 26 g•m-2•d- 1 durante los meses de otoño-invierno en las zonas no aradas y 34 y 42 g•m-2•d-1 para los meses de primavera-verano en zonas aradas y no aradas, respectivamente. Flujos mayores a estos valores de referencia podrían ser indicativos de una posible fuga durante la inyección y posterior a la misma. Los primeros datos geoquímicos e isotópicos de las aguas superficiales (de escorrentía y de manantiales) en el área de Hontomín–Huermeces fueron analizados. Los datos sugieren que las aguas estudiadas están relacionadas con aguas meteóricas con un circuito hidrogeológico superficial, caracterizadas por valores de TDS relativamente bajos (menor a 800 mg/L) y una fácie hidrogeoquímica de Ca2+(Mg2+)-HCO3 −. Algunas aguas de manantiales se caracterizan por concentraciones elevadas de NO3 − (concentraciones de hasta 123 mg/l), lo que sugiere una contaminación antropogénica. Se obtuvieron concentraciones anómalas de of Cl−, SO4 2−, As, B y Ba en dos manantiales cercanos a los sondeos petrolíferos y en el rio Ubierna, estos componentes son probablemente indicadores de una posible mezcla entre los acuíferos profundos y superficiales. El estudio de los gases disueltos en las aguas también evidencia el circuito superficial de las aguas. Estando, por lo general, dominado por la componente atmosférica (N2, O2 y Ar). Sin embargo, en algunos casos el gas predominante fue el CO2 (con concentraciones que llegan al 63% v/v), aunque los valores isotópicos del carbono (<-17,7 ‰) muestran que lo más probable es que esté relacionado con un origen biológico. Los datos geoquímicos e isotópicos de las aguas superficiales obtenidos en la zona de Hontomín se pueden considerar como el valor de fondo con el que comparar durante la fase operacional, la clausura y posterior a la clausura. En este sentido, la composición de los elementos mayoritarios y traza, la composición isotópica del carbono del CO2 disuelto y del TDIC (Carbono inorgánico disuelto) y algunos elementos traza se pueden considerar como parámetros adecuados para detectar la migración del CO2 a los ambientes superficiales. ABSTRACT Since 2009, a group made up of Universidad Politécnica de Madrid (UPM; Spain) and Università degli Studi Firenze (UniFi; Italy) has been taking part in a joint project called “Strategies for Monitoring CO2 and other Gases in Natural analogues”. The group was coordinated by AMPHOS XXI, a private company established in Barcelona. The Project was financially supported by Fundación Ciudad de la Energía (CIUDEN; Spain) as a part of the EC-funded OXYCFB300 project (European Energy Program for Recovery -EEPR-; www.compostillaproject.eu). The main objectives of the project were aimed to develop and optimize analytical methodologies to be applied at the surface to Monitor and Verify the feasibility of geologically stored carbon dioxide. These techniques were oriented to detect and quantify possible CO2 leakages to the atmosphere. Several investigations were made in natural analogues from Spain and Italy and in the Tecnchnological Development Plant for CO2 injection al Hontomín (Burgos, Spain). The studying techniques were mainly focused on the measurements of diffuse soil gases and surface and shallow waters. The soil-gas measurements included the determination of CO2 flux and the application to natural trace gases (e.g. radon) that may help to detect any CO2 leakage. As far as the water chemistry is concerned, geochemical and isotopic data related to surface and spring waters and dissolved gases in the area of the PDT of Hontomín were analyzed to determine the most suitable parameters to trace the migration of the injected CO2 into the near-surface environments. The accumulation chamber method was used to measure the diffuse emission of CO2 at the soil-atmosphere interface. Although this technique has widely been applied in different scientific areas, it was considered of the utmost importance to adapt the optimum methodology for measuring the CO2 soil flux and estimating the total CO2 output to the specific features of the site where CO2 is to be stored shortly. During the pre-injection phase CO2 fluxes are expected to be relatively low where in the intra- and post-injection phases, if leakages are to be occurring, small variation in CO2 flux might be detected when the CO2 “noise” is overcoming the biological activity of the soil (soil respiration). CO2 flux measurements by the accumulation chamber method could be performed without vegetation clearance or after vegetation clearance. However, the results obtained after clearance show less dispersion and this suggests that this procedure appears to be more suitable for monitoring CO2 Storage sites. The measurement protocol, applied for the determination of the CO2 flux baseline at Hontomín, has included the following steps: a) cleaning and removal of both the vegetal cover and top 2 cm of soil, b) waiting to reduce flux perturbation due to the soil removal and c) measuring the CO2 flux. Once completing the CO2 flux measurements and detected whether there were anomalies zones, the total CO2 output was estimated to quantify the amount of CO2 released to the atmosphere in each of the studied areas. There is a wide range of methodologies for the estimation of the CO2 output, which were applied to understand which one was the most representative. In this study six statistical methods are presented: arithmetic mean, minimum variances unbiased estimator, bootstrap resample, partitioning of data into different populations with a graphical and a maximum likelihood procedures, and sequential Gaussian simulation. Eight campaigns were carried out in the Hontomín CO2 Storage Technology Development Plant and in natural CO2 analogues. The results show that sequential Gaussian simulation is the most accurate method to estimate the total CO2 output and the confidential interval. Nevertheless, a variety of statistic methods were also used. As a consequence, an application procedure for selecting the most realistic method was developed. The first step to estimate the total emanation rate was the variogram analysis. If the relation among the data can be explained with the variogram, the best technique to calculate the total CO2 output and its confidence interval is the sequential Gaussian simulation method (sGs). If the data are independent, their distribution is to be analyzed. For normal and log-normal distribution the proper methods are the arithmetic mean and minimum variances unbiased estimator, respectively. If the data are not normal (log-normal) or are a mixture of different populations the best approach is the bootstrap resampling. According to these steps, the maximum confidence interval was about ±20/25%, with most of values between ±3.5% and ±8%. Partitioning of CO2 flux data into different populations may help to interpret the data as their distribution can be affected by different geochemical processes, e.g. geological or biological sources of CO2. Consequently, it may be an important tool in a monitoring CCS program, where the main goal is to demonstrate that there are not leakages from the reservoir to the atmosphere and, if occurring, to be able to detect and quantify it. Results show that the partitioning of populations is better performed by maximum likelihood criteria, since graphical procedures have a degree of subjectivity in the interpretation and results may not be reproducible. The relationship between CO2 flux and radon isotopes (222Rn and 220Rn) was studied in natural analogues. In all emissions zones, a positive relation between 222Rn and CO2 was observed. However, the relationship between activity of 220Rn and CO2 flux is not clear. In some cases the 220Rn activity indeed increased with the CO2 flux in other measurements a decrease was recognized. We can speculate that this effect was possibly related to the route (deep or shallow) of the radon source. These results may confirm the possible use of the radon isotopes as tracers for the gas origin and their application in the detection of leakages. With respect to the CO2 flux baseline at the TDP of Hontomín, soil flux measurements in the vicinity of oil boreholes, drilled in the eighties and named H-1 to H-4, and injection and monitoring wells were performed using an accumulation chamber. Seven surveys were carried out from November 2009 to summer 2011. More than 4,000 measurements were used to determine the baseline flux of CO2 and its seasonal variations. The measured values were relatively low (from 5 to 13 g•m-2•day-1) and few outliers were identified, mainly located close to the H-2 oil well. Nevertheless, these values cannot be associated to a deep source of CO2, being more likely related to biological processes, i.e. soil respiration. No anomalies were recognized close to the deep fault system (Ubierna Fault) detected by geophysical investigations. There, the CO2 flux is indeed as low as other measurement stations. CO2 fluxes appear to be controlled by the biological activity since the lowest values were recorded during autumn-winter seasons and they tend to increase in warm periods. Two reference CO2 flux values (UCL50 of 5 g•m-2•d-1 for non-ploughed areas in autumn-winter seasons and 3.5 and 12 g•m-2•d-1 for in ploughed and non-ploughed areas, respectively, in spring-summer time, and UCL99 of 26 g•m-2•d-1 for autumn-winter in not-ploughed areas and 34 and 42 g•m-2•d-1 for spring-summer in ploughed and not-ploughed areas, respectively, were calculated. Fluxes higher than these reference values could be indicative of possible leakage during the operational and post-closure stages of the storage project. The first geochemical and isotopic data related to surface and spring waters and dissolved gases in the area of Hontomín–Huermeces (Burgos, Spain) are presented and discussed. The chemical and features of the spring waters suggest that they are related to a shallow hydrogeological system as the concentration of the Total Dissolved Solids approaches 800 mg/L with a Ca2+(Mg2+)-HCO3 − composition, similar to that of the surface waters. Some spring waters are characterized by relatively high concentrations of NO3 − (up to 123 mg/L), unequivocally suggesting an anthropogenic source. Anomalous concentrations of Cl−, SO4 2−, As, B and Ba were measured in two springs, discharging a few hundred meters from the oil wells, and in the Rio Ubierna. These contents are possibly indicative of mixing processes between deep and shallow aquifers. The chemistry of the dissolved gases also evidences the shallow circuits of the Hontomín– Huermeces, mainly characterized by an atmospheric source as highlighted by the contents of N2, O2, Ar and their relative ratios. Nevertheless, significant concentrations (up to 63% by vol.) of isotopically negative CO2 (<−17.7‰ V-PDB) were found in some water samples, likely related to a biogenic source. The geochemical and isotopic data of the surface and spring waters in the surroundings of Hontomín can be considered as background values when intra- and post-injection monitoring programs will be carried out. In this respect, main and minor solutes, the isotopic carbon of dissolved CO2 and TDIC (Total Dissolved Inorganic Carbon) and selected trace elements can be considered as useful parameters to trace the migration of the injected CO2 into near-surface environments.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Estamos viviendo la era de la Internetificación. A día de hoy, las conexiones a Internet se asumen presentes en nuestro entorno como una necesidad más. La Web, se ha convertido en un lugar de generación de contenido por los usuarios. Una información generada, que sobrepasa la idea con la que surgió esta, ya que en la mayoría de casos, su contenido no se ha diseñado más que para ser consumido por humanos, y no por máquinas. Esto supone un cambio de mentalidad en la forma en que diseñamos sistemas capaces de soportar una carga computacional y de almacenamiento que crece sin un fin aparente. Al mismo tiempo, vivimos un momento de crisis de la educación superior: los altos costes de una educación de calidad suponen una amenaza para el mundo académico. Mediante el uso de la tecnología, se puede lograr un incremento de la productividad, y una reducción en dichos costes en un campo, en el que apenas se ha avanzado desde el Renacimiento. En CloudRoom se ha diseñado una plataforma MOOC con una arquitectura ajustada a las últimas convenciones en Cloud Computing, que implica el uso de Servicios REST, bases de datos NoSQL, y que hace uso de las últimas recomendaciones del W3C en materia de desarrollo web y Linked Data. Para su construcción, se ha hecho uso de métodos ágiles de Ingeniería del Software, técnicas de Interacción Persona-Ordenador, y tecnologías de última generación como Neo4j, Redis, Node.js, AngularJS, Bootstrap, HTML5, CSS3 o Amazon Web Services. Se ha realizado un trabajo integral de Ingeniería Informática, combinando prácticamente la totalidad de aquellas áreas de conocimiento fundamentales en Informática. En definitiva se han ideado las bases de un sistema distribuido robusto, mantenible, con características sociales y semánticas, que puede ser ejecutado en múltiples dispositivos, y que es capaz de responder ante millones de usuarios. We are living through an age of Internetification. Nowadays, Internet connections are a utility whose presence one can simply assume. The web has become a place of generation of content by users. The information generated surpasses the notion with which the World Wide Web emerged because, in most cases, this content has been designed to be consumed by humans and not by machines. This fact implies a change of mindset in the way that we design systems; these systems should be able to support a computational and storage capacity that apparently grows endlessly. At the same time, our education system is in a state of crisis: the high costs of high-quality education threaten the academic world. With the use of technology, we could achieve an increase of productivity and quality, and a reduction of these costs in this field, which has remained largely unchanged since the Renaissance. In CloudRoom, a MOOC platform has been designed with an architecture that satisfies the last conventions on Cloud Computing; which involves the use of REST services, NoSQL databases, and uses the last recommendations from W3C in terms of web development and Linked Data. For its building process, agile methods of Software Engineering, Human-Computer Interaction techniques, and state of the art technologies such as Neo4j, Redis, Node.js, AngularJS, Bootstrap, HTML5, CSS3 or Amazon Web Services have been used. Furthermore, a comprehensive Informatics Engineering work has been performed, by combining virtually all of the areas of knowledge in Computer Science. Summarizing, the pillars of a robust, maintainable, and distributed system have been devised; a system with social and semantic capabilities, which runs in multiple devices, and scales to millions of users.