850 resultados para High-dimensional data visualization


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Quantifying the health effects associated with simultaneous exposure to many air pollutants is now a research priority of the US EPA. Bayesian hierarchical models (BHM) have been extensively used in multisite time series studies of air pollution and health to estimate health effects of a single pollutant adjusted for potential confounding of other pollutants and other time-varying factors. However, when the scientific goal is to estimate the impacts of many pollutants jointly, a straightforward application of BHM is challenged by the need to specify a random-effect distribution on a high-dimensional vector of nuisance parameters, which often do not have an easy interpretation. In this paper we introduce a new BHM formulation, which we call "reduced BHM", aimed at analyzing clustered data sets in the presence of a large number of random effects that are not of primary scientific interest. At the first stage of the reduced BHM, we calculate the integrated likelihood of the parameter of interest (e.g. excess number of deaths attributed to simultaneous exposure to high levels of many pollutants). At the second stage, we specify a flexible random-effect distribution directly on the parameter of interest. The reduced BHM overcomes many of the challenges in the specification and implementation of full BHM in the context of a large number of nuisance parameters. In simulation studies we show that the reduced BHM performs comparably to the full BHM in many scenarios, and even performs better in some cases. Methods are applied to estimate location-specific and overall relative risks of cardiovascular hospital admissions associated with simultaneous exposure to elevated levels of particulate matter and ozone in 51 US counties during the period 1999-2005.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

One of the original ocean-bottom time-lapse seismic studies was performed at the Teal South oil field in the Gulf of Mexico during the late 1990’s. This work reexamines some aspects of previous work using modern analysis techniques to provide improved quantitative interpretations. Using three-dimensional volume visualization of legacy data and the two phases of post-production time-lapse data, I provide additional insight into the fluid migration pathways and the pressure communication between different reservoirs, separated by faults. This work supports a conclusion from previous studies that production from one reservoir caused regional pressure decline that in turn resulted in liberation of gas from multiple surrounding unproduced reservoirs. I also provide an explanation for unusual time-lapse changes in amplitude-versus-offset (AVO) data related to the compaction of the producing reservoir which, in turn, changed an isotropic medium to an anisotropic medium. In the first part of this work, I examine regional changes in seismic response due to the production of oil and gas from one reservoir. The previous studies primarily used two post-production ocean-bottom surveys (Phase I and Phase II), and not the legacy streamer data, due to the unavailability of legacy prestack data and very different acquisition parameters. In order to incorporate the legacy data in the present study, all three poststack data sets were cross-equalized and examined using instantaneous amplitude and energy volumes. This approach appears quite effective and helps to suppress changes unrelated to production while emphasizing those large-amplitude changes that are related to production in this noisy (by current standards) suite of data. I examine the multiple data sets first by using the instantaneous amplitude and energy attributes, and then also examine specific apparent time-lapse changes through direct comparisons of seismic traces. In so doing, I identify time-delays that, when corrected for, indicate water encroachment at the base of the producing reservoir. I also identify specific sites of leakage from various unproduced reservoirs, the result of regional pressure blowdown as explained in previous studies; those earlier studies, however, were unable to identify direct evidence of fluid movement. Of particular interest is the identification of one site where oil apparently leaked from one reservoir into a “new” reservoir that did not originally contain oil, but was ideally suited as a trap for fluids leaking from the neighboring spill-point. With continued pressure drop, oil in the new reservoir increased as more oil entered into the reservoir and expanded, liberating gas from solution. Because of the limited volume available for oil and gas in that temporary trap, oil and gas also escaped from it into the surrounding formation. I also note that some of the reservoirs demonstrate time-lapse changes only in the “gas cap” and not in the oil zone, even though gas must be coming out of solution everywhere in the reservoir. This is explained by interplay between pore-fluid modulus reduction by gas saturation decrease and dry-frame modulus increase by frame stiffening. In the second part of this work, I examine various rock-physics models in an attempt to quantitatively account for frame-stiffening that results from reduced pore-fluid pressure in the producing reservoir, searching for a model that would predict the unusual AVO features observed in the time-lapse prestack and stacked data at Teal South. While several rock-physics models are successful at predicting the time-lapse response for initial production, most fail to match the observations for continued production between Phase I and Phase II. Because the reservoir was initially overpressured and unconsolidated, reservoir compaction was likely significant, and is probably accomplished largely by uniaxial strain in the vertical direction; this implies that an anisotropic model may be required. Using Walton’s model for anisotropic unconsolidated sand, I successfully model the time-lapse changes for all phases of production. This observation may be of interest for application to other unconsolidated overpressured reservoirs under production.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Simbrain is a visually-oriented framework for building and analyzing neural networks. It emphasizes the analysis of networks which control agents embedded in virtual environments, and visualization of the structures which occur in the high dimensional state spaces of these networks. The program was originally intended to facilitate analysis of representational processes in embodied agents, however it is also well suited to teaching neural networks concepts to a broader audience than is traditional for neural networks courses. Simbrain was used to teach a course at a new university, UC Merced, in its inaugural year. Experiences from the course and sample lessons are provided.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We present an algorithm for estimating dense image correspondences. Our versatile approach lends itself to various tasks typical for video post-processing, including image morphing, optical flow estimation, stereo rectification, disparity/depth reconstruction, and baseline adjustment. We incorporate recent advances in feature matching, energy minimization, stereo vision, and data clustering into our approach. At the core of our correspondence estimation we use Efficient Belief Propagation for energy minimization. While state-of-the-art algorithms only work on thumbnail-sized images, our novel feature downsampling scheme in combination with a simple, yet efficient data term compression, can cope with high-resolution data. The incorporation of SIFT (Scale-Invariant Feature Transform) features into data term computation further resolves matching ambiguities, making long-range correspondence estimation possible. We detect occluded areas by evaluating the correspondence symmetry, we further apply Geodesic matting to automatically determine plausible values in these regions.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

High-quality data are essential for veterinary surveillance systems, and their quality can be affected by the source and the method of collection. Data recorded on farms could provide detailed information about the health of a population of animals, but the accuracy of the data recorded by farmers is uncertain. The aims of this study were to evaluate the quality of the data on animal health recorded on 97 Swiss dairy farms, to compare the quality of the data obtained by different recording systems, and to obtain baseline data on the health of the animals on the 97 farms. Data on animal health were collected from the farms for a year. Their quality was evaluated by assessing the completeness and accuracy of the recorded information, and by comparing farmers' and veterinarians' records. The quality of the data provided by the farmers was satisfactory, although electronic recording systems made it easier to trace the animals treated. The farmers tended to record more health-related events than the veterinarians, although this varied with the event considered, and some events were recorded only by the veterinarians. The farmers' attitude towards data collection was positive. Factors such as motivation, feedback, training, and simplicity and standardisation of data collection were important because they influenced the quality of the data.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

High-throughput assays, such as yeast two-hybrid system, have generated a huge amount of protein-protein interaction (PPI) data in the past decade. This tremendously increases the need for developing reliable methods to systematically and automatically suggest protein functions and relationships between them. With the available PPI data, it is now possible to study the functions and relationships in the context of a large-scale network. To data, several network-based schemes have been provided to effectively annotate protein functions on a large scale. However, due to those inherent noises in high-throughput data generation, new methods and algorithms should be developed to increase the reliability of functional annotations. Previous work in a yeast PPI network (Samanta and Liang, 2003) has shown that the local connection topology, particularly for two proteins sharing an unusually large number of neighbors, can predict functional associations between proteins, and hence suggest their functions. One advantage of the work is that their algorithm is not sensitive to noises (false positives) in high-throughput PPI data. In this study, we improved their prediction scheme by developing a new algorithm and new methods which we applied on a human PPI network to make a genome-wide functional inference. We used the new algorithm to measure and reduce the influence of hub proteins on detecting functionally associated proteins. We used the annotations of the Gene Ontology (GO) and the Kyoto Encyclopedia of Genes and Genomes (KEGG) as independent and unbiased benchmarks to evaluate our algorithms and methods within the human PPI network. We showed that, compared with the previous work from Samanta and Liang, our algorithm and methods developed in this study improved the overall quality of functional inferences for human proteins. By applying the algorithms to the human PPI network, we obtained 4,233 significant functional associations among 1,754 proteins. Further comparisons of their KEGG and GO annotations allowed us to assign 466 KEGG pathway annotations to 274 proteins and 123 GO annotations to 114 proteins with estimated false discovery rates of <21% for KEGG and <30% for GO. We clustered 1,729 proteins by their functional associations and made pathway analysis to identify several subclusters that are highly enriched in certain signaling pathways. Particularly, we performed a detailed analysis on a subcluster enriched in the transforming growth factor β signaling pathway (P<10-50) which is important in cell proliferation and tumorigenesis. Analysis of another four subclusters also suggested potential new players in six signaling pathways worthy of further experimental investigations. Our study gives clear insight into the common neighbor-based prediction scheme and provides a reliable method for large-scale functional annotations in this post-genomic era.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

It is a challenge to measure the impact of releasing data to the public since the effects may not be directly linked to particular open data activities or substantial impact may only occur several years after publishing the data. This paper proposes a framework to assess the impact of releasing open data by applying the Social Return on Investment (SROI) approach. SROI was developed for organizations intended to generate social and environmental benefits thus fitting the purpose of most open data initiatives. We link the four steps of SROI (input, output, outcome, impact) with the 14 high-value data categories of the G8 Open Data Charter to create a matrix of open data examples, activities, and impacts in each of the data categories. This Impact Monitoring Framework helps data providers to navigate the impact space of open data laying out the conceptual basis for further research.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

High-frequency data collected continuously over a multiyear time frame are required for investigating the various agents that drive ecological and hydrodynamic processes in estuaries. Here, we present water quality and current in-situ observations from a fixed monitoring station operating from 2008 to 2014 in the lower Guadiana Estuary, southern Portugal (37°11.30' N, 7°24.67' W). The data were recorded by a multi-parametric probe providing hourly records (temperature, salinity, chlorophyll, dissolved oxygen, turbidity, and pH) at a water depth of ~1 m, and by a bottom-mounted acoustic Doppler current profiler measuring the pressure, near-bottom temperature, and flow velocity through the water column every 15 min. The time-series data, in particular the probe ones, present substantial gaps arising from equipment failure and maintenance, which are ineluctable with this type of observations in harsh environments. However, prolonged (months-long) periods of multi-parametric observations during contrasted external forcing conditions are available. The raw data are reported together with flags indicating the quality status of each record. River discharge data from two hydrographic stations located near the estuary head are also provided to support data analysis and interpretation.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Based on discrete samples, we report new high-resolution records of the ~185 kyr Iceland Basin (IB) geomagnetic excursion from Ocean Drilling Project (ODP) Site 1063 on the Bermuda Rise (sedimentation rate 32 cm/kyr) and from ODP Site 983 in the far North Atlantic (sedimentation rate 18 cm/kyr). Two records from Holes 1063A and 1063B are very consistent, and provide the highest resolution of the detailed field behaviour during the IB excursion obtained so far. Inclination records from Holes 983B and 983C in the far North Atlantic are also very consistent, whereas declination anomalies deviate more notably. The pseudo-Thellier (PT) technique was applied along with more conventional palaeointensity proxies (NRM/ARM and NRM/kappa) to recover relative palaeointensity (RPI) estimates from Hole 1063A and Hole 983B. As expected, these proxies indicate that the field intensity generally dropped at both sites during the IB excursion, but also that the history of RPI from the two sites is different. VGPs from Site 1063 indicate that the field at this location experienced some stop-and-go behaviour between patches of intense vertical flux over North America and the tip of South America, areas which coincide fairly well with patches of preferred transitional VGP clustering from reversals and zones of high seismic velocity in the lower mantle. Changes in RPI at this location were generally gradual, possibly due to the proximity of these flux patches, and the first period of VGP-clustering over North America was accompanied by a conspicuous increase in RPI. VGPs from Site 983 track along a different path, and the associated RPI changes are very abrupt and completely synchronous with the onset and termination of the excursion. The differing VGP paths from Sites 1063 and 983 indicate that the global field structure during the IB excursion was not dominated by a single dipole.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The mid-Piacenzian (MP) warm period (3.264-3.025 Ma) has been identified as the most recent time in geologic history during which mean global surface temperatures were considerably warmer than today for a sustained period. This interval has therefore been proposed as a potential (albeit imperfect) analog for future climate change and as such, has received much scientific attention over the past two decades. Central to this research effort is the Pliocene Research, Interpretation, and Synoptic Mapping (PRISM) project, an iterative paleoenvironmental reconstruction of the MP focused on increasing our understanding of warm-period climate forcings, dynamics, and feedbacks by providing three-dimensional data sets for general circulation models. A mainstay of the PRISM project has been the development of a global sea surface temperature (SST) data set based primarily upon quantitative analyses of planktic foraminifer assemblages, supplemented with geochemical SST estimates wherever possible. In order to improve spatial coverage of the PRISM faunal data set in the low and mid-latitude North Atlantic, this study provides a description of the MP planktic foraminifer assemblage from five Ocean Drilling Program sites (951, 958, 1006, 1062, and 1063) in the subtropical gyre, a region critical to Atlantic Ocean circulation and tropical heat advection. Assemblages from each core provide evidence for a temperature- and circulation-driven 5-10° northward displacement of MP faunal provinces, as well as regional shifts in planktic foraminifer populations linked to species ecology and interactions. General biogeographic trends also indicate that, relative to modern conditions, gyre circulation was stronger (particularly the Gulf Stream, North Atlantic Current, and North Equatorial Current) and meridionally broader. A comparison of mid-Piacenzian and modern North Atlantic planktic foraminifer assemblages suggests that low latitude western boundary currents were less than 1 °C warmer while eastern boundary currents were ~1-2 °C warmer, supporting the hypothesis of enhanced northward heat advection along western boundary currents and warming of high latitude Northeast Atlantic source regions for the Canary Current. These findings are consistent with a model of reduced meridional SST gradients, with little-to-no low latitude warming, and more vigorous ocean circulation. Results therefore support the theory that enhanced meridional overturn circulation and associated northward heat advection made an important contribution, in conjunction with elevated atmospheric CO2 concentrations, to the 2-3 °C global surface temperature increase (relative to today) and strong polar amplification of SST warmth during the MP warm period.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Pragmatism is the leading motivation of regularization. We can understand regularization as a modification of the maximum-likelihood estimator so that a reasonable answer could be given in an unstable or ill-posed situation. To mention some typical examples, this happens when fitting parametric or non-parametric models with more parameters than data or when estimating large covariance matrices. Regularization is usually used, in addition, to improve the bias-variance tradeoff of an estimation. Then, the definition of regularization is quite general, and, although the introduction of a penalty is probably the most popular type, it is just one out of multiple forms of regularization. In this dissertation, we focus on the applications of regularization for obtaining sparse or parsimonious representations, where only a subset of the inputs is used. A particular form of regularization, L1-regularization, plays a key role for reaching sparsity. Most of the contributions presented here revolve around L1-regularization, although other forms of regularization are explored (also pursuing sparsity in some sense). In addition to present a compact review of L1-regularization and its applications in statistical and machine learning, we devise methodology for regression, supervised classification and structure induction of graphical models. Within the regression paradigm, we focus on kernel smoothing learning, proposing techniques for kernel design that are suitable for high dimensional settings and sparse regression functions. We also present an application of regularized regression techniques for modeling the response of biological neurons. Supervised classification advances deal, on the one hand, with the application of regularization for obtaining a na¨ıve Bayes classifier and, on the other hand, with a novel algorithm for brain-computer interface design that uses group regularization in an efficient manner. Finally, we present a heuristic for inducing structures of Gaussian Bayesian networks using L1-regularization as a filter. El pragmatismo es la principal motivación de la regularización. Podemos entender la regularización como una modificación del estimador de máxima verosimilitud, de tal manera que se pueda dar una respuesta cuando la configuración del problema es inestable. A modo de ejemplo, podemos mencionar el ajuste de modelos paramétricos o no paramétricos cuando hay más parámetros que casos en el conjunto de datos, o la estimación de grandes matrices de covarianzas. Se suele recurrir a la regularización, además, para mejorar el compromiso sesgo-varianza en una estimación. Por tanto, la definición de regularización es muy general y, aunque la introducción de una función de penalización es probablemente el método más popular, éste es sólo uno de entre varias posibilidades. En esta tesis se ha trabajado en aplicaciones de regularización para obtener representaciones dispersas, donde sólo se usa un subconjunto de las entradas. En particular, la regularización L1 juega un papel clave en la búsqueda de dicha dispersión. La mayor parte de las contribuciones presentadas en la tesis giran alrededor de la regularización L1, aunque también se exploran otras formas de regularización (que igualmente persiguen un modelo disperso). Además de presentar una revisión de la regularización L1 y sus aplicaciones en estadística y aprendizaje de máquina, se ha desarrollado metodología para regresión, clasificación supervisada y aprendizaje de estructura en modelos gráficos. Dentro de la regresión, se ha trabajado principalmente en métodos de regresión local, proponiendo técnicas de diseño del kernel que sean adecuadas a configuraciones de alta dimensionalidad y funciones de regresión dispersas. También se presenta una aplicación de las técnicas de regresión regularizada para modelar la respuesta de neuronas reales. Los avances en clasificación supervisada tratan, por una parte, con el uso de regularización para obtener un clasificador naive Bayes y, por otra parte, con el desarrollo de un algoritmo que usa regularización por grupos de una manera eficiente y que se ha aplicado al diseño de interfaces cerebromáquina. Finalmente, se presenta una heurística para inducir la estructura de redes Bayesianas Gaussianas usando regularización L1 a modo de filtro.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Abstract The creation of atlases, or digital models where information from different subjects can be combined, is a field of increasing interest in biomedical imaging. When a single image does not contain enough information to appropriately describe the organism under study, it is then necessary to acquire images of several individuals, each of them containing complementary data with respect to the rest of the components in the cohort. This approach allows creating digital prototypes, ranging from anatomical atlases of human patients and organs, obtained for instance from Magnetic Resonance Imaging, to gene expression cartographies of embryo development, typically achieved from Light Microscopy. Within such context, in this PhD Thesis we propose, develop and validate new dedicated image processing methodologies that, based on image registration techniques, bring information from multiple individuals into alignment within a single digital atlas model. We also elaborate a dedicated software visualization platform to explore the resulting wealth of multi-dimensional data and novel analysis algo-rithms to automatically mine the generated resource in search of bio¬logical insights. In particular, this work focuses on gene expression data from developing zebrafish embryos imaged at the cellular resolution level with Two-Photon Laser Scanning Microscopy. Disposing of quantitative measurements relating multiple gene expressions to cell position and their evolution in time is a fundamental prerequisite to understand embryogenesis multi-scale processes. However, the number of gene expressions that can be simultaneously stained in one acquisition is limited due to optical and labeling constraints. These limitations motivate the implementation of atlasing strategies that can recreate a virtual gene expression multiplex. The developed computational tools have been tested in two different scenarios. The first one is the early zebrafish embryogenesis where the resulting atlas constitutes a link between the phenotype and the genotype at the cellular level. The second one is the late zebrafish brain where the resulting atlas allows studies relating gene expression to brain regionalization and neurogenesis. The proposed computational frameworks have been adapted to the requirements of both scenarios, such as the integration of partial views of the embryo into a whole embryo model with cellular resolution or the registration of anatom¬ical traits with deformable transformation models non-dependent on any specific labeling. The software implementation of the atlas generation tool (Match-IT) and the visualization platform (Atlas-IT) together with the gene expression atlas resources developed in this Thesis are to be made freely available to the scientific community. Lastly, a novel proof-of-concept experiment integrates for the first time 3D gene expression atlas resources with cell lineages extracted from live embryos, opening up the door to correlate genetic and cellular spatio-temporal dynamics. La creación de atlas, o modelos digitales, donde la información de distintos sujetos puede ser combinada, es un campo de creciente interés en imagen biomédica. Cuando una sola imagen no contiene suficientes datos como para describir apropiadamente el organismo objeto de estudio, se hace necesario adquirir imágenes de varios individuos, cada una de las cuales contiene información complementaria respecto al resto de componentes del grupo. De este modo, es posible crear prototipos digitales, que pueden ir desde atlas anatómicos de órganos y pacientes humanos, adquiridos por ejemplo mediante Resonancia Magnética, hasta cartografías de la expresión genética del desarrollo de embrionario, típicamente adquiridas mediante Microscopía Optica. Dentro de este contexto, en esta Tesis Doctoral se introducen, desarrollan y validan nuevos métodos de procesado de imagen que, basándose en técnicas de registro de imagen, son capaces de alinear imágenes y datos provenientes de múltiples individuos en un solo atlas digital. Además, se ha elaborado una plataforma de visualization específicamente diseñada para explorar la gran cantidad de datos, caracterizados por su multi-dimensionalidad, que resulta de estos métodos. Asimismo, se han propuesto novedosos algoritmos de análisis y minería de datos que permiten inspeccionar automáticamente los atlas generados en busca de conclusiones biológicas significativas. En particular, este trabajo se centra en datos de expresión genética del desarrollo embrionario del pez cebra, adquiridos mediante Microscopía dos fotones con resolución celular. Disponer de medidas cuantitativas que relacionen estas expresiones genéticas con las posiciones celulares y su evolución en el tiempo es un prerrequisito fundamental para comprender los procesos multi-escala característicos de la morfogénesis. Sin embargo, el número de expresiones genéticos que pueden ser simultáneamente etiquetados en una sola adquisición es reducido debido a limitaciones tanto ópticas como del etiquetado. Estas limitaciones requieren la implementación de estrategias de creación de atlas que puedan recrear un multiplexado virtual de expresiones genéticas. Las herramientas computacionales desarrolladas han sido validadas en dos escenarios distintos. El primer escenario es el desarrollo embrionario temprano del pez cebra, donde el atlas resultante permite constituir un vínculo, a nivel celular, entre el fenotipo y el genotipo de este organismo modelo. El segundo escenario corresponde a estadios tardíos del desarrollo del cerebro del pez cebra, donde el atlas resultante permite relacionar expresiones genéticas con la regionalización del cerebro y la formación de neuronas. La plataforma computacional desarrollada ha sido adaptada a los requisitos y retos planteados en ambos escenarios, como la integración, a resolución celular, de vistas parciales dentro de un modelo consistente en un embrión completo, o el alineamiento entre estructuras de referencia anatómica equivalentes, logrado mediante el uso de modelos de transformación deformables que no requieren ningún marcador específico. Está previsto poner a disposición de la comunidad científica tanto la herramienta de generación de atlas (Match-IT), como su plataforma de visualización (Atlas-IT), así como las bases de datos de expresión genética creadas a partir de estas herramientas. Por último, dentro de la presente Tesis Doctoral, se ha incluido una prueba conceptual innovadora que permite integrar los mencionados atlas de expresión genética tridimensionales dentro del linaje celular extraído de una adquisición in vivo de un embrión. Esta prueba conceptual abre la puerta a la posibilidad de correlar, por primera vez, las dinámicas espacio-temporales de genes y células.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Los sistemas de proyección multi-proyector han adquirido gran popularidad en los últimos años para su uso en un amplio rango de aplicaciones como sistemas de realidad virtual, simuladores y visualización de datos. Esto es debido a que normalmente estas aplicaciones necesitan representar sus datos a muy alta resolución y a lo largo de una gran superficie. Este tipo de sistemas de proyección son baratos en comparación con las resoluciones que pueden conseguir, se pueden configurar para proyectar sobre prácticamente cualquier tipo de superficie, sea cual sea su forma, y son fácilmente escalables. Sin embargo, para hacer que este tipo de sistemas generen una imagen sin discontinuidades geométricas o colorimétricas requieren de un ajuste preciso. En la presente tesis se analizan en detalle todos los problemas a los que hay que enfrentarse a la hora de diseñar y calibrar un sistema de proyección de este tipo y se propone una metodología con una serie de optimizaciones para hacer el ajuste de estos sistemas más sencillo y rápido. Los resultados de esta metodología se muestran aplicados a la salida gráfica de un simulador de entrenamiento. Multi-projector display systems have gained high popularity over the past years for its use in a wide range of applications such as virtual reality systems, simulators or data visualization where a high resolution image over a large projection surface is required. Such systems are cheap for the resolutions they can provide, can be configured to project images on almost any kind of screen shapes and are easily scalable, but in order to provide a seamless image with no photometric discontinuities they require a precise geometric and colour correction. In this thesis, we analyze all the problems that have to be faced in order to design and calibrate a multi-projector display. We propose a calibration methodology with some optimizations that make the adjustment of this kind of displays easier and faster. The results of the implementation of this methodology on a training simulator are presented and discussed

Relevância:

100.00% 100.00%

Publicador:

Resumo:

La presente tesis doctoral presenta una serie de estudios en el campo del patrimonio basados en metodologías de monitorización mediante redes de sensores y técnicas no invasivas con el objetivo de realizar nuevas aportaciones a la conservación preventiva mediante el seguimiento de los daños de deterioro o la prevención de los mismos. Las metodologías de monitorización mediante el despliegue de redes tridimensionales basadas en data loggers abordan estudios microclimáticos, de confort y energéticos a corto plazo, donde se establecen conclusiones relativas a la eficiencia energética de tres sistemas de calefacción muy utilizados en iglesias de la región centro de la Península Ibérica, abordando aspectos de afección de los mismos en el confort de los ocupantes o en el deterioro de los elementos patrimoniales o constructivos. Se desplegaron además distintas plataformas de redes de sensores inalámbricas procediendo a analizar en esta tesis cuál es la que presenta mejores resultados en el ámbito del patrimonio con el objetivo de una monitorización a largo plazo y considerando aspectos de comunicaciones, consumo y configuración de las redes. Una vez conocida la plataforma que presenta mejores resultados comparativos se muestra una metodología de estudio de la calidad de las comunicaciones en múltiples escenarios de patrimonio cultural y natural con la misma, que servirá para establecer una serie de aspectos a considerar en el despliegue de redes de sensores inalámbricas en futuros escenarios a monitorizar. Al igual que ocurre con las redes de sensores basadas en data loggers, las tareas de monitorización desarrolladas en esta tesis mediante el despliegue de las distintas plataformas inalámbricas ha permitido la detección de numerosos fenómenos de deterioro que son descritos a lo largo de la investigación y cuyo seguimiento supone una aportación a la prevención de daños en los distintos escenarios. Asimismo en el desarrollo de la tesis se realiza una aportación para la conservación preventiva mediante la monitorización con distintas técnicas no invasivas como la termografía infrarroja, las medidas de humedad superficial mediante protimeter, las técnicas de prospección de resistividad eléctrica de alta resolución o la prospección georradar. De este modo se desarrollan distintas aportaciones y conclusiones acerca de las ventajas y/o limitaciones de uso de las mismas analizando la idoneidad de aplicar cada una de ellas en distintas fases de análisis o con distintas capacidades de detección o caracterización de los daños. El estudio de imbricación de dichas técnicas ha sido desarrollado en un escenario real que presenta graves daños por humedad, habiendo sido posible la caracterización del origen de los mismos. ABSTRACT This doctoral dissertation discusses field research conducted to monitor heritage assets with sensor networks and other non-invasive techniques. The aim pursued was to contribute to conservation by tracking or preventing decay-induced damage. Monitoring methodologies based on three-dimensional data logger networks were used in short-term micro-climatic, comfort and energy studies to draw conclusions about the energy efficiency of three heating systems widely used in central Iberian churches. The impact of these systems on occupant comfort and decay of heritage or built elements was also explored. Different wireless sensor platforms were deployed and analysed to determine which delivered the best results in the context of long-term heritage monitoring from the standpoints of communications, energy demand and network architecture. A methodology was subsequently designed to study communication quality in a number of cultural and natural heritage scenarios and help establish the considerations to be borne in mind when deploying wireless sensor networks for heritage monitoring in future. As in data logger-based sensor networks, the monitoring conducted in this research with wireless platforms identified many instances of decay, described hereunder. Tracking those situations will help prevent damage in the respective scenarios. The research also contributes to preventive conservation based on non-invasive monitoring using techniques such as infrared thermography, protimeter-based surface damp measurements, high resolution electrical resistivity surveys and georadar analysis. The conclusions drawn address the advantages and drawbacks of each technique and its suitability for the various phases of analysis and capacity to detect or characterise damage. This dissertation also describes the intermeshed usage of these techniques that led to the identification of the origin of severe damp-induced damage in a real scenario.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Las herramientas de configuración basadas en lenguajes de alto nivel como LabVIEW permiten el desarrollo de sistemas de adquisición de datos basados en hardware reconfigurable FPGA muy complejos en un breve periodo de tiempo. La estandarización del ciclo de diseño hardware/software y la utilización de herramientas como EPICS facilita su integración con la plataforma de adquisición y control ITER CODAC CORE SYSTEM (CCS) basada en Linux. En este proyecto se propondrá una metodología que simplificará el ciclo completo de integración de plataformas novedosas, como cRIO, en las que el funcionamiento del hardware de adquisición puede ser modificado por el usuario para que éste se amolde a sus requisitos específicos. El objetivo principal de este proyecto fin de master es realizar la integración de un sistema cRIO NI9159 y diferentes módulos de E/S analógica y digital en EPICS y en CODAC CORE SYSTEM (CCS). Este último consiste en un conjunto de herramientas software que simplifican la integración de los sistemas de instrumentación y control del experimento ITER. Para cumplir el objetivo se realizarán las siguientes tareas: • Desarrollo de un sistema de adquisición de datos basado en FPGA con la plataforma hardware CompactRIO. En esta tarea se realizará la configuración del sistema y la implementación en LabVIEW para FPGA del hardware necesario para comunicarse con los módulos: NI9205, NI9264, NI9401.NI9477, NI9426, NI9425 y NI9476 • Implementación de un driver software utilizando la metodología de AsynDriver para integración del cRIO con EPICS. Esta tarea requiere definir todos los records necesarios que exige EPICS y crear las interfaces adecuadas que permitirán comunicarse con el hardware. • Implementar la descripción del sistema cRIO y del driver EPICS en el sistema de descripción de plantas de ITER llamado SDD. Esto automatiza la creación de las aplicaciones de EPICS que se denominan IOCs. SUMMARY The configuration tools based in high-level programing languages like LabVIEW allows the development of high complex data acquisition systems based on reconfigurable hardware FPGA in a short time period. The standardization of the hardware/software design cycle and the use of tools like EPICS ease the integration with the data acquisition and control platform of ITER, the CODAC Core System based on Linux. In this project a methodology is proposed in order to simplify the full integration cycle of new platforms like CompactRIO (cRIO), in which the data acquisition functionality can be reconfigured by the user to fits its concrete requirements. The main objective of this MSc final project is to develop the integration of a cRIO NI-9159 and its different analog and digital Input/Output modules with EPICS in a CCS. The CCS consists of a set of software tools that simplifies the integration of instrumentation and control systems in the International Thermonuclear Reactor (ITER) experiment. To achieve such goal the following tasks are carried out: • Development of a DAQ system based on FPGA using the cRIO hardware platform. This task comprehends the configuration of the system and the implementation of the mandatory hardware to communicate to the I/O adapter modules NI9205, NI9264, NI9401, NI9477, NI9426, NI9425 y NI9476 using LabVIEW for FPGA. • Implementation of a software driver using the asynDriver methodology to integrate such cRIO system with EPICS. This task requires the definition of the necessary EPICS records and the creation of the appropriate interfaces that allow the communication with the hardware. • Develop the cRIO system’s description and the EPICS driver in the ITER plant description tool named SDD. This development will automate the creation of EPICS applications, called IOCs.