938 resultados para alta risoluzione Trentino Alto Adige data-set climatologia temperatura giornaliera orografia complessa


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Visual cluster analysis provides valuable tools that help analysts to understand large data sets in terms of representative clusters and relationships thereof. Often, the found clusters are to be understood in context of belonging categorical, numerical or textual metadata which are given for the data elements. While often not part of the clustering process, such metadata play an important role and need to be considered during the interactive cluster exploration process. Traditionally, linked-views allow to relate (or loosely speaking: correlate) clusters with metadata or other properties of the underlying cluster data. Manually inspecting the distribution of metadata for each cluster in a linked-view approach is tedious, specially for large data sets, where a large search problem arises. Fully interactive search for potentially useful or interesting cluster to metadata relationships may constitute a cumbersome and long process. To remedy this problem, we propose a novel approach for guiding users in discovering interesting relationships between clusters and associated metadata. Its goal is to guide the analyst through the potentially huge search space. We focus in our work on metadata of categorical type, which can be summarized for a cluster in form of a histogram. We start from a given visual cluster representation, and compute certain measures of interestingness defined on the distribution of metadata categories for the clusters. These measures are used to automatically score and rank the clusters for potential interestingness regarding the distribution of categorical metadata. Identified interesting relationships are highlighted in the visual cluster representation for easy inspection by the user. We present a system implementing an encompassing, yet extensible, set of interestingness scores for categorical metadata, which can also be extended to numerical metadata. Appropriate visual representations are provided for showing the visual correlations, as well as the calculated ranking scores. Focusing on clusters of time series data, we test our approach on a large real-world data set of time-oriented scientific research data, demonstrating how specific interesting views are automatically identified, supporting the analyst discovering interesting and visually understandable relationships.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Calcitic belemnite rostra are usually employed to perform paleoenvironmental studies based on geochemical data. However, several questions, such as their original porosity and microstructure, remain open, despite they are essential to make accurate interpretations based on geochemical analyses.This paper revisits and enlightens some of these questions. Petrographic data demonstrate that calcite crystals of the rostrum solidum of belemnites grow from spherulites that successively develop along the apical line, resulting in a “regular spherulithic prismatic” microstructure. Radially arranged calcite crystals emerge and diverge from the spherulites: towards the apex, crystals grow until a new spherulite is formed; towards the external walls of the rostrum, the crystals become progressively bigger and prismatic. Adjacent crystals slightly vary in their c-axis orientation, resulting in undulose extinction. Concentric growth layering develops at different scales and is superimposed and traversed by a radial pattern, which results in the micro-fibrous texture that is observed in the calcite crystals in the rostra.Petrographic data demonstrate that single calcite crystals in the rostra have a composite nature, which strongly suggests that the belemnite rostra were originally porous. Single crystals consistently comprise two distinct zones or sectors in optical continuity: 1) the inner zone is fluorescent, has relatively low optical relief under transmitted light (TL) microscopy, a dark-grey color under backscatter electron microscopy (BSEM), a commonly triangular shape, a “patchy” appearance and relatively high Mg and Na contents; 2) the outer sector is non-fluorescent, has relatively high optical relief under TL, a light-grey color under BSEM and low Mg and Na contents. The inner and fluorescent sectors are interpreted to have formed first as a product of biologically controlled mineralization during belemnite skeletal growth and the non-fluorescent outer sectors as overgrowths of the former, filling the intra- and inter-crystalline porosity. This question has important implications for making paleoenvironmental and/or paleoclimatic interpretations based on geochemical analyses of belemnite rostra.Finally, the petrographic features of composite calcite crystals in the rostra also suggest the non-classical crystallization of belemnite rostra, as previously suggested by other authors.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Thesis (Ph.D.)--University of Washington, 2016-08

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We analyze a real data set pertaining to reindeer fecal pellet-group counts obtained from a survey conducted in a forest area in northern Sweden. In the data set, over 70% of counts are zeros, and there is high spatial correlation. We use conditionally autoregressive random effects for modeling of spatial correlation in a Poisson generalized linear mixed model (GLMM), quasi-Poisson hierarchical generalized linear model (HGLM), zero-inflated Poisson (ZIP), and hurdle models. The quasi-Poisson HGLM allows for both under- and overdispersion with excessive zeros, while the ZIP and hurdle models allow only for overdispersion. In analyzing the real data set, we see that the quasi-Poisson HGLMs can perform better than the other commonly used models, for example, ordinary Poisson HGLMs, spatial ZIP, and spatial hurdle models, and that the underdispersed Poisson HGLMs with spatial correlation fit the reindeer data best. We develop R codes for fitting these models using a unified algorithm for the HGLMs. Spatial count response with an extremely high proportion of zeros, and underdispersion can be successfully modeled using the quasi-Poisson HGLM with spatial random effects.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This dissertation contains four essays that all share a common purpose: developing new methodologies to exploit the potential of high-frequency data for the measurement, modeling and forecasting of financial assets volatility and correlations. The first two chapters provide useful tools for univariate applications while the last two chapters develop multivariate methodologies. In chapter 1, we introduce a new class of univariate volatility models named FloGARCH models. FloGARCH models provide a parsimonious joint model for low frequency returns and realized measures, and are sufficiently flexible to capture long memory as well as asymmetries related to leverage effects. We analyze the performances of the models in a realistic numerical study and on the basis of a data set composed of 65 equities. Using more than 10 years of high-frequency transactions, we document significant statistical gains related to the FloGARCH models in terms of in-sample fit, out-of-sample fit and forecasting accuracy compared to classical and Realized GARCH models. In chapter 2, using 12 years of high-frequency transactions for 55 U.S. stocks, we argue that combining low-frequency exogenous economic indicators with high-frequency financial data improves the ability of conditionally heteroskedastic models to forecast the volatility of returns, their full multi-step ahead conditional distribution and the multi-period Value-at-Risk. Using a refined version of the Realized LGARCH model allowing for time-varying intercept and implemented with realized kernels, we document that nominal corporate profits and term spreads have strong long-run predictive ability and generate accurate risk measures forecasts over long-horizon. The results are based on several loss functions and tests, including the Model Confidence Set. Chapter 3 is a joint work with David Veredas. We study the class of disentangled realized estimators for the integrated covariance matrix of Brownian semimartingales with finite activity jumps. These estimators separate correlations and volatilities. We analyze different combinations of quantile- and median-based realized volatilities, and four estimators of realized correlations with three synchronization schemes. Their finite sample properties are studied under four data generating processes, in presence, or not, of microstructure noise, and under synchronous and asynchronous trading. The main finding is that the pre-averaged version of disentangled estimators based on Gaussian ranks (for the correlations) and median deviations (for the volatilities) provide a precise, computationally efficient, and easy alternative to measure integrated covariances on the basis of noisy and asynchronous prices. Along these lines, a minimum variance portfolio application shows the superiority of this disentangled realized estimator in terms of numerous performance metrics. Chapter 4 is co-authored with Niels S. Hansen, Asger Lunde and Kasper V. Olesen, all affiliated with CREATES at Aarhus University. We propose to use the Realized Beta GARCH model to exploit the potential of high-frequency data in commodity markets. The model produces high quality forecasts of pairwise correlations between commodities which can be used to construct a composite covariance matrix. We evaluate the quality of this matrix in a portfolio context and compare it to models used in the industry. We demonstrate significant economic gains in a realistic setting including short selling constraints and transaction costs.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In 2005, the University of Maryland acquired over 70 digital videos spanning 35 years of Jim Henson’s groundbreaking work in television and film. To support in-house discovery and use, the collection was cataloged in detail using AACR2 and MARC21, and a web-based finding aid was also created. In the past year, I created an "r-ball" (a linked data set described using RDA) of these same resources. The presentation will compare and contrast these three ways of accessing the Jim Henson Works collection, with insights gleaned from providing resource discovery using RIMMF (RDA in Many Metadata Formats).

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We analyze available heat flow data from the flanks of the Southeast Indian Ridge adjacent to or within the Australian-Antarctic Discordance (AAD), an area with patchy sediment cover and highly fractured seafloor as dissected by ridge- and fracture-parallel faults. The data set includes 23 new data points collected along a 14-Ma old isochron and 19 existing measurements from the 20- to 24-Ma old crust. Most sites of measurements exhibit low heat flux (from 2 to 50 mW m(-2)) with near-linear temperature-depth profiles except at a few sites, where recent bottom water temperature change may have caused nonlinearity toward the sediment surface. Because the igneous basement is expected to outcrop a short distance away from any measurement site, we hypothesize that horizontally channelized water circulation within the uppermost crust is the primary process for the widespread low heat flow values. The process may be further influenced by vertical fluid flow along numerous fault zones that crisscross the AAD seafloor. Systematic measurements along and across the fault zones of interest as well as seismic profiling for sediment distribution are required to confirm this possible, suspected effect.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Las úlceras por presión representan un importante problema de salud pública y tienen un importante impacto económico en los sistemas de salud. La mayoría de los estudios para prevenir las úlceras por presión se han llevado a cabo en contextos hospitalarios, usando ácidos grasos hiperoxigenados (AGHO) y hasta la fecha, no se ha realizado ningún estudio específico con aceite de oliva virgen extra (AOVE) en el entorno domiciliario. Material y método Objetivo principal: evaluar si la utilización de AOVE no es inferior a los AGHO en la prevención de úlceras por presión (UPP) en pacientes inmovilizados en el entorno domiciliario. Diseño: Ensayo clínico aleatorizado multicéntrico, paralelo, a triple ciego, de no inferioridad. Ámbito: Población consultante de centros de salud andaluces. Muestra: 831 pacientes inmovilizados en riesgo de padecer UPP. Resultados El período de seguimiento fue de 16 semanas. En el análisis por protocolo, ninguna de las zonas evaluadas presentó diferencias de riesgo de incidencia de las úlceras por presión que superasen el valor delta establecido (10%). Sacro: AOVE 8 (2,55%) vs AGHO 8 (3,08%), RAR 0,53 (-2,2 a 3,26). Talón derecho: AOVE 4 (1,27%) vs AGHO 5 (1,92)%, RAR 0,65 (-1,43 a 2,73). Talón izquierdo: AOVE 3 (0,96%) vs AGHO 3 (1,15%), RAR 0.2 (-1,49 a 1,88). Trocánter Derecho: AOVE 0 (0%) vs AGHO 4 (1,54%), RAR 1,54 (0,04-3,03). Trocánter izquierdo: AOVE 1 (0,32%) vs AGHO 1 (0,38%), RAR 0.07 (-0,91 a 1,04). Discusión Este ensayo clínico pretendía evaluar si la prevención de las UPP usando una fórmula de AOVE no era inferior a la prevención de UPP con el uso AGHO, en el entorno domiciliario y con pacientes inmovilizados de alto riesgo. Los resultados han mostrado esta no inferioridad al no observarse diferencias que hayan superado el límite inferior del intervalo de confianza y convierte al aceite de oliva en un producto eficaz para la prevención de UPP en este tipo de pacientes. Se necesitan más estudios para investigar el mecanismo de acción del AOVE en la prevención de las UPP y relacionarlo con la etiopatogenia de éstas. Bibliografía 1. Gorecki C, Brown JM, Nelson EA, Briggs M, Schoonhoven L, Dealey C, et al. Impact of pressure ulcers on quality of life in older patients: a systematic review. J Am Geriatr Soc. 2009 Jul;57(7):1175–83. 2. Yamamoto Y, Hayashino Y, Higashi T, Matsui M, Yamazaki S, Takegami M, et al. Keeping vulnerable elderly patients free from pressure ulcer is associated with high caregiver burden in informal caregivers. J Eval Clin Pract. 2010 Jun;16(3):585–9. 3. Hanson D, Langemo DK, Anderson J, Thompson P, Hunter S. Friction and shear considerations in pressure ulcer development. Adv Skin Wound Care. 2010 Jan;23(1):21–4. 4. Soldevilla Agreda JJ, Torra i Bou J-E, Verdú Soriano J, López Casanova P. 3.er Estudio Nacional de Prevalencia de Úlceras por Presión en España, 2009: Epidemiología y variables definitorias de las lesiones y pacientes. Gerokomos. 2011 Jun;22(2):77–90. 5. Kottner J, Lahmann N, Dassen T. [Pressure ulcer prevalence: comparison between nursing homes and hospitals]. Pflege Z. 2010 Apr;63(4):228–31. 6. Wilborn D, Halfens R, Dassen T, Tannen A. [Pressure ulcer prevalence in German nursing homes and hospitals: what role does the National Nursing Expert Standard Prevention of Pressure Ulcer play?]. Gesundheitswesen Bundesverb Ärzte Öffentl Gesundheitsdienstes Ger. 2010 Apr;72(4):240–5. 7. Tubaishat A, Anthony D, Saleh M. Pressure ulcers in Jordan: a point prevalence study. J Tissue Viability. 2011 Feb;20(1):14–9. 8. James J, Evans JA, Young T, Clark M. Pressure ulcer prevalence across Welsh orthopaedic units and community hospitals: surveys based on the European Pressure Ulcer Advisory Panel minimum data set. Int Wound J. 2010 Jun;7(3):147–52. 9. Defloor T, Schoonhoven L, Katrien V, Weststrate J, Myny D. Reliability of the European Pressure Ulcer Advisory Panel classification system. J Adv Nurs. 2006 Apr;54(2):189–98

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This dissertation comprises three chapters. The first chapter motivates the use of a novel data set combining survey and administrative sources for the study of internal labor migration. By following a sample of individuals from the American Community Survey (ACS) across their employment outcomes over time according to the Longitudinal Employer-Household Dynamics (LEHD) database, I construct a measure of geographic labor mobility that allows me to exploit information about individuals prior to their move. This enables me to explore aspects of the migration decision, such as homeownership and employment status, in ways that have not previously been possible. In the second chapter, I use this data set to test the theory that falling home prices affect a worker’s propensity to take a job in a different metropolitan area from where he is currently located. Employing a within-CBSA and time estimation that compares homeowners to renters in their propensities to relocate for jobs, I find that homeowners who have experienced declines in the nominal value of their homes are approximately 12% less likely than average to take a new job in a location outside of the metropolitan area where they currently reside. This evidence is consistent with the hypothesis that housing lock-in has contributed to the decline in labor mobility of homeowners during the recent housing bust. The third chapter focuses on a sample of unemployed workers in the same data set, in order to compare the unemployment durations of those who find subsequent employment by relocating to a new metropolitan area, versus those who find employment in their original location. Using an instrumental variables strategy to address the endogeneity of the migration decision, I find that out-migrating for a new job significantly reduces the time to re-employment. These results stand in contrast to OLS estimates, which suggest that those who move have longer unemployment durations. This implies that those who migrate for jobs in the data may be particularly disadvantaged in their ability to find employment, and thus have strong short-term incentives to relocate.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The paper catalogues the procedures and steps involved in agroclimatic classification. These vary from conventional descriptive methods to modern computer-based numerical techniques. There are three mutually independent numerical classification techniques, namely Ordination, Cluster analysis, and Minimum spanning tree; and under each technique there are several forms of grouping techniques existing. The vhoice of numerical classification procedure differs with the type of data set. In the case of numerical continuous data sets with booth positive and negative values, the simple and least controversial procedures are unweighted pair group method (UPGMA) and weighted pair group method (WPGMA) under clustering techniques with similarity measure obtained either from Gower metric or standardized Euclidean metric. Where the number of attributes are large, these could be reduced to fewer new attributes defined by the principal components or coordinates by ordination technique. The first few components or coodinates explain the maximum variance in the data matrix. These revided attributes are less affected by noise in the data set. It is possible to check misclassifications using minimum spanning tree.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Mestrado Vinifera Euromaster - Instituto Superior de Agronomia - UL

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Physiological and genetic information has been critical to the successful diagnosis and prognosis of complex diseases. In this paper, we introduce a support-confidence-correlation framework to accurately discover truly meaningful and interesting association rules between complex physiological and genetic data for disease factor analysis, such as type II diabetes (T2DM). We propose a novel Multivariate and Multidimensional Association Rule mining system based on Change Detection (MMARCD). Given a complex data set u i (e.g. u 1 numerical data streams, u 2 images, u 3 videos, u 4 DNA/RNA sequences) observed at each time tick t, MMARCD incrementally finds correlations and hidden variables that summarise the key relationships across the entire system. Based upon MMARCD, we are able to construct a correlation network for human diseases. © 2012 Springer-Verlag.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Clustering of big data has received much attention recently. In this paper, we present a new clusiVAT algorithm and compare it with four other popular data clustering algorithms. Three of the four comparison methods are based on the well known, classical batch k-means model. Specifically, we use k-means, single pass k-means, online k-means, and clustering using representatives (CURE) for numerical comparisons. clusiVAT is based on sampling the data, imaging the reordered distance matrix to estimate the number of clusters in the data visually, clustering the samples using a relative of single linkage (SL), and then noniteratively extending the labels to the rest of the data-set using the nearest prototype rule. Previous work has established that clusiVAT produces true SL clusters in compact-separated data. We have performed experiments to show that k-means and its modified algorithms suffer from initialization issues that cause many failures. On the other hand, clusiVAT needs no initialization, and almost always finds partitions that accurately match ground truth labels in labeled data. CURE also finds SL type partitions but is much slower than the other four algorithms. In our experiments, clusiVAT proves to be the fastest and most accurate of the five algorithms; e.g., it recovers 97% of the ground truth labels in the real world KDD-99 cup data (4 292 637 samples in 41 dimensions) in 76 s.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Crosswell data set contains a range of angles limited only by the geometry of the source and receiver configuration, the separation of the boreholes and the depth to the target. However, the wide angles reflections present in crosswell imaging result in amplitude-versus-angle (AVA) features not usually observed in surface data. These features include reflections from angles that are near critical and beyond critical for many of the interfaces; some of these reflections are visible only for a small range of angles, presumably near their critical angle. High-resolution crosswell seismic surveys were conducted over a Silurian (Niagaran) reef at two fields in northern Michigan, Springdale and Coldspring. The Springdale wells extended to much greater depths than the reef, and imaging was conducted from above and from beneath the reef. Combining the results from images obtained from above with those from beneath provides additional information, by exhibiting ranges of angles that are different for the two images, especially for reflectors at shallow depths, and second, by providing additional constraints on the solutions for Zoeppritz equations. Inversion of seismic data for impedance has become a standard part of the workflow for quantitative reservoir characterization. Inversion of crosswell data using either deterministic or geostatistical methods can lead to poor results with phase change beyond the critical angle, however, the simultaneous pre-stack inversion of partial angle stacks may be best conducted with restrictions to angles less than critical. Deterministic inversion is designed to yield only a single model of elastic properties (best-fit), while the geostatistical inversion produces multiple models (realizations) of elastic properties, lithology and reservoir properties. Geostatistical inversion produces results with far more detail than deterministic inversion. The magnitude of difference in details between both types of inversion becomes increasingly pronounced for thinner reservoirs, particularly those beyond the vertical resolution of the seismic. For any interface imaged from above and from beneath, the results AVA characters must result from identical contrasts in elastic properties in the two sets of images, albeit in reverse order. An inversion approach to handle both datasets simultaneously, at pre-critical angles, is demonstrated in this work. The main exploration problem for carbonate reefs is determining the porosity distribution. Images of elastic properties, obtained from deterministic and geostatistical simultaneous inversion of a high-resolution crosswell seismic survey were used to obtain the internal structure and reservoir properties (porosity) of Niagaran Michigan reef. The images obtained are the best of any Niagaran pinnacle reef to date.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Recent marine long-offset transient electromagnetic (LOTEM) measurements yielded the offshore delineation of a fresh groundwater body beneath the seafloor in the region of Bat Yam, Israel. The LOTEM application was effective in detecting this freshwater body underneath the Mediterranean Sea and allowed an estimation of its seaward extent. However, the measured data set was insufficient to understand the hydrogeological configuration and mechanism controlling the occurrence of this fresh groundwater discovery. Especially the lateral geometry of the freshwater boundary, important for the hydrogeological modelling, could not be resolved. Without such an understanding, a rational management of this unexploited groundwater reservoir is not possible. Two new high-resolution marine time-domain electromagnetic methods are theoretically developed to derive the hydrogeological structure of the western aquifer boundary. The first is called Circular Electric Dipole (CED). It is the land-based analogous of the Vertical Electric Dipole (VED), which is commonly applied to detect resistive structures in the subsurface. Although the CED shows exceptional detectability characteristics in the step-off signal towards the sub-seafloor freshwater body, an actual application was not carried out in the extent of this study. It was found that the method suffers from an insufficient signal strength to adequately delineate the resistive aquifer under realistic noise conditions. Moreover, modelling studies demonstrated that severe signal distortions are caused by the slightest geometrical inaccuracies. As a result, a successful application of CED in Israel proved to be rather doubtful. A second method called Differential Electric Dipole (DED) is developed as an alternative to the intended CED method. Compared to the conventional marine time-domain electromagnetic system that commonly applies a horizontal electric dipole transmitter, the DED is composed of two horizontal electric dipoles in an in-line configuration that share a common central electrode. Theoretically, DED has similar detectability/resolution characteristics compared to the conventional LOTEM system. However, the superior lateral resolution towards multi-dimensional resistivity structures make an application desirable. Furthermore, the method is less susceptible towards geometrical errors making an application in Israel feasible. In the extent of this thesis, the novel marine DED method is substantiated using several one-dimensional (1D) and multi-dimensional (2D/3D) modelling studies. The main emphasis lies on the application in Israel. Preliminary resistivity models are derived from the previous marine LOTEM measurement and tested for a DED application. The DED method is effective in locating the two-dimensional resistivity structure at the western aquifer boundary. Moreover, a prediction regarding the hydrogeological boundary conditions are feasible, provided a brackish water zone exists at the head of the interface. A seafloor-based DED transmitter/receiver system is designed and built at the Institute of Geophysics and Meteorology at the University of Cologne. The first DED measurements were carried out in Israel in April 2016. The acquired data set is the first of its kind. The measured data is processed and subsequently interpreted using 1D inversion. The intended aim of interpreting both step-on and step-off signals failed, due to the insufficient data quality of the latter. Yet, the 1D inversion models of the DED step-on signals clearly detect the freshwater body for receivers located close to the Israeli coast. Additionally, a lateral resistivity contrast is observable in the 1D inversion models that allow to constrain the seaward extent of this freshwater body. A large-scale 2D modelling study followed the 1D interpretation. In total, 425 600 forward calculations are conducted to find a sub-seafloor resistivity distribution that adequately explains the measured data. The results indicate that the western aquifer boundary is located at 3600 m - 3700 m before the coast. Moreover, a brackish water zone of 3 Omega*m to 5 Omega*m with a lateral extent of less than 300 m is likely located at the head of the freshwater aquifer. Based on these results, it is predicted that the sub-seafloor freshwater body is indeed open to the sea and may be vulnerable to seawater intrusion.