545 resultados para Smoothing


Relevância:

10.00% 10.00%

Publicador:

Resumo:

Klimamontoring benötigt eine operative, raum-zeitliche Analyse der Klimavariabilität. Mit dieser Zielsetzung, funktionsbereite Karten regelmäßig zu erstellen, ist es hilfreich auf einen Blick, die räumliche Variabilität der Klimaelemente in der zeitlichen Veränderungen darzustellen. Für aktuelle und kürzlich vergangene Jahre entwickelte der Deutsche Wetterdienst ein Standardverfahren zur Erstellung solcher Karten. Die Methode zur Erstellung solcher Karten variiert für die verschiedenen Klimaelemente bedingt durch die Datengrundlage, die natürliche Variabilität und der Verfügbarkeit der in-situ Daten.rnIm Rahmen der Analyse der raum-zeitlichen Variabilität innerhalb dieser Dissertation werden verschiedene Interpolationsverfahren auf die Mitteltemperatur der fünf Dekaden der Jahre 1951-2000 für ein relativ großes Gebiet, der Region VI der Weltorganisation für Meteorologie (Europa und Naher Osten) angewendet. Die Region deckt ein relativ heterogenes Arbeitsgebiet von Grönland im Nordwesten bis Syrien im Südosten hinsichtlich der Klimatologie ab.rnDas zentrale Ziel der Dissertation ist eine Methode zur räumlichen Interpolation der mittleren Dekadentemperaturwerte für die Region VI zu entwickeln. Diese Methode soll in Zukunft für die operative monatliche Klimakartenerstellung geeignet sein. Diese einheitliche Methode soll auf andere Klimaelemente übertragbar und mit der entsprechenden Software überall anwendbar sein. Zwei zentrale Datenbanken werden im Rahmen dieser Dissertation verwendet: So genannte CLIMAT-Daten über dem Land und Schiffsdaten über dem Meer.rnIm Grunde wird die Übertragung der Punktwerte der Temperatur per räumlicher Interpolation auf die Fläche in drei Schritten vollzogen. Der erste Schritt beinhaltet eine multiple Regression zur Reduktion der Stationswerte mit den vier Einflussgrößen der Geographischen Breite, der Höhe über Normalnull, der Jahrestemperaturamplitude und der thermischen Kontinentalität auf ein einheitliches Niveau. Im zweiten Schritt werden die reduzierten Temperaturwerte, so genannte Residuen, mit der Interpolationsmethode der Radialen Basis Funktionen aus der Gruppe der Neuronalen Netzwerk Modelle (NNM) interpoliert. Im letzten Schritt werden die interpolierten Temperaturraster mit der Umkehrung der multiplen Regression aus Schritt eins mit Hilfe der vier Einflussgrößen auf ihr ursprüngliches Niveau hochgerechnet.rnFür alle Stationswerte wird die Differenz zwischen geschätzten Wert aus der Interpolation und dem wahren gemessenen Wert berechnet und durch die geostatistische Kenngröße des Root Mean Square Errors (RMSE) wiedergegeben. Der zentrale Vorteil ist die wertegetreue Wiedergabe, die fehlende Generalisierung und die Vermeidung von Interpolationsinseln. Das entwickelte Verfahren ist auf andere Klimaelemente wie Niederschlag, Schneedeckenhöhe oder Sonnenscheindauer übertragbar.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Il problema dell'acidificazione degli oceani, conseguente ai cambiamenti climatici, è un processo ancora poco conosciuto. Per comprendere questo fenomeno, possono essere utilizzati degli ambienti naturalmente acidificati, considerati laboratori a cielo aperto. Lo scopo di questo lavoro di tesi è stato quello di utilizzare le fumarole presenti nell'isola di Ischia, per approfondire le dinamiche dei processi di acidificazione e per analizzare l'eventuale interazione tra pH e condizioni meteorologiche. I dati utilizzati, forniti dalla Stazione Zoologica “Anton Dohrn” di Napoli, erano serie di pH e di vento rilevate in continuo, in due aree, nord e sud rispetto all'isolotto del Castello Aragonese, e in tre stazioni lungo un gradiente di acidificazione. Tutto il lavoro è stato svolto a step, dove il risultato di un'analisi suggeriva il tipo e il metodo analitico da utilizzare nelle analisi successive. Inizialmente i dati delle due serie sono stati analizzati singolarmente per ottenere i parametri più salienti delle due serie. In seguito i dati sono stati correlati fra loro per stimare l'influenza del vento sul pH. Globalmente è stato possibile evidenziare come il fenomeno dell'acidificazione sia correlato con il vento, ma la risposta sembra essere sito-specifica, essendo risultato dipendente da altri fattori interagenti a scala locale, come la geomorfologia del territorio, le correnti marine e la batimetria del fondale. È però emersa anche la difficoltà nel trovare chiare correlazioni fra le due serie indagate, perché molto complesse, a causa sia della numerosa quantità di zeri nella serie del vento, sia da una forte variabilità naturale del pH, nelle varie stazioni esaminate. In generale, con questo lavoro si è dimostrato come utilizzare tecniche di analisi delle serie storiche, e come poter utilizzare metodi di regressione, autocorrelazione, cross-correlation e smoothing che possono integrare i modelli che prendono in considerazione variabili esogene rispetto alla variabile di interesse.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Statistical shape models (SSMs) have been used widely as a basis for segmenting and interpreting complex anatomical structures. The robustness of these models are sensitive to the registration procedures, i.e., establishment of a dense correspondence across a training data set. In this work, two SSMs based on the same training data set of scoliotic vertebrae, and registration procedures were compared. The first model was constructed based on the original binary masks without applying any image pre- and post-processing, and the second was obtained by means of a feature preserving smoothing method applied to the original training data set, followed by a standard rasterization algorithm. The accuracies of the correspondences were assessed quantitatively by means of the maximum of the mean minimum distance (MMMD) and Hausdorf distance (H(D)). Anatomical validity of the models were quantified by means of three different criteria, i.e., compactness, specificity, and model generalization ability. The objective of this study was to compare quasi-identical models based on standard metrics. Preliminary results suggest that the MMMD distance and eigenvalues are not sensitive metrics for evaluating the performance and robustness of SSMs.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Osteoarticular allograft transplantation is a popular treatment method in wide surgical resections with large defects. For this reason hospitals are building bone data banks. Performing the optimal allograft selection on bone banks is crucial to the surgical outcome and patient recovery. However, current approaches are very time consuming hindering an efficient selection. We present an automatic method based on registration of femur bones to overcome this limitation. We introduce a new regularization term for the log-domain demons algorithm. This term replaces the standard Gaussian smoothing with a femur specific polyaffine model. The polyaffine femur model is constructed with two affine (femoral head and condyles) and one rigid (shaft) transformation. Our main contribution in this paper is to show that the demons algorithm can be improved in specific cases with an appropriate model. We are not trying to find the most optimal polyaffine model of the femur, but the simplest model with a minimal number of parameters. There is no need to optimize for different number of regions, boundaries and choice of weights, since this fine tuning will be done automatically by a final demons relaxation step with Gaussian smoothing. The newly developed synthesis approach provides a clear anatomically motivated modeling contribution through the specific three component transformation model, and clearly shows a performance improvement (in terms of anatomical meaningful correspondences) on 146 CT images of femurs compared to a standard multiresolution demons. In addition, this simple model improves the robustness of the demons while preserving its accuracy. The ground truth are manual measurements performed by medical experts.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Three comprehensive one-dimensional simulators were used on the same PC to simulate the dynamics of different electrophoretic configurations, including two migrating hybrid boundaries, an isotachophoretic boundary and the zone electrophoretic separation of ten monovalent anions. Two simulators, SIMUL5 and GENTRANS, use a uniform grid, while SPRESSO uses a dynamic adaptive grid. The simulators differ in the way components are handled. SIMUL5 and SPRESSO feature one equation for all components, whereas GENTRANS is based on the use of separate modules for the different types of monovalent components, a module for multivalent components and a module for proteins. The code for multivalent components is executed more slowly compared to those for monovalent components. Furthermore, with SIMUL5, the computational time interval becomes smaller when it is operated with a reduced calculation space that features moving borders, whereas GENTRANS offers the possibility of using data smoothing (removal of negative concentrations), which can avoid numerical oscillations and speed up a simulation. SPRESSO with its adaptive grid could be employed to simulate the same configurations with smaller numbers of grid points and thus is faster in certain but not all cases. The data reveal that simulations featuring a large number of monovalent components distributed such that a high mesh is required throughout a large proportion of the column are fastest executed with GENTRANS.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

A new technique for on-line high resolution isotopic analysis of liquid water, tailored for ice core studies is presented. We built an interface between a Wavelength Scanned Cavity Ring Down Spectrometer (WS-CRDS) purchased from Picarro Inc. and a Continuous Flow Analysis (CFA) system. The system offers the possibility to perform simultaneuous water isotopic analysis of δ18O and δD on a continuous stream of liquid water as generated from a continuously melted ice rod. Injection of sub μl amounts of liquid water is achieved by pumping sample through a fused silica capillary and instantaneously vaporizing it with 100% efficiency in a~home made oven at a temperature of 170 °C. A calibration procedure allows for proper reporting of the data on the VSMOW–SLAP scale. We apply the necessary corrections based on the assessed performance of the system regarding instrumental drifts and dependance on the water concentration in the optical cavity. The melt rates are monitored in order to assign a depth scale to the measured isotopic profiles. Application of spectral methods yields the combined uncertainty of the system at below 0.1‰ and 0.5‰ for δ18O and δD, respectively. This performance is comparable to that achieved with mass spectrometry. Dispersion of the sample in the transfer lines limits the temporal resolution of the technique. In this work we investigate and assess these dispersion effects. By using an optimal filtering method we show how the measured profiles can be corrected for the smoothing effects resulting from the sample dispersion. Considering the significant advantages the technique offers, i.e. simultaneuous measurement of δ18O and δD, potentially in combination with chemical components that are traditionally measured on CFA systems, notable reduction on analysis time and power consumption, we consider it as an alternative to traditional isotope ratio mass spectrometry with the possibility to be deployed for field ice core studies. We present data acquired in the field during the 2010 season as part of the NEEM deep ice core drilling project in North Greenland.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The present study assessed the effects of abrasion, salivary proteins, and measurement angle on the quantification of early dental erosion by the analysis of reflection intensities from enamel. Enamel from 184 caries-free human molars was used for in vitro erosion in citric acid (pH 3.6). Abrasion of the eroded enamel resulted in a 6% to 14% increase in the specular reflection intensity compared to only eroded enamel, and the reflection increase depended on the erosion degree. Nevertheless, monitoring of early erosion by reflection analysis was possible even in the abraded eroded teeth. The presence of the salivary pellicle induced up to 22% higher reflection intensities due to the smoothing of the eroded enamel by the adhered proteins. However, this measurement artifact could be significantly minimized (p<0.05) by removing the pellicle layer with 3% NaOCl solution. Change of the measurement angles from 45 to 60 deg did not improve the sensitivity of the analysis at late erosion stages. The applicability of the method for monitoring the remineralization of eroded enamel remained unclear in a demineralization/remineralization cycling model of early dental erosion in vitro.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

When different markers are responsive to different aspects of a disease, combination of multiple markers could provide a better screening test for early detection. It is also resonable to assume that the risk of disease changes smoothly as the biomarker values change and the change in risk is monotone with respect to each biomarker. In this paper, we propose a boundary constrained tensor-product B-spline method to estimate the risk of disease by maximizing a penalized likelihood. To choose the optimal amount of smoothing, two scores are proposed which are extensions of the GCV score (O'Sullivan et al. (1986)) and the GACV score (Ziang and Wahba (1996)) to incorporate linear constraints. Simulation studies are carried out to investigate the performance of the proposed estimator and the selection scores. In addidtion, sensitivities and specificities based ona pproximate leave-one-out estimates are proposed to generate more realisitc ROC curves. Data from a pancreatic cancer study is used for illustration.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We investigate the interplay of smoothness and monotonicity assumptions when estimating a density from a sample of observations. The nonparametric maximum likelihood estimator of a decreasing density on the positive half line attains a rate of convergence at a fixed point if the density has a negative derivative. The same rate is obtained by a kernel estimator, but the limit distributions are different. If the density is both differentiable and known to be monotone, then a third estimator is obtained by isotonization of a kernel estimator. We show that this again attains the rate of convergence and compare the limit distributors of the three types of estimators. It is shown that both isotonization and smoothing lead to a more concentrated limit distribution and we study the dependence on the proportionality constant in the bandwidth. We also show that isotonization does not change the limit behavior of a kernel estimator with a larger bandwidth, in the case that the density is known to have more than one derivative.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In this paper we propose methods for smooth hazard estimation of a time variable where that variable is interval censored. These methods allow one to model the transformed hazard in terms of either smooth (smoothing splines) or linear functions of time and other relevant time varying predictor variables. We illustrate the use of this method on a dataset of hemophiliacs where the outcome, time to seroconversion for HIV, is interval censored and left-truncated.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The last two decades have seen intense scientific and regulatory interest in the health effects of particulate matter (PM). Influential epidemiological studies that characterize chronic exposure of individuals rely on monitoring data that are sparse in space and time, so they often assign the same exposure to participants in large geographic areas and across time. We estimate monthly PM during 1988-2002 in a large spatial domain for use in studying health effects in the Nurses' Health Study. We develop a conceptually simple spatio-temporal model that uses a rich set of covariates. The model is used to estimate concentrations of PM10 for the full time period and PM2.5 for a subset of the period. For the earlier part of the period, 1988-1998, few PM2.5 monitors were operating, so we develop a simple extension to the model that represents PM2.5 conditionally on PM10 model predictions. In the epidemiological analysis, model predictions of PM10 are more strongly associated with health effects than when using simpler approaches to estimate exposure. Our modeling approach supports the application in estimating both fine-scale and large-scale spatial heterogeneity and capturing space-time interaction through the use of monthly-varying spatial surfaces. At the same time, the model is computationally feasible, implementable with standard software, and readily understandable to the scientific audience. Despite simplifying assumptions, the model has good predictive performance and uncertainty characterization.