9 resultados para SMOOTHING SPLINE
em AMS Tesi di Dottorato - Alm@DL - Università di Bologna
Resumo:
The thesis is concerned with local trigonometric regression methods. The aim was to develop a method for extraction of cyclical components in time series. The main results of the thesis are the following. First, a generalization of the filter proposed by Christiano and Fitzgerald is furnished for the smoothing of ARIMA(p,d,q) process. Second, a local trigonometric filter is built, with its statistical properties. Third, they are discussed the convergence properties of trigonometric estimators, and the problem of choosing the order of the model. A large scale simulation experiment has been designed in order to assess the performance of the proposed models and methods. The results show that local trigonometric regression may be a useful tool for periodic time series analysis.
Resumo:
In this work we aim to propose a new approach for preliminary epidemiological studies on Standardized Mortality Ratios (SMR) collected in many spatial regions. A preliminary study on SMRs aims to formulate hypotheses to be investigated via individual epidemiological studies that avoid bias carried on by aggregated analyses. Starting from collecting disease counts and calculating expected disease counts by means of reference population disease rates, in each area an SMR is derived as the MLE under the Poisson assumption on each observation. Such estimators have high standard errors in small areas, i.e. where the expected count is low either because of the low population underlying the area or the rarity of the disease under study. Disease mapping models and other techniques for screening disease rates among the map aiming to detect anomalies and possible high-risk areas have been proposed in literature according to the classic and the Bayesian paradigm. Our proposal is approaching this issue by a decision-oriented method, which focus on multiple testing control, without however leaving the preliminary study perspective that an analysis on SMR indicators is asked to. We implement the control of the FDR, a quantity largely used to address multiple comparisons problems in the eld of microarray data analysis but which is not usually employed in disease mapping. Controlling the FDR means providing an estimate of the FDR for a set of rejected null hypotheses. The small areas issue arises diculties in applying traditional methods for FDR estimation, that are usually based only on the p-values knowledge (Benjamini and Hochberg, 1995; Storey, 2003). Tests evaluated by a traditional p-value provide weak power in small areas, where the expected number of disease cases is small. Moreover tests cannot be assumed as independent when spatial correlation between SMRs is expected, neither they are identical distributed when population underlying the map is heterogeneous. The Bayesian paradigm oers a way to overcome the inappropriateness of p-values based methods. Another peculiarity of the present work is to propose a hierarchical full Bayesian model for FDR estimation in testing many null hypothesis of absence of risk.We will use concepts of Bayesian models for disease mapping, referring in particular to the Besag York and Mollié model (1991) often used in practice for its exible prior assumption on the risks distribution across regions. The borrowing of strength between prior and likelihood typical of a hierarchical Bayesian model takes the advantage of evaluating a singular test (i.e. a test in a singular area) by means of all observations in the map under study, rather than just by means of the singular observation. This allows to improve the power test in small areas and addressing more appropriately the spatial correlation issue that suggests that relative risks are closer in spatially contiguous regions. The proposed model aims to estimate the FDR by means of the MCMC estimated posterior probabilities b i's of the null hypothesis (absence of risk) for each area. An estimate of the expected FDR conditional on data (\FDR) can be calculated in any set of b i's relative to areas declared at high-risk (where thenull hypothesis is rejected) by averaging the b i's themselves. The\FDR can be used to provide an easy decision rule for selecting high-risk areas, i.e. selecting as many as possible areas such that the\FDR is non-lower than a prexed value; we call them\FDR based decision (or selection) rules. The sensitivity and specicity of such rule depend on the accuracy of the FDR estimate, the over-estimation of FDR causing a loss of power and the under-estimation of FDR producing a loss of specicity. Moreover, our model has the interesting feature of still being able to provide an estimate of relative risk values as in the Besag York and Mollié model (1991). A simulation study to evaluate the model performance in FDR estimation accuracy, sensitivity and specificity of the decision rule, and goodness of estimation of relative risks, was set up. We chose a real map from which we generated several spatial scenarios whose counts of disease vary according to the spatial correlation degree, the size areas, the number of areas where the null hypothesis is true and the risk level in the latter areas. In summarizing simulation results we will always consider the FDR estimation in sets constituted by all b i's selected lower than a threshold t. We will show graphs of the\FDR and the true FDR (known by simulation) plotted against a threshold t to assess the FDR estimation. Varying the threshold we can learn which FDR values can be accurately estimated by the practitioner willing to apply the model (by the closeness between\FDR and true FDR). By plotting the calculated sensitivity and specicity (both known by simulation) vs the\FDR we can check the sensitivity and specicity of the corresponding\FDR based decision rules. For investigating the over-smoothing level of relative risk estimates we will compare box-plots of such estimates in high-risk areas (known by simulation), obtained by both our model and the classic Besag York Mollié model. All the summary tools are worked out for all simulated scenarios (in total 54 scenarios). Results show that FDR is well estimated (in the worst case we get an overestimation, hence a conservative FDR control) in small areas, low risk levels and spatially correlated risks scenarios, that are our primary aims. In such scenarios we have good estimates of the FDR for all values less or equal than 0.10. The sensitivity of\FDR based decision rules is generally low but specicity is high. In such scenario the use of\FDR = 0:05 or\FDR = 0:10 based selection rule can be suggested. In cases where the number of true alternative hypotheses (number of true high-risk areas) is small, also FDR = 0:15 values are well estimated, and \FDR = 0:15 based decision rules gains power maintaining an high specicity. On the other hand, in non-small areas and non-small risk level scenarios the FDR is under-estimated unless for very small values of it (much lower than 0.05); this resulting in a loss of specicity of a\FDR = 0:05 based decision rule. In such scenario\FDR = 0:05 or, even worse,\FDR = 0:1 based decision rules cannot be suggested because the true FDR is actually much higher. As regards the relative risk estimation, our model achieves almost the same results of the classic Besag York Molliè model. For this reason, our model is interesting for its ability to perform both the estimation of relative risk values and the FDR control, except for non-small areas and large risk level scenarios. A case of study is nally presented to show how the method can be used in epidemiology.
Resumo:
Some fundamental biological processes such as embryonic development have been preserved during evolution and are common to species belonging to different phylogenetic positions, but are nowadays largely unknown. The understanding of cell morphodynamics leading to the formation of organized spatial distribution of cells such as tissues and organs can be achieved through the reconstruction of cells shape and position during the development of a live animal embryo. We design in this work a chain of image processing methods to automatically segment and track cells nuclei and membranes during the development of a zebrafish embryo, which has been largely validates as model organism to understand vertebrate development, gene function and healingrepair mechanisms in vertebrates. The embryo is previously labeled through the ubiquitous expression of fluorescent proteins addressed to cells nuclei and membranes, and temporal sequences of volumetric images are acquired with laser scanning microscopy. Cells position is detected by processing nuclei images either through the generalized form of the Hough transform or identifying nuclei position with local maxima after a smoothing preprocessing step. Membranes and nuclei shapes are reconstructed by using PDEs based variational techniques such as the Subjective Surfaces and the Chan Vese method. Cells tracking is performed by combining informations previously detected on cells shape and position with biological regularization constraints. Our results are manually validated and reconstruct the formation of zebrafish brain at 7-8 somite stage with all the cells tracked starting from late sphere stage with less than 2% error for at least 6 hours. Our reconstruction opens the way to a systematic investigation of cellular behaviors, of clonal origin and clonal complexity of brain organs, as well as the contribution of cell proliferation modes and cell movements to the formation of local patterns and morphogenetic fields.
Resumo:
This thesis individuates and characterizes irreversible transformations occurring in specific organic and oligomeric/polymeric thin films. These transformations are dewetting in discotic liquid crystals thin films and dewetting and smoothing in oligomeric and polyemeric films. Irreversible transformations are extensively characterized by means of optical and atomic force microscopy. In the case of discotic liquid crystals films the morphological characterization is performed sinchronically with electrical measurements of current during dewetting.
Resumo:
Introduction: Nocturnal frontal lobe epilepsy (NFLE) is a distinct syndrome of partial epilepsy whose clinical features comprise a spectrum of paroxysmal motor manifestations of variable duration and complexity, arising from sleep. Cardiovascular changes during NFLE seizures have previously been observed, however the extent of these modifications and their relationship with seizure onset has not been analyzed in detail. Objective: Aim of present study is to evaluate NFLE seizure related changes in heart rate (HR) and in sympathetic/parasympathetic balance through wavelet analysis of HR variability (HRV). Methods: We evaluated the whole night digitally recorded video-polysomnography (VPSG) of 9 patients diagnosed with NFLE with no history of cardiac disorders and normal cardiac examinations. Events with features of NFLE seizures were selected independently by three examiners and included in the study only if a consensus was reached. Heart rate was evaluated by measuring the interval between two consecutive R-waves of QRS complexes (RRi). RRi series were digitally calculated for a period of 20 minutes, including the seizures and resampled at 10 Hz using cubic spline interpolation. A multiresolution analysis was performed (Daubechies-16 form), and the squared level specific amplitude coefficients were summed across appropriate decomposition levels in order to compute total band powers in bands of interest (LF: 0.039062 - 0.156248, HF: 0.156248 - 0.624992). A general linear model was then applied to estimate changes in RRi, LF and HF powers during three different period (Basal) (30 sec, at least 30 sec before seizure onset, during which no movements occurred and autonomic conditions resulted stationary); pre-seizure period (preSP) (10 sec preceding seizure onset) and seizure period (SP) corresponding to the clinical manifestations. For one of the patients (patient 9) three seizures associated with ictal asystole were recorded, hence he was treated separately. Results: Group analysis performed on 8 patients (41 seizures) showed that RRi remained unchanged during the preSP, while a significant tachycardia was observed in the SP. A significant increase in the LF component was instead observed during both the preSP and the SP (p<0.001) while HF component decreased only in the SP (p<0.001). For patient 9 during the preSP and in the first part of SP a significant tachycardia was observed associated with an increased sympathetic activity (increased LF absolute values and LF%). In the second part of the SP a progressive decrease in HR that gradually exceeded basal values occurred before IA. Bradycardia was associated with an increase in parasympathetic activity (increased HF absolute values and HF%) contrasted by a further increase in LF until the occurrence of IA. Conclusions: These data suggest that changes in autonomic balance toward a sympathetic prevalence always preceded clinical seizure onset in NFLE, even when HR changes were not yet evident, confirming that wavelet analysis is a sensitive technique to detect sudden variations of autonomic balance occurring during transient phenomena. Finally we demonstrated that epileptic asystole is associated with a parasympathetic hypertonus counteracted by a marked sympathetic activation.
Resumo:
In recent years, radars have been used in many applications such as precision agriculture and advanced driver assistant systems. Optimal techniques for the estimation of the number of targets and of their coordinates require solving multidimensional optimization problems entailing huge computational efforts. This has motivated the development of sub-optimal estimation techniques able to achieve good accuracy at a manageable computational cost. Another technical issue in advanced driver assistant systems is the tracking of multiple targets. Even if various filtering techniques have been developed, new efficient and robust algorithms for target tracking can be devised exploiting a probabilistic approach, based on the use of the factor graph and the sum-product algorithm. The two contributions provided by this dissertation are the investigation of the filtering and smoothing problems from a factor graph perspective and the development of efficient algorithms for two and three-dimensional radar imaging. Concerning the first contribution, a new factor graph for filtering is derived and the sum-product rule is applied to this graphical model; this allows to interpret known algorithms and to develop new filtering techniques. Then, a general method, based on graphical modelling, is proposed to derive filtering algorithms that involve a network of interconnected Bayesian filters. Finally, the proposed graphical approach is exploited to devise a new smoothing algorithm. Numerical results for dynamic systems evidence that our algorithms can achieve a better complexity-accuracy tradeoff and tracking capability than other techniques in the literature. Regarding radar imaging, various algorithms are developed for frequency modulated continuous wave radars; these algorithms rely on novel and efficient methods for the detection and estimation of multiple superimposed tones in noise. The accuracy achieved in the presence of multiple closely spaced targets is assessed on the basis of both synthetically generated data and of the measurements acquired through two commercial multiple-input multiple-output radars.
Resumo:
Additive Manufacturing (AM) is nowadays considered an important alternative to traditional manufacturing processes. AM technology shows several advantages in literature as design flexibility, and its use increases in automotive, aerospace and biomedical applications. As a systematic literature review suggests, AM is sometimes coupled with voxelization, mainly for representation and simulation purposes. Voxelization can be defined as a volumetric representation technique based on the model’s discretization with hexahedral elements, as occurs with pixels in the 2D image. Voxels are used to simplify geometric representation, store intricated details of the interior and speed-up geometric and algebraic manipulation. Compared to boundary representation used in common CAD software, voxel’s inherent advantages are magnified in specific applications such as lattice or topologically structures for visualization or simulation purposes. Those structures can only be manufactured with AM employment due to their complex topology. After an accurate review of the existent literature, this project aims to exploit the potential of the voxelization algorithm to develop optimized Design for Additive Manufacturing (DfAM) tools. The final aim is to manipulate and support mechanical simulations of lightweight and optimized structures that should be ready to be manufactured with AM with particular attention to automotive applications. A voxel-based methodology is developed for efficient structural simulation of lattice structures. Moreover, thanks to an optimized smoothing algorithm specific for voxel-based geometries, a topological optimized and voxelized structure can be transformed into a surface triangulated mesh file ready for the AM process. Moreover, a modified panel code is developed for simple CFD simulations using the voxels as a discretization unit to understand the fluid-dynamics performances of industrial components for preliminary aerodynamic performance evaluation. The developed design tools and methodologies perfectly fit the automotive industry’s needs to accelerate and increase the efficiency of the design workflow from the conceptual idea to the final product.
Resumo:
The COVID-19 pandemic, sparked by the SARS-CoV-2 virus, stirred global comparisons to historical pandemics. Initially presenting a high mortality rate, it later stabilized globally at around 0.5-3%. Patients manifest a spectrum of symptoms, necessitating efficient triaging for appropriate treatment strategies, ranging from symptomatic relief to antivirals or monoclonal antibodies. Beyond traditional approaches, emerging research suggests a potential link between COVID-19 severity and alterations in gut microbiota composition, impacting inflammatory responses. However, most studies focus on severe hospitalized cases without standardized criteria for severity. Addressing this gap, the first study in this thesis spans diverse COVID-19 severity levels, utilizing 16S rRNA amplicon sequencing on fecal samples from 315 subjects. The findings highlight significant microbiota differences correlated with severity. Machine learning classifiers, including a multi-layer convoluted neural network, demonstrated the potential of microbiota compositional data to predict patient severity, achieving an 84.2% mean balanced accuracy starting one week post-symptom onset. These preliminary results underscore the gut microbiota's potential as a biomarker in clinical decision-making for COVID-19. The second study delves into mild COVID-19 cases, exploring their implications for ‘long COVID’ or Post-Acute COVID-19 Syndrome (PACS). Employing longitudinal analysis, the study unveils dynamic shifts in microbial composition during the acute phase, akin to severe cases. Innovative techniques, including network approaches and spline-based longitudinal analysis, were deployed to assess microbiota dynamics and potential associations with PACS. The research suggests that even in mild cases, similar mechanisms to hospitalized patients are established regarding changes in intestinal microbiota during the acute phase of the infection. These findings lay the foundation for potential microbiota-targeted therapies to mitigate inflammation, potentially preventing long COVID symptoms in the broader population. In essence, these studies offer valuable insights into the intricate relationships between COVID-19 severity, gut microbiota, and the potential for innovative clinical applications.