307 resultados para MULTISCALE


Relevância:

10.00% 10.00%

Publicador:

Resumo:

In the analysis of heart rate variability (HRV) are used temporal series that contains the distances between successive heartbeats in order to assess autonomic regulation of the cardiovascular system. These series are obtained from the electrocardiogram (ECG) signal analysis, which can be affected by different types of artifacts leading to incorrect interpretations in the analysis of the HRV signals. Classic approach to deal with these artifacts implies the use of correction methods, some of them based on interpolation, substitution or statistical techniques. However, there are few studies that shows the accuracy and performance of these correction methods on real HRV signals. This study aims to determine the performance of some linear and non-linear correction methods on HRV signals with induced artefacts by quantification of its linear and nonlinear HRV parameters. As part of the methodology, ECG signals of rats measured using the technique of telemetry were used to generate real heart rate variability signals without any error. In these series were simulated missing points (beats) in different quantities in order to emulate a real experimental situation as accurately as possible. In order to compare recovering efficiency, deletion (DEL), linear interpolation (LI), cubic spline interpolation (CI), moving average window (MAW) and nonlinear predictive interpolation (NPI) were used as correction methods for the series with induced artifacts. The accuracy of each correction method was known through the results obtained after the measurement of the mean value of the series (AVNN), standard deviation (SDNN), root mean square error of the differences between successive heartbeats (RMSSD), Lomb\'s periodogram (LSP), Detrended Fluctuation Analysis (DFA), multiscale entropy (MSE) and symbolic dynamics (SD) on each HRV signal with and without artifacts. The results show that, at low levels of missing points the performance of all correction techniques are very similar with very close values for each HRV parameter. However, at higher levels of losses only the NPI method allows to obtain HRV parameters with low error values and low quantity of significant differences in comparison to the values calculated for the same signals without the presence of missing points.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Este trabalho propõe uma técnica de modelagem multiescala concorrente do concreto considerando duas escalas distintas: a mesoescala, onde o concreto é modelado como um material heterogêneo, e a macroescala, na qual o concreto é tratado como um material homogêneo. A heterogeneidade da estrutura mesoscópica do concreto é idealizada considerando três fases distintas, compostas pelos agregados graúdos e argamassa (matriz), estes considerados materiais homogêneos, e zona de transição interfacial (ZTI), tratada como a parte mais fraca entre as três fases. O agregado graúdo é gerado a partir de uma curva granulométrica e posicionado na matriz de forma aleatória. Seu comportamento mecânico é descrito por um modelo constitutivo elástico-linear, devido a sua maior resistência quando comparado com as outras duas fases do concreto. Elementos finitos contínuos com alta relação de aspecto em conjunto com um modelo constitutivo de dano são usados para representar o comportamento não linear do concreto, decorrente da iniciação de fissuras na ZTI e posterior propagação para a matriz, dando lugar à formação de macrofissuras. Os elementos finitos de interface com alta relação de aspecto são inseridos entre todos os elementos regulares da matriz e entre os da matriz e agregados, representando a ZTI, tornando-se potenciais caminhos de propagação de fissuras. No estado limite, quando a espessura do elemento de interface tende a zero (h ?0) e, consequentemente, a relação de aspecto tende a infinito, estes elementos apresentam a mesma cinemática da aproximação contínua de descontinuidades fortes (ACDF), sendo apropriados para representar a formação de descontinuidades associados a fissuras, similar aos modelos coesivos. Um modelo de dano à tração é proposto para representar o comportamento mecânico não linear das interfaces, associado à formação de fissuras, ou até mesmo ao eventual fechamento destas. A fim de contornar os problemas causados pela malha de elementos finitos de transição entre as malhas da macro e da mesoescala, que, em geral, apresentam diferenças expressivas 5 de refinamento, utiliza-se uma técnica recente de acoplamento de malhas não conformes. Esta técnica é baseada na definição de elementos finitos de acoplamento (EFAs), os quais são capazes de estabelecer a continuidade de deslocamento entre malhas geradas de forma completamente independentes, sem aumentar a quantidade total de graus de liberdade do problema, podendo ser utilizados tanto para acoplar malhas não sobrepostas quanto sobrepostas. Para tornar possível a análise em multiescala em casos nos quais a região de localização de deformações não pode ser definida a priori, propõe-se uma técnica multiescala adaptativa. Nesta abordagem, usa-se a distribuição de tensões da escala macroscópica como um indicador para alterar a modelagem das regiões críticas, substituindo-se a macroescala pela mesoescala durante a análise. Consequentemente, a malha macroscópica é automaticamente substituída por uma malha mesoscópica, onde o comportamento não linear está na iminência de ocorrer. Testes numéricos são desenvolvidos para mostrar a capacidade do modelo proposto de representar o processo de iniciação e propagação de fissuras na região tracionada do concreto. Os resultados numéricos são comparados com os resultados experimentais ou com aqueles obtidos através da simulação direta em mesoescala (SDM).

Relevância:

10.00% 10.00%

Publicador:

Resumo:

A presente pesquisa tem como objetivo compreender a dinâmica de comportamento do solo sob escala macro e micromorfológica visualizados em topossequência, no que concerne aos agentes morfológicos que condicionam e contribuem para deflagração de processos erosivos. A área de estudo está inserida na sub-bacia hidrográfica do Laranja Azeda localizada na região centro-leste do estado de São Paulo, no município de São Carlos/SP, e têm fundamental importância por pertencer à bacia hidrográfica do Ribeirão Feijão, importante manancial urbano para a cidade. O planejamento de uso e ocupação adequados aos fatores físicos que compõe a dinâmica desta paisagem são essenciais visando a conservação e preservação dos recursos hídricos ali existentes, onde a expressiva ocorrência de processos erosivos são objetos de preocupação, já que estes podem causar assoreamento de rios e reservatórios. Utilizando uma metodologia multiescalar para seleção da área de pesquisa em detalhe e compreensão da organização e dinâmica da cobertura pedológica, foram utilizados os procedimentos propostos pela Análise Estrutural da Cobertura Pedológica e conceitos e técnicas da micromorfologia de solos. Verifica-se que a distribuição dos solos na Topossequência Manacá está estritamente correlacionada à transformação vertical do materialde origem em solo, em cuja vertente existe uma diferenciação litológica que condiciona a morfologia diferenciada, tanto em escala macromorfológica quanto micromorfológica. O terço superior e médio da vertente está associado à depósitos colúvio-eluvionaresda Formação Itaqueri, onde desenvolve-se um Latossolo Vermelho Amarelo. Já o terço inferior da vertente corresponde a um solo formado a partir dos arenitos da Formação Botucatu, sendo enquadrado enquanto Neossolo Quartzarênico. Com o auxílio técnicas de análise bidimensional de imagens retiradas das lâminas delgadas de solo, foi possível visualizar e quantificar a macroposidade ao longo da vertente, importante atributo morfológico que controla os fluxos de água e são agentes condicionantes para o desenvolvimento de processos erosivos. Conclui-se que a ocorrência de voçorocas no terço médio inferior da vertente é a materialização em forma de processos erosivos deste comportamento diferencial da massa do solo, onde portanto, na Topossequência Manacá a busca de equilíbrio dinâmico na vertente é induzida pela dinâmica genética evolutiva das formações geológicas que sustentam a paisagem, desencadeada em processos erosivos que tendem a progredir em desequilíbrio, a depender do manejo estabelecido para o local.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In the first chapter, we test some stochastic volatility models using options on the S&P 500 index. First, we demonstrate the presence of a short time-scale, on the order of days, and a long time-scale, on the order of months, in the S&P 500 volatility process using the empirical structure function, or variogram. This result is consistent with findings of previous studies. The main contribution of our paper is to estimate the two time-scales in the volatility process simultaneously by using nonlinear weighted least-squares technique. To test the statistical significance of the rates of mean-reversion, we bootstrap pairs of residuals using the circular block bootstrap of Politis and Romano (1992). We choose the block-length according to the automatic procedure of Politis and White (2004). After that, we calculate a first-order correction to the Black-Scholes prices using three different first-order corrections: (i) a fast time scale correction; (ii) a slow time scale correction; and (iii) a multiscale (fast and slow) correction. To test the ability of our model to price options, we simulate options prices using five different specifications for the rates or mean-reversion. We did not find any evidence that these asymptotic models perform better, in terms of RMSE, than the Black-Scholes model. In the second chapter, we use Brazilian data to compute monthly idiosyncratic moments (expected skewness, realized skewness, and realized volatility) for equity returns and assess whether they are informative for the cross-section of future stock returns. Since there is evidence that lagged skewness alone does not adequately forecast skewness, we estimate a cross-sectional model of expected skewness that uses additional predictive variables. Then, we sort stocks each month according to their idiosyncratic moments, forming quintile portfolios. We find a negative relationship between higher idiosyncratic moments and next-month stock returns. The trading strategy that sells stocks in the top quintile of expected skewness and buys stocks in the bottom quintile generates a significant monthly return of about 120 basis points. Our results are robust across sample periods, portfolio weightings, and to Fama and French (1993)’s risk adjustment factors. Finally, we identify a return reversal of stocks with high idiosyncratic skewness. Specifically, stocks with high idiosyncratic skewness have high contemporaneous returns. That tends to reverse, resulting in negative abnormal returns in the following month.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Flows of complex fluids need to be understood at both macroscopic and molecular scales, because it is the macroscopic response that controls the fluid behavior, but the molecular scale that ultimately gives rise to rheological and solid-state properties. Here the flow field of an entangled polymer melt through an extended contraction, typical of many polymer processes, is imaged optically and by small-angle neutron scattering. The dual-probe technique samples both the macroscopic stress field in the flow and the microscopic configuration of the polymer molecules at selected points. The results are compared with a recent tube model molecular theory of entangled melt flow that is able to calculate both the stress and the single-chain structure factor from first principles. The combined action of the three fundamental entangled processes of reptation, contour length fluctuation, and convective constraint release is essential to account quantitatively for the rich rheological behavior. The multiscale approach unearths a new feature: Orientation at the length scale of the entire chain decays considerably more slowly than at the smaller entanglement length.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The Wet Tropics World Heritage Area in Far North Queens- land, Australia consists predominantly of tropical rainforest and wet sclerophyll forest in areas of variable relief. Previous maps of vegetation communities in the area were produced by a labor-intensive combination of field survey and air-photo interpretation. Thus,. the aim of this work was to develop a new vegetation mapping method based on imaging radar that incorporates topographical corrections, which could be repeated frequently, and which would reduce the need for detailed field assessments and associated costs. The method employed G topographic correction and mapping procedure that was developed to enable vegetation structural classes to be mapped from satellite imaging radar. Eight JERS-1 scenes covering the Wet Tropics area for 1996 were acquired from NASDA under the auspices of the Global Rainforest Mapping Project. JERS scenes were geometrically corrected for topographic distortion using an 80 m DEM and a combination of polynomial warping and radar viewing geometry modeling. An image mosaic was created to cover the Wet Tropics region, and a new technique for image smoothing was applied to the JERS texture bonds and DEM before a Maximum Likelihood classification was applied to identify major land-cover and vegetation communities. Despite these efforts, dominant vegetation community classes could only be classified to low levels of accuracy (57.5 percent) which were partly explained by the significantly larger pixel size of the DEM in comparison to the JERS image (12.5 m). In addition, the spatial and floristic detail contained in the classes of the original validation maps were much finer than the JERS classification product was able to distinguish. In comparison to field and aerial photo-based approaches for mapping the vegetation of the Wet Tropics, appropriately corrected SAR data provides a more regional scale, all-weather mapping technique for broader vegetation classes. Further work is required to establish an appropriate combination of imaging radar with elevation data and other environmental surrogates to accurately map vegetation communities across the entire Wet Tropics.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Summarizing topological relations is fundamental to many spatial applications including spatial query optimization. In this article, we present several novel techniques to effectively construct cell density based spatial histograms for range (window) summarizations restricted to the four most important level-two topological relations: contains, contained, overlap, and disjoint. We first present a novel framework to construct a multiscale Euler histogram in 2D space with the guarantee of the exact summarization results for aligned windows in constant time. To minimize the storage space in such a multiscale Euler histogram, an approximate algorithm with the approximate ratio 19/12 is presented, while the problem is shown NP-hard generally. To conform to a limited storage space where a multiscale histogram may be allowed to have only k Euler histograms, an effective algorithm is presented to construct multiscale histograms to achieve high accuracy in approximately summarizing aligned windows. Then, we present a new approximate algorithm to query an Euler histogram that cannot guarantee the exact answers; it runs in constant time. We also investigate the problem of nonaligned windows and the problem of effectively partitioning the data space to support nonaligned window queries. Finally, we extend our techniques to 3D space. Our extensive experiments against both synthetic and real world datasets demonstrate that the approximate multiscale histogram techniques may improve the accuracy of the existing techniques by several orders of magnitude while retaining the cost efficiency, and the exact multiscale histogram technique requires only a storage space linearly proportional to the number of cells for many popular real datasets.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Ecological and genetic studies of marine turtles generally support the hypothesis of natal homing, but leave open the question of the geographical scale of genetic exchange and the capacity of turtles to shift breeding sites. Here we combine analyses of mitochondrial DNA (mtDNA) variation and recapture data to assess the geographical scale of individual breeding populations and the distribution of such populations through Australasia. We conducted multiscale assessments of mtDNA variation among 714 samples from 27 green turtle rookeries and of adult female dispersal among nesting sites in eastern Australia. Many of these rookeries are on shelves that were flooded by rising sea levels less than 10 000 years (c. 450 generations) ago. Analyses of sequence variation among the mtDNA control region revealed 25 haplotypes, and their frequency distributions indicated 17 genetically distinct breeding stocks (Management Units) consisting either of individual rookeries or groups of rookeries in general that are separated by more than 500 km. The population structure inferred from mtDNA was consistent with the scale of movements observed in long-term mark-recapture studies of east Australian rookeries. Phylogenetic analysis of the haplotypes revealed five clades with significant partitioning of sequence diversity (Phi = 68.4) between Pacific Ocean and Southeast Asian/Indian Ocean rookeries. Isolation by distance was indicated for rookeries separated by up to 2000 km but explained only 12% of the genetic structure. The emerging general picture is one of dynamic population structure influenced by the capacity of females to relocate among proximal breeding sites, although this may be conditional on large population sizes as existed historically across this region.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We are concerned with the problem of image segmentation in which each pixel is assigned to one of a predefined finite number of classes. In Bayesian image analysis, this requires fusing together local predictions for the class labels with a prior model of segmentations. Markov Random Fields (MRFs) have been used to incorporate some of this prior knowledge, but this not entirely satisfactory as inference in MRFs is NP-hard. The multiscale quadtree model of Bouman and Shapiro (1994) is an attractive alternative, as this is a tree-structured belief network in which inference can be carried out in linear time (Pearl 1988). It is an hierarchical model where the bottom-level nodes are pixels, and higher levels correspond to downsampled versions of the image. The conditional-probability tables (CPTs) in the belief network encode the knowledge of how the levels interact. In this paper we discuss two methods of learning the CPTs given training data, using (a) maximum likelihood and the EM algorithm and (b) emphconditional maximum likelihood (CML). Segmentations obtained using networks trained by CML show a statistically-significant improvement in performance on synthetic images. We also demonstrate the methods on a real-world outdoor-scene segmentation task.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This thesis presents an investigation, of synchronisation and causality, motivated by problems in computational neuroscience. The thesis addresses both theoretical and practical signal processing issues regarding the estimation of interdependence from a set of multivariate data generated by a complex underlying dynamical system. This topic is driven by a series of problems in neuroscience, which represents the principal background motive behind the material in this work. The underlying system is the human brain and the generative process of the data is based on modern electromagnetic neuroimaging methods . In this thesis, the underlying functional of the brain mechanisms are derived from the recent mathematical formalism of dynamical systems in complex networks. This is justified principally on the grounds of the complex hierarchical and multiscale nature of the brain and it offers new methods of analysis to model its emergent phenomena. A fundamental approach to study the neural activity is to investigate the connectivity pattern developed by the brain’s complex network. Three types of connectivity are important to study: 1) anatomical connectivity refering to the physical links forming the topology of the brain network; 2) effective connectivity concerning with the way the neural elements communicate with each other using the brain’s anatomical structure, through phenomena of synchronisation and information transfer; 3) functional connectivity, presenting an epistemic concept which alludes to the interdependence between data measured from the brain network. The main contribution of this thesis is to present, apply and discuss novel algorithms of functional connectivities, which are designed to extract different specific aspects of interaction between the underlying generators of the data. Firstly, a univariate statistic is developed to allow for indirect assessment of synchronisation in the local network from a single time series. This approach is useful in inferring the coupling as in a local cortical area as observed by a single measurement electrode. Secondly, different existing methods of phase synchronisation are considered from the perspective of experimental data analysis and inference of coupling from observed data. These methods are designed to address the estimation of medium to long range connectivity and their differences are particularly relevant in the context of volume conduction, that is known to produce spurious detections of connectivity. Finally, an asymmetric temporal metric is introduced in order to detect the direction of the coupling between different regions of the brain. The method developed in this thesis is based on a machine learning extensions of the well known concept of Granger causality. The thesis discussion is developed alongside examples of synthetic and experimental real data. The synthetic data are simulations of complex dynamical systems with the intention to mimic the behaviour of simple cortical neural assemblies. They are helpful to test the techniques developed in this thesis. The real datasets are provided to illustrate the problem of brain connectivity in the case of important neurological disorders such as Epilepsy and Parkinson’s disease. The methods of functional connectivity in this thesis are applied to intracranial EEG recordings in order to extract features, which characterize underlying spatiotemporal dynamics before during and after an epileptic seizure and predict seizure location and onset prior to conventional electrographic signs. The methodology is also applied to a MEG dataset containing healthy, Parkinson’s and dementia subjects with the scope of distinguishing patterns of pathological from physiological connectivity.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

To make vision possible, the visual nervous system must represent the most informative features in the light pattern captured by the eye. Here we use Gaussian scale-space theory to derive a multiscale model for edge analysis and we test it in perceptual experiments. At all scales there are two stages of spatial filtering. An odd-symmetric, Gaussian first derivative filter provides the input to a Gaussian second derivative filter. Crucially, the output at each stage is half-wave rectified before feeding forward to the next. This creates nonlinear channels selectively responsive to one edge polarity while suppressing spurious or "phantom" edges. The two stages have properties analogous to simple and complex cells in the visual cortex. Edges are found as peaks in a scale-space response map that is the output of the second stage. The position and scale of the peak response identify the location and blur of the edge. The model predicts remarkably accurately our results on human perception of edge location and blur for a wide range of luminance profiles, including the surprising finding that blurred edges look sharper when their length is made shorter. The model enhances our understanding of early vision by integrating computational, physiological, and psychophysical approaches. © ARVO.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We describe a template model for perception of edge blur and identify a crucial early nonlinearity in this process. The main principle is to spatially filter the edge image to produce a 'signature', and then find which of a set of templates best fits that signature. Psychophysical blur-matching data strongly support the use of a second-derivative signature, coupled to Gaussian first-derivative templates. The spatial scale of the best-fitting template signals the edge blur. This model predicts blur-matching data accurately for a wide variety of Gaussian and non-Gaussian edges, but it suffers a bias when edges of opposite sign come close together in sine-wave gratings and other periodic images. This anomaly suggests a second general principle: the region of an image that 'belongs' to a given edge should have a consistent sign or direction of luminance gradient. Segmentation of the gradient profile into regions of common sign is achieved by implementing the second-derivative 'signature' operator as two first-derivative operators separated by a half-wave rectifier. This multiscale system of nonlinear filters predicts perceived blur accurately for periodic and aperiodic waveforms. We also outline its extension to 2-D images and infer the 2-D shape of the receptive fields.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Understanding a complex network's structure holds the key to understanding its function. The physics community has contributed a multitude of methods and analyses to this cross-disciplinary endeavor. Structural features exist on both the microscopic level, resulting from differences between single node properties, and the mesoscopic level resulting from properties shared by groups of nodes. Disentangling the determinants of network structure on these different scales has remained a major, and so far unsolved, challenge. Here we show how multiscale generative probabilistic exponential random graph models combined with efficient, distributive message-passing inference techniques can be used to achieve this separation of scales, leading to improved detection accuracy of latent classes as demonstrated on benchmark problems. It sheds new light on the statistical significance of motif-distributions in neural networks and improves the link-prediction accuracy as exemplified for gene-disease associations in the highly consequential Online Mendelian Inheritance in Man database. © 2011 Reichardt et al.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This study considers the application of image analysis in petrography and investigates the possibilities for advancing existing techniques by introducing feature extraction and analysis capabilities of a higher level than those currently employed. The aim is to construct relevant, useful descriptions of crystal form and inter-crystal relations in polycrystalline igneous rock sections. Such descriptions cannot be derived until the `ownership' of boundaries between adjacent crystals has been established: this is the fundamental problem of crystal boundary assignment. An analysis of this problem establishes key image features which reveal boundary ownership; a set of explicit analysis rules is presented. A petrographic image analysis scheme based on these principles is outlined and the implementation of key components of the scheme considered. An algorithm for the extraction and symbolic representation of image structural information is developed. A new multiscale analysis algorithm which produces a hierarchical description of the linear and near-linear structure on a contour is presented in detail. Novel techniques for symmetry analysis are developed. The analyses considered contribute both to the solution of the boundary assignment problem and to the construction of geologically useful descriptions of crystal form. The analysis scheme which is developed employs grouping principles such as collinearity, parallelism, symmetry and continuity, so providing a link between this study and more general work in perceptual grouping and intermediate level computer vision. Consequently, the techniques developed in this study may be expected to find wider application beyond the petrographic domain.