924 resultados para probabilistic Hough transform
Resumo:
Objective - For patients with medication refractory medial temporal lobe epilepsy (MTLE), surgery offers the hope of a cure. However, up to 30% of patients with MTLE continue to experience disabling seizures after surgery. The reasons why some patients do not achieve seizure freedom are poorly understood. A promising theory suggests that epileptogenic networks are broadly distributed in surgically refractory MTLE, involving regions beyond the medial temporal lobe. In this retrospective study, we aimed to investigate the distribution of epileptogenic networks in MTLE using Bayesian distributed EEG source analysis from preoperative ictal onset recordings. This analysis has the advantage of generating maps of source probability, which can be subjected to voxel-based statistical analyses.Methods - We compared 10 patients who achieved post-surgical seizure freedom with 10 patients who continued experiencing seizures after surgery. Voxel-based Wilcoxon tests were employed with correction for multiple comparisons.Results - We observed that ictal EEG source intensities were significantly more likely to occur in lateral temporal and posterior medial temporal regions in patients with continued seizures post-surgery.Conclusions - Our findings support the theory of broader spatial distribution of epileptogenic networks at seizure onset in patients with surgically refractory MTLE.
Resumo:
Non-Hodgkin lymphomas are of many distinct types, and different classification systems make it difficult to diagnose them correctly. Many of these systems classify lymphomas only based on what they look like under a microscope. In 2008 the World Health Organisation (WHO) introduced the most recent system, which also considers the chromosome features of the lymphoma cells and the presence of certain proteins on their surface. The WHO system is the one that we apply in this work. Herewith we present an automatic method to classify histological images of three types of non-Hodgkin lymphoma. Our method is based on the Stationary Wavelet Transform (SWT), and it consists of three steps: 1) extracting sub-bands from the histological image through SWT, 2) applying Analysis of Variance (ANOVA) to clean noise and select the most relevant information, 3) classifying it by the Support Vector Machine (SVM) algorithm. The kernel types Linear, RBF and Polynomial were evaluated with our method applied to 210 images of lymphoma from the National Institute on Aging. We concluded that the following combination led to the most relevant results: detail sub-band, ANOVA and SVM with Linear and RBF kernels.
Resumo:
Conselho Nacional de Desenvolvimento Científico e Tecnológico (CNPq)
Resumo:
This paper presents two diagnostic methods for the online detection of broken bars in induction motors with squirrel-cage type rotors. The wavelet representation of a function is a new technique. Wavelet transform of a function is the improved version of Fourier transform. Fourier transform is a powerful tool for analyzing the components of a stationary signal. But it is failed for analyzing the non-stationary signal whereas wavelet transform allows the components of a non-stationary signal to be analyzed. In this paper, our main goal is to find out the advantages of wavelet transform compared to Fourier transform in rotor failure diagnosis of induction motors.
Resumo:
Research on image processing has shown that combining segmentation methods may lead to a solid approach to extract semantic information from different sort of images. Within this context, the Normalized Cut (NCut) is usually used as a final partitioning tool for graphs modeled in some chosen method. This work explores the Watershed Transform as a modeling tool, using different criteria of the hierarchical Watershed to convert an image into an adjacency graph. The Watershed is combined with an unsupervised distance learning step that redistributes the graph weights and redefines the Similarity matrix, before the final segmentation step using NCut. Adopting the Berkeley Segmentation Data Set and Benchmark as a background, our goal is to compare the results obtained for this method with previous work to validate its performance.
Resumo:
Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)
Resumo:
Vortex-induced motion (VIM) is a highly nonlinear dynamic phenomenon. Usual spectral analysis methods, using the Fourier transform, rely on the hypotheses of linear and stationary dynamics. A method to treat nonstationary signals that emerge from nonlinear systems is denoted Hilbert-Huang transform (HHT) method. The development of an analysis methodology to study the VIM of a monocolumn production, storage, and offloading system using HHT is presented. The purposes of the present methodology are to improve the statistics analysis of VIM. The results showed to be comparable to results obtained from a traditional analysis (mean of the 10% highest peaks) particularly for the motions in the transverse direction, although the difference between the results from the traditional analysis for the motions in the in-line direction showed a difference of around 25%. The results from the HHT analysis are more reliable than the traditional ones, owing to the larger number of points to calculate the statistics characteristics. These results may be used to design risers and mooring lines, as well as to obtain VIM parameters to calibrate numerical predictions. [DOI: 10.1115/1.4003493]
Resumo:
In this paper, a definition of the Hilbert transform operating on Colombeau's temperated generalized functions is given. Similar results to some theorems that hold in the classical theory, or in certain subspaces of Schwartz distributions, have been obtained in this framework.
Resumo:
Patterns of species interactions affect the dynamics of food webs. An important component of species interactions that is rarely considered with respect to food webs is the strengths of interactions, which may affect both structure and dynamics. In natural systems, these strengths are variable, and can be quantified as probability distributions. We examined how variation in strengths of interactions can be described hierarchically, and how this variation impacts the structure of species interactions in predator-prey networks, both of which are important components of ecological food webs. The stable isotope ratios of predator and prey species may be particularly useful for quantifying this variability, and we show how these data can be used to build probabilistic predator-prey networks. Moreover, the distribution of variation in strengths among interactions can be estimated from a limited number of observations. This distribution informs network structure, especially the key role of dietary specialization, which may be useful for predicting structural properties in systems that are difficult to observe. Finally, using three mammalian predator-prey networks ( two African and one Canadian) quantified from stable isotope data, we show that exclusion of link-strength variability results in biased estimates of nestedness and modularity within food webs, whereas the inclusion of body size constraints only marginally increases the predictive accuracy of the isotope-based network. We find that modularity is the consequence of strong link-strengths in both African systems, while nestedness is not significantly present in any of the three predator-prey networks.
Resumo:
We present a method of generation of exact and explicit forms of one-sided, heavy-tailed Levy stable probability distributions g(alpha)(x), 0 <= x < infinity, 0 < alpha < 1. We demonstrate that the knowledge of one such a distribution g a ( x) suffices to obtain exactly g(alpha)p ( x), p = 2, 3, .... Similarly, from known g(alpha)(x) and g(beta)(x), 0 < alpha, beta < 1, we obtain g(alpha beta)( x). The method is based on the construction of the integral operator, called Levy transform, which implements the above operations. For a rational, alpha = l/k with l < k, we reproduce in this manner many of the recently obtained exact results for g(l/k)(x). This approach can be also recast as an application of the Efros theorem for generalized Laplace convolutions. It relies solely on efficient definite integration. (C) 2012 American Institute of Physics. [http://dx.doi.org/10.1063/1.4709443]
Resumo:
This paper addresses the numerical solution of random crack propagation problems using the coupling boundary element method (BEM) and reliability algorithms. Crack propagation phenomenon is efficiently modelled using BEM, due to its mesh reduction features. The BEM model is based on the dual BEM formulation, in which singular and hyper-singular integral equations are adopted to construct the system of algebraic equations. Two reliability algorithms are coupled with BEM model. The first is the well known response surface method, in which local, adaptive polynomial approximations of the mechanical response are constructed in search of the design point. Different experiment designs and adaptive schemes are considered. The alternative approach direct coupling, in which the limit state function remains implicit and its gradients are calculated directly from the numerical mechanical response, is also considered. The performance of both coupling methods is compared in application to some crack propagation problems. The investigation shows that direct coupling scheme converged for all problems studied, irrespective of the problem nonlinearity. The computational cost of direct coupling has shown to be a fraction of the cost of response surface solutions, regardless of experiment design or adaptive scheme considered. (C) 2012 Elsevier Ltd. All rights reserved.
Resumo:
This study aimed to evaluate the chemical interaction of collagen with some substances usually applied in dental treatments to increase the durability of adhesive restorations to dentin. Initially, the similarity between human dentin collagen and type I collagen obtained from commercial bovine membranes of Achilles deep tendon was compared by the Attenuated Total Reflectance technique of Fourier Transform Infrared (ATR-FTIR) spectroscopy. Finally, the effects of application of 35% phosphoric acid, 0.1M ethylenediaminetetraacetic acid (EDTA), 2% chlorhexidine, and 6.5% proanthocyanidin solution on microstructure of collagen and in the integrity of its triple helix were also evaluated by ATR-FTIR. It was observed that the commercial type I collagen can be used as an efficient substitute for demineralized human dentin in studies that use spectroscopy analysis. The 35% phosphoric acid significantly altered the organic content of amides, proline and hydroxyproline of type I collagen. The surface treatment with 0.1M EDTA, 2% chlorhexidine, or 6.5% proanthocyanidin did not promote deleterious structural changes to the collagen triple helix. The application of 6.5% proanthocyanidin on collagen promoted hydrogen bond formation. (c) 2012 Wiley Periodicals, Inc. J Biomed Mater Res Part B: Appl Biomater, 2012.
Resumo:
Fraud is a global problem that has required more attention due to an accentuated expansion of modern technology and communication. When statistical techniques are used to detect fraud, whether a fraud detection model is accurate enough in order to provide correct classification of the case as a fraudulent or legitimate is a critical factor. In this context, the concept of bootstrap aggregating (bagging) arises. The basic idea is to generate multiple classifiers by obtaining the predicted values from the adjusted models to several replicated datasets and then combining them into a single predictive classification in order to improve the classification accuracy. In this paper, for the first time, we aim to present a pioneer study of the performance of the discrete and continuous k-dependence probabilistic networks within the context of bagging predictors classification. Via a large simulation study and various real datasets, we discovered that the probabilistic networks are a strong modeling option with high predictive capacity and with a high increment using the bagging procedure when compared to traditional techniques. (C) 2012 Elsevier Ltd. All rights reserved.
Resumo:
Structural durability is an important criterion that must be evaluated for every type of structure. Concerning reinforced concrete members, chloride diffusion process is widely used to evaluate durability, especially when these structures are constructed in aggressive atmospheres. The chloride ingress triggers the corrosion of reinforcements; therefore, by modelling this phenomenon, the corrosion process can be better evaluated as well as the structural durability. The corrosion begins when a threshold level of chloride concentration is reached at the steel bars of reinforcements. Despite the robustness of several models proposed in literature, deterministic approaches fail to predict accurately the corrosion time initiation due the inherent randomness observed in this process. In this regard, structural durability can be more realistically represented using probabilistic approaches. This paper addresses the analyses of probabilistic corrosion time initiation in reinforced concrete structures exposed to chloride penetration. The chloride penetration is modelled using the Fick's diffusion law. This law simulates the chloride diffusion process considering time-dependent effects. The probability of failure is calculated using Monte Carlo simulation and the first order reliability method, with a direct coupling approach. Some examples are considered in order to study these phenomena. Moreover, a simplified method is proposed to determine optimal values for concrete cover.
Resumo:
Semi-qualitative probabilistic networks (SQPNs) merge two important graphical model formalisms: Bayesian networks and qualitative probabilistic networks. They provade a very Complexity of inferences in polytree-shaped semi-qualitative probabilistic networks and qualitative probabilistic networks. They provide a very general modeling framework by allowing the combination of numeric and qualitative assessments over a discrete domain, and can be compactly encoded by exploiting the same factorization of joint probability distributions that are behind the bayesian networks. This paper explores the computational complexity of semi-qualitative probabilistic networks, and takes the polytree-shaped networks as its main target. We show that the inference problem is coNP-Complete for binary polytrees with multiple observed nodes. We also show that interferences can be performed in time linear in the number of nodes if there is a single observed node. Because our proof is construtive, we obtain an efficient linear time algorithm for SQPNs under such assumptions. To the best of our knowledge, this is the first exact polynominal-time algorithm for SQPn. Together these results provide a clear picture of the inferential complexity in polytree-shaped SQPNs.