762 resultados para Viterbi-based algorithm


Relevância:

30.00% 30.00%

Publicador:

Resumo:

This work explores the design of piezoelectric transducers based on functional material gradation, here named functionally graded piezoelectric transducer (FGPT). Depending on the applications, FGPTs must achieve several goals, which are essentially related to the transducer resonance frequency, vibration modes, and excitation strength at specific resonance frequencies. Several approaches can be used to achieve these goals; however, this work focuses on finding the optimal material gradation of FGPTs by means of topology optimization. Three objective functions are proposed: (i) to obtain the FGPT optimal material gradation for maximizing specified resonance frequencies; (ii) to design piezoelectric resonators, thus, the optimal material gradation is found for achieving desirable eigenvalues and eigenmodes; and (iii) to find the optimal material distribution of FGPTs, which maximizes specified excitation strength. To track the desirable vibration mode, a mode-tracking method utilizing the `modal assurance criterion` is applied. The continuous change of piezoelectric, dielectric, and elastic properties is achieved by using the graded finite element concept. The optimization algorithm is constructed based on sequential linear programming, and the concept of continuum approximation of material distribution. To illustrate the method, 2D FGPTs are designed for each objective function. In addition, the FGPT performance is compared with the non-FGPT one.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We consider brightness/contrast-invariant and rotation-discriminating template matching that searches an image to analyze A for a query image Q We propose to use the complex coefficients of the discrete Fourier transform of the radial projections to compute new rotation-invariant local features. These coefficients can be efficiently obtained via FFT. We classify templates in ""stable"" and ""unstable"" ones and argue that any local feature-based template matching may fail to find unstable templates. We extract several stable sub-templates of Q and find them in A by comparing the features. The matchings of the sub-templates are combined using the Hough transform. As the features of A are computed only once, the algorithm can find quickly many different sub-templates in A, and it is Suitable for finding many query images in A, multi-scale searching and partial occlusion-robust template matching. (C) 2009 Elsevier Ltd. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Higher order (2,4) FDTD schemes used for numerical solutions of Maxwell`s equations are focused on diminishing the truncation errors caused by the Taylor series expansion of the spatial derivatives. These schemes use a larger computational stencil, which generally makes use of the two constant coefficients, C-1 and C-2, for the four-point central-difference operators. In this paper we propose a novel way to diminish these truncation errors, in order to obtain more accurate numerical solutions of Maxwell`s equations. For such purpose, we present a method to individually optimize the pair of coefficients, C-1 and C-2, based on any desired grid size resolution and size of time step. Particularly, we are interested in using coarser grid discretizations to be able to simulate electrically large domains. The results of our optimization algorithm show a significant reduction in dispersion error and numerical anisotropy for all modeled grid size resolutions. Numerical simulations of free-space propagation verifies the very promising theoretical results. The model is also shown to perform well in more complex, realistic scenarios.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This work aims at proposing the use of the evolutionary computation methodology in order to jointly solve the multiuser channel estimation (MuChE) and detection problems at its maximum-likelihood, both related to the direct sequence code division multiple access (DS/CDMA). The effectiveness of the proposed heuristic approach is proven by comparing performance and complexity merit figures with that obtained by traditional methods found in literature. Simulation results considering genetic algorithm (GA) applied to multipath, DS/CDMA and MuChE and multi-user detection (MuD) show that the proposed genetic algorithm multi-user channel estimation (GAMuChE) yields a normalized mean square error estimation (nMSE) inferior to 11%, under slowly varying multipath fading channels, large range of Doppler frequencies and medium system load, it exhibits lower complexity when compared to both maximum likelihood multi-user channel estimation (MLMuChE) and gradient descent method (GrdDsc). A near-optimum multi-user detector (MuD) based on the genetic algorithm (GAMuD), also proposed in this work, provides a significant reduction in the computational complexity when compared to the optimum multi-user detector (OMuD). In addition, the complexity of the GAMuChE and GAMuD algorithms were (jointly) analyzed in terms of number of operations necessary to reach the convergence, and compared to other jointly MuChE and MuD strategies. The joint GAMuChE-GAMuD scheme can be regarded as a promising alternative for implementing third-generation (3G) and fourth-generation (4G) wireless systems in the near future. Copyright (C) 2010 John Wiley & Sons, Ltd.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper presents the design and implementation of an embedded soft sensor, i. e., a generic and autonomous hardware module, which can be applied to many complex plants, wherein a certain variable cannot be directly measured. It is implemented based on a fuzzy identification algorithm called ""Limited Rules"", employed to model continuous nonlinear processes. The fuzzy model has a Takagi-Sugeno-Kang structure and the premise parameters are defined based on the Fuzzy C-Means (FCM) clustering algorithm. The firmware contains the soft sensor and it runs online, estimating the target variable from other available variables. Tests have been performed using a simulated pH neutralization plant. The results of the embedded soft sensor have been considered satisfactory. A complete embedded inferential control system is also presented, including a soft sensor and a PID controller. (c) 2007, ISA. Published by Elsevier Ltd. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper analyzes the complexity-performance trade-off of several heuristic near-optimum multiuser detection (MuD) approaches applied to the uplink of synchronous single/multiple-input multiple-output multicarrier code division multiple access (S/MIMO MC-CDMA) systems. Genetic algorithm (GA), short term tabu search (STTS) and reactive tabu search (RTS), simulated annealing (SA), particle swarm optimization (PSO), and 1-opt local search (1-LS) heuristic multiuser detection algorithms (Heur-MuDs) are analyzed in details, using a single-objective antenna-diversity-aided optimization approach. Monte- Carlo simulations show that, after convergence, the performances reached by all near-optimum Heur-MuDs are similar. However, the computational complexities may differ substantially, depending on the system operation conditions. Their complexities are carefully analyzed in order to obtain a general complexity-performance framework comparison and to show that unitary Hamming distance search MuD (uH-ds) approaches (1-LS, SA, RTS and STTS) reach the best convergence rates, and among them, the 1-LS-MuD provides the best trade-off between implementation complexity and bit error rate (BER) performance.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The present work reports the porous alumina structures fabrication and their quantitative structural characteristics study based on mathematical morphology analysis by using the SEM images. The algorithm used in this work was implemented in 6.2 MATLAB software. Using the algorithm it was possible to obtain the distribution of maximum, minimum and average radius of the pores in porous alumina structures. Additionally, with the calculus of the area occupied by the pores, it was possible to obtain the porosity of the structures. The quantitative results could be obtained and related to the process fabrication characteristics, showing to be reliable and promising to be used to control the pores formation process. Then, this technique could provide a more accurate determination of pore sizes and pores distribution. (C) 2008 Elsevier Ltd. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The task of segmenting cell nuclei from cytoplasm in conventional Papanicolaou (Pap) stained cervical cell images is a classical image analysis problem which may prove to be crucial to the development of successful systems which automate the analysis of Pap smears for detection of cancer of the cervix. Although simple thresholding techniques will extract the nucleus in some cases, accurate unsupervised segmentation of very large image databases is elusive. Conventional active contour models as introduced by Kass, Witkin and Terzopoulos (1988) offer a number of advantages in this application, but suffer from the well-known drawbacks of initialisation and minimisation. Here we show that a Viterbi search-based dual active contour algorithm is able to overcome many of these problems and achieve over 99% accurate segmentation on a database of 20 130 Pap stained cell images. (C) 1998 Elsevier Science B.V. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

OBJECTIVE: To evaluate a diagnostic algorithm for pulmonary tuberculosis based on smear microscopy and objective response to trial of antibiotics. SETTING: Adult medical wards, Hlabisa Hospital, South Africa, 1996-1997. METHODS: Adults with chronic chest symptoms and abnormal chest X-ray had sputum examined for Ziehl-Neelsen stained acid-fast bacilli by light microscopy. Those with negative smears were treated with amoxycillin for 5 days and assessed. Those who had not improved were treated with erythromycin for 5 days and reassessed. Response was compared with mycobacterial culture. RESULTS: Of 280 suspects who completed the diagnostic pathway, 160 (57%) had a positive smear, 46 (17%) responded to amoxycillin, 34 (12%) responded to erythromycin and 40 (14%) were treated as smear-negative tuberculosis. The sensitivity (89%) and specificity (84%) of the full algorithm for culture-positive tuberculosis were high. However, 11 patients (positive predictive value [PPV] 95%) were incorrectly diagnosed with tuberculosis, and 24 cases of tuberculosis (negative predictive value [NPV] 70%) were not identified. NPV improved to 75% when anaemia was included as a predictor. Algorithm performance was independent of human immunodeficiency virus status. CONCLUSION: Sputum smear microscopy plus trial of antibiotic algorithm among a selected group of tuberculosis suspects may increase diagnostic accuracy in district hospitals in developing countries.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

An equivalent algorithm is proposed to simulate thermal effects of the magma intrusion in geological systems, which are composed of porous rocks. Based on the physical and mathematical equivalence, the original magma solidification problem with a moving boundary between the rock and intruded magma is transformed into a new problem without the moving boundary but with a physically equivalent heat source. From the analysis of an ideal solidification model, the physically equivalent heat source has been determined in this paper. The major advantage in using the proposed equivalent algorithm is that the fixed finite element mesh with a variable integration time step can be employed to simulate the thermal effect of the intruded magma solidification using the conventional finite element method. The related numerical results have demonstrated the correctness and usefulness of the proposed equivalent algorithm for simulating the thermal effect of the intruded magma solidification in geological systems. (C) 2003 Elsevier B.V. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A two-component survival mixture model is proposed to analyse a set of ischaemic stroke-specific mortality data. The survival experience of stroke patients after index stroke may be described by a subpopulation of patients in the acute condition and another subpopulation of patients in the chronic phase. To adjust for the inherent correlation of observations due to random hospital effects, a mixture model of two survival functions with random effects is formulated. Assuming a Weibull hazard in both components, an EM algorithm is developed for the estimation of fixed effect parameters and variance components. A simulation study is conducted to assess the performance of the two-component survival mixture model estimators. Simulation results confirm the applicability of the proposed model in a small sample setting. Copyright (C) 2004 John Wiley Sons, Ltd.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

On the basis of a spatially distributed sediment budget across a large basin, costs of achieving certain sediment reduction targets in rivers were estimated. A range of investment prioritization scenarios were tested to identify the most cost-effective strategy to control suspended sediment loads. The scenarios were based on successively introducing more information from the sediment budget. The relationship between spatial heterogeneity of contributing sediment sources on cost effectiveness of prioritization was investigated. Cost effectiveness was shown to increase with sequential introduction of sediment budget terms. The solution which most decreased cost was achieved by including spatial information linking sediment sources to the downstream target location. This solution produced cost curves similar to those derived using a genetic algorithm formulation. Appropriate investment prioritization can offer large cost savings because the magnitude of the costs can vary by several times depending on what type of erosion source or sediment delivery mechanism is targeted. Target settings which only consider the erosion source rates can potentially result in spending more money than random management intervention for achieving downstream targets. Coherent spatial patterns of contributing sediment emerge from the budget model and its many inputs. The heterogeneity in these patterns can be summarized in a succinct form. This summary was shown to be consistent with the cost difference between local and regional prioritization for three of four test catchments. To explain the effect for the fourth catchment, the detail of the individual sediment sources needed to be taken into account.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The identification, modeling, and analysis of interactions between nodes of neural systems in the human brain have become the aim of interest of many studies in neuroscience. The complex neural network structure and its correlations with brain functions have played a role in all areas of neuroscience, including the comprehension of cognitive and emotional processing. Indeed, understanding how information is stored, retrieved, processed, and transmitted is one of the ultimate challenges in brain research. In this context, in functional neuroimaging, connectivity analysis is a major tool for the exploration and characterization of the information flow between specialized brain regions. In most functional magnetic resonance imaging (fMRI) studies, connectivity analysis is carried out by first selecting regions of interest (ROI) and then calculating an average BOLD time series (across the voxels in each cluster). Some studies have shown that the average may not be a good choice and have suggested, as an alternative, the use of principal component analysis (PCA) to extract the principal eigen-time series from the ROI(s). In this paper, we introduce a novel approach called cluster Granger analysis (CGA) to study connectivity between ROIs. The main aim of this method was to employ multiple eigen-time series in each ROI to avoid temporal information loss during identification of Granger causality. Such information loss is inherent in averaging (e.g., to yield a single ""representative"" time series per ROI). This, in turn, may lead to a lack of power in detecting connections. The proposed approach is based on multivariate statistical analysis and integrates PCA and partial canonical correlation in a framework of Granger causality for clusters (sets) of time series. We also describe an algorithm for statistical significance testing based on bootstrapping. By using Monte Carlo simulations, we show that the proposed approach outperforms conventional Granger causality analysis (i.e., using representative time series extracted by signal averaging or first principal components estimation from ROIs). The usefulness of the CGA approach in real fMRI data is illustrated in an experiment using human faces expressing emotions. With this data set, the proposed approach suggested the presence of significantly more connections between the ROIs than were detected using a single representative time series in each ROI. (c) 2010 Elsevier Inc. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Here, we examine morphological changes in cortical thickness of patients with Alzheimer`s disease (AD) using image analysis algorithms for brain structure segmentation and study automatic classification of AD patients using cortical and volumetric data. Cortical thickness of AD patients (n = 14) was measured using MRI cortical surface-based analysis and compared with healthy subjects (n = 20). Data was analyzed using an automated algorithm for tissue segmentation and classification. A Support Vector Machine (SVM) was applied over the volumetric measurements of subcortical and cortical structures to separate AD patients from controls. The group analysis showed cortical thickness reduction in the superior temporal lobe, parahippocampal gyrus, and enthorhinal cortex in both hemispheres. We also found cortical thinning in the isthmus of cingulate gyrus and middle temporal gyrus at the right hemisphere, as well as a reduction of the cortical mantle in areas previously shown to be associated with AD. We also confirmed that automatic classification algorithms (SVM) could be helpful to distinguish AD patients from healthy controls. Moreover, the same areas implicated in the pathogenesis of AD were the main parameters driving the classification algorithm. While the patient sample used in this study was relatively small, we expect that using a database of regional volumes derived from MRI scans of a large number of subjects will increase the SVM power of AD patient identification.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Background: Although various techniques have been used for breast conservation surgery reconstruction, there are few studies describing a logical approach to reconstruction of these defects. The objectives of this study were to establish a classification system for partial breast defects and to develop a reconstructive algorithm. Methods: The authors reviewed a 7-year experience with 209 immediate breast conservation surgery reconstructions. Mean follow-up was 31 months. Type I defects include tissue resection in smaller breasts (bra size A/B), including type IA, which involves minimal defects that do not cause distortion; type III, which involves moderate defects that cause moderate distortion; and type IC, which involves large defects that cause significant deformities. Type II includes tissue resection in medium-sized breasts with or without ptosis (bra size C), and type III includes tissue resection in large breasts with ptosis (bra size D). Results: Eighteen percent of patients presented type I, where a lateral thoracodorsal flap and a latissimus dorsi flap were performed in 68 percent. Forty-five percent presented type II defects, where bilateral mastopexy was performed in 52 percent. Thirty-seven percent of patients presented type III distortion, where bilateral reduction mammaplasty was performed in 67 percent. Thirty-five percent of patients presented complications, and most were minor. Conclusions: An algorithm based on breast size in relation to tumor location and extension of resection can be followed to determine the best approach to reconstruction. The authors` results have demonstrated that the complications were similar to those in other clinical series. Success depends on patient selection, coordinated planning with the oncologic surgeon, and careful intraoperative management.