762 resultados para Viterbi-based algorithm
Resumo:
This paper reports a research to evaluate the potential and the effects of use of annotated Paraconsistent logic in automatic indexing. This logic attempts to deal with contradictions, concerned with studying and developing inconsistency-tolerant systems of logic. This logic, being flexible and containing logical states that go beyond the dichotomies yes and no, permits to advance the hypothesis that the results of indexing could be better than those obtained by traditional methods. Interactions between different disciplines, as information retrieval, automatic indexing, information visualization, and nonclassical logics were considered in this research. From the methodological point of view, an algorithm for treatment of uncertainty and imprecision, developed under the Paraconsistent logic, was used to modify the values of the weights assigned to indexing terms of the text collections. The tests were performed on an information visualization system named Projection Explorer (PEx), created at Institute of Mathematics and Computer Science (ICMC - USP Sao Carlos), with available source code. PEx uses traditional vector space model to represent documents of a collection. The results were evaluated by criteria built in the information visualization system itself, and demonstrated measurable gains in the quality of the displays, confirming the hypothesis that the use of the para-analyser under the conditions of the experiment has the ability to generate more effective clusters of similar documents. This is a point that draws attention, since the constitution of more significant clusters can be used to enhance information indexing and retrieval. It can be argued that the adoption of non-dichotomous (non-exclusive) parameters provides new possibilities to relate similar information.
Resumo:
Non-Hodgkin lymphomas are of many distinct types, and different classification systems make it difficult to diagnose them correctly. Many of these systems classify lymphomas only based on what they look like under a microscope. In 2008 the World Health Organisation (WHO) introduced the most recent system, which also considers the chromosome features of the lymphoma cells and the presence of certain proteins on their surface. The WHO system is the one that we apply in this work. Herewith we present an automatic method to classify histological images of three types of non-Hodgkin lymphoma. Our method is based on the Stationary Wavelet Transform (SWT), and it consists of three steps: 1) extracting sub-bands from the histological image through SWT, 2) applying Analysis of Variance (ANOVA) to clean noise and select the most relevant information, 3) classifying it by the Support Vector Machine (SVM) algorithm. The kernel types Linear, RBF and Polynomial were evaluated with our method applied to 210 images of lymphoma from the National Institute on Aging. We concluded that the following combination led to the most relevant results: detail sub-band, ANOVA and SVM with Linear and RBF kernels.
Resumo:
In this work is presented a new method for sensor deployment on 3D surfaces. The method was structured on different steps. The first one aimed discretizes the relief of interest with Delaunay algorithm. The tetrahedra and relative values (spatial coordinates of each vertex and faces) were input to construction of 3D Voronoi diagram. Each circumcenter was calculated as a candidate position for a sensor node: the corresponding circular coverage area was calculated based on a radius r. The r value can be adjusted to simulate different kinds of sensors. The Dijkstra algorithm and a selection method were applied to eliminate candidate positions with overlapped coverage areas or beyond of surface of interest. Performance evaluations measures were defined using coverage area and communication as criteria. The results were relevant, once the mean coverage rate achieved on three different surfaces were among 91% and 100%.
Resumo:
In this letter, a semiautomatic method for road extraction in object space is proposed that combines a stereoscopic pair of low-resolution aerial images with a digital terrain model (DTM) structured as a triangulated irregular network (TIN). First, we formulate an objective function in the object space to allow the modeling of roads in 3-D. In this model, the TIN-based DTM allows the search for the optimal polyline to be restricted along a narrow band that is overlaid upon it. Finally, the optimal polyline for each road is obtained by optimizing the objective function using the dynamic programming optimization algorithm. A few seed points need to be supplied by an operator. To evaluate the performance of the proposed method, a set of experiments was designed using two stereoscopic pairs of low-resolution aerial images and a TIN-based DTM with an average resolution of 1 m. The experimental results showed that the proposed method worked properly, even when faced with anomalies along roads, such as obstructions caused by shadows and trees.
Resumo:
Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)
Resumo:
Wavelength division multiplexing (WDM) offers a solution to the problem of exploiting the large bandwidth on optical links; it is the current favorite multiplexing technology for optical communication networks. Due to the high cost of an optical amplifier, it is desirable to strategically place the amplifiers throughout the network in a way that guarantees that all the signals are adequately amplified while minimizing the total number amplifiers being used. Previous studies all consider a star-based network. This paper demonstrates an original approach for solving the problem in switch-based WDM optical network assuming the traffic matrix is always the permutation of the nodes. First we formulate the problem by choosing typical permutations which can maximize traffic load on individual links; then a GA (Genetic Algorithm) is used to search for feasible amplifier placements. Finally, by setting up all the lightpaths without violating the power constaints we confirm the feasibility of the solution.
Resumo:
We explore the problem of budgeted machine learning, in which the learning algorithm has free access to the training examples’ labels but has to pay for each attribute that is specified. This learning model is appropriate in many areas, including medical applications. We present new algorithms for choosing which attributes to purchase of which examples in the budgeted learning model based on algorithms for the multi-armed bandit problem. All of our approaches outperformed the current state of the art. Furthermore, we present a new means for selecting an example to purchase after the attribute is selected, instead of selecting an example uniformly at random, which is typically done. Our new example selection method improved performance of all the algorithms we tested, both ours and those in the literature.
Resumo:
An extension of some standard likelihood based procedures to heteroscedastic nonlinear regression models under scale mixtures of skew-normal (SMSN) distributions is developed. This novel class of models provides a useful generalization of the heteroscedastic symmetrical nonlinear regression models (Cysneiros et al., 2010), since the random term distributions cover both symmetric as well as asymmetric and heavy-tailed distributions such as skew-t, skew-slash, skew-contaminated normal, among others. A simple EM-type algorithm for iteratively computing maximum likelihood estimates of the parameters is presented and the observed information matrix is derived analytically. In order to examine the performance of the proposed methods, some simulation studies are presented to show the robust aspect of this flexible class against outlying and influential observations and that the maximum likelihood estimates based on the EM-type algorithm do provide good asymptotic properties. Furthermore, local influence measures and the one-step approximations of the estimates in the case-deletion model are obtained. Finally, an illustration of the methodology is given considering a data set previously analyzed under the homoscedastic skew-t nonlinear regression model. (C) 2012 Elsevier B.V. All rights reserved.
Resumo:
Purpose - The purpose of this paper is to develop an efficient numerical algorithm for the self-consistent solution of Schrodinger and Poisson equations in one-dimensional systems. The goal is to compute the charge-control and capacitance-voltage characteristics of quantum wire transistors. Design/methodology/approach - The paper presents a numerical formulation employing a non-uniform finite difference discretization scheme, in which the wavefunctions and electronic energy levels are obtained by solving the Schrodinger equation through the split-operator method while a relaxation method in the FTCS scheme ("Forward Time Centered Space") is used to solve the two-dimensional Poisson equation. Findings - The numerical model is validated by taking previously published results as a benchmark and then applying them to yield the charge-control characteristics and the capacitance-voltage relationship for a split-gate quantum wire device. Originality/value - The paper helps to fulfill the need for C-V models of quantum wire device. To do so, the authors implemented a straightforward calculation method for the two-dimensional electronic carrier density n(x,y). The formulation reduces the computational procedure to a much simpler problem, similar to the one-dimensional quantization case, significantly diminishing running time.
Resumo:
A direct reconstruction algorithm for complex conductivities in W-2,W-infinity(Omega), where Omega is a bounded, simply connected Lipschitz domain in R-2, is presented. The framework is based on the uniqueness proof by Francini (2000 Inverse Problems 6 107-19), but equations relating the Dirichlet-to-Neumann to the scattering transform and the exponentially growing solutions are not present in that work, and are derived here. The algorithm constitutes the first D-bar method for the reconstruction of conductivities and permittivities in two dimensions. Reconstructions of numerically simulated chest phantoms with discontinuities at the organ boundaries are included.
Resumo:
Background/purpose: Gallstones and cholelithiasis are being increasingly diagnosed in children owing to the widespread use of ultrasonography. The treatment of choice is cholecystectomy, and routine intraoperative cholangiography is recommended to explore the common bile duct. The objectives of this study were to describe our experience with the management of gallstone disease in childhood over the last 18 years and to propose an algorithm to guide the approach to cholelithiasis in children based on clinical and ultrasonographic findings. Methods: The data for this study were obtained by reviewing the records of all patients with gallstone disease treated between January 1994 and October 2011. The patients were divided into the following 5 groups based on their symptoms: group 1, asymptomatic; group 2, nonbiliary obstructive symptoms; group 3, acute cholecystitis symptoms; group 4, a history of biliary obstructive symptoms that were completely resolved by the time of surgery; and group 5, ongoing biliary obstructive symptoms. Patients were treated according to an algorithm based on their clinical, ultrasonographic, and endoscopic retrograde cholangiopancreatography (ERCP) findings. Results: A total of 223 patients were diagnosed with cholelithiasis, and comorbidities were present in 177 patients (79.3%). The most common comorbidities were hemolytic disorders in 139 patients (62.3%) and previous bariatric surgery in 16 (7.1%). Although symptoms were present in 134 patients (60.0%), cholecystectomy was performed for all patients with cholelithiasis, even if they were asymptomatic; the surgery was laparoscopic in 204 patients and open in 19. Fifty-six patients (25.1%) presented with complications as the first sign of cholelithiasis (eg, pancreatitis, choledocolithiasis, or acute calculous cholecystitis). Intraoperative cholangiography was indicated in 15 children, and it was positive in only 1 (0.4%) for whom ERCP was necessary to extract the stone after a laparoscopic cholecystectomy (LC). Preoperative ERCP was performed in 11 patients to extract the stones, and a hepaticojejunostomy was indicated in 2 patients. There were no injuries to the hepatic artery or common bile duct in our series. Conclusions: Based on our experience, we can propose an algorithm to guide the approach to cholelithiasis in the pediatric population. The final conclusion is that LC results in limited postoperative complications in children with gallstones. When a diagnosis of choledocolithiasis or dilation of the choledocus is made, ERCP is necessary if obstructive symptoms persist either before or after an LC. Intraoperative cholangiography and laparoscopic common bile duct exploration are not mandatory. Published by Elsevier Inc.
Resumo:
A power transformer needs continuous monitoring and fast protection as it is a very expensive piece of equipment and an essential element in an electrical power system. The most common protection technique used is the percentage differential logic, which provides discrimination between an internal fault and different operating conditions. Unfortunately, there are some operating conditions of power transformers that can mislead the conventional protection affecting the power system stability negatively. This study proposes the development of a new algorithm to improve the protection performance by using fuzzy logic, artificial neural networks and genetic algorithms. An electrical power system was modelled using Alternative Transients Program software to obtain the operational conditions and fault situations needed to test the algorithm developed, as well as a commercial differential relay. Results show improved reliability, as well as a fast response of the proposed technique when compared with conventional ones.
Resumo:
In this study, a dynamic programming approach to deal with the unconstrained two-dimensional non-guillotine cutting problem is presented. The method extends the recently introduced recursive partitioning approach for the manufacturer's pallet loading problem. The approach involves two phases and uses bounds based on unconstrained two-staged and non-staged guillotine cutting. The method is able to find the optimal cutting pattern of a large number of pro blem instances of moderate sizes known in the literature and a counterexample for which the approach fails to find known optimal solutions was not found. For the instances that the required computer runtime is excessive, the approach is combined with simple heuristics to reduce its running time. Detailed numerical experiments show the reliability of the method. Journal of the Operational Research Society (2012) 63, 183-200. doi: 10.1057/jors.2011.6 Published online 17 August 2011
Resumo:
In this paper, we address the problem of defining the product mix in order to maximise a system's throughput. This problem is well known for being NP-Complete and therefore, most contributions to the topic focus on developing heuristics that are able to obtain good solutions for the problem in a short CPU time. In particular, constructive heuristics are available for the problem such as that by Fredendall and Lea, and by Aryanezhad and Komijan. We propose a new constructive heuristic based on the Theory of Constraints and the Knapsack Problem. The computational results indicate that the proposed heuristic yields better results than the existing heuristic.
Resumo:
XML similarity evaluation has become a central issue in the database and information communities, its applications ranging over document clustering, version control, data integration and ranked retrieval. Various algorithms for comparing hierarchically structured data, XML documents in particular, have been proposed in the literature. Most of them make use of techniques for finding the edit distance between tree structures, XML documents being commonly modeled as Ordered Labeled Trees. Yet, a thorough investigation of current approaches led us to identify several similarity aspects, i.e., sub-tree related structural and semantic similarities, which are not sufficiently addressed while comparing XML documents. In this paper, we provide an integrated and fine-grained comparison framework to deal with both structural and semantic similarities in XML documents (detecting the occurrences and repetitions of structurally and semantically similar sub-trees), and to allow the end-user to adjust the comparison process according to her requirements. Our framework consists of four main modules for (i) discovering the structural commonalities between sub-trees, (ii) identifying sub-tree semantic resemblances, (iii) computing tree-based edit operations costs, and (iv) computing tree edit distance. Experimental results demonstrate higher comparison accuracy with respect to alternative methods, while timing experiments reflect the impact of semantic similarity on overall system performance.