87 resultados para MEANS ALGORITHM


Relevância:

20.00% 20.00%

Publicador:

Resumo:

The properties of complex networks are highly Influenced by border effects frequently found as a consequence of the finite nature of real-world networks as well as network Sampling Therefore, it becomes critical to devise effective means for sound estimation of net work topological and dynamical properties will le avoiding these types of artifacts. In the current work, an algorithm for minimization of border effects is proposed and discussed, and its potential IS Illustrated with respect to two real-world networks. namely bone canals and air transportation (C) 2009 Elsevier B.V. All rights reserved.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Two targets, reverse transcriptase (RT) and protease from HIV-1, were used during the past two decades to the discovery of non-nucleoside reverse transcriptase inhibitors (NNRTI) and protease inhibitors (PI) that belong to the arsenal of the antiretroviral therapy. Herein these enzymes were chosen as templates for conducting a computer-aided ligand design. Ligand and structure-based drug designs were the starting points to select compounds from a database bearing more than five million compounds by means of cheminformatic tools. New promising lead structures are retrieved from the database, which are open to acquisition and test. Classes of molecules already described as NNRTI or PI in the literature also came out and were useful to prove the reliability of the workflow, and thus validating the work carried out so far. (c) 2007 Elsevier Masson SAS. All rights reserved.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper presents a study on wavelets and their characteristics for the specific purpose of serving as a feature extraction tool for speaker verification (SV), considering a Radial Basis Function (RBF) classifier, which is a particular type of Artificial Neural Network (ANN). Examining characteristics such as support-size, frequency and phase responses, amongst others, we show how Discrete Wavelet Transforms (DWTs), particularly the ones which derive from Finite Impulse Response (FIR) filters, can be used to extract important features from a speech signal which are useful for SV. Lastly, an SV algorithm based on the concepts presented is described.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper proposes an improved voice activity detection (VAD) algorithm using wavelet and support vector machine (SVM) for European Telecommunication Standards Institution (ETS1) adaptive multi-rate (AMR) narrow-band (NB) and wide-band (WB) speech codecs. First, based on the wavelet transform, the original IIR filter bank and pitch/tone detector are implemented, respectively, via the wavelet filter bank and the wavelet-based pitch/tone detection algorithm. The wavelet filter bank can divide input speech signal into several frequency bands so that the signal power level at each sub-band can be calculated. In addition, the background noise level can be estimated in each sub-band by using the wavelet de-noising method. The wavelet filter bank is also derived to detect correlated complex signals like music. Then the proposed algorithm can apply SVM to train an optimized non-linear VAD decision rule involving the sub-band power, noise level, pitch period, tone flag, and complex signals warning flag of input speech signals. By the use of the trained SVM, the proposed VAD algorithm can produce more accurate detection results. Various experimental results carried out from the Aurora speech database with different noise conditions show that the proposed algorithm gives considerable VAD performances superior to the AMR-NB VAD Options 1 and 2, and AMR-WB VAD. (C) 2009 Elsevier Ltd. All rights reserved.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The focus of study in this paper is the class of packing problems. More specifically, it deals with the placement of a set of N circular items of unitary radius inside an object with the aim of minimizing its dimensions. Differently shaped containers are considered, namely circles, squares, rectangles, strips and triangles. By means of the resolution of non-linear equations systems through the Newton-Raphson method, the herein presented algorithm succeeds in improving the accuracy of previous results attained by continuous optimization approaches up to numerical machine precision. The computer implementation and the data sets are available at http://www.ime.usp.br/similar to egbirgin/packing/. (C) 2009 Elsevier Ltd, All rights reserved.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper presents the formulation of a combinatorial optimization problem with the following characteristics: (i) the search space is the power set of a finite set structured as a Boolean lattice; (ii) the cost function forms a U-shaped curve when applied to any lattice chain. This formulation applies for feature selection in the context of pattern recognition. The known approaches for this problem are branch-and-bound algorithms and heuristics that explore partially the search space. Branch-and-bound algorithms are equivalent to the full search, while heuristics are not. This paper presents a branch-and-bound algorithm that differs from the others known by exploring the lattice structure and the U-shaped chain curves of the search space. The main contribution of this paper is the architecture of this algorithm that is based on the representation and exploration of the search space by new lattice properties proven here. Several experiments, with well known public data, indicate the superiority of the proposed method to the sequential floating forward selection (SFFS), which is a popular heuristic that gives good results in very short computational time. In all experiments, the proposed method got better or equal results in similar or even smaller computational time. (C) 2009 Elsevier Ltd. All rights reserved.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Large-scale simulations of parts of the brain using detailed neuronal models to improve our understanding of brain functions are becoming a reality with the usage of supercomputers and large clusters. However, the high acquisition and maintenance cost of these computers, including the physical space, air conditioning, and electrical power, limits the number of simulations of this kind that scientists can perform. Modern commodity graphical cards, based on the CUDA platform, contain graphical processing units (GPUs) composed of hundreds of processors that can simultaneously execute thousands of threads and thus constitute a low-cost solution for many high-performance computing applications. In this work, we present a CUDA algorithm that enables the execution, on multiple GPUs, of simulations of large-scale networks composed of biologically realistic Hodgkin-Huxley neurons. The algorithm represents each neuron as a CUDA thread, which solves the set of coupled differential equations that model each neuron. Communication among neurons located in different GPUs is coordinated by the CPU. We obtained speedups of 40 for the simulation of 200k neurons that received random external input and speedups of 9 for a network with 200k neurons and 20M neuronal connections, in a single computer with two graphic boards with two GPUs each, when compared with a modern quad-core CPU. Copyright (C) 2010 John Wiley & Sons, Ltd.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

One of the key issues in e-learning environments is the possibility of creating and evaluating exercises. However, the lack of tools supporting the authoring and automatic checking of exercises for specifics topics (e.g., geometry) drastically reduces advantages in the use of e-learning environments on a larger scale, as usually happens in Brazil. This paper describes an algorithm, and a tool based on it, designed for the authoring and automatic checking of geometry exercises. The algorithm dynamically compares the distances between the geometric objects of the student`s solution and the template`s solution, provided by the author of the exercise. Each solution is a geometric construction which is considered a function receiving geometric objects (input) and returning other geometric objects (output). Thus, for a given problem, if we know one function (construction) that solves the problem, we can compare it to any other function to check whether they are equivalent or not. Two functions are equivalent if, and only if, they have the same output when the same input is applied. If the student`s solution is equivalent to the template`s solution, then we consider the student`s solution as a correct solution. Our software utility provides both authoring and checking tools to work directly on the Internet, together with learning management systems. These tools are implemented using the dynamic geometry software, iGeom, which has been used in a geometry course since 2004 and has a successful track record in the classroom. Empowered with these new features, iGeom simplifies teachers` tasks, solves non-trivial problems in student solutions and helps to increase student motivation by providing feedback in real time. (c) 2008 Elsevier Ltd. All rights reserved.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Given two strings A and B of lengths n(a) and n(b), n(a) <= n(b), respectively, the all-substrings longest common subsequence (ALCS) problem obtains, for every substring B` of B, the length of the longest string that is a subsequence of both A and B. The ALCS problem has many applications, such as finding approximate tandem repeats in strings, solving the circular alignment of two strings and finding the alignment of one string with several others that have a common substring. We present an algorithm to prepare the basic data structure for ALCS queries that takes O(n(a)n(b)) time and O(n(a) + n(b)) space. After this preparation, it is possible to build that allows any LCS length to be retrieved in constant time. Some trade-offs between the space required and a matrix of size O(n(b)(2)) the querying time are discussed. To our knowledge, this is the first algorithm in the literature for the ALCS problem. (C) 2007 Elsevier B.V. All rights reserved.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The usual tests to compare variances and means (e. g. Bartlett`s test and F-test) assume that the sample comes from a normal distribution. In addition, the test for equality of means requires the assumption of homogeneity of variances. In some situation those assumptions are not satisfied, hence we may face problems like excessive size and low power. In this paper, we describe two tests, namely the Levene`s test for equality of variances, which is robust under nonnormality; and the Brown and Forsythe`s test for equality of means. We also present some modifications of the Levene`s test and Brown and Forsythe`s test, proposed by different authors. We analyzed and applied one modified form of Brown and Forsythe`s test to a real data set. This test is a robust alternative under nonnormality, heteroscedasticity and also when the data set has influential observations. The equality of variance can be well tested by Levene`s test with centering at the sample median.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Tetrapyridylporphyrins containing four chloro(2,2`-bipyridine)platinum(II) complexes attached at the meta (3-H(2)TPtPyP) and para (4-H(2)TPtPyP) positions of the peripheral pyridine ligands were synthesized and their interaction with DNA investigated. The compounds were isolated in the solid state and characterized by means of spectroscopic and analytical techniques. According to molecular simulations, the two isomers exhibit contrasting structural characteristics, consistent with a saddle shape configuration for 3-H(2)TPtPyP and a planar geometry for 4-H(2)TPtPyP. Surface plasmon resonance studies were carried out on the interaction of the complexes with calf thymus DNA, revealing a preferential binding of 3-H(2)TPtPyP, presumably at the DNA major grooves. (C) 2008 Elsevier Inc. All rights reserved.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A dosing algorithm including genetic (VKORC1 and CYP2C9 genotypes) and nongenetic factors (age, weight, therapeutic indication, and cotreatment with amiodarone or simvastatin) explained 51% of the variance in stable weekly warfarin doses in 390 patients attending an anticoagulant clinic in a Brazilian public hospital. The VKORC1 3673G>A genotype was the most important predictor of warfarin dose, with a partial R(2) value of 23.9%. Replacing the VKORC1 3673G>A genotype with VKORC1 diplotype did not increase the algorithm`s predictive power. We suggest that three other single-nucleotide polymorphisms (SNPs) (5808T>G, 6853G>C, and 9041G>A) that are in strong linkage disequilibrium (LD) with 3673G>A would be equally good predictors of the warfarin dose requirement. The algorithm`s predictive power was similar across the self-identified ""race/color"" subsets. ""Race/color"" was not associated with stable warfarin dose in the multiple regression model, although the required warfarin dose was significantly lower (P = 0.006) in white (29 +/- 13 mg/week, n = 196) than in black patients (35 +/- 15 mg/week, n = 76).