960 resultados para Weighted Query
Resumo:
In this article we show that for corank 1, quasi-homogeneous and finitely determined map germs f : (C-n, 0)-> (C-3, 0), n >= 3 one can obtain formulae for the polar multiplicities defined on the following stable types of f, f(Delta(f) and f(Sigma(n-2,1)(f), in terms of the weights and degrees of f. As a consequence we show how to compute the Euler obstruction of such stable types, also in terms of the weights and degrees of f.
Resumo:
Let (a, b) subset of (0, infinity) and for any positive integer n, let S-n be the Chebyshev space in [a, b] defined by S-n:= span{x(-n/2+k),k= 0,...,n}. The unique (up to a constant factor) function tau(n) is an element of S-n, which satisfies the orthogonality relation S(a)(b)tau(n)(x)q(x) (x(b - x)(x - a))(-1/2) dx = 0 for any q is an element of Sn-1, is said to be the orthogonal Chebyshev S-n-polynomials. This paper is an attempt to exibit some interesting properties of the orthogonal Chebyshev S-n-polynomials and to demonstrate their importance to the problem of approximation by S-n-polynomials. A simple proof of a Jackson-type theorem is given and the Lagrange interpolation problem by functions from S-n is discussed. It is shown also that tau(n) obeys an extremal property in L-q, 1 less than or equal to q less than or equal to infinity. Natural analogues of some inequalities for algebraic polynomials, which we expect to hold for the S-n-pelynomials, are conjectured.
Resumo:
An extended version of HIER, a query-the-user facility for expert systems is presented. HIER was developed to run over Prolog programs, and has been incorporated to systems that support the design of large and complex applications. The framework of the extended version is described,; as well as the major features of the implementation. An example is included to illustrate the use of the tool, involving the design of a specific database application.
Resumo:
Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)
Resumo:
In this paper we introduce the notion of G-pre-weighted homogeneous map germ, (G is one of Mather's groups A or K.) and show that any G-pre-weighted homogeneous map germ is G-finitely determined. We also give an explicit order, based on the Newton polyhedron of a pre-weighted homogeneous germ of function, such that the topological structure is preserved after perturbations by terms of higher order.
Resumo:
The development of new technologies that use peer-to-peer networks grows every day, with the object to supply the need of sharing information, resources and services of databases around the world. Among them are the peer-to-peer databases that take advantage of peer-to-peer networks to manage distributed knowledge bases, allowing the sharing of information semantically related but syntactically heterogeneous. However, it is a challenge to ensure the efficient search for information without compromising the autonomy of each node and network flexibility, given the structural characteristics of these networks. On the other hand, some studies propose the use of ontology semantics by assigning standardized categorization of information. The main original contribution of this work is the approach of this problem with a proposal for optimization of queries supported by the Ant Colony algorithm and classification though ontologies. The results show that this strategy enables the semantic support to the searches in peer-to-peer databases, aiming to expand the results without compromising network performance. © 2011 IEEE.
Resumo:
A number of studies have demonstrated that simple elastic network models can reproduce experimental B-factors, providing insights into the structure-function properties of proteins. Here, we report a study on how to improve an elastic network model and explore its performance by predicting the experimental B-factors. Elastic network models are built on the experimental C coordinates, and they only take the pairs of C atoms within a given cutoff distance r(c) into account. These models describe the interactions by elastic springs with the same force constant. We have developed a method based on numerical simulations with a simple coarse-grained force field, to attribute weights to these spring constants. This method considers the time that two C atoms remain connected in the network during partial unfolding, establishing a means of measuring the strength of each link. We examined two different coarse-grained force fields and explored the computation of these weights by unfolding the native structures. Proteins 2014; 82:119-129. (c) 2013 Wiley Periodicals, Inc.
Resumo:
Coordenação de Aperfeiçoamento de Pessoal de Nível Superior (CAPES)
Resumo:
This paper proposes a technique for solving the multiobjective environmental/economic dispatch problem using the weighted sum and ε-constraint strategies, which transform the problem into a set of single-objective problems. In the first strategy, the objective function is a weighted sum of the environmental and economic objective functions. The second strategy considers one of the objective functions: in this case, the environmental function, as a problem constraint, bounded above by a constant. A specific predictor-corrector primal-dual interior point method which uses the modified log barrier is proposed for solving the set of single-objective problems generated by such strategies. The purpose of the modified barrier approach is to solve the problem with relaxation of its original feasible region, enabling the method to be initialized with unfeasible points. The tests involving the proposed solution technique indicate i) the efficiency of the proposed method with respect to the initialization with unfeasible points, and ii) its ability to find a set of efficient solutions for the multiobjective environmental/economic dispatch problem.
Resumo:
In this paper we address the "skull-stripping" problem in 3D MR images. We propose a new method that employs an efficient and unique histogram analysis. A fundamental component of this analysis is an algorithm for partitioning a histogram based on the position of the maximum deviation from a Gaussian fit. In our experiments we use a comprehensive image database, including both synthetic and real MRI. and compare our method with other two well-known methods, namely BSE and BET. For all datasets we achieved superior results. Our method is also highly independent of parameter tuning and very robust across considerable variations of noise ratio.
Resumo:
Inthispaperwestudygermsofpolynomialsformedbytheproductofsemi-weighted homogeneous polynomials of the same type, which we call semi-weighted homogeneous arrangements. It is shown how the L numbers of such polynomials are computed using only their weights and degree of homogeneity. A key point of the main theorem is to find the number called polar ratio of this polynomial class. An important consequence is the description of the Euler characteristic of the Milnor fibre of such arrangements only depending on their weights and degree of homogeneity. The constancy of the L numbers in families formed by such arrangements is shown, with the deformed terms having weighted degree greater than the weighted degree of the initial germ. Moreover, using the results of Massey applied to families of function germs, we obtain the constancy of the homology of the Milnor fibre in this family of semi-weighted homogeneous arrangements.
Resumo:
In this article, we propose a new Bayesian flexible cure rate survival model, which generalises the stochastic model of Klebanov et al. [Klebanov LB, Rachev ST and Yakovlev AY. A stochastic-model of radiation carcinogenesis - latent time distributions and their properties. Math Biosci 1993; 113: 51-75], and has much in common with the destructive model formulated by Rodrigues et al. [Rodrigues J, de Castro M, Balakrishnan N and Cancho VG. Destructive weighted Poisson cure rate models. Technical Report, Universidade Federal de Sao Carlos, Sao Carlos-SP. Brazil, 2009 (accepted in Lifetime Data Analysis)]. In our approach, the accumulated number of lesions or altered cells follows a compound weighted Poisson distribution. This model is more flexible than the promotion time cure model in terms of dispersion. Moreover, it possesses an interesting and realistic interpretation of the biological mechanism of the occurrence of the event of interest as it includes a destructive process of tumour cells after an initial treatment or the capacity of an individual exposed to irradiation to repair altered cells that results in cancer induction. In other words, what is recorded is only the damaged portion of the original number of altered cells not eliminated by the treatment or repaired by the repair system of an individual. Markov Chain Monte Carlo (MCMC) methods are then used to develop Bayesian inference for the proposed model. Also, some discussions on the model selection and an illustration with a cutaneous melanoma data set analysed by Rodrigues et al. [Rodrigues J, de Castro M, Balakrishnan N and Cancho VG. Destructive weighted Poisson cure rate models. Technical Report, Universidade Federal de Sao Carlos, Sao Carlos-SP. Brazil, 2009 (accepted in Lifetime Data Analysis)] are presented.
Resumo:
We study the action of a weighted Fourier–Laplace transform on the functions in the reproducing kernel Hilbert space (RKHS) associated with a positive definite kernel on the sphere. After defining a notion of smoothness implied by the transform, we show that smoothness of the kernel implies the same smoothness for the generating elements (spherical harmonics) in the Mercer expansion of the kernel. We prove a reproducing property for the weighted Fourier–Laplace transform of the functions in the RKHS and embed the RKHS into spaces of smooth functions. Some relevant properties of the embedding are considered, including compactness and boundedness. The approach taken in the paper includes two important notions of differentiability characterized by weighted Fourier–Laplace transforms: fractional derivatives and Laplace–Beltrami derivatives.
Resumo:
[ES] Uno de los cinco componentes de la arquitectura triskel, una base de datos NoSQL que trata de dar solución al problema de Big data de la web semántica, el gran número de identificadores de recursos que se necesitarían debido al creciente número de sitios web, concretamente el motor de gestión de ejecución de patrones basados en tripletas y en la tecnología RDF. Se encarga de recoger la petición de consulta por parte del intérprete, analizar los patrones que intervienen en la consulta en busca de dependencias explotables entre ellos, y así poder realizar la consulta con mayor rapidez además de ir resolviendo los diferentes patrones contra el almacenamiento, un TripleStore, y devolver el resultado de la petición en una tabla.