839 resultados para Polynomial Classifier
Stabilized Petrov-Galerkin methods for the convection-diffusion-reaction and the Helmholtz equations
Resumo:
We present two new stabilized high-resolution numerical methods for the convection–diffusion–reaction (CDR) and the Helmholtz equations respectively. The work embarks upon a priori analysis of some consistency recovery procedures for some stabilization methods belonging to the Petrov–Galerkin framework. It was found that the use of some standard practices (e.g. M-Matrices theory) for the design of essentially non-oscillatory numerical methods is not feasible when consistency recovery methods are employed. Hence, with respect to convective stabilization, such recovery methods are not preferred. Next, we present the design of a high-resolution Petrov–Galerkin (HRPG) method for the 1D CDR problem. The problem is studied from a fresh point of view, including practical implications on the formulation of the maximum principle, M-Matrices theory, monotonicity and total variation diminishing (TVD) finite volume schemes. The current method is next in line to earlier methods that may be viewed as an upwinding plus a discontinuity-capturing operator. Finally, some remarks are made on the extension of the HRPG method to multidimensions. Next, we present a new numerical scheme for the Helmholtz equation resulting in quasi-exact solutions. The focus is on the approximation of the solution to the Helmholtz equation in the interior of the domain using compact stencils. Piecewise linear/bilinear polynomial interpolation are considered on a structured mesh/grid. The only a priori requirement is to provide a mesh/grid resolution of at least eight elements per wavelength. No stabilization parameters are involved in the definition of the scheme. The scheme consists of taking the average of the equation stencils obtained by the standard Galerkin finite element method and the classical finite difference method. Dispersion analysis in 1D and 2D illustrate the quasi-exact properties of this scheme. Finally, some remarks are made on the extension of the scheme to unstructured meshes by designing a method within the Petrov–Galerkin framework.
Resumo:
We extend the basic concepts of Street's formal theory of monads from the setting of 2-categories to that of double categories. In particular, we introduce the double category Mnd(C) of monads in a double category C and dene what it means for a double category to admit the construction of free monads. Our main theorem shows that, under some mild conditions, a double category that is a framed bicategory admits the construction of free monads if its horizontal 2-category does. We apply this result to obtain double adjunctions which extend the adjunction between graphs and categories and the adjunction between polynomial endofunctors and polynomial monads.
Resumo:
Donada una aplicació racional en una varietat complexa, Bellon i Viallet van definit l’entropia algebraica d’aquesta aplicació i van provar que aquest valor és un invariant biracional. Un invariant biracional equivalent és el grau asimptòtic, grau dinàmic o complexitat, definit per Boukraa i Maillard. Aquesta noció és propera a la complexitat definida per Arnold. Conjecturalment, el grau asimptòtic satisfà una recurrència lineal amb coeficients enters. Aquesta conjectura ha estat provada en el cas polinòmic en el pla afí complex per Favre i Jonsson i resta oberta en per al cas projectiu global i per al cas local. L’estudi de l’arbre valoratiu de Favre i Jonsson ha resultat clau per resoldre la conjectura en el cas polinòmic en el pla afí complex. El beneficiari ha estudiat l’arbre valoratiu global de Favre i Jonsson i ha reinterpretat algunes nocions i resultats des d’un punt de vista més geomètric. Així mateix, ha estudiat la demostració de la conjectura de Bellon – Viallet en el cas polinòmic en el pla afí complex com a primer pas per trobar una demostració en el cas local i projectiu global en estudis futurs. El projecte inclou un estudi detallat de l'arbre valoratiu global des d'un punt de vista geomètric i els primers passos de la demostració de la conjectura de Bellon - Viallet en el cas polinòmic en el pla afí complex que van efectuar Favre i Jonsson.
Resumo:
Donada una aplicació racional en una varietat complexa, Bellon i Viallet van definit l’entropia algebraica d’aquesta aplicació i van provar que aquest valor és un invariant biracional. Un invariant biracional equivalent és el grau asimptòtic, grau dinàmic o complexitat, definit per Boukraa i Maillard. Aquesta noció és propera a la complexitat definida per Arnold. Conjecturalment, el grau asimptòtic satisfà una recurrència lineal amb coeficients enters. Aquesta conjectura ha estat provada en el cas polinòmic en el pla afí complex per Favre i Jonsson i resta oberta en per al cas projectiu global i per al cas local. L’estudi de l’arbre valoratiu de Favre i Jonsson ha resultat clau per resoldre la conjectura en el cas polinòmic en el pla afí complex. El beneficiari ha estudiat l’arbre valoratiu global de Favre i Jonsson i ha reinterpretat algunes nocions i resultats des d’un punt de vista més geomètric. Així mateix, ha estudiat la demostració de la conjectura de Bellon – Viallet en el cas polinòmic en el pla afí complex com a primer pas per trobar una demostració en el cas local i projectiu global en estudis futurs. El projecte inclou un estudi detallat de l'arbre valoratiu global des d'un punt de vista geomètric i els primers passos de la demostració de la conjectura de Bellon - Viallet en el cas polinòmic en el pla afí complex que van efectuar Favre i Jonsson.
Resumo:
In this paper the two main drawbacks of the heat balance integral methods are examined. Firstly we investigate the choice of approximating function. For a standard polynomial form it is shown that combining the Heat Balance and Refined Integral methods to determine the power of the highest order term will either lead to the same, or more often, greatly improved accuracy on standard methods. Secondly we examine thermal problems with a time-dependent boundary condition. In doing so we develop a logarithmic approximating function. This new function allows us to model moving peaks in the temperature profile, a feature that previous heat balance methods cannot capture. If the boundary temperature varies so that at some time t & 0 it equals the far-field temperature, then standard methods predict that the temperature is everywhere at this constant value. The new method predicts the correct behaviour. It is also shown that this function provides even more accurate results, when coupled with the new CIM, than the polynomial profile. Analysis primarily focuses on a specified constant boundary temperature and is then extended to constant flux, Newton cooling and time dependent boundary conditions.
Resumo:
We give the first systematic study of strong isomorphism reductions, a notion of reduction more appropriate than polynomial time reduction when, for example, comparing the computational complexity of the isomorphim problem for different classes of structures. We show that the partial ordering of its degrees is quite rich. We analyze its relationship to a further type of reduction between classes of structures based on purely comparing for every n the number of nonisomorphic structures of cardinality at most n in both classes. Furthermore, in a more general setting we address the question of the existence of a maximal element in the partial ordering of the degrees.
Resumo:
Assume that the problem Qo is not solvable in polynomial time. For theories T containing a sufficiently rich part of true arithmetic we characterize T U {ConT} as the minimal extension of T proving for some algorithm that it decides Qo as fast as any algorithm B with the property that T proves that B decides Qo. Here, ConT claims the consistency of T. Moreover, we characterize problems with an optimal algorithm in terms of arithmetical theories.
Resumo:
In a seminal paper [10], Weitz gave a deterministic fully polynomial approximation scheme for counting exponentially weighted independent sets (which is the same as approximating the partition function of the hard-core model from statistical physics) in graphs of degree at most d, up to the critical activity for the uniqueness of the Gibbs measure on the innite d-regular tree. ore recently Sly [8] (see also [1]) showed that this is optimal in the sense that if here is an FPRAS for the hard-core partition function on graphs of maximum egree d for activities larger than the critical activity on the innite d-regular ree then NP = RP. In this paper we extend Weitz's approach to derive a deterministic fully polynomial approximation scheme for the partition function of general two-state anti-ferromagnetic spin systems on graphs of maximum degree d, up to the corresponding critical point on the d-regular tree. The main ingredient of our result is a proof that for two-state anti-ferromagnetic spin systems on the d-regular tree, weak spatial mixing implies strong spatial mixing. his in turn uses a message-decay argument which extends a similar approach proposed recently for the hard-core model by Restrepo et al [7] to the case of general two-state anti-ferromagnetic spin systems.
Resumo:
Report for the scientific sojourn at the University of Bern, Swiss, from Mars until June 2008. Writer identification consists in determining the writer of a piece of handwriting from a set of writers. Even though an important amount of compositions contains handwritten text in the music scores, the aim of the work is to use only music notation to determine the author. It’s been developed two approaches for writer identification in old handwritten music scores. The methods proposed extract features from every music line, and also features from a texture image of music symbols. First of all, the music sheet is first preprocessed for obtaining a binarized music score without the staff lines. The classification is performed using a k-NN classifier based on Euclidean distance. The proposed method has been tested on a database of old music scores from the 17th to 19th centuries, achieving encouraging identification rates.
Resumo:
The development of targeted treatment strategies adapted to individual patients requires identification of the different tumor classes according to their biology and prognosis. We focus here on the molecular aspects underlying these differences, in terms of sets of genes that control pathogenesis of the different subtypes of astrocytic glioma. By performing cDNA-array analysis of 53 patient biopsies, comprising low-grade astrocytoma, secondary glioblastoma (respective recurrent high-grade tumors), and newly diagnosed primary glioblastoma, we demonstrate that human gliomas can be differentiated according to their gene expression. We found that low-grade astrocytoma have the most specific and similar expression profiles, whereas primary glioblastoma exhibit much larger variation between tumors. Secondary glioblastoma display features of both other groups. We identified several sets of genes with relatively highly correlated expression within groups that: (a). can be associated with specific biological functions; and (b). effectively differentiate tumor class. One prominent gene cluster discriminating primary versus nonprimary glioblastoma comprises mostly genes involved in angiogenesis, including VEGF fms-related tyrosine kinase 1 but also IGFBP2, that has not yet been directly linked to angiogenesis. In situ hybridization demonstrating coexpression of IGFBP2 and VEGF in pseudopalisading cells surrounding tumor necrosis provided further evidence for a possible involvement of IGFBP2 in angiogenesis. The separating groups of genes were found by the unsupervised coupled two-way clustering method, and their classification power was validated by a supervised construction of a nearly perfect glioma classifier.
Resumo:
El projecte es centra en el desenvolupament d'un recol·lector de notícies publicades a una llarga llista de blocs ampliada contínuament pel desenvolupador i pels usuaris, afegint els seus blocs preferits. L'aplicació desenvolupada realitza una recol·lecció contínua de notícies consultant les possibles novetats que apareguin en cada un dels blocs inscrits a l'aplicació. Se'ls hi aplica un classificador per idioma i per temàtica i es relaciona amb les altres notícies existents si aquestes parlen sobre el mateix tema. En l'aplicació desenvolupada hi ha la possibilitat d'escollir entre les temàtiques ofertes i en l'idioma que ha estat publicada la notícia. Pel desenvolupament del projecte s'ha desitjat que la plataforma sigui el més compatible possible amb la tecnologia actual fent servir diversos llenguatges de programació que han permès desenvolupar cada un dels algorismes necessaris pel desenvolupament global de l'aplicació; en ordre d'ús he fet servir Php, Matlab, Html, MySql, CSS3, Javascript i XML. s'ha de destacar que el projecte aporta una comoditat per tots aquells lectors de blocs que es troben tantes vegades amb notícies ja llegides en els diferents blocs que consulten.
Resumo:
BACKGROUND: Shared Decision Making (SDM) is increasingly advocated as a model for medical decision making. However, there is still low use of SDM in clinical practice. High impact factor journals might represent an efficient way for its dissemination. We aimed to identify and characterize publication trends of SDM in 15 high impact medical journals. METHODS: We selected the 15 general and internal medicine journals with the highest impact factor publishing original articles, letters and editorials. We retrieved publications from 1996 to 2011 through the full-text search function on each journal website and abstracted bibliometric data. We included publications of any type containing the phrase "shared decision making" or five other variants in their abstract or full text. These were referred to as SDM publications. A polynomial Poisson regression model with logarithmic link function was used to assess the evolution across the period of the number of SDM publications according to publication characteristics. RESULTS: We identified 1285 SDM publications out of 229,179 publications in 15 journals from 1996 to 2011. The absolute number of SDM publications by journal ranged from 2 to 273 over 16 years. SDM publications increased both in absolute and relative numbers per year, from 46 (0.32% relative to all publications from the 15 journals) in 1996 to 165 (1.17%) in 2011. This growth was exponential (P < 0.01). We found fewer research publications (465, 36.2% of all SDM publications) than non-research publications, which included non-systematic reviews, letters, and editorials. The increase of research publications across time was linear. Full-text search retrieved ten times more SDM publications than a similar PubMed search (1285 vs. 119 respectively). CONCLUSION: This review in full-text showed that SDM publications increased exponentially in major medical journals from 1996 to 2011. This growth might reflect an increased dissemination of the SDM concept to the medical community.
Resumo:
In this paper, we present and apply a semisupervised support vector machine based on cluster kernels for the problem of very high resolution image classification. In the proposed setting, a base kernel working with labeled samples only is deformed by a likelihood kernel encoding similarities between unlabeled examples. The resulting kernel is used to train a standard support vector machine (SVM) classifier. Experiments carried out on very high resolution (VHR) multispectral and hyperspectral images using very few labeled examples show the relevancy of the method in the context of urban image classification. Its simplicity and the small number of parameters involved make it versatile and workable by unexperimented users.
Resumo:
We investigate whether dimensionality reduction using a latent generative model is beneficial for the task of weakly supervised scene classification. In detail, we are given a set of labeled images of scenes (for example, coast, forest, city, river, etc.), and our objective is to classify a new image into one of these categories. Our approach consists of first discovering latent ";topics"; using probabilistic Latent Semantic Analysis (pLSA), a generative model from the statistical text literature here applied to a bag of visual words representation for each image, and subsequently, training a multiway classifier on the topic distribution vector for each image. We compare this approach to that of representing each image by a bag of visual words vector directly and training a multiway classifier on these vectors. To this end, we introduce a novel vocabulary using dense color SIFT descriptors and then investigate the classification performance under changes in the size of the visual vocabulary, the number of latent topics learned, and the type of discriminative classifier used (k-nearest neighbor or SVM). We achieve superior classification performance to recent publications that have used a bag of visual word representation, in all cases, using the authors' own data sets and testing protocols. We also investigate the gain in adding spatial information. We show applications to image retrieval with relevance feedback and to scene classification in videos
Resumo:
Most network operators have considered reducing Label Switched Routers (LSR) label spaces (i.e. the number of labels that can be used) as a means of simplifying management of underlaying Virtual Private Networks (VPNs) and, hence, reducing operational expenditure (OPEX). This letter discusses the problem of reducing the label spaces in Multiprotocol Label Switched (MPLS) networks using label merging - better known as MultiPoint-to-Point (MP2P) connections. Because of its origins in IP, MP2P connections have been considered to have tree- shapes with Label Switched Paths (LSP) as branches. Due to this fact, previous works by many authors affirm that the problem of minimizing the label space using MP2P in MPLS - the Merging Problem - cannot be solved optimally with a polynomial algorithm (NP-complete), since it involves a hard- decision problem. However, in this letter, the Merging Problem is analyzed, from the perspective of MPLS, and it is deduced that tree-shapes in MP2P connections are irrelevant. By overriding this tree-shape consideration, it is possible to perform label merging in polynomial time. Based on how MPLS signaling works, this letter proposes an algorithm to compute the minimum number of labels using label merging: the Full Label Merging algorithm. As conclusion, we reclassify the Merging Problem as Polynomial-solvable, instead of NP-complete. In addition, simulation experiments confirm that without the tree-branch selection problem, more labels can be reduced