54 resultados para Algebraic approaches
em Consorci de Serveis Universitaris de Catalunya (CSUC), Spain
Resumo:
Black-box optimization problems (BBOP) are de ned as those optimization problems in which the objective function does not have an algebraic expression, but it is the output of a system (usually a computer program). This paper is focussed on BBOPs that arise in the eld of insurance, and more speci cally in reinsurance problems. In this area, the complexity of the models and assumptions considered to de ne the reinsurance rules and conditions produces hard black-box optimization problems, that must be solved in order to obtain the optimal output of the reinsurance. The application of traditional optimization approaches is not possible in BBOP, so new computational paradigms must be applied to solve these problems. In this paper we show the performance of two evolutionary-based techniques (Evolutionary Programming and Particle Swarm Optimization). We provide an analysis in three BBOP in reinsurance, where the evolutionary-based approaches exhibit an excellent behaviour, nding the optimal solution within a fraction of the computational cost used by inspection or enumeration methods.
Resumo:
We consider linear stochastic differential-algebraic equations with constant coefficients and additive white noise. Due to the nature of this class of equations, the solution must be defined as a generalised process (in the sense of Dawson and Fernique). We provide sufficient conditions for the law of the variables of the solution process to be absolutely continuous with respect to Lebesgue measure.
Resumo:
We review recent likelihood-based approaches to modeling demand for medical care. A semi-nonparametric model along the lines of Cameron and Johansson's Poisson polynomial model, but using a negative binomial baseline model, is introduced. We apply these models, as well a semiparametric Poisson, hurdle semiparametric Poisson, and finite mixtures of negative binomial models to six measures of health care usage taken from the Medical Expenditure Panel survey. We conclude that most of the models lead to statistically similar results, both in terms of information criteria and conditional and unconditional prediction. This suggests that applied researchers may not need to be overly concerned with the choice of which of these models they use to analyze data on health care demand.
Resumo:
The aim of this paper is to unify the points of view of three recent and independent papers (Ventura 1997, Margolis, Sapir and Weil 2001 and Kapovich and Miasnikov 2002), where similar modern versions of a 1951 theorem of Takahasi were given. We develop a theory of algebraic extensions for free groups, highlighting the analogies and differences with respect to the corresponding classical fieldt heoretic notions, and we discuss in detail the notion of algebraic closure. We apply that theory to the study and the computation of certain algebraic properties of subgroups (e.g. being malnormal, pure, inert or compressed, being closed in certain profinite topologies) and the corresponding closure operators. We also analyze the closure of a subgroup under the addition of solutions of certain sets of equations.
Resumo:
"Vegeu el resum a l'inici del document del fitxer adjunt."
Resumo:
Aquest treball analitza el “Tratado y libro de arte mayor o álgebra” que forma part del manuscrit 2294 de la Biblioteca de la Universitat de Salamanca, datat el 1590. El seu autor és Diego Pérez de Mesa. L’objectiu d’aquest estudi és aportar nous elements que ajudin a entendre quin era l’estatus de l’àlgebra a la Península Ibèrica en un segle que va ser clau en el seu desenvolupament. Primer es descriu el manuscrit i després es reflexiona sobre les seves aportacions a la matemàtica, mostrant algunes característiques originals d’aquesta àlgebra enfront d’altres àlgebres de la Península Ibèrica del segle XVI.
Resumo:
The first main result of the paper is a criterion for a partially commutative group G to be a domain. It allows us to reduce the study of algebraic sets over G to the study of irreducible algebraic sets, and reduce the elementary theory of G (of a coordinate group over G) to the elementary theories of the direct factors of G (to the elementary theory of coordinate groups of irreducible algebraic sets). Then we establish normal forms for quantifier-free formulas over a non-abelian directly indecomposable partially commutative group H. Analogously to the case of free groups, we introduce the notion of a generalised equation and prove that the positive theory of H has quantifier elimination and that arbitrary first-order formulas lift from H to H * F, where F is a free group of finite rank. As a consequence, the positive theory of an arbitrary partially commutative group is decidable.
Resumo:
Given an algebraic curve in the complex affine plane, we describe how to determine all planar polynomial vector fields which leave this curve invariant. If all (finite) singular points of the curve are nondegenerate, we give an explicit expression for these vector fields. In the general setting we provide an algorithmic approach, and as an alternative we discuss sigma processes.
Resumo:
We present a solution to the problem of defining a counterpart in Algebraic Set Theory of the construction of internal sheaves in Topos Theory. Our approach is general in that we consider sheaves as determined by Lawvere-Tierney coverages, rather than by Grothen-dieck coverages, and assume only a weakening of the axioms for small maps originally introduced by Joyal and Moerdijk, thus subsuming the existing topos-theoretic results.
Resumo:
This work investigates applying introspective reasoning to improve the performance of Case-Based Reasoning (CBR) systems, in both reactive and proactive fashion, by guiding learning to improve how a CBR system applies its cases and by identifying possible future system deficiencies. First we present our reactive approach, a new introspective reasoning model which enables CBR systems to autonomously learn to improve multiple facets of their reasoning processes in response to poor quality solutions. We illustrate our model’s benefits with experimental results from tests in an industrial design application. Then as for our proactive approach, we introduce a novel method for identifying regions in a case-base where the system gives low confidence solutions to possible future problems. Experimentation is provided for Zoology and Robo-Soccer domains and we argue how encountered regions of dubiosity help us to analyze the case-bases of a given CBR system.
Resumo:
Report for the scientific sojourn at the Swiss Federal Institute of Technology Zurich, Switzerland, between September and December 2007. In order to make robots useful assistants for our everyday life, the ability to learn and recognize objects is of essential importance. However, object recognition in real scenes is one of the most challenging problems in computer vision, as it is necessary to deal with difficulties. Furthermore, in mobile robotics a new challenge is added to the list: computational complexity. In a dynamic world, information about the objects in the scene can become obsolete before it is ready to be used if the detection algorithm is not fast enough. Two recent object recognition techniques have achieved notable results: the constellation approach proposed by Lowe and the bag of words approach proposed by Nistér and Stewénius. The Lowe constellation approach is the one currently being used in the robot localization project of the COGNIRON project. This report is divided in two main sections. The first section is devoted to briefly review the currently used object recognition system, the Lowe approach, and bring to light the drawbacks found for object recognition in the context of indoor mobile robot navigation. Additionally the proposed improvements for the algorithm are described. In the second section the alternative bag of words method is reviewed, as well as several experiments conducted to evaluate its performance with our own object databases. Furthermore, some modifications to the original algorithm to make it suitable for object detection in unsegmented images are proposed.
Resumo:
Projecte de recerca elaborat a partir d’una estada a la School of Mathematics and Statistics de la University of Plymouth, United Kingdom, entre abril juliol del 2007.Aquesta investigació és encara oberta i la memòria que presento constitueix un informe de la recerca que estem duent a terme actualment. En aquesta nota estudiem els centres isòcrons dels sistemes Hamiltonians analítics, parant especial atenció en el cas polinomial. Ens centrem en els anomenats quadratic-like Hamiltonian systems. Diverses propietats dels centres isòcrons d'aquest tipus de sistemes van ser donades a [A. Cima, F. Mañosas and J. Villadelprat, Isochronicity for several classes of Hamiltonian systems, J. Di®erential Equations 157 (1999) 373{413]. Aquell article estava centrat principalment en el cas en que A; B i C fossin funcions analítiques. El nostre objectiu amb l'estudi que estem duent a terme és investigar el cas en el que aquestes funcions són polinomis. En aquesta nota formulem una conjectura concreta sobre les propietats algebraiques que venen forçades per la isocronia del centre i provem alguns resultats parcials.
Resumo:
We consider linear optimization over a nonempty convex semi-algebraic feasible region F. Semidefinite programming is an example. If F is compact, then for almost every linear objective there is a unique optimal solution, lying on a unique \active" manifold, around which F is \partly smooth", and the second-order sufficient conditions hold. Perturbing the objective results in smooth variation of the optimal solution. The active manifold consists, locally, of these perturbed optimal solutions; it is independent of the representation of F, and is eventually identified by a variety of iterative algorithms such as proximal and projected gradient schemes. These results extend to unbounded sets F.
Resumo:
OER development is becoming more sophisticated as instructors and course specialists become more familiar with the environment. Most OER development approaches for online courses have been developed from those that were appropriate in the face-to-face context. However, the OER online environment opens up new possibilities for learning as well as holding particular limitations. This paper presents some approaches that OER implementers should bear in mind when initiating and supporting OER course development projects.1. Beg, borrow, or steal courseware. Don't reinvent the wheel.2. Take what exists and build the course around it.3. Mix and match. Assemble. Don't create.4. Avoid the "not invented here" syndrome. 5. Know the content -garbage in and garbage out.6. Establish deadlines. Work to deadlines, but don't be unrealistic. 7. Estimate your costs and then double them. Double them again. 8. Be realistic in scheduling and scoping.9. The project plan must be flexible. Be prepared for major shifts.10. Build flexibly for reuse and repurposing -generalizability reduces costs 11. Provide different routes to learning. 12. Build to international standards.There are necessary features in every OER, including introduction, schedule etc. but it is most important to keep the course as simple as possible. Extreme Programming (XP) methodology can be adapted from software engineering to aid in the course development process.
Resumo:
”compositions” is a new R-package for the analysis of compositional and positive data.It contains four classes corresponding to the four different types of compositional andpositive geometry (including the Aitchison geometry). It provides means for computation,plotting and high-level multivariate statistical analysis in all four geometries.These geometries are treated in an fully analogous way, based on the principle of workingin coordinates, and the object-oriented programming paradigm of R. In this way,called functions automatically select the most appropriate type of analysis as a functionof the geometry. The graphical capabilities include ternary diagrams and tetrahedrons,various compositional plots (boxplots, barplots, piecharts) and extensive graphical toolsfor principal components. Afterwards, ortion and proportion lines, straight lines andellipses in all geometries can be added to plots. The package is accompanied by ahands-on-introduction, documentation for every function, demos of the graphical capabilitiesand plenty of usage examples. It allows direct and parallel computation inall four vector spaces and provides the beginner with a copy-and-paste style of dataanalysis, while letting advanced users keep the functionality and customizability theydemand of R, as well as all necessary tools to add own analysis routines. A completeexample is included in the appendix