961 resultados para Set covering theory
Resumo:
In the last 50 years, we have had approximately 40 events with characteristics related to financial crisis. The most severe crisis was in 1929, when the financial markets plummet and the US gross domestic product decline in more than 30 percent. Recently some years ago, a new crisis developed in the United States, but instantly caused consequences and effects in the rest of the world. This new economic and financial crisis has increased the interest and motivation for the academic community, professors and researchers, to understand the causes and effects of the crisis, to learn from it. This is the one of the main reasons for the compilation of this book, which begins with a meeting of a group of IAFI researchers from the University of Barcelona, where researchers form Mexico and Spain, explain causes and consequences of the crisis of 2007. For that reason, we believed this set of chapters related to methodologies, applications and theories, would conveniently explained the characteristics and events of the past and future financial crisis This book consists in 3 main sections, the first one called "State of the Art and current situation", the second named "Econometric applications to estimate crisis time periods" , and the third one "Solutions to diminish the effects of the crisis". The first section explains the current point of view of many research papers related to financial crisis, it has 2 chapters. In the first one, it describe and analyzes the models that historically have been used to explain financial crisis, furthermore, it proposes to used alternative methodologies such as Fuzzy Cognitive Maps. On the other hand , Chapter 2 , explains the characteristics and details of the 2007 crisis from the US perspective and its comparison to 1929 crisis, presenting some effects in Mexico and Latin America. The second section presents two econometric applications to estimate possible crisis periods. For this matter, Chapter 3, studies 3 Latin-American countries: Argentina, Brazil and Peru in the 1994 crisis and estimates the multifractal characteristics to identify financial and economic distress. Chapter 4 explains the crisis situations in Argentina (2001), Mexico (1994) and the recent one in the United States (2007) and its effects in other countries through a financial series methodology related to the stock market. The last section shows an alternative to prevent the effects of the crisis. The first chapter explains the financial stability effects through the financial system regulation and some globalization standards. Chapter 6, study the benefits of the Investor activism and a way to protect personal and national wealth to face the financial crisis risks.
Resumo:
There is a recent trend to describe physical phenomena without the use of infinitesimals or infinites. This has been accomplished replacing differential calculus by the finite difference theory. Discrete function theory was first introduced in l94l. This theory is concerned with a study of functions defined on a discrete set of points in the complex plane. The theory was extensively developed for functions defined on a Gaussian lattice. In 1972 a very suitable lattice H: {Ci qmxO,I qnyo), X0) 0, X3) 0, O < q < l, m, n 5 Z} was found and discrete analytic function theory was developed. Very recently some work has been done in discrete monodiffric function theory for functions defined on H. The theory of pseudoanalytic functions is a generalisation of the theory of analytic functions. When the generator becomes the identity, ie., (l, i) the theory of pseudoanalytic functions reduces to the theory of analytic functions. Theugh the theory of pseudoanalytic functions plays an important role in analysis, no discrete theory is available in literature. This thesis is an attempt in that direction. A discrete pseudoanalytic theory is derived for functions defined on H.
Resumo:
We extend the relativistic mean field theory model of Sugahara and Toki by adding new couplings suggested by modern effective field theories. An improved set of parameters is developed with the goal to test the ability of the models based on effective field theory to describe the properties of finite nuclei and, at the same time, to be consistent with the trends of Dirac-Brueckner-Hartree-Fock calculations at densities away from the saturation region. We compare our calculations with other relativistic nuclear force parameters for various nuclear phenomena.
Resumo:
This paper highlights the prediction of learning disabilities (LD) in school-age children using rough set theory (RST) with an emphasis on application of data mining. In rough sets, data analysis start from a data table called an information system, which contains data about objects of interest, characterized in terms of attributes. These attributes consist of the properties of learning disabilities. By finding the relationship between these attributes, the redundant attributes can be eliminated and core attributes determined. Also, rule mining is performed in rough sets using the algorithm LEM1. The prediction of LD is accurately done by using Rosetta, the rough set tool kit for analysis of data. The result obtained from this study is compared with the output of a similar study conducted by us using Support Vector Machine (SVM) with Sequential Minimal Optimisation (SMO) algorithm. It is found that, using the concepts of reduct and global covering, we can easily predict the learning disabilities in children
Resumo:
The object of research presented here is Vessiot's theory of partial differential equations: for a given differential equation one constructs a distribution both tangential to the differential equation and contained within the contact distribution of the jet bundle. Then within it, one seeks n-dimensional subdistributions which are transversal to the base manifold, the integral distributions. These consist of integral elements, and these again shall be adapted so that they make a subdistribution which closes under the Lie-bracket. This then is called a flat Vessiot connection. Solutions to the differential equation may be regarded as integral manifolds of these distributions. In the first part of the thesis, I give a survey of the present state of the formal theory of partial differential equations: one regards differential equations as fibred submanifolds in a suitable jet bundle and considers formal integrability and the stronger notion of involutivity of differential equations for analyzing their solvability. An arbitrary system may (locally) be represented in reduced Cartan normal form. This leads to a natural description of its geometric symbol. The Vessiot distribution now can be split into the direct sum of the symbol and a horizontal complement (which is not unique). The n-dimensional subdistributions which close under the Lie bracket and are transversal to the base manifold are the sought tangential approximations for the solutions of the differential equation. It is now possible to show their existence by analyzing the structure equations. Vessiot's theory is now based on a rigorous foundation. Furthermore, the relation between Vessiot's approach and the crucial notions of the formal theory (like formal integrability and involutivity of differential equations) is clarified. The possible obstructions to involution of a differential equation are deduced explicitly. In the second part of the thesis it is shown that Vessiot's approach for the construction of the wanted distributions step by step succeeds if, and only if, the given system is involutive. Firstly, an existence theorem for integral distributions is proven. Then an existence theorem for flat Vessiot connections is shown. The differential-geometric structure of the basic systems is analyzed and simplified, as compared to those of other approaches, in particular the structure equations which are considered for the proofs of the existence theorems: here, they are a set of linear equations and an involutive system of differential equations. The definition of integral elements given here links Vessiot theory and the dual Cartan-Kähler theory of exterior systems. The analysis of the structure equations not only yields theoretical insight but also produces an algorithm which can be used to derive the coefficients of the vector fields, which span the integral distributions, explicitly. Therefore implementing the algorithm in the computer algebra system MuPAD now is possible.
Resumo:
In many real world contexts individuals find themselves in situations where they have to decide between options of behaviour that serve a collective purpose or behaviours which satisfy one’s private interests, ignoring the collective. In some cases the underlying social dilemma (Dawes, 1980) is solved and we observe collective action (Olson, 1965). In others social mobilisation is unsuccessful. The central topic of social dilemma research is the identification and understanding of mechanisms which yield to the observed cooperation and therefore resolve the social dilemma. It is the purpose of this thesis to contribute this research field for the case of public good dilemmas. To do so, existing work that is relevant to this problem domain is reviewed and a set of mandatory requirements is derived which guide theory and method development of the thesis. In particular, the thesis focusses on dynamic processes of social mobilisation which can foster or inhibit collective action. The basic understanding is that success or failure of the required process of social mobilisation is determined by heterogeneous individual preferences of the members of a providing group, the social structure in which the acting individuals are contained, and the embedding of the individuals in economic, political, biophysical, or other external contexts. To account for these aspects and for the involved dynamics the methodical approach of the thesis is computer simulation, in particular agent-based modelling and simulation of social systems. Particularly conductive are agent models which ground the simulation of human behaviour in suitable psychological theories of action. The thesis develops the action theory HAPPenInGS (Heterogeneous Agents Providing Public Goods) and demonstrates its embedding into different agent-based simulations. The thesis substantiates the particular added value of the methodical approach: Starting out from a theory of individual behaviour, in simulations the emergence of collective patterns of behaviour becomes observable. In addition, the underlying collective dynamics may be scrutinised and assessed by scenario analysis. The results of such experiments reveal insights on processes of social mobilisation which go beyond classical empirical approaches and yield policy recommendations on promising intervention measures in particular.
Resumo:
We show that optimizing a quantum gate for an open quantum system requires the time evolution of only three states irrespective of the dimension of Hilbert space. This represents a significant reduction in computational resources compared to the complete basis of Liouville space that is commonly believed necessary for this task. The reduction is based on two observations: the target is not a general dynamical map but a unitary operation; and the time evolution of two properly chosen states is sufficient to distinguish any two unitaries. We illustrate gate optimization employing a reduced set of states for a controlled phasegate with trapped atoms as qubit carriers and a iSWAP gate with superconducting qubits.
Resumo:
In order to estimate the motion of an object, the visual system needs to combine multiple local measurements, each of which carries some degree of ambiguity. We present a model of motion perception whereby measurements from different image regions are combined according to a Bayesian estimator --- the estimated motion maximizes the posterior probability assuming a prior favoring slow and smooth velocities. In reviewing a large number of previously published phenomena we find that the Bayesian estimator predicts a wide range of psychophysical results. This suggests that the seemingly complex set of illusions arise from a single computational strategy that is optimal under reasonable assumptions.
Resumo:
This paper describes a new reliable method, based on modal interval analysis (MIA) and set inversion (SI) techniques, for the characterization of solution sets defined by quantified constraints satisfaction problems (QCSP) over continuous domains. The presented methodology, called quantified set inversion (QSI), can be used over a wide range of engineering problems involving uncertain nonlinear models. Finally, an application on parameter identification is presented
Resumo:
Slides, handouts and links covering relevant topics including how to create an annotated bibliography, exemplar bibliography, reference managers etc. The coursework specification with assessment criteria is available as a single item from http://www.edshare.soton.ac.uk/6044/
Resumo:
El incumplimiento reiterado de la normatividad y políticas relacionadas con los tiempos de respuesta del proceso de contratación minera del país, desarrollado actualmente por la recién creada Agencia Nacional de Minería ANM, ha suscitado que la administración del recurso minero no se realice bajo los principios de eficiencia, eficacia, economía y celeridad. Estas debilidades manifiestas provocan represamientos en la resolución de trámites, congelación de áreas para contratar, sobrecostos, demoras en los tiempos de respuesta establecidos por la normatividad vigente y trae como consecuencia incertidumbre en los inversionistas mineros y pérdidas por concepto de recaudo de canon superficiario, entre otras. El objetivo del presente trabajo de investigación consiste en analizar el proceso de titulación minera de Colombia a partir de la filosofía de mejora continua desarrollado en la teoría de restricciones TOC (Theory Of Constraints), para poder identificar cuáles son los cuellos de botella que no permiten que el proceso fluya de manera adecuada y proponer alternativas de mejora, que con su implementación exploten y subordinen la limitaciones al sistema.
Resumo:
Our purpose is to provide a set-theoretical frame to clustering fuzzy relational data basically based on cardinality of the fuzzy subsets that represent objects and their complementaries, without applying any crisp property. From this perspective we define a family of fuzzy similarity indexes which includes a set of fuzzy indexes introduced by Tolias et al, and we analyze under which conditions it is defined a fuzzy proximity relation. Following an original idea due to S. Miyamoto we evaluate the similarity between objects and features by means the same mathematical procedure. Joining these concepts and methods we establish an algorithm to clustering fuzzy relational data. Finally, we present an example to make clear all the process
Resumo:
Møller-Plesset (MP2) and Becke-3-Lee-Yang-Parr (B3LYP) calculations have been used to compare the geometrical parameters, hydrogen-bonding properties, vibrational frequencies and relative energies for several X- and X+ hydrogen peroxide complexes. The geometries and interaction energies were corrected for the basis set superposition error (BSSE) in all the complexes (1-5), using the full counterpoise method, yielding small BSSE values for the 6-311 + G(3df,2p) basis set used. The interaction energies calculated ranged from medium to strong hydrogen-bonding systems (1-3) and strong electrostatic interactions (4 and 5). The molecular interactions have been characterized using the atoms in molecules theory (AIM), and by the analysis of the vibrational frequencies. The minima on the BSSE-counterpoise corrected potential-energy surface (PES) have been determined as described by S. Simón, M. Duran, and J. J. Dannenberg, and the results were compared with the uncorrected PES
Resumo:
A comparision of the local effects of the basis set superposition error (BSSE) on the electron densities and energy components of three representative H-bonded complexes was carried out. The electron densities were obtained with Hartee-Fock and density functional theory versions of the chemical Hamiltonian approach (CHA) methodology. It was shown that the effects of the BSSE were common for all complexes studied. The electron density difference maps and the chemical energy component analysis (CECA) analysis confirmed that the local effects of the BSSE were different when diffuse functions were present in the calculations
Resumo:
Quantum molecular similarity (QMS) techniques are used to assess the response of the electron density of various small molecules to application of a static, uniform electric field. Likewise, QMS is used to analyze the changes in electron density generated by the process of floating a basis set. The results obtained show an interrelation between the floating process, the optimum geometry, and the presence of an external field. Cases involving the Le Chatelier principle are discussed, and an insight on the changes of bond critical point properties, self-similarity values and density differences is performed