964 resultados para 010103 Category Theory, K Theory, Homological Algebra
Resumo:
In the last 50 years, we have had approximately 40 events with characteristics related to financial crisis. The most severe crisis was in 1929, when the financial markets plummet and the US gross domestic product decline in more than 30 percent. Recently some years ago, a new crisis developed in the United States, but instantly caused consequences and effects in the rest of the world.This new economic and financial crisis has increased the interest and motivation for the academic community, professors and researchers, to understand the causes and effects of the crisis, to learn from it. This is the one of the main reasons for the compilation of this book, which begins with a meeting of a group of IAFI researchers from the University of Barcelona, where researchers form Mexico and Spain, explain causes and consequences of the crisis of 2007.For that reason, we believed this set of chapters related to methodologies, applications and theories, would conveniently explained the characteristics and events of the past and future financial crisisThis book consists in 3 main sections, the first one called "State of the Art and current situation", the second named "Econometric applications to estimate crisis time periods" , and the third one "Solutions to diminish the effects of the crisis". The first section explains the current point of view of many research papers related to financial crisis, it has 2 chapters. In the first one, it describe and analyzes the models that historically have been used to explain financial crisis, furthermore, it proposes to used alternative methodologies such as Fuzzy Cognitive Maps. On the other hand , Chapter 2 , explains the characteristics and details of the 2007 crisis from the US perspective and its comparison to 1929 crisis, presenting some effects in Mexico and Latin America.The second section presents two econometric applications to estimate possible crisis periods. For this matter, Chapter 3, studies 3 Latin-American countries: Argentina, Brazil and Peru in the 1994 crisis and estimates the multifractal characteristics to identify financial and economic distress.Chapter 4 explains the crisis situations in Argentina (2001), Mexico (1994) and the recent one in the United States (2007) and its effects in other countries through a financial series methodology related to the stock market.The last section shows an alternative to prevent the effects of the crisis. The first chapter explains the financial stability effects through the financial system regulation and some globalization standards. Chapter 6, study the benefits of the Investor activism and a way to protect personal and national wealth to face the financial crisis risks.
Resumo:
We develop an abstract extrapolation theory for the real interpolation method that covers and improves the most recent versions of the celebrated theorems of Yano and Zygmund. As a consequence of our method, we give new endpoint estimates of the embedding Sobolev theorem for an arbitrary domain Omega
Resumo:
We present the result of polar angle resolved x¿ray photoemission spectroscopy on Al(111)/O and cluster calculations of the O(1s) binding energy (BE) for various model situations. In the experimental data two O(1s) peaks are observed, separated by 1.3 eV. The angular behavior (depth¿resolution) could indicate that the lower BE peak is associated with an O atom under the surface, and the higher BE peak with an O atom above the surface. Equally, it could indicate oxygen islands on the surface where the perimeter atoms have a higher O(1s) BE than the interior atoms. The cluster calculations show that the former interpretation cannot be correct, since an O ads below the surface has a higher calculated O(1s) BE than one above. Cluster calculations simulating oxygen islands are, however, consistent with the experimental data.
Resumo:
The performance of density-functional theory to solve the exact, nonrelativistic, many-electron problem for magnetic systems has been explored in a new implementation imposing space and spin symmetry constraints, as in ab initio wave function theory. Calculations on selected systems representative of organic diradicals, molecular magnets and antiferromagnetic solids carried out with and without these constraints lead to contradictory results, which provide numerical illustration on this usually obviated problem. It is concluded that the present exchange-correlation functionals provide reasonable numerical results although for the wrong physical reasons, thus evidencing the need for continued search for more accurate expressions.
Resumo:
The electronic and magnetic structures of the LaMnO3 compound have been studied by means of periodic calculations within the framework of spin polarized hybrid density-functional theory. In order to quantify the role of approximations to electronic exchange and correlation three different hybrid functionals have been used which mix nonlocal Fock and local Dirac-Slater exchange. Periodic Hartree-Fock results are also reported for comparative purposes. The A-antiferromagnetic ground state is properly predicted by all methods including Hartree-Fock exchange. In general, the different hybrid methods provide a rather accurate description of the band gap and of the two magnetic coupling constants, strongly suggesting that the corresponding description of the electronic structure is also accurate. An important conclusion emerging from this study is that the nature of the occupied states near the Fermi level is intermediate between the Hartree-Fock and local density approximation descriptions with a comparable participation of both Mn and O states.
Resumo:
A hybrid theory which combines the full nonlocal ¿exact¿ exchange interaction with the local spin-density approximation of density-functional theory is shown to lead to marked improvement in the description of antiferromagnetically coupled systems. Semiquantitative agreement with experiment is found for the magnitude of the coupling constant in La2CuO4, KNiF3, and K2NiF4. The magnitude of the unpaired spin population on the metal site is in excellent agreement with experiment for La2CuO4.
Resumo:
The performance of density-functional theory to solve the exact, nonrelativistic, many-electron problem for magnetic systems has been explored in a new implementation imposing space and spin symmetry constraints, as in ab initio wave function theory. Calculations on selected systems representative of organic diradicals, molecular magnets and antiferromagnetic solids carried out with and without these constraints lead to contradictory results, which provide numerical illustration on this usually obviated problem. It is concluded that the present exchange-correlation functionals provide reasonable numerical results although for the wrong physical reasons, thus evidencing the need for continued search for more accurate expressions.
Resumo:
Variable queen mating frequencies provide a unique opportunity to study the resolution of worker-queen conflict over sex ratio in social Hymenoptera, because the conflict is maximal in colonies headed by a singly mated queen and is weak or nonexistent in colonies headed by a multiply mated queen. In the wood ant Formica exsecta, workers in colonies with a singly mated queen, but not those in colonies with a multiply mated queen, altered the sex ratio of queen-laid eggs by eliminating males to preferentially raise queens. By this conditional response to queen mating frequency, workers enhance their inclusive fitness.
Resumo:
From a theoretical perspective, an extension to the Full Range leadership Theory (FRLT) seems needed. In this paper, we explain why instrumental leadership--a class of leadership includes leader behaviors focusing on task and strategic aspects that are neither values nor exchange oriented--can fulfill this extension. Instrument leadership is composed of four factors: environmental monitoring, strategy formulation and implementation, path-goal facilitation and outcome monitoring; these aspects of leadership are currently not included in any of the FRLT's nine leadership scales (as measured by the MLQ--Multifactor Leadership Questionnaire). We present results from two empirical studies using very large samples from a wide array of countries (N > 3,000) to examine the factorial, discriminant and criterion-related validity of the instrumental leadership scales. We find support for a four-factor instrumental leadership model, which explains incremental variance in leader outcomes in over and above transactional and transformational leadership.
Resumo:
In the 1920s, Ronald Fisher developed the theory behind the p value and Jerzy Neyman and Egon Pearson developed the theory of hypothesis testing. These distinct theories have provided researchers important quantitative tools to confirm or refute their hypotheses. The p value is the probability to obtain an effect equal to or more extreme than the one observed presuming the null hypothesis of no effect is true; it gives researchers a measure of the strength of evidence against the null hypothesis. As commonly used, investigators will select a threshold p value below which they will reject the null hypothesis. The theory of hypothesis testing allows researchers to reject a null hypothesis in favor of an alternative hypothesis of some effect. As commonly used, investigators choose Type I error (rejecting the null hypothesis when it is true) and Type II error (accepting the null hypothesis when it is false) levels and determine some critical region. If the test statistic falls into that critical region, the null hypothesis is rejected in favor of the alternative hypothesis. Despite similarities between the two, the p value and the theory of hypothesis testing are different theories that often are misunderstood and confused, leading researchers to improper conclusions. Perhaps the most common misconception is to consider the p value as the probability that the null hypothesis is true rather than the probability of obtaining the difference observed, or one that is more extreme, considering the null is true. Another concern is the risk that an important proportion of statistically significant results are falsely significant. Researchers should have a minimum understanding of these two theories so that they are better able to plan, conduct, interpret, and report scientific experiments.