5 resultados para D63 - Equity, Justice, Inequality, and Other Normative Criteria and Measurement
em Universitätsbibliothek Kassel, Universität Kassel, Germany
Resumo:
The traditional task of a central bank is to preserve price stability and, in doing so, not to impair the real economy more than necessary. To meet this challenge, it is of great relevance whether inflation is only driven by inflation expectations and the current output gap or whether it is, in addition, influenced by past inflation. In the former case, as described by the New Keynesian Phillips curve, the central bank can immediately and simultaneously achieve price stability and equilibrium output, the so-called ‘divine coincidence’ (Blanchard and Galí 2007). In the latter case, the achievement of price stability is costly in terms of output and will be pursued over several periods. Similarly, it is important to distinguish this latter case, which describes ‘intrinsic’ inflation persistence, from that of ‘extrinsic’ inflation persistence, where the sluggishness of inflation is not a ‘structural’ feature of the economy but merely ‘inherited’ from the sluggishness of the other driving forces, inflation expectations and output. ‘Extrinsic’ inflation persistence is usually considered to be the less challenging case, as policy-makers are supposed to fight against the persistence in the driving forces, especially to reduce the stickiness of inflation expectations by a credible monetary policy, in order to reestablish the ‘divine coincidence’. The scope of this dissertation is to contribute to the vast literature and ongoing discussion on inflation persistence: Chapter 1 describes the policy consequences of inflation persistence and summarizes the empirical and theoretical literature. Chapter 2 compares two models of staggered price setting, one with a fixed two-period duration and the other with a stochastic duration of prices. I show that in an economy with a timeless optimizing central bank the model with the two-period alternating price-setting (for most parameter values) leads to more persistent inflation than the model with stochastic price duration. This result amends earlier work by Kiley (2002) who found that the model with stochastic price duration generates more persistent inflation in response to an exogenous monetary shock. Chapter 3 extends the two-period alternating price-setting model to the case of 3- and 4-period price durations. This results in a more complex Phillips curve with a negative impact of past inflation on current inflation. As simulations show, this multi-period Phillips curve generates a too low degree of autocorrelation and too early turnings points of inflation and is outperformed by a simple Hybrid Phillips curve. Chapter 4 starts from the critique of Driscoll and Holden (2003) on the relative real-wage model of Fuhrer and Moore (1995). While taking the critique seriously that Fuhrer and Moore’s model will collapse to a much simpler one without intrinsic inflation persistence if one takes their arguments literally, I extend the model by a term for inequality aversion. This model extension is not only in line with experimental evidence but results in a Hybrid Phillips curve with inflation persistence that is observably equivalent to that presented by Fuhrer and Moore (1995). In chapter 5, I present a model that especially allows to study the relationship between fairness attitudes and time preference (impatience). In the model, two individuals take decisions in two subsequent periods. In period 1, both individuals are endowed with resources and are able to donate a share of their resources to the other individual. In period 2, the two individuals might join in a common production after having bargained on the split of its output. The size of the production output depends on the relative share of resources at the end of period 1 as the human capital of the individuals, which is built by means of their resources, cannot fully be substituted one against each other. Therefore, it might be rational for a well-endowed individual in period 1 to act in a seemingly ‘fair’ manner and to donate own resources to its poorer counterpart. This decision also depends on the individuals’ impatience which is induced by the small but positive probability that production is not possible in period 2. As a general result, the individuals in the model economy are more likely to behave in a ‘fair’ manner, i.e., to donate resources to the other individual, the lower their own impatience and the higher the productivity of the other individual. As the (seemingly) ‘fair’ behavior is modelled as an endogenous outcome and as it is related to the aspect of time preference, the presented framework might help to further integrate behavioral economics and macroeconomics.
Resumo:
Almost all Latin American countries are still marked by extreme forms of social inequality – and to an extent, this seems to be the case regardless of national differences in the economic development model or the strength of democracy and the welfare state. Recent research highlights the fact that the heterogeneous labour markets in the region are a key source of inequality. At the same time, there is a strengthening of ‘exclusive’ social policy, which is located at the fault lines of the labour market and is constantly (re-)producing market-mediated disparities. In the last three decades, this type of social policy has even enjoyed democratic legitimacy. These dynamics challenge many of the assumptions guiding social policy and democratic theory, which often attempt to account for the specificities of the region by highlighting the purported flaws of certain policies. We suggest taking a different perspective: social policy in Latin American should not be grasped as a deficient or flawed type of social policy, but as a very successful relation of political domination. ‘Relational social analysis’ locates social policy in the ‘tension zone’ constituted by the requirements of economic reproduction, demands for democratic legitimacy and the relative autonomy of the state. From this vantage point, we will make the relation of domination in question accessible for empirical research. It seems particularly useful for this purpose to examine the recent shifts in the Latin American labour markets, which have undergone numerous reforms. We will examine which mechanisms, institutions and constellations of actors block or activate the potentials of redistribution inherent in such processes of political reform. This will enable us to explore the socio-political field of forces that has been perpetuating the social inequalities in Latin America for generations.
Resumo:
During recent years, quantum information processing and the study of N−qubit quantum systems have attracted a lot of interest, both in theory and experiment. Apart from the promise of performing efficient quantum information protocols, such as quantum key distribution, teleportation or quantum computation, however, these investigations also revealed a great deal of difficulties which still need to be resolved in practise. Quantum information protocols rely on the application of unitary and non–unitary quantum operations that act on a given set of quantum mechanical two-state systems (qubits) to form (entangled) states, in which the information is encoded. The overall system of qubits is often referred to as a quantum register. Today the entanglement in a quantum register is known as the key resource for many protocols of quantum computation and quantum information theory. However, despite the successful demonstration of several protocols, such as teleportation or quantum key distribution, there are still many open questions of how entanglement affects the efficiency of quantum algorithms or how it can be protected against noisy environments. To facilitate the simulation of such N−qubit quantum systems and the analysis of their entanglement properties, we have developed the Feynman program. The program package provides all necessary tools in order to define and to deal with quantum registers, quantum gates and quantum operations. Using an interactive and easily extendible design within the framework of the computer algebra system Maple, the Feynman program is a powerful toolbox not only for teaching the basic and more advanced concepts of quantum information but also for studying their physical realization in the future. To this end, the Feynman program implements a selection of algebraic separability criteria for bipartite and multipartite mixed states as well as the most frequently used entanglement measures from the literature. Additionally, the program supports the work with quantum operations and their associated (Jamiolkowski) dual states. Based on the implementation of several popular decoherence models, we provide tools especially for the quantitative analysis of quantum operations. As an application of the developed tools we further present two case studies in which the entanglement of two atomic processes is investigated. In particular, we have studied the change of the electron-ion spin entanglement in atomic photoionization and the photon-photon polarization entanglement in the two-photon decay of hydrogen. The results show that both processes are, in principle, suitable for the creation and control of entanglement. Apart from process-specific parameters like initial atom polarization, it is mainly the process geometry which offers a simple and effective instrument to adjust the final state entanglement. Finally, for the case of the two-photon decay of hydrogenlike systems, we study the difference between nonlocal quantum correlations, as given by the violation of the Bell inequality and the concurrence as a true entanglement measure.
Resumo:
Supervision ist Beratung beruflichen Handelns. Zentrales Anliegen der Dissertation ist die Entwicklung der Arbeitswelt in der Moderne sowie die aktuellen Veränderungen näher zu beleuchten, um ihre Bedeutung für die Supervision zu eruieren. Vorab werden die Geschichte und Weiterentwicklung der Supervision mit ihren Spannungsfeldern skizziert. Der Hauptteil handelt von der Herausbildung und Durchsetzung der kapitalistischen Produktionsweise. Es wird dargelegt, was für ein vielschichtiger Prozess für den einzelnen und die Gesellschaft in Gang gesetzt wurde, um die moderne Wirtschaftsweise zu installieren und welche Unterstützung aus anderen Bereichen -Religion, Politik etc. - dazu von Nöten war. Darüber hinaus wird aufgezeigt, welche Konflikte von Anfang an vorhanden waren. Das Hauptaugenmerk richtet sich auf die Prämissen der modernen Arbeitswelt und die mögliche Inkompatibilität ihrer Anliegen: Gewinn und Gerechtigkeit, Konkurrenz und Kooperation, freier Markt und staatliche Regulation, Gleichheit und Spezialisierung, Individualität und Kollektivität. Konflikte und Verstrickungen, die sich daraus für den arbeitenden Menschen ergeben und somit Thema für die Supervision wurden, werden herausgearbeitet. In bezug auf Supervision wird dargestellt, welche Problematik der Wirtschaftsweise immanent ist und schwerlich durch Beratung aufgelöst werden kann. Eine bedeutsame Stellung nimmt die Arbeit als eine zentrale Kategorie der Ökonomie ein. Die Verflechtung von Arbeit und Ökonomie sowie der Entwicklungsverlauf von Arbeit werden skizziert: vom Fordismus zur Globalisierung. Beschrieben und erörtert werden die Konsequenzen für die aktuellen Arbeitsformen und -bedingungen durch die Produktivkraftentwicklung und die neuen Produktionskonzepte. Arbeit wird kritisch in ihrer Funktion für die kapitalistische Produktionsweise reflektiert, mit ihren Beeinträchtigungen für den tätigen Menschen. Das ambivalente Verhältnis von Mensch und Ökonomie wird als das signifikanteste Spannungsfeld für die Supervision betrachtet, verbunden mit der Herausforderung, Supervision nicht zu einer Beratungstechnologie zur Anpassung und Funktionalisierung des Menschen an die sich immer rascher verändernde Arbeitswelt zu zuschneiden.
Resumo:
This work focuses on the analysis of the influence of environment on the relative biological effectiveness (RBE) of carbon ions on molecular level. Due to the high relevance of RBE for medical applications, such as tumor therapy, and radiation protection in space, DNA damages have been investigated in order to understand the biological efficiency of heavy ion radiation. The contribution of this study to the radiobiology research consists in the analysis of plasmid DNA damages induced by carbon ion radiation in biochemical buffer environments, as well as in the calculation of the RBE of carbon ions on DNA level by mean of scanning force microscopy (SFM). In order to study the DNA damages, besides the common electrophoresis method, a new approach has been developed by using SFM. The latter method allows direct visualisation and measurement of individual DNA fragments with an accuracy of several nanometres. In addition, comparison of the results obtained by SFM and agarose gel electrophoresis methods has been performed in the present study. Sparsely ionising radiation, such as X-rays, and densely ionising radiation, such as carbon ions, have been used to irradiate plasmid DNA in trishydroxymethylaminomethane (Tris buffer) and 4-(2-hydroxyethyl)-1-piperazineethanesulfonic acid (HEPES buffer) environments. These buffer environments exhibit different scavenging capacities for hydroxyl radical (HO0), which is produced by ionisation of water and plays the major role in the indirect DNA damage processes. Fragment distributions have been measured by SFM over a large length range, and as expected, a significantly higher degree of DNA damages was observed for increasing dose. Also a higher amount of double-strand breaks (DSBs) was observed after irradiation with carbon ions compared to X-ray irradiation. The results obtained from SFM measurements show that both types of radiation induce multiple fragmentation of the plasmid DNA in the dose range from D = 250 Gy to D = 1500 Gy. Using Tris environments at two different concentrations, a decrease of the relative biological effectiveness with the rise of Tris concentration was observed. This demonstrates the radioprotective behavior of the Tris buffer solution. In contrast, a lower scavenging capacity for all other free radicals and ions, produced by the ionisation of water, was registered in the case of HEPES buffer compared to Tris solution. This is reflected in the higher RBE values deduced from SFM and gel electrophoresis measurements after irradiation of the plasmid DNA in 20 mM HEPES environment compared to 92 mM Tris solution. These results show that HEPES and Tris environments play a major role on preventing the indirect DNA damages induced by ionising radiation and on the relative biological effectiveness of heavy ion radiation. In general, the RBE calculated from the SFM measurements presents higher values compared to gel electrophoresis data, for plasmids irradiated in all environments. Using a large set of data, obtained from the SFM measurements, it was possible to calculate the survive rate over a larger range, from 88% to 98%, while for gel electrophoresis measurements the survive rates have been calculated only for values between 96% and 99%. While the gel electrophoresis measurements provide information only about the percentage of plasmids DNA that suffered a single DSB, SFM can count the small plasmid fragments produced by multiple DSBs induced in a single plasmid. Consequently, SFM generates more detailed information regarding the amount of the induced DSBs compared to gel electrophoresis, and therefore, RBE can be calculated with more accuracy. Thus, SFM has been proven to be a more precise method to characterize on molecular level the DNA damage induced by ionizing radiations.