68 resultados para Armington Assumption


Relevância:

10.00% 10.00%

Publicador:

Resumo:

Nowadays any analysis of Russian economy is incomplete without taking into account the phenomenon of oligarchy. Russian oligarchs appeared after the fall of the Soviet Union and are represented by wealthy businessmen who control a huge part of natural resources enterprises and have a big political influence. Oligarchs’ shares in some natural resources industries reach even 70-80%. Their role in Russian economy is big without any doubts, however there has been very little economic analysis done. The aim of this work is to examine Russian oligarchy on micro and macro levels, its role in Russia’s transition and the possible positive and negative outcomes from this phenomenon. For this purpose the work presents two theoretical models. The first part of this thesis work examines the role of oligarchs on micro level, concentrating on the question whether the oligarchs can be more productive owners than other types of owners. To answer the question this part presents a model based on the article “Are oligarchs productive? Theory and evidence” by Y. Gorodnichenko and Y. Grygorenko. It is followed by empirical test based on the works of S. Guriev and A. Rachinsky. The model predicts oligarchs to invest more in the productivity of their enterprises and have higher returns on capital, therefore be more productive owners. According to the empirical test, oligarchs were found to outperform other types of owners, however it is not defined whether the productivity gains offset losses in tax revenue. The second part of the work concentrates on the role of oligarchy on macro level. More precisely, it examines the assumption that the depression after 1998 crises in Russia was caused by the oligarchs’ behavior. This part presents a theoretical model based on the article “A macroeconomic model of Russian transition: The role of oligarchic property rights” by S. Braguinsky and R. Myerson, where the special type of property rights is introduced. After the 1998 crises oligarchs started to invest all their resources abroad to protect themselves from political risks, which resulted in the long depression phase. The macroeconomic model shows, that better protection of property rights (smaller political risk) or/and higher outside investing could reduce the depression. Taking into account this result, the government policy can change the oligarchs’ behavior to be more beneficial for the Russian economy and make the transition faster.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The study presents a theory of utility models based on aspiration levels, as well as the application of this theory to the planning of timber flow economics. The first part of the study comprises a derivation of the utility-theoretic basis for the application of aspiration levels. Two basic models are dealt with: the additive and the multiplicative. Applied here solely for partial utility functions, aspiration and reservation levels are interpreted as defining piecewisely linear functions. The standpoint of the choices of the decision-maker is emphasized by the use of indifference curves. The second part of the study introduces a model for the management of timber flows. The model is based on the assumption that the decision-maker is willing to specify a shape of income flow which is different from that of the capital-theoretic optimum. The utility model comprises four aspiration-based compound utility functions. The theory and the flow model are tested numerically by computations covering three forest holdings. The results show that the additive model is sensitive even to slight changes in relative importances and aspiration levels. This applies particularly to nearly linear production possibility boundaries of monetary variables. The multiplicative model, on the other hand, is stable because it generates strictly convex indifference curves. Due to a higher marginal rate of substitution, the multiplicative model implies a stronger dependence on forest management than the additive function. For income trajectory optimization, a method utilizing an income trajectory index is more efficient than one based on the use of aspiration levels per management period. Smooth trajectories can be attained by squaring the deviations of the feasible trajectories from the desired one.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

An extensive electricity transmission network facilitates electricity trading between Finland, Sweden, Norway and Denmark. Currently most of the area's power generation is traded at NordPool, where the trading volumes have steadily increased since the early 1990's, when the exchange was founded. The Nordic electricity is expected to follow the current trend and further integrate with the other European electricity markets. Hydro power is the source for roughly a half of the supply in the Nordic electricity market and most of the hydro is generated in Norway. The dominating role of hydro power distinguishes the Nordic electricity market from most of the other market places. Production of hydro power varies mainly due to hydro reservoirs and demand for electricity. Hydro reservoirs are affected by water inflows that differ each year. The hydro reservoirs explain remarkably the behaviour of the Nordic electricity markets. Therefore among others, Kauppi and Liski (2008) have developed a model that analyzes the behaviour of the markets using hydro reservoirs as explanatory factors. Their model includes, for example, welfare loss due to socially suboptimal hydro reservoir usage, socially optimal electricity price, hydro reservoir storage and thermal reservoir storage; that are referred as outcomes. However, the model does not explain the real market condition but rather an ideal situation. In the model the market is controlled by one agent, i.e. one agent controls all the power generation reserves; it is referred to as a socially optimal strategy. Article by Kauppi and Liski (2008) includes an assumption where an individual agent has a certain fraction of market power, e.g. 20 % or 30 %. In order to maintain the focus of this thesis, this part of their paper is omitted. The goal of this thesis is two-fold. Firstly we expand the results from the socially optimal strategy for years 2006-08, as the earlier study finishes in 2005. The second objective is to improve on the methods from the previous study. This thesis results several outcomes (SPOT-price and welfare loss, etc.) due to socially optimal actions. Welfare loss is interesting as it describes the inefficiency of the market. SPOT-price is an important output for the market participants as it often has an effect on end users' electricity bills. Another function is to modify and try to improve the model by means of using more accurate input data, e.g. by considering pollution trade rights effect on input data. After modifications to the model, new welfare losses are calculated and compared with the same results before the modifications. The hydro reservoir has the higher explanatory significance in the model followed by thermal power. In Nordic markets, thermal power reserves are mostly nuclear power and other thermal sources (coal, natural gas, oil, peat). It can be argued that hydro and thermal reservoirs determine electricity supply. Roughly speaking, the model takes into account electricity demand and supply, and several parameters related to them (water inflow, oil price, etc.), yielding finally the socially optimal outcomes. The author of this thesis is not aware of any similar model being tested before. There have been some other studies that are close to the Kauppi and Liski (2008) model, but those have a somewhat different focus. For example, a specific feature in the model is the focus on long-run capacity usage that differs from the previous studies on short-run market power. The closest study to the model is from California's wholesale electricity markets that, however, uses different methodology. Work is constructed as follows.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Modern sample surveys started to spread after statistician at the U.S. Bureau of the Census in the 1940s had developed a sampling design for the Current Population Survey (CPS). A significant factor was also that digital computers became available for statisticians. In the beginning of 1950s, the theory was documented in textbooks on survey sampling. This thesis is about the development of the statistical inference for sample surveys. For the first time the idea of statistical inference was enunciated by a French scientist, P. S. Laplace. In 1781, he published a plan for a partial investigation in which he determined the sample size needed to reach the desired accuracy in estimation. The plan was based on Laplace s Principle of Inverse Probability and on his derivation of the Central Limit Theorem. They were published in a memoir in 1774 which is one of the origins of statistical inference. Laplace s inference model was based on Bernoulli trials and binominal probabilities. He assumed that populations were changing constantly. It was depicted by assuming a priori distributions for parameters. Laplace s inference model dominated statistical thinking for a century. Sample selection in Laplace s investigations was purposive. In 1894 in the International Statistical Institute meeting, Norwegian Anders Kiaer presented the idea of the Representative Method to draw samples. Its idea was that the sample would be a miniature of the population. It is still prevailing. The virtues of random sampling were known but practical problems of sample selection and data collection hindered its use. Arhtur Bowley realized the potentials of Kiaer s method and in the beginning of the 20th century carried out several surveys in the UK. He also developed the theory of statistical inference for finite populations. It was based on Laplace s inference model. R. A. Fisher contributions in the 1920 s constitute a watershed in the statistical science He revolutionized the theory of statistics. In addition, he introduced a new statistical inference model which is still the prevailing paradigm. The essential idea is to draw repeatedly samples from the same population and the assumption that population parameters are constants. Fisher s theory did not include a priori probabilities. Jerzy Neyman adopted Fisher s inference model and applied it to finite populations with the difference that Neyman s inference model does not include any assumptions of the distributions of the study variables. Applying Fisher s fiducial argument he developed the theory for confidence intervals. Neyman s last contribution to survey sampling presented a theory for double sampling. This gave the central idea for statisticians at the U.S. Census Bureau to develop the complex survey design for the CPS. Important criterion was to have a method in which the costs of data collection were acceptable, and which provided approximately equal interviewer workloads, besides sufficient accuracy in estimation.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The Master’s thesis examines whether and how decolonial cosmopolitanism is empirically traceable in the attitudes and practices of Costa Rican activists working in transnational advocacy organizations. Decolonial cosmopolitanism is defined as a form of cosmopolitanism from below that aims to propose ways of imagining – and putting into practice – a truly globe-encompassing civic community not based on relations of domination but on horizontal dialogue. This concept has been developed by and shares its basic presumptions with the theory on coloniality that the modernity/coloniality/decoloniality research group is putting forward. It is analyzed whether and how the workings of coloniality as underlying ontological assumption of decolonial cosmopolitanism and broadly subsumable under the three logics of race, capitalism, and knowledge, are traceable in intermediate postcolonial transnational advocacy in Costa Rica. The method of analysis chosen to approach these questions is content analysis, which is used for the analysis of qualitative semi-structured in-depth interviews with Costa Rican activists working in advocacy organizations with transnational ties. Costa Rica was chosen as it – while unquestionably a Latin American postcolonial country and thus within the geo-political context in which the concept was developed – introduces a complex setting of socio-cultural and political factors that put the explanatory potential of the concept to the test. The research group applies the term ‘coloniality’ to describe how the social, political, economic, and epistemic relations developed during the colonization of the Americas order global relations and sustain Western domination still today through what is called the logic of coloniality. It also takes these processes as point of departure for imagining how counter-hegemonic contestations can be achieved through the linking of local struggles to a global community that is based on pluriversality. The issues that have been chosen as most relevant expressions of the logic of coloniality in the context of Costa Rican transnational advocacy and that are thus empirically scrutinized are national identity as ‘white’ exceptional nation with gender equality (racism), the neoliberalization of advocacy in the Global South (capitalism), and finally Eurocentrism, but also transnational civil society networks as first step in decolonizing civic activism (epistemic domination). The findings of this thesis show that the various ways in which activists adopt practices and outlooks stemming from the center in order to empower themselves and their constituencies, but also how their particular geo-political position affects their work, cannot be reduced to one single logic of coloniality. Nonetheless, the aspects of race, gender, capitalism and epistemic hegemony do undeniably affect activist cosmopolitan attitudes and transnational practices. While the premisses on which the concept of decolonial cosmopolitanism is based suffer from some analytical drawbacks, its importance is seen in its ability to take as point of departure the concrete spaces in which situated social relations develop. It thus allows for perceiving the increasing interconnectedness between different levels of social and political organizing as contributing to cosmopolitan visions combining local situatedness with global community as normative horizon that have not only influenced academic debate, but also political projects.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

From monologues to dialogue. A discussion about changing the fragmented character of the debate concerning schools to one of negotiation, in the spirit of social constructionism. The starting point for the study is the assumption that the interested parties concerning schools such as teachers, students, public servants within school administration or politics construct the idea of the school in disparate ways. It looks as if the representatives of the various interested parties perceive the school in distinctive ways or with particular emphases. Additionally, there are not many discussion forums where these different interested parties have an equal right to speak and be heard. It seems that the lack of dialogue characterizes the debate about school. At the centre of the study are negotiations concerning schools, and the conditions that promote changing the fragmented character of this school debate in a more promising and collectively responsible process of negotiation. The aims of the study are to find both an empirical and theoretical basis for more equal ways to negotiate about school, and to increase cultural self reflection. Social constructionism plays a key role in aspiring to meet these research aims. The research questions are (1) How do the informants of the study construct the idea of school in their texts, and (2) What kind of prospects does social constructionism bring to the negotiations about school. The research informants construct the idea of school in their texts in several ways. To sum up: school is constructed as a place for learning, a place for building the future, a place where ethical education is lived out, a place for social education and Bildung, and a place where the students well-being is ensured. The previously presented assumption that the interested parties of a school construct the idea of a school in disparate ways or with various emphases seems to have support in the informants texts. Based on that, a condition can be put forward: different perspectives should have an equal opportunity to be heard in negotiations about school. It would also be helpful if there was a chance for different perspectives to be documented and/or in some way, visualized. This ensures that different constructions of school are within reach of all the participants. Additionally, while making the process of negotiation transparent, this documentation becomes an important medium for self reflection. On one hand it visualizes the complexity of the school. On the other hand it protects the school and education from serving as the spokesman of any single truth that is presented as objective or universal. Social constructionism seems to offer a stable theoretical basis for changing the fragmented character of the school debate in one of negotiation. More equal and collectively responsible school negotiation presumes that certain aspects or conditions drawn from postmodernism and social constructionism have been studied. In the study, six conditions are presented that can be seen as mediums for changing the fragmented character of the school debate into one of more equal negotiation. Keywords: social constructionism, Kenneth J. Gergen, school negotiation, education policy, dialogue.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This thesis is concerned with the area of vector-valued Harmonic Analysis, where the central theme is to determine how results from classical Harmonic Analysis generalize to functions with values in an infinite dimensional Banach space. The work consists of three articles and an introduction. The first article studies the Rademacher maximal function that was originally defined by T. Hytönen, A. McIntosh and P. Portal in 2008 in order to prove a vector-valued version of Carleson's embedding theorem. The boundedness of the corresponding maximal operator on Lebesgue-(Bochner) -spaces defines the RMF-property of the range space. It is shown that the RMF-property is equivalent to a weak type inequality, which does not depend for instance on the integrability exponent, hence providing more flexibility for the RMF-property. The second article, which is written in collaboration with T. Hytönen, studies a vector-valued Carleson's embedding theorem with respect to filtrations. An earlier proof of the dyadic version assumed that the range space satisfies a certain geometric type condition, which this article shows to be also necessary. The third article deals with a vector-valued generalizations of tent spaces, originally defined by R. R. Coifman, Y. Meyer and E. M. Stein in the 80's, and concerns especially the ones related to square functions. A natural assumption on the range space is then the UMD-property. The main result is an atomic decomposition for tent spaces with integrability exponent one. In order to suit the stochastic integrals appearing in the vector-valued formulation, the proof is based on a geometric lemma for cones and differs essentially from the classical proof. Vector-valued tent spaces have also found applications in functional calculi for bisectorial operators. In the introduction these three themes come together when studying paraproduct operators for vector-valued functions. The Rademacher maximal function and Carleson's embedding theorem were applied already by Hytönen, McIntosh and Portal in order to prove boundedness for the dyadic paraproduct operator on Lebesgue-Bochner -spaces assuming that the range space satisfies both UMD- and RMF-properties. Whether UMD implies RMF is thus an interesting question. Tent spaces, on the other hand, provide a method to study continuous time paraproduct operators, although the RMF-property is not yet understood in the framework of tent spaces.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper describes the cost-benefit analysis of digital long-term preservation (LTP) that was carried out in the context of the Finnish National Digital Library Project (NDL) in 2010. The analysis was based on the assumption that as many as 200 archives, libraries, and museums will share an LTP system. The term ‘system’ shall be understood as encompassing not only information technology, but also human resources, organizational structures, policies and funding mechanisms. The cost analysis shows that an LTP system will incur, over the first 12 years, cumulative costs of €42 million, i.e. an average of €3.5 million per annum. Human resources and investments in information technology are the major cost factors. After the initial stages, the analysis predicts annual costs of circa €4 million. The analysis compared scenarios with and without a shared LTP system. The results indicate that a shared system will have remarkable benefits. At the development and implementation stages, a shared system shows an advantage of €30 million against the alternative scenario consisting of five independent LTP solutions. During the later stages, the advantage is estimated at €10 million per annum. The cumulative cost benefit over the first 12 years would amount to circa €100 million.