851 resultados para Employment (Economic theory)
Resumo:
The policy reform literature is primarily concerned with the construction of reforms that yield welfare gains. By contrast, this paper’s contribution is to develop a theoretical concept for which the focus is upon the sizes of welfare gains accruing from policy reforms rather than upon their signs. In undertaking this task, and by focusing on tariff reforms, we introduce the concept of a steepest ascent policy reform, which is a locally optimal reform in the sense that it achieves the highest marginal gain in utility of any feasible local reform. We argue that this reform presents itself as a natural benchmark for the evaluation of the welfare effectiveness of other popular tariff reforms such as the proportional tariff reduction and the concertina rules, since it provides the maximal welfare gain of all possible local reforms. We derive properties of the steepest ascent tariff reform, construct an index to measure the relative welfare effectiveness of any given tariff reform, determine conditions under which proportional and concertina reforms are locally optimal and provide illustrative examples.
Resumo:
We examine how a multinational's choice to centralize or decentralize its decision structure is affected by country tax differentials. Within a simple model that emphasizes the multiple conflicting roles of transfer prices in multinational enterprises (MNEs)—here, as a strategic precommitment device and a tax manipulation instrument—we show that centralization is more profitable when tax differentials are large. When tax differentials are small, decentralization can be performed in two different ways each providing the highest profits in a particular range of the tax differential. Hence, the paper emphasizes the organizational flexibility that MNEs have in pursuing tax optimization.
Resumo:
Animal Spirits is multi-channel video portrait of key personalities involved in the Global Financial Crisis. The four-screen installation displays these twelve decapitated apostles of free-market economic theory in a tableau of droning pontification. Trapped in a purgatorial loop, they endlessly spout vague and obfuscating explanations and defenses of their ideologies and (in)actions. The work takes a creatively quotidian approach to understanding the language of economics and the financial services industry. Through its endless loop of sound, image, and spoken text, the installation examines some of the ideas, narratives and power dynamics that foster and reward hubris and greed.
Resumo:
This study is divided into two parts: a methodological part and a part which focuses on the saving of households. In the 1950 s both the concepts as well as the household surveys themselves went through a rapid change. The development of national accounts was motivated by the Keynesian theory and the 1940 s and 1950 s were an important time for the development of the national accounts. Before this, saving was understood as cash money or money deposited in bank accounts but the changes in this era led to the establishment of the modern saving concept. Separate from the development of national accounts, household surveys were established. Household surveys have been conducted in Finland from the beginning of the 20th century. At that time surveys were conducted in order to observe the working class living standard and as a result, these were based on the tradition of welfare studies. Also a motivation for undertaking the studies was to estimate weights for the consumer price index. A final reason underpinning the government s interest in observing this data regarded whether there were any reasons for the working class to become radicalised and therefore adopt revolutionary ideas. As the need for the economic analysis increased and the data requirements underlying the political decision making process also expanded, the two traditions and thus, the two data sources started to integrate. In the 1950s the household surveys were compiled distinctly from the national accounts and they were virtually unaffected by economic theory. The 1966 survey was the first study that was clearly motivated by national accounts and saving analysis. This study also covered the whole population rather than it being limited to just part of it. It is essential to note that the integration of these two traditions is still continuing. This recently took a big step forward as the Stiglitz, Sen and Fitoussi Committee Report was introduced and thus, the criticism of the current measure of welfare was taken seriously. The Stiglitz report emphasises that the focus in the measurement of welfare should be on the households and the macro as well as micro perspective should be included in the analysis. In this study the national accounts are applied to the household survey data from the years 1950-51, 1955-56 and 1959-60. The first two studies cover the working population of towns and market towns and the last survey covers the population of rural areas. The analysis is performed at three levels: macro economic level, meso level, i.e. at the level of different types of households, and micro level, i.e. at the level of individual households. As a result it analyses how the different households saved and consumed and how that changed during the 1950 s.
Resumo:
Ever since its initial introduction some fifty years ago, the rational expectations paradigm has dominated the way economic theory handles uncertainty. The main assertion made by John F. Muth (1961), seen by many as the father of the paradigm, is that expectations of rational economic agents should essentially be equal to the predictions of relevant economic theory, since rational agents should use information available to them in an optimal way. This assumption often has important consequences on the results and interpretations of the models where it is applied. Although the rational expectations assumption can be applied to virtually any economic theory, the focus in this thesis is on macroeconomic theories of consumption, especially the Rational Expectations–Permanent Income Hypothesis proposed by Robert E. Hall in 1978. The much-debated theory suggests that, assuming that agents have rational expectations on their future income, consumption decisions should follow a random walk, and the best forecast of future consumption level is the current consumption level. Then, changes in consumption are unforecastable. This thesis constructs an empirical test for the Rational Expectations–Permanent Income Hypothesis using Finnish Consumer Survey data as well as various Finnish macroeconomic data. The data sample covers the years 1995–2010. Consumer survey data may be interpreted to directly represent household expectations, which makes it an interesting tool for this particular test. The variable to be predicted is the growth of total household consumption expenditure. The main empirical result is that the Consumer Confidence Index (CCI), a balance figure computed from the most important consumer survey responses, does have statistically significant predictive power over the change in total consumption expenditure. The history of consumption expenditure growth itself, however, fails to predict its own future values. This indicates that the CCI contains some information that the history of consumption decisions does not, and that the consumption decisions are not optimal in the theoretical context. However, when conditioned on various macroeconomic variables, the CCI loses its predictive ability. This finding suggests that the index is merely a (partial) summary of macroeconomic information, and does not contain any significant private information on consumption intentions of households not directly deductible from the objective economic variables. In conclusion, the Rational Expectations–Permanent Income Hypothesis is strongly rejected by the empirical results in this thesis. This result is in accordance with most earlier studies conducted on the topic.
Resumo:
Congestion of traffic is one of the biggest challenges for urban cities in global perspective. Car traffic and traffic jams are causing major problems and the congestion is predicted to worsen in the future. The greenhouse effect has caused a severe threat to the environment globally. On the other hand from the point of view of companies and other economic parties time and money has been lost because of the congestion of traffic. This work studies some possible traffic payment systems for the Helsinki Metropolitan area introducing three optional models and concentrating on the point of view of the economic parties. Central part of this work is formed by a research questionnaire, which was conducted among companies located in the Helsinki area and where more than 1000 responses were gained. The study researches the approaches of the respondents to the area s current traffic system, its development and urban congestion pricing and the answers are analyzed according to the size, industry and location of the companies. The economic aspect is studied by economic theory of industrial location and by emphasizing the meaning of smoothly running traffic for the economic world. Chapter three presents detailed information about traffic congestion, how today s car-centered society has been formed, what concrete things congestion means for economic life and how traffic congestion can be limited. Theoretically it is examined how urban traffic payment systems are working using examples from London and Stockholm where successful traffic payment experiences exist. The literature review analyzes urban development, increasing car traffic and Helsinki Metropolitan area on a structural point of view. The fourth chapter introduces a case study, which concentrates on Helsinki Metropolitan area s different structures, the congestion situation in Helsinki and the introduction of the traffic payment system clarification. Currently the region is experiencing a phase where big changes are happening in the planning of traffic. The traffic systems are being unified to consider the whole region in the future. Also different advices for the increasing traffic congestion problems are needed. Chapter five concentrates on the questionnaire and theme interviews and introduces the research findings. The respondents overall opinion of the traffic payments is quite skeptical. There were some regional differences found and especially taxi, bus and cargo and transit enterprises shared the most negative opinion. Economic parties were worried especially because of the traffic congestion is causing harm for the business travel and the employees traveling to and from work. According to the respondents the best option from the traffic payment models was the ring model where the payment places would be situated inside the Ring Road III. Both the company representatives and other key decision makers see public transportation as a good and powerful tool to decrease traffic congestion. The only question, which remains, is where to find investors willing to invest in public transportation if economic representatives do not believe in pricing the traffic by for example traffic payment systems.
Resumo:
Competition is an immensely important area of study in economic theory, business and strategy. It is known to be vital in meeting consumers’ growing expectations, stimulating increase in the size of the market, pushing innovation, reducing cost and consequently generating better value for end users, among other things. Having said that, it is important to recognize that supply chains, as we know it, has changed the way companies deal with each other both in confrontational or conciliatory terms. As such, with the rise of global markets and outsourcing destinations, increased technological development in transportation, communication and telecommunications has meant that geographical barriers of distance with regards to competition are a thing of the past in an increasingly flat world. Even though the dominant articulation of competition within management and business literature rests mostly within economic competition theory, this thesis draws attention to the implicit shift in the recognition of other forms of competition in today’s business environment, especially those involving supply chain structures. Thus, there is popular agreement within a broad business arena that competition between companies is set to take place along their supply chains. Hence, management’s attention has been focused on how supply chains could become more aggressive making each firm in its supply chain more efficient. However, there is much disagreement on the mechanism through which such competition pitching supply chain against supply chain will take place. The purpose of this thesis therefore, is to develop and conceptualize the notion of supply chain vs. supply chain competition, within the discipline of supply chain management. The thesis proposes that competition between supply chains may be carried forward via the use of competition theories that emphasize interaction and dimensionality, hence, encountering friction from a number of sources in their search for critical resources and services. The thesis demonstrates how supply chain vs. supply chain competition may be carried out theoretically, using generated data for illustration, and practically using logistics centers as a way to provide a link between theory and corresponding practice of this evolving competition mode. The thesis concludes that supply chain vs. supply chain competition, no matter the conceptualization taken, is complex, novel and can be very easily distorted and abused. It therefore calls for the joint development of regulatory measures by practitioners and policymakers alike, to guide this developing mode of competition.
Resumo:
Resumen: El trabajo constituye una presentación general del análisis de la circulación económica elaborado por B. Lonergan quien, desde presupuestos filosóficos diferentes de los implícitos en la teoría económica dominante, desarrolló una aproximación sistémica a la problemática económica que revela ciertas insuficiencias en las mediciones vigentes. En efecto, las variables de interés se ven modificadas cuando cambian las categorías fundamentales de análisis. En la primera sección se exponen los supuestos gnoseológicos y epistemológicos realistas desarrollados por este autor. Estos fundamentos permiten reconocer con mayor claridad las bases endebles de los abordajes actuales. Tras una breve enumeración de algunas críticas al paradigma vigente de parte de diversos autores, se exponen las líneas generales del esquema propuesto por Lonergan.
Resumo:
Resumen: El artículo tiene por tema central la importancia de la intervención de la Iglesia en asuntos socio-económicos. El autor parte de dos interrogantes: el motivo por el cual se instauró la Doctrina Social de la Iglesia a finales del siglo XIX y no anteriormente, y las razones que condujeron al quiebre entre la Doctrina Social de la Iglesia y la teoría económica dominante. Para dar respuesta a estas cuestiones, Pasinetti se remonta a los inicios del Cristianismo, y realiza un análisis histórico del desarrollo de la teoría económica hasta la proclamación de la encíclica Rerum Novarum en 1891. El autor explica que ese corpus doctrinal surgió como resultado de tres eventos históricos: la Revolución Industrial, el impacto de la obra de Karl Marx, y la falla en formular una teoría económica capaz de resolver los problemas de un mundo nuevo. La Doctrina Social de la Iglesia, entonces, está llamada a superar estas dificultades ya que posee las herramientas necesarias para lograrlo.
Resumo:
Revised 2008-08.-- Published as an article in: Journal of Public Economic Theory (2008), 10(4), 563-594.
Resumo:
This study is concerned with the measurement of total factor prodnctivity in the marine fishing industries in general and in the Pacific coast trawl fishery in particular. The study is divided into two parts. Part I contains suitable empirical and introductory theoretical material for the examination of productivity in the Pacific coast trawl Deet. It is self-contained, and contains the basic formulae, empirical results, and discussion. Because the economic theory of index numbers and productivity is constantly evolving and is widely scattered throughout the economics literature, Part D draws together the theoretical literature into one place to allow ready access for readers interested in more details. The major methodological focus of the study is upon the type of economic index number that is most appropriate for use by economists with the National Marine Fisheries Service. This study recommends that the following types of economic index numbers be used: chain rather than fIxed base; bilateral rather than multilateral; one of the class of superlative indices, such as the Tornqvist or Fisher Ideal. (PDF file contains 40 pages.)
Resumo:
Government procurement of a new good or service is a process that usually includes basic research, development, and production. Empirical evidences indicate that investments in research and development (R and D) before production are significant in many defense procurements. Thus, optimal procurement policy should not be only to select the most efficient producer, but also to induce the contractors to design the best product and to develop the best technology. It is difficult to apply the current economic theory of optimal procurement and contracting, which has emphasized production, but ignored R and D, to many cases of procurement.
In this thesis, I provide basic models of both R and D and production in the procurement process where a number of firms invest in private R and D and compete for a government contract. R and D is modeled as a stochastic cost-reduction process. The government is considered both as a profit-maximizer and a procurement cost minimizer. In comparison to the literature, the following results derived from my models are significant. First, R and D matters in procurement contracting. When offering the optimal contract the government will be better off if it correctly takes into account costly private R and D investment. Second, competition matters. The optimal contract and the total equilibrium R and D expenditures vary with the number of firms. The government usually does not prefer infinite competition among firms. Instead, it prefers free entry of firms. Third, under a R and D technology with the constant marginal returns-to-scale, it is socially optimal to have only one firm to conduct all of the R and D and production. Fourth, in an independent private values environment with risk-neutral firms, an informed government should select one of four standard auction procedures with an appropriate announced reserve price, acting as if it does not have any private information.
Resumo:
La salud es un aspecto muy importante en la vida de cualquier persona, de forma que, al ocurrir cualquier contingencia que merma el estado de salud de un individuo o grupo de personas, se debe valorar estrictamente y en detalle las distintas alternativas destinadas a combatir la enfermedad. Esto se debe a que, la calidad de vida de los pacientes variará dependiendo de la alternativa elegida. La calidad de vida relacionada con la salud (CVRS) se entiende como el valor asignado a la duración de la vida, modificado por la oportunidad social, la percepción, el estado funcional y la disminución provocadas por una enfermedad, accidente, tratamiento o política (Sacristán et al, 1995). Para determinar el valor numérico asignado a la CVRS, ante una intervención, debemos beber de la teoría económica aplicada a las evaluaciones sanitarias para nuevas intervenciones. Entre los métodos de evaluación económica sanitaria, el método coste-utilidad emplea como utilidad, los años de vida ajustado por calidad (AVAC), que consiste, por un lado, tener en cuenta la calidad de vida ante una intervención médica, y por otro lado, los años estimados a vivir tras la intervención. Para determinar la calidad de vida, se emplea técnicas como el Juego Estándar, la Equivalencia Temporal y la Escala de Categoría. Estas técnicas nos proporcionan un valor numérico entre 0 y 1, siendo 0 el peor estado y 1 el estado perfecto de salud. Al entrevistar a un paciente a cerca de la utilidad en términos de salud, puede haber riesgo o incertidumbre en la pregunta planteada. En tal caso, se aplica el Juego Estándar con el fin de determinar el valor numérico de la utilidad o calidad de vida del paciente ante un tratamiento dado. Para obtener este valor, al paciente se le plantean dos escenarios: en primer lugar, un estado de salud con probabilidad de morir y de sobrevivir, y en segundo lugar, un estado de certeza. La utilidad se determina modificando la probabilidad de morir hasta llegar a la probabilidad que muestra la indiferencia del individuo entre el estado de riesgo y el estado de certeza. De forma similar, tenemos la equivalencia temporal, cuya aplicación resulta más fácil que el juego estándar ya que valora en un eje de ordenadas y abscisas, el valor de la salud y el tiempo a cumplir en esa situación ante un tratamiento sanitario, de forma que, se llega al valor correspondiente a la calidad de vida variando el tiempo hasta que el individuo se muestre indiferente entre las dos alternativas. En último lugar, si lo que se espera del paciente es una lista de estados de salud preferidos ante un tratamiento, empleamos la Escala de Categoría, que consiste en una línea horizontal de 10 centímetros con puntuaciones desde 0 a 100. La persona entrevistada coloca la lista de estados de salud según el orden de preferencia en la escala que después es normalizado a un intervalo entre 0 y 1. Los años de vida ajustado por calidad se obtienen multiplicando el valor de la calidad de vida por los años de vida estimados que vivirá el paciente. Sin embargo, ninguno de estas metodologías mencionadas consideran el factor edad, siendo necesario la inclusión de esta variable. Además, los pacientes pueden responder de manera subjetiva, situación en la que se requiere la opinión de un experto que determine el nivel de discapacidad del aquejado. De esta forma, se introduce el concepto de años de vida ajustado por discapacidad (AVAD) tal que el parámetro de utilidad de los AVAC será el complementario del parámetro de discapacidad de los AVAD Q^i=1-D^i. A pesar de que este último incorpora parámetros de ponderación de edad que no se contemplan en los AVAC. Además, bajo la suposición Q=1-D, podemos determinar la calidad de vida del individuo antes del tratamiento. Una vez obtenido los AVAC ganados, procedemos a la valoración monetaria de éstos. Para ello, partimos de la suposición de que la intervención sanitaria permite al individuo volver a realizar las labores que venía realizando. De modo que valoramos los salarios probables con una temporalidad igual a los AVAC ganados, teniendo en cuenta la limitación que supone la aplicación de este enfoque. Finalmente, analizamos los beneficios derivados del tratamiento (masa salarial probable) si empleamos la tabla GRF-95 (población femenina) y GRM-95 (población masculina).
Resumo:
Nas últimas três décadas, o Brasil produziu mais de um milhão de mortos por homicídios, alcançando assim a triste posição de 18 país com maior taxa de mortes violentas no mundo (GENEVADECLARATIONON ARMED VIOLENCE AND DEVELOPMENT, 2011). Para solucionar tal problema, diversos esforços privados e públicos foram feitos, tendo sido o Estatuto de Desarmamento um dos esforços de maior destaque. No entanto, apesar de decorridos mais de dez anos após a promulgação desta legislação, a literatura econômica sobre o crime ainda não é unânime acerca dos efeitos das armas de fogo sobre os crimes violentos. Com a intenção de analisar estes efeitos, esta dissertação investiga as diferentes abordagens da Teoria Econômica do Crime e elabora um modelo teórico capaz de respaldar a análise empírica. Esta análise, por sua vez, avalia as relações entre armas de fogo e homicídios por perfuração de arma de fogo no Brasil e no Estado do Rio Grande do Sul, por meio de Vetores Auto Regressivos em painel. Dos resultados obtidos, conclui-se que os efeitos entre armas e homicídios variam de acordo com as heterogeneidades locais, não sendo possível extrapolar os mesmos.
Análise do mercado sucroalcooleiro e das elasticidades preço e renda da demanda por etanol hidratado
Resumo:
Este trabalho tem como objetivo estudar o mercado de etanol combustível e estimar as elasticidades preço e renda da demanda de etanol hidratado, no Brasil, no período de janeiro de 2003 a setembro de 2012. O método econométrico utilizado para analisar os dados e obter os resultados referentes à estimativa dos parâmetros das equações de demanda foi a metodologia de Johansen. Baseado na teoria da demanda incluiu-se, num primeiro momento, como variáveis explicativas: o preço do bem, o preço do bem substituto e a renda, num segundo momento incluiu-se também a frota de veículos, estimando assim o modelo VAR/VEC. Os resultados encontrados foram significativos e estão de acordo com a teoria econômica, nos levando a concluir que a demanda por etanol é bem elástica ao preço do etanol e ao preço da gasolina. Antes de desenvolvermos o modelo, analisamos o mercado de cana-de-açúcar, de etanol e da commodity concorrente ao etanol, o açúcar. Destacando a preocupação com o meio ambiente e a importância do etanol como energia renovável.