975 resultados para Three parameters


Relevância:

70.00% 70.00%

Publicador:

Resumo:

Introduction In my thesis I argue that economic policy is all about economics and politics. Consequently, analysing and understanding economic policy ideally has at least two parts. The economics part, which is centered around the expected impact of a specific policy on the real economy both in terms of efficiency and equity. The insights of this part point into which direction the fine-tuning of economic policies should go. However, fine-tuning of economic policies will be most likely subject to political constraints. That is why, in the politics part, a much better understanding can be gained by taking into account how the incentives of politicians and special interest groups as well as the role played by different institutional features affect the formation of economic policies. The first part and chapter of my thesis concentrates on the efficiency-related impact of economic policies: how does corporate income taxation in general, and corporate income tax progressivity in specific, affect the creation of new firms? Reduced progressivity and flat-rate taxes are in vogue. By 2009, 22 countries are operating flat-rate income tax systems, as do 7 US states and 14 Swiss cantons (for corporate income only). Tax reform proposals in the spirit of the "flat tax" model typically aim to reduce three parameters: the average tax burden, the progressivity of the tax schedule, and the complexity of the tax code. In joint work, Marius Brülhart and I explore the implications of changes in these three parameters on entrepreneurial activity, measured by counts of firm births in a panel of Swiss municipalities. Our results show that lower average tax rates and reduced complexity of the tax code promote firm births. Controlling for these effects, reduced progressivity inhibits firm births. Our reading of these results is that tax progressivity has an insurance effect that facilitates entrepreneurial risk taking. The positive effects of lower tax levels and reduced complexity are estimated to be significantly stronger than the negative effect of reduced progressivity. To the extent that firm births reflect desirable entrepreneurial dynamism, it is not the flattening of tax schedules that is key to successful tax reforms, but the lowering of average tax burdens and the simplification of tax codes. Flatness per se is of secondary importance and even appears to be detrimental to firm births. The second part of my thesis, which corresponds to the second and third chapter, concentrates on how economic policies are formed. By the nature of the analysis, these two chapters draw on a broader literature than the first chapter. Both economists and political scientists have done extensive research on how economic policies are formed. Thereby, researchers in both disciplines have recognised the importance of special interest groups trying to influence policy-making through various channels. In general, economists base their analysis on a formal and microeconomically founded approach, while abstracting from institutional details. In contrast, political scientists' frameworks are generally richer in terms of institutional features but lack the theoretical rigour of economists' approaches. I start from the economist's point of view. However, I try to borrow as much as possible from the findings of political science to gain a better understanding of how economic policies are formed in reality. In the second chapter, I take a theoretical approach and focus on the institutional policy framework to explore how interactions between different political institutions affect the outcome of trade policy in presence of special interest groups' lobbying. Standard political economy theory treats the government as a single institutional actor which sets tariffs by trading off social welfare against contributions from special interest groups seeking industry-specific protection from imports. However, these models lack important (institutional) features of reality. That is why, in my model, I split up the government into a legislative and executive branch which can both be lobbied by special interest groups. Furthermore, the legislative has the option to delegate its trade policy authority to the executive. I allow the executive to compensate the legislative in exchange for delegation. Despite ample anecdotal evidence, bargaining over delegation of trade policy authority has not yet been formally modelled in the literature. I show that delegation has an impact on policy formation in that it leads to lower equilibrium tariffs compared to a standard model without delegation. I also show that delegation will only take place if the lobby is not strong enough to prevent it. Furthermore, the option to delegate increases the bargaining power of the legislative at the expense of the lobbies. Therefore, the findings of this model can shed a light on why the U.S. Congress often practices delegation to the executive. In the final chapter of my thesis, my coauthor, Antonio Fidalgo, and I take a narrower approach and focus on the individual politician level of policy-making to explore how connections to private firms and networks within parliament affect individual politicians' decision-making. Theories in the spirit of the model of the second chapter show how campaign contributions from lobbies to politicians can influence economic policies. There exists an abundant empirical literature that analyses ties between firms and politicians based on campaign contributions. However, the evidence on the impact of campaign contributions is mixed, at best. In our paper, we analyse an alternative channel of influence in the shape of personal connections between politicians and firms through board membership. We identify a direct effect of board membership on individual politicians' voting behaviour and an indirect leverage effect when politicians with board connections influence non-connected peers. We assess the importance of these two effects using a vote in the Swiss parliament on a government bailout of the national airline, Swissair, in 2001, which serves as a natural experiment. We find that both the direct effect of connections to firms and the indirect leverage effect had a strong and positive impact on the probability that a politician supported the government bailout.

Relevância:

70.00% 70.00%

Publicador:

Resumo:

For data obtained from horizontal soil column experiments, the determination of soil-water transport characteristics and functions would be aided by a single-form equation capable of objectively describing water content theta vs. time t at given position x(f). Our study was conducted to evaluate two such possible equations, one having the form of the Weibull frequency distribution, and the other being called a bipower form. Each equation contained three parameters, and was fitted by nonlinear least squares to the experimental data from three separate columns of a single soil. Across the theta range containing the measured data points obtained by gamma-ray attenuation, the two equations were in close agreement. The resulting family of theta(x(f),t) transients, as obtained from either equation, enabled the evaluation of exponent n in the t(n) dependence of the positional advance of a given theta. Not only was n found to be <0.5 at low theta values, but it also increased with theta and tended toward 0.5 as theta approached its sated (near-saturated) value. Some quantitative uncertainty in n(theta) does arise due to the reduced number of data points available at the higher water contents. Without claiming non-Boltzmann behavior (n < 0.5) as necessarily representative of all soils, we nonetheless consider n(theta) to be worthy of further study for evaluating its significance and implications.

Relevância:

70.00% 70.00%

Publicador:

Resumo:

For human beings, the origin of life has always been an interesting and mysterious matter, particularly how life arose from inorganic matter through natural processes. Polymerization is always involved in such processes. In this paper we built what we refer to as ideal and physical models to simulate spontaneous polymerization based on certain physical principles. As the modeling confirms, without taking external energy, small and simple inorganic molecules formed bigger and more complicated molecules, which are necessary ingredients of all living organisms. In our simulations, we utilized actual ranges of parameters according to their experimentally observed values. The results from the simulations led to a good agreement with the nature of polymerization. After sorting out through all the models that were built, we arrived at a final model that, it is hoped, can be used to simply and efficiently describe spontaneous polymerization using only three parameters: the dipole moment, the distance between molecules, and the temperature.

Relevância:

70.00% 70.00%

Publicador:

Resumo:

Human identification from a skull is a critical process in legal and forensic medicine, specially when no other means are available. Traditional clay-based methods attempt to generate the human face, in order to identify the corresponding person. However, these reconstructions lack of objectivity and consistence, since they depend on the practitioner. Current computerized techniques are based on facial models, which introduce undesired facial features when the final reconstruction is built. This paper presents an objective 3D craniofacial reconstruction technique, implemented in a graphic application, without using any facial template. The only information required by the software tool is the 3D image of the target skull and three parameters: age, gender and Body Mass Index (BMI) of the individual. Complexity is minimized, since the application database only consists of the anthropological information provided by soft tissue depth values in a set of points of the skull.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

The caffeine solubility in supercritical CO2 was studied by assessing the effects of pressure and temperature on the extraction of green coffee oil (GCO). The Peng-Robinson¹ equation of state was used to correlate the solubility of caffeine with a thermodynamic model and two mixing rules were evaluated: the classical mixing rule of van der Waals with two adjustable parameters (PR-VDW) and a density dependent one, proposed by Mohamed and Holder² with two (PR-MH, two parameters adjusted to the attractive term) and three (PR-MH3 two parameters adjusted to the attractive and one to the repulsive term) adjustable parameters. The best results were obtained with the mixing rule of Mohamed and Holder² with three parameters.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Small angle X-ray scattering (SAXS) images of normal breast tissue and benign and malignant breast tumour tissues, fixed in formalin, were measured at the momentum transfer range of 0.063 nm(-1) <= q (=4 pi sin(theta/2)/lambda) <= 2.720 nm(-1). Four intrinsic parameters were extracted from the scattering profiles (1D SAXS image reduced) and, from the combination of these parameters, another three parameters were also created. All parameters, intrinsic and derived, were subject to discriminant analysis, and it was verified that parameters such as the area of diffuse scatter at the momentum transfer range 0.50 <= q <= 0.56 nm(-1), the ratio between areas of fifth-order axial and third-order lateral peaks and third-order axial spacing provide the most significant information for diagnosis (p < 0.001). Thus, in this work it was verified that by combining these three parameters it was possible to classify human breast tissues as normal, benign lesion or malignant lesion with a sensitivity of 83% and a specificity of 100%.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

The procedure of on-line process control by attributes, known as Taguchi`s on-line process control, consists of inspecting the mth item (a single item) at every m produced items and deciding, at each inspection, whether the fraction of conforming items was reduced or not. If the inspected item is nonconforming, the production is stopped for adjustment. As the inspection system can be subject to diagnosis errors, one develops a probabilistic model that classifies repeatedly the examined item until a conforming or b non-conforming classification is observed. The first event that occurs (a conforming classifications or b non-conforming classifications) determines the final classification of the examined item. Proprieties of an ergodic Markov chain were used to get the expression of average cost of the system of control, which can be optimized by three parameters: the sampling interval of the inspections (m); the number of repeated conforming classifications (a); and the number of repeated non-conforming classifications (b). The optimum design is compared with two alternative approaches: the first one consists of a simple preventive policy. The production system is adjusted at every n produced items (no inspection is performed). The second classifies the examined item repeatedly r (fixed) times and considers it conforming if most classification results are conforming. Results indicate that the current proposal performs better than the procedure that fixes the number of repeated classifications and classifies the examined item as conforming if most classifications were conforming. On the other hand, the preventive policy can be averagely the most economical alternative rather than those ones that require inspection depending on the degree of errors and costs. A numerical example illustrates the proposed procedure. (C) 2009 Elsevier B. V. All rights reserved.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

BACKGROUND AND PURPOSE Bacterial lipopolysaccharide (LPS) induces fever through two parallel pathways; one, prostaglandin (PG)-dependent and the other, PG-independent and involving endothelin-1 (ET-1). For a better understanding of the mechanisms by which dipyrone exerts antipyresis, we have investigated its effects on fever and changes in PGE(2) content in plasma, CSF and hypothalamus induced by either LPS or ET-1. EXPERIMENTAL APPROACH Rats were given (i.p.) dipyrone (120 mg center dot kg-1) or indomethacin (2 mg center dot kg-1) 30 min before injection of LPS (5 mu g center dot kg-1, i.v.) or ET-1 (1 pmol, i.c.v.). Rectal temperature was measured by tele-thermometry. PGE(2) levels were determined in the plasma, CSF and hypothalamus by elisa. KEY RESULTS LPS or ET-1 induced fever and increased CSF and hypothalamic PGE(2) levels. Two hours after LPS, indomethacin reduced CSF and hypothalamic PGE(2) but did not inhibit fever, while at 3 h it reduced all three parameters. Three hours after ET-1, indomethacin inhibited the increase in CSF and hypothalamic PGE(2) levels but did not affect fever. Dipyrone abolished both the fever and the increased CSF PGE(2) levels induced by LPS or ET-1 but did not affect the increased hypothalamic PGE(2) levels. Dipyrone also reduced the increase in the venous plasma PGE(2) concentration induced by LPS. CONCLUSIONS AND IMPLICATIONS These findings confirm that PGE(2) does not play a relevant role in ET-1-induced fever. They also demonstrate for the first time that the antipyretic effect of dipyrone was not mechanistically linked to the inhibition of hypothalamic PGE(2) synthesis.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

A distribuição geográfica de um táxon é limitada por aspectos ecológicos e históricos. Muitas atividades humanas têm causado modificações na cobertura vegetal, o que leva à fragmentação e perda do habitat. Isso tem levado à extinção local de populações de várias espécies, alterando sua distribuição geográfica. Entre elas estão as duas espécies do gênero Brachyteles (os muriquis), que são primatas endêmicos de um dos biomas mais afetados por esses processos, a Mata Atlântica. A União Internacional para a Conservação da Natureza (UICN) é uma organização que busca conservar a biodiversidade. Entre outros critérios, utiliza o conhecimento sobre as distribuições geográficas restritas das espécies para classificá-las em categorias de ameaça de extinção, nas chamadas listas vermelhas. Para isso, utiliza parâmetros espaciais, cujos resultados indicam o risco de extinção de determinado táxon em relação à sua distribuição geográfica. Muitas vezes os cálculos desses parâmetros são realizados de maneira subjetiva, de maneira que é importante a busca de métodos que tornem as classificações mais objetivas, precisas e replicáveis. Nesse contexto, o presente trabalho testou diferentes métodos de cálculos de três parâmetros relacionados à distribuição geográfica de B. hypoxanthus e B. arachnoides. Tratam-se de espécies ameaçadas de extinção, com localidades de ocorrência bem conhecidas, que foram profundamente afetadas pela degradação da Mata Atlântica. Assim, podem ser consideradas bons modelos para essas análises. Foi construído um banco de dados de localidades de ocorrência atuais das duas espécies. Por meio de abordagens de Sistemas de Informação Geográfica (SIG), foram estimadas a Extensão de Ocorrência (EOO) através de Mínimo Polígono Convexo e α-hull e Área de Ocupação (AOO) e Subpopulações por meio de métodos de grids, buffers circulares e α-hull, em diferentes escalas espaciais. Os resultados dos cálculos desses parâmetros foram comparados para identificar as abordagens e escalas mais adequadas para a avaliação de risco de extinção. Esses resultados indicam que as listas de localidades e os mapas de distribuição disponibilizados pela UICN precisam ser atualizados. Além disso, sugerem que α-hull é uma abordagem vantajosa para EOO e o método de buffers é mais adequado para os parâmetros de AOO e Subpopulações, quando utiliza escalas espaciais menores. Também foi utilizada a ferramenta GeoCAT, para as duas espécies. Essa ferramenta, por realizar análises de EOO e AOO instantâneas e por seus resultados serem semelhantes aos de outras análises, serve como uma abordagem preliminar de risco de extinção baseado no critério de distribuição geográfica.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Mestrado em Tecnologia de Diagnóstico e Intervenção Cardiovascular. Área de especialização: Ultrassonografia Cardiovascular.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

A presente dissertação consiste em verificar a tendência do coeficiente de segurança quando se varia certos parâmetros (ângulo de atrito interno do terreno, inclinação do terrapleno no tardoz do muro e o ângulo que a massa de solo faz quando se comporta como parte integrante do muro numa situação limite) considerados no cálculo. Para atingir os objectivos anteriormente referidos, dividiu-se o trabalho em duas fases, a primeira fase teve como objectivo verificar qual a tendência do coeficiente de segurança quando sujeito à variação de dois parâmetros, o ângulo de atrito interno do terreno e a inclinação do terrapleno no tardoz do muro que varia de 5° até ao valr do ângulo de atrito interno do terreno. A segunda fase consiste em analisar qual a tendência do coeficiente de segurança quando sujeito à variação de três parâmetros, o ângulo de atrito interno do terreno, de 20° a 45°, a inclinação do terrapleno no tardoz do muro, de 10° até ao valor do ângulo de atrito interno do terreno, e o ângulo que a massa de solo faz quando se comporta como parte integrante do muro numa situação limite. Para ambas as situações efectuaram-se os respectivos cálculos quer pela Teoria de Rankine e quer pela Teoria de Mohr – Coulomb, havendo casos em que foi necessário conjugar estas duas Teorias.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Trabalho Final de Mestrado para obtenção do grau de Mestre em Engenharia de Electrónica e Telecomunicações

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Mestrado em Engenharia Electrotécnica e de Computadores - Ramo de Sistemas Autónomos

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Dissertação para obtenção do grau de Mestre em Engenharia Civil na Área de Especialização de Edificações

Relevância:

60.00% 60.00%

Publicador:

Resumo:

A diversificação do uso dos produtos de cortiça exige uma adaptação constante dos processos produtivos utilizados para a transformação deste produto. Este trabalho tem como principal intuito a idealização e investigação de um novo processo de fabrico adaptado a um tipo muito particular de produto, os aglomerados puros de cortiça. Estes, não recorrem a qualquer tipo de aglutinante para assegurar a sua ligação, apenas as resinas contidas na própria cortiça são utilizadas para esse efeito. Analisando os processos de fabrico atuais, verifica-se que, a conceção de geometrias complexas com aglomerados puros de cortiça, só é possível com recurso à maquinação. Este processo gera desperdício e tem uma série de processos que o antecedem, destacando-se a aglomeração prévia numa forma mais simples (bloco paralelepípedo ou cilindro) e o corte numa pré forma. A ideia de se executar uma peça complexa sem recurso à maquinação é o objetivo primordial deste trabalho. Para isso foi construído um molde, com todos os mecanismos auxiliares necessários, à semelhança do que já se faz noutro tipo de indústrias, nomeadamente, do vidro e do plástico. Esse mesmo molde foi submetido a uma série de ensaios para determinação dos parâmetros de enchimento e moldação, tendo-se chegado a uma formulação que garante características atrativas na peça final. A aglutinação do granulado de cortiça revelou-se bastante dependente de três parâmetros: a temperatura aplicada, a humidade do granulado e o grau de compactação aplicado. A industrialização do processo revelou-se também plausível de execução, sendo apenas necessários alguns melhoramentos. A utilização deste tipo de processo revela-se, portanto, atrativa para grandes séries de peças, conseguindo-se elevada reprodutibilidade das mesmas com apenas um processo, sem necessidade de procedimentos adicionais de acabamento.