32 resultados para Welland Canal (Ont.) -- Rates and tolls


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Perinatal care of pregnant women at high risk for preterm delivery and of preterm infants born at the limit of viability (22-26 completed weeks of gestation) requires a multidisciplinary approach by an experienced perinatal team. Limited precision in the determination of both gestational age and foetal weight, as well as biological variability may significantly affect the course of action chosen in individual cases. The decisions that must be taken with the pregnant women and on behalf of the preterm infant in this context are complex and have far-reaching consequences. When counselling pregnant women and their partners, neonatologists and obstetricians should provide them with comprehensive information in a sensitive and supportive way to build a basis of trust. The decisions are developed in a continuing dialogue between all parties involved (physicians, midwives, nursing staff and parents) with the principal aim to find solutions that are in the infant's and pregnant woman's best interest. Knowledge of current gestational age-specific mortality and morbidity rates and how they are modified by prenatally known prognostic factors (estimated foetal weight, sex, exposure or nonexposure to antenatal corticosteroids, single or multiple births) as well as the application of accepted ethical principles form the basis for responsible decision-making. Communication between all parties involved plays a central role. The members of the interdisciplinary working group suggest that the care of preterm infants with a gestational age between 22 0/7 and 23 6/7 weeks should generally be limited to palliative care. Obstetric interventions for foetal indications such as Caesarean section delivery are usually not indicated. In selected cases, for example, after 23 weeks of pregnancy have been completed and several of the above mentioned prenatally known prognostic factors are favourable or well informed parents insist on the initiation of life-sustaining therapies, active obstetric interventions for foetal indications and provisional intensive care of the neonate may be reasonable. In preterm infants with a gestational age between 24 0/7 and 24 6/7 weeks, it can be difficult to determine whether the burden of obstetric interventions and neonatal intensive care is justified given the limited chances of success of such a therapy. In such cases, the individual constellation of prenatally known factors which impact on prognosis can be helpful in the decision making process with the parents. In preterm infants with a gestational age between 25 0/7 and 25 6/7 weeks, foetal surveillance, obstetric interventions for foetal indications and neonatal intensive care measures are generally indicated. However, if several prenatally known prognostic factors are unfavourable and the parents agree, primary non-intervention and neonatal palliative care can be considered. All pregnant women with threatening preterm delivery or premature rupture of membranes at the limit of viability must be transferred to a perinatal centre with a level III neonatal intensive care unit no later than 23 0/7 weeks of gestation, unless emergency delivery is indicated. An experienced neonatology team should be involved in all deliveries that take place after 23 0/7 weeks of gestation to help to decide together with the parents if the initiation of intensive care measures appears to be appropriate or if preference should be given to palliative care (i.e., primary non-intervention). In doubtful situations, it can be reasonable to initiate intensive care and to admit the preterm infant to a neonatal intensive care unit (i.e., provisional intensive care). The infant's clinical evolution and additional discussions with the parents will help to clarify whether the life-sustaining therapies should be continued or withdrawn. Life support is continued as long as there is reasonable hope for survival and the infant's burden of intensive care is acceptable. If, on the other hand, the health care team and the parents have to recognise that in the light of a very poor prognosis the burden of the currently used therapies has become disproportionate, intensive care measures are no longer justified and other aspects of care (e.g., relief of pain and suffering) are the new priorities (i.e., redirection of care). If a decision is made to withhold or withdraw life-sustaining therapies, the health care team should focus on comfort care for the dying infant and support for the parents.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

AbstractBreast cancer is one of the most common cancers affecting one in eight women during their lives. Survival rates have increased steadily thanks to early diagnosis with mammography screening and more efficient treatment strategies. Post-operative radiation therapy is a standard of care in the management of breast cancer and has been shown to reduce efficiently both local recurrence rate and breast cancer mortality. Radiation therapy is however associated with some late effects for long-term survivors. Radiation-induced secondary cancer is a relatively rare but severe late effect of radiation therapy. Currently, radiotherapy plans are essentially optimized to maximize tumor control and minimize late deterministic effects (tissue reactions) that are mainly associated with high doses (» 1 Gy). With improved cure rates and new radiation therapy technologies, it is also important to evaluate and minimize secondary cancer risks for different treatment techniques. This is a particularly challenging task due to the large uncertainties in the dose-response relationship.In contrast with late deterministic effects, secondary cancers may be associated with much lower doses and therefore out-of-field doses (also called peripheral doses) that are typically inferior to 1 Gy need to be determined accurately. Out-of-field doses result from patient scatter and head scatter from the treatment unit. These doses are particularly challenging to compute and we characterized it by Monte Carlo (MC) calculation. A detailed MC model of the Siemens Primus linear accelerator has been thoroughly validated with measurements. We investigated the accuracy of such a model for retrospective dosimetry in epidemiological studies on secondary cancers. Considering that patients in such large studies could be treated on a variety of machines, we assessed the uncertainty in reconstructed peripheral dose due to the variability of peripheral dose among various linac geometries. For large open fields (> 10x10 cm2), the uncertainty would be less than 50%, but for small fields and wedged fields the uncertainty in reconstructed dose could rise up to a factor of 10. It was concluded that such a model could be used for conventional treatments using large open fields only.The MC model of the Siemens Primus linac was then used to compare out-of-field doses for different treatment techniques in a female whole-body CT-based phantom. Current techniques such as conformai wedged-based radiotherapy and hybrid IMRT were investigated and compared to older two-dimensional radiotherapy techniques. MC doses were also compared to those of a commercial Treatment Planning System (TPS). While the TPS is routinely used to determine the dose to the contralateral breast and the ipsilateral lung which are mostly out of the treatment fields, we have shown that these doses may be highly inaccurate depending on the treatment technique investigated. MC shows that hybrid IMRT is dosimetrically similar to three-dimensional wedge-based radiotherapy within the field, but offers substantially reduced doses to out-of-field healthy organs.Finally, many different approaches to risk estimations extracted from the literature were applied to the calculated MC dose distribution. Absolute risks varied substantially as did the ratio of risk between two treatment techniques, reflecting the large uncertainties involved with current risk models. Despite all these uncertainties, the hybrid IMRT investigated resulted in systematically lower cancer risks than any of the other treatment techniques. More epidemiological studies with accurate dosimetry are required in the future to construct robust risk models. In the meantime, any treatment strategy that reduces out-of-field doses to healthy organs should be investigated. Electron radiotherapy might offer interesting possibilities with this regard.RésuméLe cancer du sein affecte une femme sur huit au cours de sa vie. Grâce au dépistage précoce et à des thérapies de plus en plus efficaces, le taux de guérison a augmenté au cours du temps. La radiothérapie postopératoire joue un rôle important dans le traitement du cancer du sein en réduisant le taux de récidive et la mortalité. Malheureusement, la radiothérapie peut aussi induire des toxicités tardives chez les patients guéris. En particulier, les cancers secondaires radio-induits sont une complication rare mais sévère de la radiothérapie. En routine clinique, les plans de radiothérapie sont essentiellement optimisées pour un contrôle local le plus élevé possible tout en minimisant les réactions tissulaires tardives qui sont essentiellement associées avec des hautes doses (» 1 Gy). Toutefois, avec l'introduction de différentes nouvelles techniques et avec l'augmentation des taux de survie, il devient impératif d'évaluer et de minimiser les risques de cancer secondaire pour différentes techniques de traitement. Une telle évaluation du risque est une tâche ardue étant donné les nombreuses incertitudes liées à la relation dose-risque.Contrairement aux effets tissulaires, les cancers secondaires peuvent aussi être induits par des basses doses dans des organes qui se trouvent hors des champs d'irradiation. Ces organes reçoivent des doses périphériques typiquement inférieures à 1 Gy qui résultent du diffusé du patient et du diffusé de l'accélérateur. Ces doses sont difficiles à calculer précisément, mais les algorithmes Monte Carlo (MC) permettent de les estimer avec une bonne précision. Un modèle MC détaillé de l'accélérateur Primus de Siemens a été élaboré et validé avec des mesures. La précision de ce modèle a également été déterminée pour la reconstruction de dose en épidémiologie. Si on considère que les patients inclus dans de larges cohortes sont traités sur une variété de machines, l'incertitude dans la reconstruction de dose périphérique a été étudiée en fonction de la variabilité de la dose périphérique pour différents types d'accélérateurs. Pour de grands champs (> 10x10 cm ), l'incertitude est inférieure à 50%, mais pour de petits champs et des champs filtrés, l'incertitude de la dose peut monter jusqu'à un facteur 10. En conclusion, un tel modèle ne peut être utilisé que pour les traitements conventionnels utilisant des grands champs.Le modèle MC de l'accélérateur Primus a été utilisé ensuite pour déterminer la dose périphérique pour différentes techniques dans un fantôme corps entier basé sur des coupes CT d'une patiente. Les techniques actuelles utilisant des champs filtrés ou encore l'IMRT hybride ont été étudiées et comparées par rapport aux techniques plus anciennes. Les doses calculées par MC ont été comparées à celles obtenues d'un logiciel de planification commercial (TPS). Alors que le TPS est utilisé en routine pour déterminer la dose au sein contralatéral et au poumon ipsilatéral qui sont principalement hors des faisceaux, nous avons montré que ces doses peuvent être plus ou moins précises selon la technTque étudiée. Les calculs MC montrent que la technique IMRT est dosimétriquement équivalente à celle basée sur des champs filtrés à l'intérieur des champs de traitement, mais offre une réduction importante de la dose aux organes périphériques.Finalement différents modèles de risque ont été étudiés sur la base des distributions de dose calculées par MC. Les risques absolus et le rapport des risques entre deux techniques de traitement varient grandement, ce qui reflète les grandes incertitudes liées aux différents modèles de risque. Malgré ces incertitudes, on a pu montrer que la technique IMRT offrait une réduction du risque systématique par rapport aux autres techniques. En attendant des données épidémiologiques supplémentaires sur la relation dose-risque, toute technique offrant une réduction des doses périphériques aux organes sains mérite d'être étudiée. La radiothérapie avec des électrons offre à ce titre des possibilités intéressantes.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Introduction In my thesis I argue that economic policy is all about economics and politics. Consequently, analysing and understanding economic policy ideally has at least two parts. The economics part, which is centered around the expected impact of a specific policy on the real economy both in terms of efficiency and equity. The insights of this part point into which direction the fine-tuning of economic policies should go. However, fine-tuning of economic policies will be most likely subject to political constraints. That is why, in the politics part, a much better understanding can be gained by taking into account how the incentives of politicians and special interest groups as well as the role played by different institutional features affect the formation of economic policies. The first part and chapter of my thesis concentrates on the efficiency-related impact of economic policies: how does corporate income taxation in general, and corporate income tax progressivity in specific, affect the creation of new firms? Reduced progressivity and flat-rate taxes are in vogue. By 2009, 22 countries are operating flat-rate income tax systems, as do 7 US states and 14 Swiss cantons (for corporate income only). Tax reform proposals in the spirit of the "flat tax" model typically aim to reduce three parameters: the average tax burden, the progressivity of the tax schedule, and the complexity of the tax code. In joint work, Marius Brülhart and I explore the implications of changes in these three parameters on entrepreneurial activity, measured by counts of firm births in a panel of Swiss municipalities. Our results show that lower average tax rates and reduced complexity of the tax code promote firm births. Controlling for these effects, reduced progressivity inhibits firm births. Our reading of these results is that tax progressivity has an insurance effect that facilitates entrepreneurial risk taking. The positive effects of lower tax levels and reduced complexity are estimated to be significantly stronger than the negative effect of reduced progressivity. To the extent that firm births reflect desirable entrepreneurial dynamism, it is not the flattening of tax schedules that is key to successful tax reforms, but the lowering of average tax burdens and the simplification of tax codes. Flatness per se is of secondary importance and even appears to be detrimental to firm births. The second part of my thesis, which corresponds to the second and third chapter, concentrates on how economic policies are formed. By the nature of the analysis, these two chapters draw on a broader literature than the first chapter. Both economists and political scientists have done extensive research on how economic policies are formed. Thereby, researchers in both disciplines have recognised the importance of special interest groups trying to influence policy-making through various channels. In general, economists base their analysis on a formal and microeconomically founded approach, while abstracting from institutional details. In contrast, political scientists' frameworks are generally richer in terms of institutional features but lack the theoretical rigour of economists' approaches. I start from the economist's point of view. However, I try to borrow as much as possible from the findings of political science to gain a better understanding of how economic policies are formed in reality. In the second chapter, I take a theoretical approach and focus on the institutional policy framework to explore how interactions between different political institutions affect the outcome of trade policy in presence of special interest groups' lobbying. Standard political economy theory treats the government as a single institutional actor which sets tariffs by trading off social welfare against contributions from special interest groups seeking industry-specific protection from imports. However, these models lack important (institutional) features of reality. That is why, in my model, I split up the government into a legislative and executive branch which can both be lobbied by special interest groups. Furthermore, the legislative has the option to delegate its trade policy authority to the executive. I allow the executive to compensate the legislative in exchange for delegation. Despite ample anecdotal evidence, bargaining over delegation of trade policy authority has not yet been formally modelled in the literature. I show that delegation has an impact on policy formation in that it leads to lower equilibrium tariffs compared to a standard model without delegation. I also show that delegation will only take place if the lobby is not strong enough to prevent it. Furthermore, the option to delegate increases the bargaining power of the legislative at the expense of the lobbies. Therefore, the findings of this model can shed a light on why the U.S. Congress often practices delegation to the executive. In the final chapter of my thesis, my coauthor, Antonio Fidalgo, and I take a narrower approach and focus on the individual politician level of policy-making to explore how connections to private firms and networks within parliament affect individual politicians' decision-making. Theories in the spirit of the model of the second chapter show how campaign contributions from lobbies to politicians can influence economic policies. There exists an abundant empirical literature that analyses ties between firms and politicians based on campaign contributions. However, the evidence on the impact of campaign contributions is mixed, at best. In our paper, we analyse an alternative channel of influence in the shape of personal connections between politicians and firms through board membership. We identify a direct effect of board membership on individual politicians' voting behaviour and an indirect leverage effect when politicians with board connections influence non-connected peers. We assess the importance of these two effects using a vote in the Swiss parliament on a government bailout of the national airline, Swissair, in 2001, which serves as a natural experiment. We find that both the direct effect of connections to firms and the indirect leverage effect had a strong and positive impact on the probability that a politician supported the government bailout.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Indirect calorimetry based on respiratory exchange measurement has been successfully used from the beginning of the century to obtain an estimate of heat production (energy expenditure) in human subjects and animals. The errors inherent to this classical technique can stem from various sources: 1) model of calculation and assumptions, 2) calorimetric factors used, 3) technical factors and 4) human factors. The physiological and biochemical factors influencing the interpretation of calorimetric data include a change in the size of the bicarbonate and urea pools and the accumulation or loss (via breath, urine or sweat) of intermediary metabolites (gluconeogenesis, ketogenesis). More recently, respiratory gas exchange data have been used to estimate substrate utilization rates in various physiological and metabolic situations (fasting, post-prandial state, etc.). It should be recalled that indirect calorimetry provides an index of overall substrate disappearance rates. This is incorrectly assumed to be equivalent to substrate "oxidation" rates. Unfortunately, there is no adequate golden standard to validate whole body substrate "oxidation" rates, and this contrasts to the "validation" of heat production by indirect calorimetry, through use of direct calorimetry under strict thermal equilibrium conditions. Tracer techniques using stable (or radioactive) isotopes, represent an independent way of assessing substrate utilization rates. When carbohydrate metabolism is measured with both techniques, indirect calorimetry generally provides consistent glucose "oxidation" rates as compared to isotopic tracers, but only when certain metabolic processes (such as gluconeogenesis and lipogenesis) are minimal or / and when the respiratory quotients are not at the extreme of the physiological range. However, it is believed that the tracer techniques underestimate true glucose "oxidation" rates due to the failure to account for glycogenolysis in the tissue storing glucose, since this escapes the systemic circulation. A major advantage of isotopic techniques is that they are able to estimate (given certain assumptions) various metabolic processes (such as gluconeogenesis) in a noninvasive way. Furthermore when, in addition to the 3 macronutrients, a fourth substrate is administered (such as ethanol), isotopic quantification of substrate "oxidation" allows one to eliminate the inherent assumptions made by indirect calorimetry. In conclusion, isotopic tracers techniques and indirect calorimetry should be considered as complementary techniques, in particular since the tracer techniques require the measurement of carbon dioxide production obtained by indirect calorimetry. However, it should be kept in mind that the assessment of substrate oxidation by indirect calorimetry may involve large errors in particular over a short period of time. By indirect calorimetry, energy expenditure (heat production) is calculated with substantially less error than substrate oxidation rates.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Objective. The existence of two vaccines seasonal and pandemic-created the potential for confusion and misinformation among consumers during the 2009-2010 vaccination season. We measured the frequency and nature of influenza vaccination communication between healthcare providers and adults for both seasonal and 2009 influenza A(H1N1) vaccination and quantified its association with uptake of the two vaccines.Methods. We analyzed data from 4040 U.S. adult members of a nationally representative online panel surveyed between March 4th and March 24th, 2010. We estimated prevalence rates and adjusted associations between vaccine uptake and vaccination-related communication between patients and healthcare providers using bivariate probit models.Results. 64.1% (95%-CI: 61.5%-66.6%) of adults did not receive any provider-issued influenza vaccination recommendation. Adults who received a provider-issued vaccination recommendation were 14.1 (95%-CI: -2.4 to 30.6) to 32.1 (95%-CI: 24.3-39.8) percentage points more likely to be vaccinated for influenza than adults without a provider recommendation, after adjusting for other characteristics associated with vaccination.Conclusions. Influenza vaccination communication between healthcare providers and adults was relatively uncommon during the 2009-2010 pandemic. Increased communication could significantly enhance influenza vaccination rates. (C) 2011 Elsevier Inc. All rights reserved.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Conjugative transfer of the integrative and conjugative element ICEclc in the bacterium Pseudomonas knackmussii is the consequence of a bistable decision taken in some 3% of cells in a population during stationary phase. Here we study the possible control exerted by the stationary phase sigma factor RpoS on the bistability decision. The gene for RpoS in P. knackmussii B13 was characterized, and a loss-of-function mutant was produced and complemented. We found that, in absence of RpoS, ICEclc transfer rates and activation of two key ICEclc promoters (P(int) and P(inR)) decrease significantly in cells during stationary phase. Microarray and gene reporter analysis indicated that the most direct effect of RpoS is on P(inR), whereas one of the gene products from the P(inR)-controlled operon (InrR) transmits activation to P(int) and other ICEclc core genes. Addition of a second rpoS copy under control of its native promoter resulted in an increase of the proportion of cells expressing the P(int) and P(inR) promoters to 18%. Strains in which rpoS was replaced by an rpoS-mcherry fusion showed high mCherry fluorescence of individual cells that had activated P(int) and P(inR), whereas a double-copy rpoS-mcherry-containing strain displayed twice as much mCherry fluorescence. This suggested that high RpoS levels are a prerequisite for an individual cell to activate P(inR) and thus ICEclc transfer. Double promoter-reporter fusions confirmed that expression of P(inR) is dominated by extrinsic noise, such as being the result of cellular variability in RpoS. In contrast, expression from P(int) is dominated by intrinsic noise, indicating it is specific to the ICEclc transmission cascade. Our results demonstrate how stochastic noise levels of global transcription factors can be transduced to a precise signaling cascade in a subpopulation of cells leading to ICE activation.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In this work we present numerical simulations of continuous flow left ventricle assist device implantation with the aim of comparing difference in flow rates and pressure patterns depending on the location of the anastomosis and the rotational speed of the device. Despite the fact that the descending aorta anastomosis approach is less invasive, since it does not require a sternotomy and a cardiopulmonary bypass, its benefits are still controversial. Moreover, the device rotational speed should be correctly chosen to avoid anomalous flow rates and pressure distribution in specific location of the cardiovascular tree. With the aim of assessing the differences between these two approaches and device rotational speed in terms of flow rate and pressure waveforms, we set up numerical simulations of network of one-dimensional models where we account for the presence of an outflow cannula anastomosed to different locations of the aorta. Then, we use the resulting network to compare the results of the two different cannulations for several stages of heart failure and different rotational speed of the device. The inflow boundary data for the heart and the cannulas are obtained from a lumped parameters model of the entire circulatory system with an assist device, which is validated with clinical data. The results show that ascending and descending aorta cannulations lead to similar waveforms and mean flow rate in all the considered cases. Moreover, regardless of the anastomosis region, the rotational speed of the device has an important impact on wave profiles; this effect is more pronounced at high RPM.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

INTRODUCTION: The EORTC 22922/10925 trial investigated the potential survival benefit and toxicity of elective irradiation of the internal mammary and medial supraclavicular (IM-MS) nodes Accrual completed in January 2004 and first results are expected in 2012. We present the toxicity reported until year 3 after treatment. PATIENTS AND METHODS: At each visit, toxicity was reported but severity was not graded routinely. Toxicity rates and performance status (PS) changes at three years were compared by chi(2) tests and logistic regression models in all the 3,866 of 4,004 patients eligible to the trial who received the allocated treatment. RESULTS: Only lung (fibrosis; dyspnoea; pneumonitis; any lung toxicities) (4.3% vs. 1.3%; p < 0.0001) but not cardiac toxicity (0.3% vs. 0.4%; p = 0.55) significantly increased with IM-MS treatment. No significant worsening of the PS was observed (p = 0.79), suggesting that treatment-related toxicity does not impair patient's daily activities. CONCLUSIONS: IM-MS irradiation seems well tolerated and does not significantly impair WHO PS at three years. A follow-up period of at least 10 years is needed to determine whether cardiac toxicity is increased after radiotherapy.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

BACKGROUND: Screening tests for subclinical cardiovascular disease, such as markers of atherosclerosis, are increasingly used in clinical prevention to identify individuals at high cardiovascular risk. Being aware of these test results might also enhance patient motivation to change unhealthy behaviors but the effectiveness of such a screening strategy has been poorly studied. METHODS: The CAROtid plaque Screening trial on Smoking cessation (CAROSS) is a randomized controlled trial in 530 regular smokers aged 40-70 years to test the hypothesis that carotid plaque screening will influence smokers' behavior with an increased rate of smoking cessation (primary outcome) and an improved control of other cardiovascular risk factors (secondary outcomes) after 1-year follow-up. All smokers will receive a brief advice for smoking cessation,and will subsequently be randomly assigned to either the intervention group (with plaques screening) or the control group (without plaque screening). Carotid ultrasound will be conducted with a standard protocol. Smokers with at least one carotid plaque will receive pictures of their own plaques with a structured explanation on the general significance of plaques. To ensure equal contact conditions, smokers not undergoing ultrasound and those without plaque will receive a relevant explanation on the risks associated with tobacco smoking. Study outcomes will be compared between smokers randomized to plaque screening and smokers not submitted to plaque screening. SUMMARY: This will be the first trial to assess the impact of carotid plaque screening on 1-year smoking cessation rates and levels of control of other cardiovascular risk factors.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Boundaries for delta, representing a "quantitatively significant" or "substantively impressive" distinction, have not been established, analogous to the boundary of alpha, usually set at 0.05, for the stochastic or probabilistic component of "statistical significance". To determine what boundaries are being used for the "quantitative" decisions, we reviewed pertinent articles in three general medical journals. For each contrast of two means, contrast of two rates, or correlation coefficient, we noted the investigators' decisions about stochastic significance, stated in P values or confidence intervals, and about quantitative significance, indicated by interpretive comments. The boundaries between impressive and unimpressive distinctions were best formed by a ratio of greater than or equal to 1.2 for the smaller to the larger mean in 546 comparisons, by a standardized increment of greater than or equal to 0.28 and odds ratio of greater than or equal to 2.2 in 392 comparisons of two rates; and by an r value of greater than or equal to 0.32 in 154 correlation coefficients. Additional boundaries were also identified for "substantially" and "highly" significant quantitative distinctions. Although the proposed boundaries should be kept flexible, indexes and boundaries for decisions about "quantitative significance" are particularly useful when a value of delta must be chosen for calculating sample size before the research is done, and when the "statistical significance" of completed research is appraised for its quantitative as well as stochastic components.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Summary The field of public finance focuses on the spending and taxing activities of governments and their influence on the allocation of resources and distribution of income. This work covers in three parts different topics related to public finance which are currently widely discussed in media and politics. The first two parts deal with issues on social security, which is in general one of the biggest spending shares of governments. The third part looks at the main income source of governments by analyzing the perceived value of tax competition. Part one deals with the current problem of increased early retirement by focusing on Switzerland as a special case. Early retirement is predominantly considered to be the result of incentives set by social security and the tax system. But the Swiss example demonstrates that the incidence of early retirement has dramatically increased even in the absence of institutional changes. We argue that the wealth effect also plays an important role in the retirement decision for middle and high income earners. An actuarially fair, but mandatory funded system with a relatively high replacement rate may thus contribute to a low labor market participation rate of elderly workers. We provide evidence using a unique dataset on individual retirement decisions in Swiss pension funds, allowing us to perfectly control for pension scheme details. Our findings suggest that affordability is a key determinant in the retirement decisions. The higher the accumulated pension capital, the earlier men, and to a smaller extent women, tend to leave the workforce. The fact that early retirement has become much more prevalent in the last 15 years is a further indicator of the importance of a wealth effect, as the maturing of the Swiss mandatory funded pension system over that period has led to an increase in the effective replacement rates for middle and high income earners. Part two covers the theoretical side of social security. Theories analyzing optimal social security benefits provide important qualitative results, by mainly using one general type of an economy. Economies are however very diverse concerning numerous aspects, one of the most important being the wealth level. This can lead to significant quantitative benefit differences that imply differences in replacement rates and levels of labor supply. We focus on several aspects related to this fact. In a within cohort social security model, we introduce disability insurance with an imperfect screening mechanism. We then vary the wealth level of the model economy and analyze how the optimal social security benefit structure or equivalently, the optimal replacement rates, changes depending on the wealth level of the economy, and if the introduction of disability insurance into a social security system is preferable for all economies. Second, the screening mechanism of disability insurance and the threshold level at which people are defined as disabled can differ. For economies with different wealth levels, we determine for different thresholds the screening level that maximizes social welfare. Finally, part three turns to the income of governments, by adding an element to the controversy on tax competition versus tax harmonization.2 Inter-jurisdictional tax competition can generate at least two potential benefits or costs: On a public level, tax competition may result in a lower or higher efficiency in the production of public services. But there is also a more private benefit in the form of an option for individuals to move to a community with a lower tax rate in the future. To explore the value citizens attach to tax competition we analyze a unique popular vote for a complete tax harmonization between communities in the third largest Swiss canton, Vaud. Although a majority of voters would have seemingly benefited from replacing the current tax rate by a revenue-neutral average tax rate, the proposal was rejected by a large margin. Our estimates suggest that the estimated combined perceived benefit from tax competition is in the range of 10%.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Coronary heart disease is a leading cause of death for both sexes in developed countries. Controversy has arisen about the health benefits and risks of coronary surgery and, more recently of coronary angioplasty. As a clinical prerequisite to these interventions, coronary arteriography can be considered an indicator of invasive services offered to coronary heart disease patients. We collected data on characteristics of all patients subjected to coronary arteriography during 1984 in Switzerland. A total of 4921 coronary arteriographies were performed among 4359 patients; this corresponds to 77 procedures/100,000 residents and 68 patients/100,000 residents. Rates for men are 4.2 times women's rates, and the highest utilization rate for both sexes are observed in the group aged 40-64. Large variations characterize cantonal and regional coronary arteriography rates. Similarly, the distribution of centers practising this procedure is not uniform. These observations are placed in the context of the general practice of coronary angiography, changes expected in the face of by-pass surgery and angioplasty expansion, and coronary heart disease data.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Early Cretaceous life and the environment were strongly influenced by the accelerated break up of Pangaea, which was associated with the formation of a multitude of rift basins, intensified spreading, and important volcanic activity on land and in the sea. These processes likely interacted with greenhouse conditions, and Early Cretaceous climate oscillated between "normal" greenhouse, predominantly arid conditions, and intensified greenhouse, predominantly humid conditions. Arid conditions were important during the latest Jurassic and early Berriasian, the late Barremian, and partly also during the late Aptian. Humid conditions were particularly intense and widespread during shorter episodes of environmental change (EECs): the Valanginian Weissert, the latest Hauterivian Faraoni, the latest Barremian earliest Aptian Taxy, the early Aptian Selli, the early late Aptian Fallot and the late Aptian-early Albian Paquier episodes. Arid conditions were associated with evaporation, low biogeochemical weathering rates, low nutrient fluxes, and partly stratified oceans, leading to oxygen depletion and enhanced preservation of laminated, organic-rich mud (LOM). Humid conditions enabled elevated biogeochemical weathering rates and nutrient fluxes, important runoff and the buildup of freshwater lids in proximal basins, intensified oceanic and atmospheric circulation, widespread upwelling and phosphogenesis, important primary productivity and enhanced preservation of LOM in expanded oxygen-minimum zones. The transition of arid to humid climates may have been associated with the net transfer of water to the continent owing to the infill of dried-out groundwater reservoirs in internally drained inland basins. This resulted in shorter-term sea-level fall, which was followed by sea-level rise. These sea-level changes and the influx of freshwater into the ocean may have influenced oxygen-isotope signatures. Climate change preceding and during the Early Cretaceous EECs may have been rapid, but in general, the EECs had a "pre"-history, during which the stage was set for environmental change. Negative feedback on the climate through increased marine LOM preservation was unlikely, because of the low overall organic-carbon accumulation rates during these episodes. Life and climate co-evolved during the Early Cretaceous. Arid conditions may have affected continental life, such as across the Tithonian/Berriasian boundary. Humid conditions and the corresponding tendency to develop dys- to anaerobic conditions in deeper ocean waters led to phases of accelerated extinction in oceans, but may have led to more luxuriant vegetation cover on continents, such as during the Valanginian, to the benefit of herbivores. During Early Cretaceous EECs, reef systems and carbonate platforms in general were particularly vulnerable. They were the first to disappear and the last to recover, often only after several million years. (C) 2011 Elsevier Ltd. All rights reserved.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Functional divergence between homologous proteins is expected to affect amino acid sequences in two main ways, which can be considered as proxies of biochemical divergence: a "covarion-like" pattern of correlated changes in evolutionary rates, and switches in conserved residues ("conserved but different"). Although these patterns have been used in case studies, a large-scale analysis is needed to estimate their frequency and distribution. We use a phylogenomic framework of animal genes to answer three questions: 1) What is the prevalence of such patterns? 2) Can we link such patterns at the amino acid level with selection inferred at the codon level? 3) Are patterns different between paralogs and orthologs? We find that covarion-like patterns are more frequently detected than "constant but different," but that only the latter are correlated with signal for positive selection. Finally, there is no obvious difference in patterns between orthologs and paralogs.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

BACKGROUND: Prevention of cardiovascular disease (CVD) at the individual level should rely on the assessment of absolute risk using population-specific risk tables. OBJECTIVE: To compare the predictive accuracy of the original and the calibrated SCORE functions regarding 10-year cardiovascular risk in Switzerland. DESIGN: Cross-sectional, population-based study (5773 participants aged 35-74 years). METHODS: The SCORE equation for low-risk countries was calibrated based on the Swiss CVD mortality rates and on the CVD risk factor levels from the study sample. The predicted number of CVD deaths after a 10-year period was computed from the original and the calibrated equations and from the observed cardiovascular mortality for 2003. RESULTS: According to the original and calibrated functions, 16.3 and 15.8% of men and 8.2 and 8.9% of women, respectively, had a 10-year CVD risk > or =5%. Concordance correlation coefficient between the two functions was 0.951 for men and 0.948 for women, both P<0.001. Both risk functions adequately predicted the 10-year cumulative number of CVD deaths: in men, 71 (original) and 74 (calibrated) deaths for 73 deaths when using the CVD mortality rates; in women, 44 (original), 45 (calibrated) and 45 (CVD mortality rates), respectively. Compared to the original function, the calibrated function classified more women and fewer men at high-risk. Moreover, the calibrated function gave better risk estimates among participants aged over 65 years. CONCLUSION: The original SCORE function adequately predicts CVD death in Switzerland, particularly for individuals aged less than 65 years. The calibrated function provides more reliable estimates for older individuals.