870 resultados para power of sale


Relevância:

90.00% 90.00%

Publicador:

Resumo:

Introduction In my thesis I argue that economic policy is all about economics and politics. Consequently, analysing and understanding economic policy ideally has at least two parts. The economics part, which is centered around the expected impact of a specific policy on the real economy both in terms of efficiency and equity. The insights of this part point into which direction the fine-tuning of economic policies should go. However, fine-tuning of economic policies will be most likely subject to political constraints. That is why, in the politics part, a much better understanding can be gained by taking into account how the incentives of politicians and special interest groups as well as the role played by different institutional features affect the formation of economic policies. The first part and chapter of my thesis concentrates on the efficiency-related impact of economic policies: how does corporate income taxation in general, and corporate income tax progressivity in specific, affect the creation of new firms? Reduced progressivity and flat-rate taxes are in vogue. By 2009, 22 countries are operating flat-rate income tax systems, as do 7 US states and 14 Swiss cantons (for corporate income only). Tax reform proposals in the spirit of the "flat tax" model typically aim to reduce three parameters: the average tax burden, the progressivity of the tax schedule, and the complexity of the tax code. In joint work, Marius Brülhart and I explore the implications of changes in these three parameters on entrepreneurial activity, measured by counts of firm births in a panel of Swiss municipalities. Our results show that lower average tax rates and reduced complexity of the tax code promote firm births. Controlling for these effects, reduced progressivity inhibits firm births. Our reading of these results is that tax progressivity has an insurance effect that facilitates entrepreneurial risk taking. The positive effects of lower tax levels and reduced complexity are estimated to be significantly stronger than the negative effect of reduced progressivity. To the extent that firm births reflect desirable entrepreneurial dynamism, it is not the flattening of tax schedules that is key to successful tax reforms, but the lowering of average tax burdens and the simplification of tax codes. Flatness per se is of secondary importance and even appears to be detrimental to firm births. The second part of my thesis, which corresponds to the second and third chapter, concentrates on how economic policies are formed. By the nature of the analysis, these two chapters draw on a broader literature than the first chapter. Both economists and political scientists have done extensive research on how economic policies are formed. Thereby, researchers in both disciplines have recognised the importance of special interest groups trying to influence policy-making through various channels. In general, economists base their analysis on a formal and microeconomically founded approach, while abstracting from institutional details. In contrast, political scientists' frameworks are generally richer in terms of institutional features but lack the theoretical rigour of economists' approaches. I start from the economist's point of view. However, I try to borrow as much as possible from the findings of political science to gain a better understanding of how economic policies are formed in reality. In the second chapter, I take a theoretical approach and focus on the institutional policy framework to explore how interactions between different political institutions affect the outcome of trade policy in presence of special interest groups' lobbying. Standard political economy theory treats the government as a single institutional actor which sets tariffs by trading off social welfare against contributions from special interest groups seeking industry-specific protection from imports. However, these models lack important (institutional) features of reality. That is why, in my model, I split up the government into a legislative and executive branch which can both be lobbied by special interest groups. Furthermore, the legislative has the option to delegate its trade policy authority to the executive. I allow the executive to compensate the legislative in exchange for delegation. Despite ample anecdotal evidence, bargaining over delegation of trade policy authority has not yet been formally modelled in the literature. I show that delegation has an impact on policy formation in that it leads to lower equilibrium tariffs compared to a standard model without delegation. I also show that delegation will only take place if the lobby is not strong enough to prevent it. Furthermore, the option to delegate increases the bargaining power of the legislative at the expense of the lobbies. Therefore, the findings of this model can shed a light on why the U.S. Congress often practices delegation to the executive. In the final chapter of my thesis, my coauthor, Antonio Fidalgo, and I take a narrower approach and focus on the individual politician level of policy-making to explore how connections to private firms and networks within parliament affect individual politicians' decision-making. Theories in the spirit of the model of the second chapter show how campaign contributions from lobbies to politicians can influence economic policies. There exists an abundant empirical literature that analyses ties between firms and politicians based on campaign contributions. However, the evidence on the impact of campaign contributions is mixed, at best. In our paper, we analyse an alternative channel of influence in the shape of personal connections between politicians and firms through board membership. We identify a direct effect of board membership on individual politicians' voting behaviour and an indirect leverage effect when politicians with board connections influence non-connected peers. We assess the importance of these two effects using a vote in the Swiss parliament on a government bailout of the national airline, Swissair, in 2001, which serves as a natural experiment. We find that both the direct effect of connections to firms and the indirect leverage effect had a strong and positive impact on the probability that a politician supported the government bailout.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The dynamical analysis of large biological regulatory networks requires the development of scalable methods for mathematical modeling. Following the approach initially introduced by Thomas, we formalize the interactions between the components of a network in terms of discrete variables, functions, and parameters. Model simulations result in directed graphs, called state transition graphs. We are particularly interested in reachability properties and asymptotic behaviors, which correspond to terminal strongly connected components (or "attractors") in the state transition graph. A well-known problem is the exponential increase of the size of state transition graphs with the number of network components, in particular when using the biologically realistic asynchronous updating assumption. To address this problem, we have developed several complementary methods enabling the analysis of the behavior of large and complex logical models: (i) the definition of transition priority classes to simplify the dynamics; (ii) a model reduction method preserving essential dynamical properties, (iii) a novel algorithm to compact state transition graphs and directly generate compressed representations, emphasizing relevant transient and asymptotic dynamical properties. The power of an approach combining these different methods is demonstrated by applying them to a recent multilevel logical model for the network controlling CD4+ T helper cell response to antigen presentation and to a dozen cytokines. This model accounts for the differentiation of canonical Th1 and Th2 lymphocytes, as well as of inflammatory Th17 and regulatory T cells, along with many hybrid subtypes. All these methods have been implemented into the software GINsim, which enables the definition, the analysis, and the simulation of logical regulatory graphs.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

AIM: The clinical relevance of sentinel lymph node (SLN) analysis was evaluated prospectively and compared with other known risk factors of relapse in early stage melanoma. METHODS: Surgery was guided by lymphoscintigraphy, blue dye and gamma probe detection. SLN were analysed by haematoxylin eosin (HE) histochemistry and multimarker immunohistochemistry (IHC). Disease free survival (DFS) was evaluated with Kaplan-Meier plots according to different parameters and Cox analyses of variance. RESULTS: From 210 patients a total of 381 SLN were excised. Lymphoscintigraphy identified all excised SLN with only 2 false positive lymphatic lakes. Fifty patients (24%) had tumour positive SLN. With a mean follow-up of 31.3 months, 29 tumour recurrences were observed, 19 (38%) in 50 SLN positive and 10 (6%) in 160 SLN negative patients. Strong predictive factors for early relapse (p < 0.0005) were SLN positivity and a high Breslow index. CONCLUSION: SLN tumour positivity is an independent factor of high risk for early relapse with a higher power of discrimination than the Breslow index.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

This article addresses the normative dilemma located within the application of `securitization,’ as a method of understanding the social construction of threats and security policies. Securitization as a theoretical and practical undertaking is being increasingly used by scholars and practitioners. This scholarly endeavour wishes to provide those wishing to engage with securitization with an alternative application of this theory; one which is sensitive to and self-reflective of the possible normative consequences of its employment. This article argues that discussing and analyzing securitization processes have normative implications, which is understood here to be the negative securitization of a referent. The negative securitization of a referent is asserted to be carried out through the unchallenged analysis of securitization processes which have emerged through relations of exclusion and power. It then offers a critical understanding and application of securitization studies as a way of overcoming the identified normative dilemma. First, it examines how the Copenhagen School’s formation of securitization theory gives rise to a normative dilemma, which is situated in the performative and symbolic power of security as a political invocation and theoretical concept. Second, it evaluates previous attempts to overcome the normative dilemma of securitization studies, outlining the obstacles that each individual proposal faces. Third, this article argues that the normative dilemma of applying securitization can be avoided by firstly, deconstructing the institutional power of security actors and dominant security subjectivities and secondly, by addressing countering or alternative approaches to security and incorporating different security subjectivities. Examples of the securitization of international terrorism and immigration are prominent throughout.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

In this study, we compared a selective stop task (transition from a bimanual in-phase to a unimanual index fingers' tapping), with a non-selective stop task (stopping a bimanual in-phase tapping at all), and with a switching task (transition from in-phase to anti-phase bimanual tapping). The aim was twofold: 1) to identify the electro-cortical correlates of selective and non-selective inhibition processes and 2) to investigate which type of inhibition - selective or not - is required when switching between two bimanual motor patterns. The results revealed that all tasks led to enhanced activation (alpha power) of the left sensorimotor and posterior regions which seems to reflect an overall effort to stop the preferred bimanual in-phase tendency. Each task implied specific functional connectivity reorganizations (beta coherence) between cerebral motor areas, probably reflecting engagement in a new unimanual or bimanual movement.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Plant membrane compartments and trafficking pathways are highly complex, and are often distinct from those of animals and fungi. Progress has been made in defining trafficking in plants using transient expression systems. However, many processes require a precise understanding of plant membrane trafficking in a developmental context, and in diverse, specialized cell types. These include defense responses to pathogens, regulation of transporter accumulation in plant nutrition or polar auxin transport in development. In all of these cases a central role is played by the endosomal membrane system, which, however, is the most divergent and ill-defined aspect of plant cell compartmentation. We have designed a new vector series, and have generated a large number of stably transformed plants expressing membrane protein fusions to spectrally distinct, fluorescent tags. We selected lines with distinct subcellular localization patterns, and stable, non-toxic expression. We demonstrate the power of this multicolor 'Wave' marker set for rapid, combinatorial analysis of plant cell membrane compartments, both in live-imaging and immunoelectron microscopy. Among other findings, our systematic co-localization analysis revealed that a class of plant Rab1-homologs has a much more extended localization than was previously assumed, and also localizes to trans-Golgi/endosomal compartments. Constructs that can be transformed into any genetic background or species, as well as seeds from transgenic Arabidopsis plants, will be freely available, and will promote rapid progress in diverse areas of plant cell biology.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Meta-analysis of prospective studies shows that quantitative ultrasound of the heel using validated devices predicts risk of different types of fracture with similar performance across different devices and in elderly men and women. These predictions are independent of the risk estimates from hip DXA measures.Introduction Clinical utilisation of heel quantitative ultrasound (QUS) depends on its power to predict clinical fractures. This is particularly important in settings that have no access to DXA-derived bone density measurements. We aimed to assess the predictive power of heel QUS for fractures using a meta-analysis approach.Methods We conducted an inverse variance random effects meta-analysis of prospective studies with heel QUS measures at baseline and fracture outcomes in their follow-up. Relative risks (RR) per standard deviation (SD) of different QUS parameters (broadband ultrasound attenuation [BUA], speed of sound [SOS], stiffness index [SI], and quantitative ultrasound index [QUI]) for various fracture outcomes (hip, vertebral, any clinical, any osteoporotic and major osteoporotic fractures) were reported based on study questions.Results Twenty-one studies including 55,164 women and 13,742 men were included in the meta-analysis with a total follow-up of 279,124 person-years. All four QUS parameters were associated with risk of different fracture. For instance, RR of hip fracture for 1 SD decrease of BUA was 1.69 (95% CI 1.43-2.00), SOS was 1.96 (95% CI 1.64-2.34), SI was 2.26 (95%CI 1.71-2.99) and QUI was 1.99 (95% CI 1.49-2.67). There was marked heterogeneity among studies on hip and any clinical fractures but no evidence of publication bias amongst them. Validated devices from different manufacturers predicted fracture risks with similar performance (meta-regression p values > 0.05 for difference of devices). QUS measures predicted fracture with a similar performance in men and women. Meta-analysis of studies with QUS measures adjusted for hip BMD showed a significant and independent association with fracture risk (RR/SD for BUA = 1.34 [95%CI 1.22-1.49]).Conclusions This study confirms that heel QUS, using validated devices, predicts risk of different fracture outcomes in elderly men and women. Further research is needed for more widespread utilisation of the heel QUS in clinical settings across the world.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

In this paper the two main drawbacks of the heat balance integral methods are examined. Firstly we investigate the choice of approximating function. For a standard polynomial form it is shown that combining the Heat Balance and Refined Integral methods to determine the power of the highest order term will either lead to the same, or more often, greatly improved accuracy on standard methods. Secondly we examine thermal problems with a time-dependent boundary condition. In doing so we develop a logarithmic approximating function. This new function allows us to model moving peaks in the temperature profile, a feature that previous heat balance methods cannot capture. If the boundary temperature varies so that at some time t & 0 it equals the far-field temperature, then standard methods predict that the temperature is everywhere at this constant value. The new method predicts the correct behaviour. It is also shown that this function provides even more accurate results, when coupled with the new CIM, than the polynomial profile. Analysis primarily focuses on a specified constant boundary temperature and is then extended to constant flux, Newton cooling and time dependent boundary conditions.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Errors in the inferred multiple sequence alignment may lead to false prediction of positive selection. Recently, methods for detecting unreliable alignment regions were developed and were shown to accurately identify incorrectly aligned regions. While removing unreliable alignment regions is expected to increase the accuracy of positive selection inference, such filtering may also significantly decrease the power of the test, as positively selected regions are fast evolving, and those same regions are often those that are difficult to align. Here, we used realistic simulations that mimic sequence evolution of HIV-1 genes to test the hypothesis that the performance of positive selection inference using codon models can be improved by removing unreliable alignment regions. Our study shows that the benefit of removing unreliable regions exceeds the loss of power due to the removal of some of the true positively selected sites.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

(Résumé de l'ouvrage) She kills and destroys. She causes illness and disaster. The wild goddess evokes fear and terror. People worship her with blood-sacrifices and alcohol in order to appease her rage, but also in order to participate in her power for she is at once a force of destruction and a force of regeneration, of life, and of sexuality. Her creative violence reflects the ambivalent power of nature. The idea of frightening goddesses is preserved in regionally different forms throughout South Asia. The Institute for the Science of Religions, University of Berne, and the Museum of Anthropology of the University of Zurich, coordinated a symposium on wild goddesses in India and Nepal. The papers and reports on ongoing research presented at this symposium are published in this volume.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Given the urgence of a new paradigm in wireless digital trasmission which should allow for higher bit rate, lower latency and tigher delay constaints, it has been proposed to investigate the fundamental building blocks that at the circuital/device level, will boost the change towards a more efficient network architecture, with high capacity, higher bandwidth and a more satisfactory end user experience. At the core of each transciever, there are inherently analog devices capable of providing the carrier signal, the oscillators. It is strongly believed that many limitations in today's communication protocols, could be relieved by permitting high carrier frequency radio transmission, and having some degree of reconfigurability. This led us to studying distributed oscillator architectures which work in the microwave range and possess wideband tuning capability. As microvave oscillators are essentially nonlinear devices, a full nonlinear analyis, synthesis, and optimization had to be considered for their implementation. Consequently, all the most used nonlinear numerical techniques in commercial EDA software had been reviewed. An application of all the aforementioned techniques has been shown, considering a systems of three coupled oscillator ("triple push" oscillator) in which the stability of the various oscillating modes has been studied. Provided that a certain phase distribution is maintained among the oscillating elements, this topology permits a rise in the output power of the third harmonic; nevertheless due to circuit simmetry, "unwanted" oscillating modes coexist with the intenteded one. Starting with the necessary background on distributed amplification and distributed oscillator theory, the design of a four stage reverse mode distributed voltage controlled oscillator (DVCO) using lumped elments has been presented. All the design steps have been reported and for the first time a method for an optimized design with reduced variations in the output power has been presented. Ongoing work is devoted to model a wideband DVCO and to implement a frequency divider.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Little is known about transmission and drug resistance of tuberculosis (TB) in Bauru, State of São Paulo. The objective of this study was to evaluate risk factors for transmission of Mycobacterium tuberculosis strains in this area. Strains were collected from patients attended at ambulatory services in the region and susceptibility towards the main first line antibiotics was determined and fingerprinting performed. A total of 57 strains were submitted to susceptibility testing: 23 (42.6%) were resistant to at least one drug while 3 (13%) were resistant against both rifampicin and isoniazide. Resistant strains had been isolated from patients that had not (n = 13) or had (n = 9) previously been submitted to anti-TB treatment, demonstrating a preoccupying high level of primary resistance in the context of the study. All strains were submitted to IS6110 restriction fragment length polymorphism (IS6110-RFLP) and double repetitive element PCR (DRE-PCR). Using IS6110-RFLP, 26.3% of the strains were clustered and one cluster of 3 patients included 2 HIV-infected individuals that had been hospitalized together during 16 days; clustering of strains of patients from the hospital was however not higher than that of patients attended at health posts. According to DRE-PCR, 55.3% belonged to a cluster, confirming the larger discriminatory power of IS6110-RFLP when compared to DRE-PCR, that should therefore be used as a screening procedure only. No clinical, epidemiological or microbiological characteristics were associated with clustering so risk factors for transmission of TB could not be defined in the present study.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Usually, the differentiation of inks on questioned documents is carried out by optical methods and thin layer chromatography (TLC). Therefore, spectrometric methods were also proposed in forensic literature for the analysis of dyes. Between these techniques, laser desorption/ionization mass spectrometry (LDI-MS) has demonstrated a great versatility thanks to its sensitivity to blue ballpoint ink dyes and minimal sample destruction. Previous researches concentrated mostly on the LDI-MS positive mode and have shown that this analytical tool offers higher discrimination power than high performance TLC (HPTLC) for the differentiation of blue ballpoint inks. Although LDI-MS negative mode has already been applied in numerous forensic domains like the studies of works of art, automotive paints or rollerball pens, its potential for the discrimination of ballpoint pens was never studied before. The aim of the present paper is therefore to evaluate its potential for the discrimination of blue ballpoint inks. After optimization of the method, ink entries from 33 blue ballpoint pens were analyzed directly on paper in both positive and negative modes by LDI-MS. Several cationic and anionic ink components were identified in inks; therefore, pens were classified and compared according to their formulations. Results show that additional information provided by anionic dyes and pigments significantly increases the discrimination power of positive mode. In fact, it was demonstrated that classifications obtained by the two modes were, to some extent, complementary (i.e., inks with specific cationic dyes not necessarily contained the same anionic components).

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Samples containing highly unbalanced DNA mixtures from two individuals commonly occur both in forensic mixed stains and in peripheral blood DNA microchimerism induced by pregnancy or following organ transplant. Because of PCR amplification bias, the genetic identification of a DNA that contributes trace amounts to a mixed sample represents a tremendous challenge. This means that standard genetic markers, namely microsatellites, also referred as short tandem repeats (STR), and single-nucleotide polymorphism (SNP) have limited power in addressing common questions of forensic and medical genetics. To address this issue, we developed a molecular marker, named DIP-STR that relies on pairing deletion-insertion polymorphisms (DIP) with STR. This novel analytical approach allows for the unambiguous genotyping of a minor component in the presence of a major component, where DIP-STR genotypes of the minor were successfully procured at ratios up to 1:1,000. The compound nature of this marker generates a high level of polymorphism that is suitable for identity testing. Here, we demonstrate the power of the DIP-STR approach on an initial set of nine markers surveyed in a Swiss population. Finally, we discuss the limitations and potential applications of our new system including preliminary tests on clinical samples and estimates of their performance on simulated DNA mixtures.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Laser desorption ionisation mass spectrometry (LDI-MS) has demonstrated to be an excellent analytical method for the forensic analysis of inks on a questioned document. The ink can be analysed directly on its substrate (paper) and hence offers a fast method of analysis as sample preparation is kept to a minimum and more importantly, damage to the document is minimised. LDI-MS has also previously been reported to provide a high power of discrimination in the statistical comparison of ink samples and has the potential to be introduced as part of routine ink analysis. This paper looks into the methodology further and evaluates statistically the reproducibility and the influence of paper on black gel pen ink LDI-MS spectra; by comparing spectra of three different black gel pen inks on three different paper substrates. Although generally minimal, the influences of sample homogeneity and paper type were found to be sample dependent. This should be taken into account to avoid the risk of false differentiation of black gel pen ink samples. Other statistical approaches such as principal component analysis (PCA) proved to be a good alternative to correlation coefficients for the comparison of whole mass spectra.