873 resultados para exploit
Resumo:
This paper proposes an argument that explains incumbency advantage without recurring to the collective irresponsibility of legislatures. For that purpose, we exploit the informational value of incumbency: incumbency confers voters information about governing politicians not available from challengers. Because there are many reasons for high reelection rates different from incumbency status, we propose a measure of incumbency advantage that improves the use of pure reelection success. We also study the relationship between incumbency advantage and ideological and selection biases. An important implication of our analysis is that the literature linking incumbency and legislature irresponsibility most likely provides an overestimation of the latter.
Resumo:
We study relative performance evaluation in executive compensation whenexecutives have private information about their ability. We assume that thejoint distribution of an individual firm s profit and market movements dependson the ability of the executive that runs the firm. In the equilibrium of theexecutive labor market, compensation schemes exploit this fact to sortexecutives of di ?erent abilities. This implies that executive compensation isincreasing in own performance, but may also be increasing in industryperformance-a sharp departure from standard relative performance evaluation.This result provides an explanation for the scarcity of relative performanceconsiderations in executive compensation documented by the empirical literature.
Resumo:
As the prevalence of smoking has decreased to below 20%, health practitioners interest has shifted towards theprevalence of obesity, and reducing it is one of the major health challenges in decades to come. In this paper westudy the impact that the final product of the anti-smoking campaign, that is, smokers quitting the habit, had onaverage weight in the population. To these ends, we use data from the Behavioral Risk Factors Surveillance System,a large series of independent representative cross-sectional surveys. We construct a synthetic panel that allows us tocontrol for unobserved heterogeneity and we exploit the exogenous changes in taxes and regulations to instrumentthe endogenous decision to give up the habit of smoking. Our estimates, are very close to estimates issued in the 90sby the US Department of Health, and indicate that a 10% decrease in the incidence of smoking leads to an averageweight increase of 2.2 to 3 pounds, depending on choice of specification. In addition, we find evidence that the effectovershoots in the short run, although a significant part remains even after two years. However, when we split thesample between men and women, we only find a significant effect for men. Finally, the implicit elasticity of quittingsmoking to the probability of becoming obese is calculated at 0.58. This implies that the net benefit from reducingthe incidence of smoking by 1% is positive even though the cost to society is $0.6 billions.
Resumo:
A class of composite estimators of small area quantities that exploit spatial (distancerelated)similarity is derived. It is based on a distribution-free model for the areas, but theestimators are aimed to have optimal design-based properties. Composition is applied alsoto estimate some of the global parameters on which the small area estimators depend.It is shown that the commonly adopted assumption of random effects is not necessaryfor exploiting the similarity of the districts (borrowing strength across the districts). Themethods are applied in the estimation of the mean household sizes and the proportions ofsingle-member households in the counties (comarcas) of Catalonia. The simplest version ofthe estimators is more efficient than the established alternatives, even though the extentof spatial similarity is quite modest.
Resumo:
Labor market regulations have often being blamed for high and persistentunemployment in Europe, but evidence on their impact remains mixed. Morerecently, attention has turned to the impact of product market regulationson employment growth. This paper analyzes how labor and product marketregulations interact to affect turnover and employment. We present a matchingmodel which illustrates how barriers to entry in the product market mitigatethe impact of labor market deregulation. We, then, use the Italian SocialSecurity employer-employee panel to study the interaction between barriersto entry and dismissal costs. We exploit the fact that costs for unjustdismissals in Italy increased for firms below 15 employees relative to biggerfirms after 1990. We find that the increase in dismissal costs after 1990decreased accessions and separations in small relative to big firms,especially for women. Moreover, consistent with our model, we find evidencethat the increase in dismissal costs had smaller effects on turnover for womenin sectors faced with strict product market regulations.
Resumo:
This paper reports an analysis of the evolution of equity in access to health care in Spain over the period 1987-2001, a time span covering the development of the modern Spanish National Health System. Our measures of access are the probabilities of visiting a doctor, using emergency services and being hospitalised. For these three measures we obtain indices of horizontal inequity from microeconometric models of utilization that exploit the individual information in the Spanish National Health Surveys of 1987 and 2001. We find that by 2001 the system has improved in the sense that differences in income no longer lead to different access given the same level of need. However, the tenure of private health insurance leads to differences in access given the same level of need, and its contribution to inequity has increased over time, both because insurance is more concentrated among the rich and because the elasticity of utilization for the three services has increased too.
Resumo:
When the behaviour of a specific hypothesis test statistic is studied by aMonte Carlo experiment, the usual way to describe its quality is by givingthe empirical level of the test. As an alternative to this procedure, we usethe empirical distribution of the obtained \emph{p-}values and exploit itsinformation both graphically and numerically.
Resumo:
We present a model of conglomeration motivated by technology synergies and strategic reductions in variable costs in the face of competitive pressures. The resulting firm integration is neither horizontal nor vertical but rather congeneric integration of firms in related industries. We endogenize the industrial conglomeration structure and examine the effects of competition between conglomerates, and between a conglomerate and independent firms. We show that there is an equilibrium synergy trap in which conglomerates are formed to exploit economies of scope, but resulting profits are lower than under the status quo. We also show that strategic firm integration can occur even in the presence of diseconomies of scope. The model helps to explain features of recent mergers and acquisitions experience.
Resumo:
OBJECTIVE. The purpose of this study was to improve the blood-pool signal-to-noise ratio (SNR) and blood-myocardium contrast-to-noise ratio (CNR) of slow-infusion 3-T whole-heart coronary MR angiography (MRA).SUBJECTS AND METHODS. In 2D sensitivity encoding (SENSE), the number of acquired k-space lines is reduced, allowing less radiofrequency excitation per cardiac cycle and a longer TR. The former can be exploited for signal enhancement with a higher radiofrequency excitation angle, and the latter leads to noise reduction due to lower data-sampling bandwidth. Both effects contribute to SNR gain in coronary MRA when spatial and temporal resolution and acquisition time remain identical. Numeric simulation was performed to select the optimal 2D SENSE pulse sequence parameters and predict the SNR gain. Eleven patients underwent conventional unenhanced and the proposed 2D SENSE contrast-enhanced coronary MRA acquisition. Blood-pool SNR, blood-myocardium CNR, visible vessel length, vessel sharpness, and number of side branches were evaluated.RESULTS. Consistent with the numeric simulation, using 2D SENSE in contrast-enhanced coronary MRA resulted in significant improvement in aortic blood-pool SNR (unenhanced vs contrast-enhanced, 37.5 +/- 14.7 vs 121.3 +/- 44.0; p < 0.05) and CNR (14.4 +/- 6.9 vs 101.5 +/- 40.8; p < 0.05) in the patient sample. A longer length of left anterior descending coronary artery was visualized, but vessel sharpness, coronary artery coverage, and image quality score were not improved with the proposed approach.CONCLUSION. In combination with contrast administration, 2D SENSE was found effective in improving SNR and CNR in 3-T whole-heart coronary MRA. Further investigation of cardiac motion compensation is necessary to exploit the SNR and CNR advantages and to achieve submillimeter spatial resolution.
Resumo:
Ultra-high-throughput sequencing (UHTS) techniques are evolving rapidly and may soon become an affordable and routine tool for sequencing plant DNA, even in smaller plant biology labs. Here we review recent insights into intraspecific genome variation gained from UHTS, which offers a glimpse of the rather unexpected levels of structural variability among Arabidopsis thaliana accessions. The challenges that will need to be addressed to efficiently assemble and exploit this information are also discussed.
Resumo:
Using comprehensive administrative data on France's single largest financialaid program, this paper provides new evidence on the impact of large-scaleneed-based grant programs on the college enrollment decisions, persistenceand graduation rates of low-income students. We exploit sharp discontinuitiesin the grant eligibility formula to identify the impact of aid on student outcomesat different levels of study. We find that eligibility for an annual cashallowance of 1,500 euros increases college enrollment rates by up to 5 percentagepoints. Moreover, we show that need-based grants have positive effectson student persistence and degree completion.
Resumo:
Under the influence of intelligence-led policing models, crime analysis methods have known of important developments in recent years. Applications have been proposed in several fields of forensic science to exploit and manage various types of material evidence in a systematic and more efficient way. However, nothing has been suggested so far in the field of false identity documents.This study seeks to fill this gap by proposing a simple and general method for profiling false identity documents which aims to establish links based on their visual forensic characteristics. A sample of more than 200 false identity documents including French stolen blank passports, counterfeited driving licenses from Iraq and falsified Bulgarian driving licenses was gathered from nine Swiss police departments and integrated into an ad hoc developed database called ProfID. Links detected automatically and systematically through this database were exploited and analyzed to produce strategic and tactical intelligence useful to the fight against identity document fraud.The profiling and intelligence process established for these three types of false identity documents has confirmed its efficiency, more than 30% of documents being linked. Identity document fraud appears as a structured and interregional criminality, against which material and forensic links detected between false identity documents might serve as a tool for investigation.
Resumo:
BACKGROUND: Finding genes that are differentially expressed between conditions is an integral part of understanding the molecular basis of phenotypic variation. In the past decades, DNA microarrays have been used extensively to quantify the abundance of mRNA corresponding to different genes, and more recently high-throughput sequencing of cDNA (RNA-seq) has emerged as a powerful competitor. As the cost of sequencing decreases, it is conceivable that the use of RNA-seq for differential expression analysis will increase rapidly. To exploit the possibilities and address the challenges posed by this relatively new type of data, a number of software packages have been developed especially for differential expression analysis of RNA-seq data. RESULTS: We conducted an extensive comparison of eleven methods for differential expression analysis of RNA-seq data. All methods are freely available within the R framework and take as input a matrix of counts, i.e. the number of reads mapping to each genomic feature of interest in each of a number of samples. We evaluate the methods based on both simulated data and real RNA-seq data. CONCLUSIONS: Very small sample sizes, which are still common in RNA-seq experiments, impose problems for all evaluated methods and any results obtained under such conditions should be interpreted with caution. For larger sample sizes, the methods combining a variance-stabilizing transformation with the 'limma' method for differential expression analysis perform well under many different conditions, as does the nonparametric SAMseq method.
Resumo:
Because viral replication depends on the vigour of its host, many viruses have evolved incentives of fitness to pay their keep. When the viral host is a human pathogen, these fitness factors can surface as virulence: creating a Russian doll of pathogenesis where pathogens within pathogens complicate the disease process. Microbial viruses can even be independently immunogenic, as we recently reported for leishmania-virus. Thus, the incidence of this 'hyperpathogenism' is becoming an important clinical consideration and by appreciating the microbial-virus as a backseat driver of human disease, we could exploit its presence as a diagnostic biomarker and molecular target for therapeutic intervention. Here we discuss the prevalence of clinically relevant hyperpathogenism as well as the environmental sanctuaries that breed it.
Resumo:
BACKGROUND: The criteria for choosing relevant cell lines among a vast panel of available intestinal-derived lines exhibiting a wide range of functional properties are still ill-defined. The objective of this study was, therefore, to establish objective criteria for choosing relevant cell lines to assess their appropriateness as tumor models as well as for drug absorption studies. RESULTS: We made use of publicly available expression signatures and cell based functional assays to delineate differences between various intestinal colon carcinoma cell lines and normal intestinal epithelium. We have compared a panel of intestinal cell lines with patient-derived normal and tumor epithelium and classified them according to traits relating to oncogenic pathway activity, epithelial-mesenchymal transition (EMT) and stemness, migratory properties, proliferative activity, transporter expression profiles and chemosensitivity. For example, SW480 represent an EMT-high, migratory phenotype and scored highest in terms of signatures associated to worse overall survival and higher risk of recurrence based on patient derived databases. On the other hand, differentiated HT29 and T84 cells showed gene expression patterns closest to tumor bulk derived cells. Regarding drug absorption, we confirmed that differentiated Caco-2 cells are the model of choice for active uptake studies in the small intestine. Regarding chemosensitivity we were unable to confirm a recently proposed association of chemo-resistance with EMT traits. However, a novel signature was identified through mining of NCI60 GI50 values that allowed to rank the panel of intestinal cell lines according to their drug responsiveness to commonly used chemotherapeutics. CONCLUSIONS: This study presents a straightforward strategy to exploit publicly available gene expression data to guide the choice of cell-based models. While this approach does not overcome the major limitations of such models, introducing a rank order of selected features may allow selecting model cell lines that are more adapted and pertinent to the addressed biological question.