988 resultados para regression discontinuity design


Relevância:

20.00% 20.00%

Publicador:

Resumo:

In the presence of cost uncertainty, limited liability introduces the possibility of default in procurement with its associated bank-ruptcy costs. When financial soundness is not perfectly observable, we show that incentive compatibility implies that financially less sound contractors are selected with higher probability in any feasible mechanism. Informational rents are associated with unsound financial situations. By selecting the financially weakest contractor, stronger price competition (auctions) may not only increase the probability of default but also expected rents. Thus, weak conditions are suffcient for auctions to be suboptimal. In particular, we show that pooling firms with higher assets may reduce the cost of procurement even when default is costless for the sponsor.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

There are two ways of creating incentives for interacting agents to behave in a desired way. One is by providing appropriate payoff incentives, which is the subject of mechanism design. The other is by choosing the information that agents observe, which we refer to as information design. We consider a model of symmetric information where a designer chooses and announces the information structure about a payoff relevant state. The interacting agents observe the signal realizations and take actions which affect the welfare of both the designer and the agents. We characterize the general finite approach to deriving the optimal information structure for the designer - the one that maximizes the designer's ex ante expected utility subject to agents playing a Bayes Nash equilibrium. We then apply the general approach to a symmetric two state, two agent, and two actions environment in a parameterized underlying game and fully characterize the optimal information structure: it is never strictly optimal for the designer to use conditionally independent private signals; the optimal information structure may be a public signal or may consist of correlated private signals. Finally, we examine how changes in the underlying game affect the designer's maximum payoff. This exercise provides a joint mechanism/information design perspective.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Lean meat percentage (LMP) is an important carcass quality parameter. The aim of this work is to obtain a calibration equation for the Computed Tomography (CT) scans with the Partial Least Square Regression (PLS) technique in order to predict the LMP of the carcass and the different cuts and to study and compare two different methodologies of the selection of the variables (Variable Importance for Projection — VIP- and Stepwise) to be included in the prediction equation. The error of prediction with cross-validation (RMSEPCV) of the LMP obtained with PLS and selection based on VIP value was 0.82% and for stepwise selection it was 0.83%. The prediction of the LMP scanning only the ham had a RMSEPCV of 0.97% and if the ham and the loin were scanned the RMSEPCV was 0.90%. Results indicate that for CT data both VIP and stepwise selection are good methods. Moreover the scanning of only the ham allowed us to obtain a good prediction of the LMP of the whole carcass.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

An active, solvent-free solid sampler was developed for the collection of 1,6-hexamethylene diisocyanate (HDI) aerosol and prepolymers. The sampler was made of a filter impregnated with 1-(2-methoxyphenyl)piperazine contained in a filter holder. Interferences with HDI were observed when a set of cellulose acetate filters and a polystyrene filter holder were used; a glass fiber filter and polypropylene filter cassette gave better results. The applicability of the sampling and analytical procedure was validated with a test chamber, constructed for the dynamic generation of HDI aerosol and prepolymers in commercial two-component spray paints (Desmodur(R) N75) used in car refinishing. The particle size distribution, temporal stability, and spatial uniformity of the simulated aerosol were established in order to test the sample. The monitoring of aerosol concentrations was conducted with the solid sampler paired to the reference impinger technique (impinger flasks contained 10 mL of 0.5 mg/mL 1-(2-methoxyphenyl)piperazine in toluene) under a controlled atmosphere in the test chamber. Analyses of derivatized HDI and prepolymers were carried out by using high-performance liquid chromatography and ultraviolet detection. The correlation between the solvent-free and the impinger techniques appeared fairly good (Y = 0.979X - 0.161; R = 0.978), when the tests were conducted in the range of 0.1 to 10 times the threshold limit value (TLV) for HDI monomer and up to 60-mu-g/m3 (3 U.K. TLVs) for total -N = C = O groups.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The comparison of consecutively manufactured tools and firearms has provided much, but not all, of the basis for the profession of firearm and toolmark examination. The authors accept the fundamental soundness of this approach but appeal to the experimental community to close two minor gaps in the experimental procedure. We suggest that "blinding" and attention to appropriateness of other experimental conditions that would consolidate the foundations of our profession. We do not suggest that previous work is unsound.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper explores the effects of two main sources of innovation - intramural and external R&D— on the productivity level in a sample of 3,267 Catalonian firms. The data set used is based on the official innovation survey of Catalonia which was a part of the Spanish sample of CIS4, covering the years 2002-2004. We compare empirical results by applying usual OLS and quantile regression techniques both in manufacturing and services industries. In quantile regression, results suggest different patterns at both innovation sources as we move across conditional quantiles. The elasticity of intramural R&D activities on productivity decreased when we move up the high productivity levels both in manufacturing and services sectors, while the effects of external R&D rise in high-technology industries but are more ambiguous in low-technology and knowledge-intensive services. JEL codes: O300, C100, O140 Keywords: Innovation sources, R&D, Productivity, Quantile Regression

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Objectives: Skin can be partially regenerated after full thickness defects by collagen matrices, In this study, we identified the main limitations of induced regeneration aiming to improve the design of dermal matrices. Methods: Single mice received a 1 cm2, full thickness skin wound on the dorsum, which were grafted with collagen-GAG matrices or left ungrafted. The healing modulation induced by the collagen-GAG matrices was compared to spontaneous healing and to custom designed, bioactive, poly-N-Acetyl- Glucosamine (NAG) matrices. Wound staging was based on macroscopic, histological and immunhistochemical analysis on days 3, 7, 10 and 21 post wounding. Results: Cell density was higher in spontaneously granulating wounds compared to grafted wounds. While grafted wounds exhibited increased levels of cell proliferation on days 7 and 10, vascularity was dramatically reduced. NAG scaffolds accelerated both angiogenesis and wound re-epithelialization. Conclusions: Since slow integration and revascularization severely limit the engraftment of clinically used dermal scaffolds, the design of dermal matrices using bioactive materials represent the next step in skin regeneration.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

ABSTRACT: BACKGROUND: There is no recommendation to screen ferritin level in blood donors, even though several studies have noted the high prevalence of iron deficiency after blood donation, particularly among menstruating females. Furthermore, some clinical trials have shown that non-anaemic women with unexplained fatigue may benefit from iron supplementation. Our objective is to determine the clinical effect of iron supplementation on fatigue in female blood donors without anaemia, but with a mean serum ferritin </= 30 ng/ml. METHODS/DESIGN: In a double blind randomised controlled trial, we will measure blood count and ferritin level of women under age 50 yr, who donate blood to the University Hospital of Lausanne Blood Transfusion Department, at the time of the donation and after 1 week. One hundred and forty donors with a ferritin level </= 30 ng/ml and haemoglobin level >/= 120 g/l (non-anaemic) a week after the donation will be included in the study and randomised. A one-month course of oral ferrous sulphate (80 mg/day of elemental iron) will be introduced vs. placebo. Self-reported fatigue will be measured using a visual analogue scale. Secondary outcomes are: score of fatigue (Fatigue Severity Scale), maximal aerobic power (Chester Step Test), quality of life (SF-12), and mood disorders (Prime-MD). Haemoglobin and ferritin concentration will be monitored before and after the intervention. DISCUSSION: Iron deficiency is a potential problem for all blood donors, especially menstruating women. To our knowledge, no other intervention study has yet evaluated the impact of iron supplementation on subjective symptoms after a blood donation. TRIAL REGISTRATION: NCT00689793.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

SUMMARY : Eukaryotic DNA interacts with the nuclear proteins using non-covalent ionic interactions. Proteins can recognize specific nucleotide sequences based on the sterical interactions with the DNA and these specific protein-DNA interactions are the basis for many nuclear processes, e.g. gene transcription, chromosomal replication, and recombination. New technology termed ChIP-Seq has been recently developed for the analysis of protein-DNA interactions on a whole genome scale and it is based on immunoprecipitation of chromatin and high-throughput DNA sequencing procedure. ChIP-Seq is a novel technique with a great potential to replace older techniques for mapping of protein-DNA interactions. In this thesis, we bring some new insights into the ChIP-Seq data analysis. First, we point out to some common and so far unknown artifacts of the method. Sequence tag distribution in the genome does not follow uniform distribution and we have found extreme hot-spots of tag accumulation over specific loci in the human and mouse genomes. These artifactual sequence tags accumulations will create false peaks in every ChIP-Seq dataset and we propose different filtering methods to reduce the number of false positives. Next, we propose random sampling as a powerful analytical tool in the ChIP-Seq data analysis that could be used to infer biological knowledge from the massive ChIP-Seq datasets. We created unbiased random sampling algorithm and we used this methodology to reveal some of the important biological properties of Nuclear Factor I DNA binding proteins. Finally, by analyzing the ChIP-Seq data in detail, we revealed that Nuclear Factor I transcription factors mainly act as activators of transcription, and that they are associated with specific chromatin modifications that are markers of open chromatin. We speculate that NFI factors only interact with the DNA wrapped around the nucleosome. We also found multiple loci that indicate possible chromatin barrier activity of NFI proteins, which could suggest the use of NFI binding sequences as chromatin insulators in biotechnology applications. RESUME : L'ADN des eucaryotes interagit avec les protéines nucléaires par des interactions noncovalentes ioniques. Les protéines peuvent reconnaître les séquences nucléotidiques spécifiques basées sur l'interaction stérique avec l'ADN, et des interactions spécifiques contrôlent de nombreux processus nucléaire, p.ex. transcription du gène, la réplication chromosomique, et la recombinaison. Une nouvelle technologie appelée ChIP-Seq a été récemment développée pour l'analyse des interactions protéine-ADN à l'échelle du génome entier et cette approche est basée sur l'immuno-précipitation de la chromatine et sur la procédure de séquençage de l'ADN à haut débit. La nouvelle approche ChIP-Seq a donc un fort potentiel pour remplacer les anciennes techniques de cartographie des interactions protéine-ADN. Dans cette thèse, nous apportons de nouvelles perspectives dans l'analyse des données ChIP-Seq. Tout d'abord, nous avons identifié des artefacts très communs associés à cette méthode qui étaient jusqu'à présent insoupçonnés. La distribution des séquences dans le génome ne suit pas une distribution uniforme et nous avons constaté des positions extrêmes d'accumulation de séquence à des régions spécifiques, des génomes humains et de la souris. Ces accumulations des séquences artéfactuelles créera de faux pics dans toutes les données ChIP-Seq, et nous proposons différentes méthodes de filtrage pour réduire le nombre de faux positifs. Ensuite, nous proposons un nouvel échantillonnage aléatoire comme un outil puissant d'analyse des données ChIP-Seq, ce qui pourraient augmenter l'acquisition de connaissances biologiques à partir des données ChIP-Seq. Nous avons créé un algorithme d'échantillonnage aléatoire et nous avons utilisé cette méthode pour révéler certaines des propriétés biologiques importantes de protéines liant à l'ADN nommés Facteur Nucléaire I (NFI). Enfin, en analysant en détail les données de ChIP-Seq pour la famille de facteurs de transcription nommés Facteur Nucléaire I, nous avons révélé que ces protéines agissent principalement comme des activateurs de transcription, et qu'elles sont associées à des modifications de la chromatine spécifiques qui sont des marqueurs de la chromatine ouverte. Nous pensons que lés facteurs NFI interagir uniquement avec l'ADN enroulé autour du nucléosome. Nous avons également constaté plusieurs régions génomiques qui indiquent une éventuelle activité de barrière chromatinienne des protéines NFI, ce qui pourrait suggérer l'utilisation de séquences de liaison NFI comme séquences isolatrices dans des applications de la biotechnologie.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

AIM OF THE STUDY: To analyse the course of upper limb edema in patients with an arteriovenous fistula used for dialysis and to analyse the available therapeutic options. STUDY DESIGN: Retrospective study of patients with this type of edema, who were treated in our institution from 1992 to 1996. PATIENTS AND METHODS: Seven consecutive patients with an arterioveinous fistula treated for edema of the upper extremity, were reviewed. The fistula was created at the elbow in 6 patients and at the forearm in 1. The edema appeared immediately after operation in 4 patients and after a delay in 3 patients. Stenosis (3 patients) or occlusion (2 patients) of the subclavian vein was documented in 5 patients who were investigated by angiography. RESULTS: The edema regressed spontaneously in 4 patients because collaterals developed in 3 patients, and the fistula thrombosed in 1 patient. Surgical intervention allowed regression of the edema in the other 3 patients: excessive output of the fistula was reduced in 2 patients and an axillojugular bypass was performed in 1 patient. The fistula remained effective in 6 patients. Another fistula was performed on the contralateral arm in 1 patient. CONCLUSION: Non-operative management is recommended in patients who develop edema immediately after creation of the fistula, because spontaneous regression is likely. Measures aimed at reducing the output of the fistula or enhancing the venous capacities of the arm are required when edema appears at a later stage. The fistula can be saved in the majority of cases.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The future of antimalarial chemotherapy is particulary alarming in view of the spread of parasite cross-resistances to drugs that are not even structurally related. Only the availability of new pharmacological models will make it possible to select molecules with novel mechanisms of action, thus delaving resistance and allowing the development of new chemotherapeutic strategies. We reached this objective in mice. Our approach is hunged on fundamental and applied research begun in 1980 to investigate to phospholipid (PL) metabolism of intraerythrocytic Plasmodium. This metabolism is abundant, specific and indispensable for the production of Plasmodium membranes. Any drug to interfere with this metabolism blocks parasitic development. The most effective interference yet found involves blockage of the choline transporter, which supplies Plasmodium with choline for the synthesis of phosphatidylcholine, its major PL, this is a limiting step in the pathway. The drug sensitivity thereshold is much lower for the parasite, which is more dependent on this metabolism than host cells. The compounds show in vitro activity against P. falciparum at 1 to 10 nM. They show a very low toxicity against a lymphblastoid cell line, demonstrating a total abscence of correlation between growth inhibition of parasites and lymphoblastoid cells. They show antimalarial activity in vivo, in the P. berghei or P. chabaudi/mouse system, at doses 20-to 100-fold lower than their in acute toxicity limit. The bioavailability of a radiolabeled form of the product seemed to be advantageous (slow blood clearance and no significant concentration in tissues). Lastly, the compounds are inexpensive to produce. They are stable and water-soluble.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

When actuaries face with the problem of pricing an insurance contract that contains different types of coverage, such as a motor insurance or homeowner's insurance policy, they usually assume that types of claim are independent. However, this assumption may not be realistic: several studies have shown that there is a positive correlation between types of claim. Here we introduce different regression models in order to relax the independence assumption, including zero-inflated models to account for excess of zeros and overdispersion. These models have been largely ignored to multivariate Poisson date, mainly because of their computational di±culties. Bayesian inference based on MCMC helps to solve this problem (and also lets us derive, for several quantities of interest, posterior summaries to account for uncertainty). Finally, these models are applied to an automobile insurance claims database with three different types of claims. We analyse the consequences for pure and loaded premiums when the independence assumption is relaxed by using different multivariate Poisson regression models and their zero-inflated versions.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The goal of this paper is to reexamine the optimal design and efficiency of loyalty rewards in markets for final consumption goods. While the literature has emphasized the role of loyalty rewards as endogenous switching costs (which distort the efficient allocation of consumers), in this paper I analyze the ability of alternative designs to foster consumer participation and increase total surplus. First, the efficiency of loyalty rewards depend on their specific design. A commitment to the price of repeat purchases can involve substantial efficiency gains by reducing price-cost margins. However, discount policies imply higher future regular prices and are likely to reduce total surplus. Second, firms may prefer to set up inefficient rewards (discounts), especially in those circumstances where a commitment to the price of repeat purchases triggers Coasian dynamics.