17 resultados para Coefficient of Loss Aversion
em BORIS: Bern Open Repository and Information System - Berna - Suiça
Resumo:
Shifts in pollination syndromes involve coordinated changes in multiple oral traits. This raises the question of how plants can cope with rapid changes in pollinator availability by the slow process of accumulation of mutations in multiple genes. Here we study the transition from bee to hawkmoth pollination in the genus Petunia. Interspecic crosses followed by single locus introgressions were used to recreate putative intermediate evolutionary stages in the evolution of moth pollination. The effect of the loss/gain of petal color was asymmetric: it had no inuence on the established pollinator but enhanced visitation by the new pollinator. Therefore, shifts in pollination syndromes may proceed through intermediate stages of reduced specialization and consequently enhanced reproductive assurance. The loss of petal color in moth-pollinated Petunia involves null mutations in a single regulatory gene, An2. Such simple genetic changes may be sufciently rapid and frequent to ensure survival during pollinator failure.
Resumo:
OBJECTIVE To examine the impact of different definitions of loss to follow-up (LTFU) on estimates of program outcomes in cohort studies of patients on antiretroviral therapy (ART). STUDY DESIGN AND SETTING We examined the impact of different definitions of LTFU using data from the International Epidemiological Databases to Evaluate AIDS-Southern Africa. The reference approach, Definition A, was compared with five alternative scenarios that differed in eligibility for analysis and the date assigned to the LTFU outcome. Kaplan-Meier estimates of LTFU were calculated up to 2 years after starting ART. RESULTS Estimated cumulative LTFU were 14% and 22% at 12 and 24 months, respectively, using the reference approach. Differences in the proportion LTFU were reported in the alternative scenarios with 12-month estimates of LTFU varying by up to 39% compared with Definition A. Differences were largest when the date assigned to the LTFU outcome was 6 months after the date of last contact and when the site-specific definition of LTFU was used. CONCLUSION Variation in the definitions of LTFU within cohort analyses can have an appreciable impact on estimated proportions of LTFU over 2 years of follow-up. Use of a standardized definition of LTFU is needed to accurately measure program effectiveness and comparability between programs.
Resumo:
In several studies of antiretroviral treatment (ART) programs for persons with human immunodeficiency virus infection, investigators have reported that there has been a higher rate of loss to follow-up (LTFU) among patients initiating ART in recent years than among patients who initiated ART during earlier time periods. This finding is frequently interpreted as reflecting deterioration of patient retention in the face of increasing patient loads. However, in this paper we demonstrate by simulation that transient gaps in follow-up could lead to bias when standard survival analysis techniques are applied. We created a simulated cohort of patients with different dates of ART initiation. Rates of ART interruption, ART resumption, and mortality were assumed to remain constant over time, but when we applied a standard definition of LTFU, the simulated probability of being classified LTFU at a particular ART duration was substantially higher in recently enrolled cohorts. This suggests that much of the apparent trend towards increased LTFU may be attributed to bias caused by transient interruptions in care. Alternative statistical techniques need to be used when analyzing predictors of LTFU-for example, using "prospective" definitions of LTFU in place of "retrospective" definitions. Similar considerations may apply when analyzing predictors of LTFU from treatment programs for other chronic diseases.
Resumo:
BACKGROUND Antiretroviral therapy (ART) initiation is now recommended irrespective of CD4 count. However data on the relationship between CD4 count at ART initiation and loss to follow-up (LTFU) are limited and conflicting. METHODS We conducted a cohort analysis including all adults initiating ART (2008-2012) at three public sector sites in South Africa. LTFU was defined as no visit in the 6months before database closure. The Kaplan-Meier estimator and Cox's proportional hazards models examined the relationship between CD4 count at ART initiation and 24-month LTFU. Final models were adjusted for demographics, year of ART initiation, programme expansion and corrected for unascertained mortality. RESULTS Among 17038 patients, the median CD4 at initiation increased from 119 (IQR 54-180) in 2008 to 257 (IQR 175-318) in 2012. In unadjusted models, observed LTFU was associated with both CD4 counts <100 cells/L and CD4 counts 300 cells/L. After adjustment, patients with CD4 counts 300 cells/L were 1.35 (95% CI 1.12 to 1.63) times as likely to be LTFU after 24months compared to those with a CD4 150-199 cells/L. This increased risk for patients with CD4 counts 300 cells/L was largest in the first 3months on treatment. Correction for unascertained deaths attenuated the association between CD4 counts <100 cells/L and LTFU while the association between CD4 counts 300 cells/L and LTFU persisted. CONCLUSIONS Patients initiating ART at higher CD4 counts may be at increased risk for LTFU. With programmes initiating patients at higher CD4 counts, models of ART delivery need to be reoriented to support long-term retention.
Resumo:
BACKGROUND One aspect of a multidimensional approach to understanding asthma as a complex dynamic disease is to study how lung function varies with time. Variability measures of lung function have been shown to predict response to beta(2)-agonist treatment. An investigation was conducted to determine whether mean, coefficient of variation (CV) or autocorrelation, a measure of short-term memory, of peak expiratory flow (PEF) could predict loss of asthma control following withdrawal of regular inhaled corticosteroid (ICS) treatment, using data from a previous study. METHODS 87 adult patients with mild to moderate asthma who had been taking ICS at a constant dose for at least 6 months were monitored for 2-4 weeks. ICS was then withdrawn and monitoring continued until loss of control occurred as per predefined criteria. Twice-daily PEF was recorded during monitoring. Associations between loss of control and mean, CV and autocorrelation of morning PEF within 2 weeks pre- and post-ICS withdrawal were assessed using Cox regression analysis. Predictive utility was assessed using receiver operator characteristics. RESULTS 53 out of 87 patients had sufficient PEF data over the required analysis period. The mean (389 vs 370 l/min, p<0.0001) and CV (4.5% vs 5.6%, p=0.007) but not autocorrelation of PEF changed significantly from prewithdrawal to postwithdrawal in subjects who subsequently lost control, and were unaltered in those who did not. These changes were related to time to loss of control. CV was the most consistent predictor, with similar sensitivity and sensitivity to exhaled nitric oxide. CONCLUSION A simple, easy to obtain variability measure of daily lung function such as the CV may predict loss of asthma control within the first 2 weeks of ICS withdrawal.
Resumo:
Background Loss to follow-up (LTFU) is common in antiretroviral therapy (ART) programmes. Mortality is a competing risk (CR) for LTFU; however, it is often overlooked in cohort analyses. We examined how the CR of death affected LTFU estimates in Zambia and Switzerland. Methods and Findings HIV-infected patients aged 18 years who started ART 20042008 in observational cohorts in Zambia and Switzerland were included. We compared standard Kaplan-Meier curves with CR cumulative incidence. We calculated hazard ratios for LTFU across CD4 cell count strata using cause-specific Cox models, or Fine and Gray subdistribution models, adjusting for age, gender, body mass index and clinical stage. 89,339 patients from Zambia and 1,860 patients from Switzerland were included. 12,237 patients (13.7%) in Zambia and 129 patients (6.9%) in Switzerland were LTFU and 8,498 (9.5%) and 29 patients (1.6%), respectively, died. In Zambia, the probability of LTFU was overestimated in Kaplan-Meier curves: estimates at 3.5 years were 29.3% for patients starting ART with CD4 cells <100 cells/l and 15.4% among patients starting with 350 cells/L. The estimates from CR cumulative incidence were 22.9% and 13.6%, respectively. Little difference was found between nave and CR analyses in Switzerland since only few patients died. The results from Cox and Fine and Gray models were similar: in Zambia the risk of loss to follow-up and death increased with decreasing CD4 counts at the start of ART, whereas in Switzerland there was a trend in the opposite direction, with patients with higher CD4 cell counts more likely to be lost to follow-up. Conclusions In ART programmes in low-income settings the competing risk of death can substantially bias standard analyses of LTFU. The CD4 cell count and other prognostic factors may be differentially associated with LTFU in low-income and high-income settings.
Resumo:
PURPOSE: This study was conducted to elucidate the impact of loss of heterozygosity (LOH) for chromosomes 1p36 and 19q13 on the overall survival of patients with diffusely infiltrating WHO grade 2 gliomas treated without chemotherapy. PATIENTS AND METHODS: We assessed the LOH status of tumors from patients harboring WHO grade 2 gliomas diagnosed between 1991 and 2000. Patients were either followed after initial biopsy or treated by surgery and/or radiation therapy (RT). Overall survival, time to malignant transformation, and progression-free survival were last updated as of March 2005. RESULTS: Of a total of 79 patients, LOH 1p36 and LOH 19q13 could be assessed in 67 and 66 patients, respectively. The median follow-up after diagnosis was 6 years. Loss of either 1p or 19q, in particular codeletion(s) at both loci, was found to positively impact on both overall survival (log-rank P < .01), progression-free survival, and survival without malignant transformation (P < .05). Tumor volume (P < .0001), neurologic deficits at diagnosis (P < .01), involvement of more than one lobe (P < .01), and absence of an oligodendroglial component (P < .05) were also predictors of shorter overall survival. The extent of surgery was similar in patients with or without LOH 1p and/or 19q; RT was more frequently resorted to for patients without than for patients with LOH 1p/19q (30% v 60%). CONCLUSION: The presence of LOH on either 1p36 or 19q13, and in particular codeletion of both loci is a strong, nontreatment-related, prognostic factor for overall survival in patients with diffusely infiltrating WHO grade 2 gliomas.
Resumo:
The age distribution and incidence of loss of heterozygosity (LOH) of 1p and 19q was analyzed in 85 oligodendroglial tumors WHO II and III. The peak of tumor manifestation was in the age group of 35 to 55 years. There was no association between age at diagnosis and LOH incidence. We conclude that the prognostic effect of age on survival is not mediated by LOH 1p/19q.
Resumo:
BACKGROUND: Microarray genome analysis is realising its promise for improving detection of genetic abnormalities in individuals with mental retardation and congenital abnormality. Copy number variations (CNVs) are now readily detectable using a variety of platforms and a major challenge is the distinction of pathogenic from ubiquitous, benign polymorphic CNVs. The aim of this study was to investigate replacement of time consuming, locus specific testing for specific microdeletion and microduplication syndromes with microarray analysis, which theoretically should detect all known syndromes with CNV aetiologies as well as new ones. METHODS: Genome wide copy number analysis was performed on 117 patients using Affymetrix 250K microarrays. RESULTS: 434 CNVs (195 losses and 239 gains) were found, including 18 pathogenic CNVs and 9 identified as "potentially pathogenic". Almost all pathogenic CNVs were larger than 500 kb, significantly larger than the median size of all CNVs detected. Segmental regions of loss of heterozygosity larger than 5 Mb were found in 5 patients. CONCLUSIONS: Genome microarray analysis has improved diagnostic success in this group of patients. Several examples of recently discovered "new syndromes" were found suggesting they are more common than previously suspected and collectively are likely to be a major cause of mental retardation. The findings have several implications for clinical practice. The study revealed the potential to make genetic diagnoses that were not evident in the clinical presentation, with implications for pretest counselling and the consent process. The importance of contributing novel CNVs to high quality databases for genotype-phenotype analysis and review of guidelines for selection of individuals for microarray analysis is emphasised.
Resumo:
OBJECTIVES: The present research examined motivational differences across adulthood that might contribute to age-related differences in the willingness to engage in collective action. Two experiments addressed the role of gain and loss orientation for age-related differences in the willingness to engage in collective action across adulthood. METHOD: In Experiment 1, N = 169 adults (20-85 years) were confronted with a hypothetical scenario that involved either an impending increase or decrease of health insurance costs for their respective age group. In Experiment 2, N = 231 adults (18-83 years) were asked to list an advantage or disadvantage they perceived in being a member of their age group. Subsequently, participants indicated their willingness to engage in collective action on behalf of their age group. RESULTS: Both experiments suggest that, with increasing age, people are more willing to engage in collective action when they are confronted with the prospect of loss or a disadvantage. DISCUSSION: The findings highlight the role of motivational processes for involvement in collective action across adulthood. With increasing age, (anticipated) loss or perceived disadvantages become more important for the willingness to participate in collective action.
Resumo:
We estimate the momentum diffusion coefficient of a heavy quark within a pure SU(3) plasma at a temperature of about 1.5Tc. Large-scale Monte Carlo simulations on a series of lattices extending up to 192348 permit us to carry out a continuum extrapolation of the so-called color-electric imaginary-time correlator. The extrapolated correlator is analyzed with the help of theoretically motivated models for the corresponding spectral function. Evidence for a nonzero transport coefficient is found and, incorporating systematic uncertainties reflecting model assumptions, we obtain =(1.83.4)T3. This implies that the drag coefficient, characterizing the time scale at which heavy quarks adjust to hydrodynamic flow, is 1D=(1.83.4)(Tc/T)2(M/1.5GeV)fm/c, where M is the heavy quark kinetic mass. The results apply to bottom and, with somewhat larger systematic uncertainties, to charm quarks.
Resumo:
Five test runs were performed to assess possible bias when performing the loss on ignition (LOI) method to estimate organic matter and carbonate content of lake sediments. An accurate and stable weight loss was achieved after 2 h of burning pure CaCO3 at 950 C, whereas LOI of pure graphite at 530 C showed a direct relation to sample size and exposure time, with only 40-70% of the possible weight loss reached after 2 h of exposure and smaller samples losing weight faster than larger ones. Experiments with a standardised lake sediment revealed a strong initial weight loss at 550 C, but samples continued to lose weight at a slow rate at exposure of up to 64 h, which was likely the effect of loss of volatile salts, structural water of clay minerals or metal oxides, or of inorganic carbon after the initial burning of organic matter. A further test-run revealed that at 550 C samples in the centre of the furnace lost more weight than marginal samples. At 950 C this pattern was still apparent but the differences became negligible. Again, LOI was dependent on sample size. An analytical LOI quality control experiment including ten different laboratories was carried out using each laboratory's own LOI procedure as well as a standardised LOI procedure to analyse three different sediments. The range of LOI values between laboratories measured at 550 C was generally larger when each laboratory used its own method than when using the standard method. This was similar for 950 C, although the range of values tended to be smaller. The within-laboratory range of LOI measurements for a given sediment was generally small. Comparisons of the results of the individual and the standardised method suggest that there is a laboratory-specific pattern in the results, probably due to differences in laboratory equipment and/or handling that could not be eliminated by standardising the LOI procedure. Factors such as sample size, exposure time, position of samples in the furnace and the laboratory measuring affected LOI results, with LOI at 550 C being more susceptible to these factors than LOI at 950 C. We, therefore, recommend analysts to be consistent in the LOI method used in relation to the ignition temperatures, exposure times, and the sample size and to include information on these three parameters when referring to the method.
Resumo:
Bargaining is the building block of many economic interactions, ranging from bilateral to multilateral encounters and from situations in which the actors are individuals to negotiations between firms or countries. In all these settings, economists have been intrigued for a long time by the fact that some projects, trades or agreements are not realized even though they are mutually beneficial. On the one hand, this has been explained by incomplete information. A firm may not be willing to offer a wage that is acceptable to a qualified worker, because it knows that there are also unqualified workers and cannot distinguish between the two types. This phenomenon is known as adverse selection. On the other hand, it has been argued that even with complete information, the presence of externalities may impede efficient outcomes. To see this, consider the example of climate change. If a subset of countries agrees to curb emissions, non-participant regions benefit from the signatories efforts without incurring costs. These free riding opportunities give rise to incentives to strategically improve ones bargaining power that work against the formation of a global agreement. This thesis is concerned with extending our understanding of both factors, adverse selection and externalities. The findings are based on empirical evidence from original laboratory experiments as well as game theoretic modeling. On a very general note, it is demonstrated that the institutions through which agents interact matter to a large extent. Insights are provided about which institutions we should expect to perform better than others, at least in terms of aggregate welfare. Chapters 1 and 2 focus on the problem of adverse selection. Effective operation of markets and other institutions often depends on good information transmission properties. In terms of the example introduced above, a firm is only willing to offer high wages if it receives enough positive signals about the workers quality during the application and wage bargaining process. In Chapter 1, it will be shown that repeated interaction coupled with time costs facilitates information transmission. By making the wage bargaining process costly for the worker, the firm is able to obtain more accurate information about the workers type. The cost could be pure time cost from delaying agreement or cost of effort arising from a multi-step interviewing process. In Chapter 2, I abstract from time cost and show that communication can play a similar role. The simple fact that a worker states to be of high quality may be informative. In Chapter 3, the focus is on a different source of inefficiency. Agents strive for bargaining power and thus may be motivated by incentives that are at odds with the socially efficient outcome. I have already mentioned the example of climate change. Other examples are coalitions within committees that are formed to secure voting power to block outcomes or groups that commit to different technological standards although a single standard would be optimal (e.g. the format war between HD and BlueRay). It will be shown that such inefficiencies are directly linked to the presence of externalities and a certain degree of irreversibility in actions. I now discuss the three articles in more detail. In Chapter 1, Olivier Bochet and I study a simple bilateral bargaining institution that eliminates trade failures arising from incomplete information. In this setting, a buyer makes offers to a seller in order to acquire a good. Whenever an offer is rejected by the seller, the buyer may submit a further offer. Bargaining is costly, because both parties suffer a (small) time cost after any rejection. The difficulties arise, because the good can be of low or high quality and the quality of the good is only known to the seller. Indeed, without the possibility to make repeated offers, it is too risky for the buyer to offer prices that allow for trade of high quality goods. When allowing for repeated offers, however, at equilibrium both types of goods trade with probability one. We provide an experimental test of these predictions. Buyers gather information about sellers using specific price offers and rates of trade are high, much as the models qualitative predictions. We also observe a persistent over-delay before trade occurs, and this mitigates efficiency substantially. Possible channels for over-delay are identified in the form of two behavioral assumptions missing from the standard model, loss aversion (buyers) and haggling (sellers), which reconcile the data with the theoretical predictions. Chapter 2 also studies adverse selection, but interaction between buyers and sellers now takes place within a market rather than isolated pairs. Remarkably, in a market it suffices to let agents communicate in a very simple manner to mitigate trade failures. The key insight is that better informed agents (sellers) are willing to truthfully reveal their private information, because by doing so they are able to reduce search frictions and attract more buyers. Behavior observed in the experimental sessions closely follows the theoretical predictions. As a consequence, costless and non-binding communication (cheap talk) significantly raises rates of trade and welfare. Previous experiments have documented that cheap talk alleviates inefficiencies due to asymmetric information. These findings are explained by pro-social preferences and lie aversion. I use appropriate control treatments to show that such consideration play only a minor role in our market. Instead, the experiment highlights the ability to organize markets as a new channel through which communication can facilitate trade in the presence of private information. In Chapter 3, I theoretically explore coalition formation via multilateral bargaining under complete information. The environment studied is extremely rich in the sense that the model allows for all kinds of externalities. This is achieved by using so-called partition functions, which pin down a coalitional worth for each possible coalition in each possible coalition structure. It is found that although binding agreements can be written, efficiency is not guaranteed, because the negotiation process is inherently non-cooperative. The prospects of cooperation are shown to crucially depend on i) the degree to which players can renegotiate and gradually build up agreements and ii) the absence of a certain type of externalities that can loosely be described as incentives to free ride. Moreover, the willingness to concede bargaining power is identified as a novel reason for gradualism. Another key contribution of the study is that it identifies a strong connection between the Core, one of the most important concepts in cooperative game theory, and the set of environments for which efficiency is attained even without renegotiation.