799 resultados para optimal and suboptimal quality RNA


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Bargaining is the building block of many economic interactions, ranging from bilateral to multilateral encounters and from situations in which the actors are individuals to negotiations between firms or countries. In all these settings, economists have been intrigued for a long time by the fact that some projects, trades or agreements are not realized even though they are mutually beneficial. On the one hand, this has been explained by incomplete information. A firm may not be willing to offer a wage that is acceptable to a qualified worker, because it knows that there are also unqualified workers and cannot distinguish between the two types. This phenomenon is known as adverse selection. On the other hand, it has been argued that even with complete information, the presence of externalities may impede efficient outcomes. To see this, consider the example of climate change. If a subset of countries agrees to curb emissions, non-participant regions benefit from the signatories’ efforts without incurring costs. These free riding opportunities give rise to incentives to strategically improve ones bargaining power that work against the formation of a global agreement. This thesis is concerned with extending our understanding of both factors, adverse selection and externalities. The findings are based on empirical evidence from original laboratory experiments as well as game theoretic modeling. On a very general note, it is demonstrated that the institutions through which agents interact matter to a large extent. Insights are provided about which institutions we should expect to perform better than others, at least in terms of aggregate welfare. Chapters 1 and 2 focus on the problem of adverse selection. Effective operation of markets and other institutions often depends on good information transmission properties. In terms of the example introduced above, a firm is only willing to offer high wages if it receives enough positive signals about the worker’s quality during the application and wage bargaining process. In Chapter 1, it will be shown that repeated interaction coupled with time costs facilitates information transmission. By making the wage bargaining process costly for the worker, the firm is able to obtain more accurate information about the worker’s type. The cost could be pure time cost from delaying agreement or cost of effort arising from a multi-step interviewing process. In Chapter 2, I abstract from time cost and show that communication can play a similar role. The simple fact that a worker states to be of high quality may be informative. In Chapter 3, the focus is on a different source of inefficiency. Agents strive for bargaining power and thus may be motivated by incentives that are at odds with the socially efficient outcome. I have already mentioned the example of climate change. Other examples are coalitions within committees that are formed to secure voting power to block outcomes or groups that commit to different technological standards although a single standard would be optimal (e.g. the format war between HD and BlueRay). It will be shown that such inefficiencies are directly linked to the presence of externalities and a certain degree of irreversibility in actions. I now discuss the three articles in more detail. In Chapter 1, Olivier Bochet and I study a simple bilateral bargaining institution that eliminates trade failures arising from incomplete information. In this setting, a buyer makes offers to a seller in order to acquire a good. Whenever an offer is rejected by the seller, the buyer may submit a further offer. Bargaining is costly, because both parties suffer a (small) time cost after any rejection. The difficulties arise, because the good can be of low or high quality and the quality of the good is only known to the seller. Indeed, without the possibility to make repeated offers, it is too risky for the buyer to offer prices that allow for trade of high quality goods. When allowing for repeated offers, however, at equilibrium both types of goods trade with probability one. We provide an experimental test of these predictions. Buyers gather information about sellers using specific price offers and rates of trade are high, much as the model’s qualitative predictions. We also observe a persistent over-delay before trade occurs, and this mitigates efficiency substantially. Possible channels for over-delay are identified in the form of two behavioral assumptions missing from the standard model, loss aversion (buyers) and haggling (sellers), which reconcile the data with the theoretical predictions. Chapter 2 also studies adverse selection, but interaction between buyers and sellers now takes place within a market rather than isolated pairs. Remarkably, in a market it suffices to let agents communicate in a very simple manner to mitigate trade failures. The key insight is that better informed agents (sellers) are willing to truthfully reveal their private information, because by doing so they are able to reduce search frictions and attract more buyers. Behavior observed in the experimental sessions closely follows the theoretical predictions. As a consequence, costless and non-binding communication (cheap talk) significantly raises rates of trade and welfare. Previous experiments have documented that cheap talk alleviates inefficiencies due to asymmetric information. These findings are explained by pro-social preferences and lie aversion. I use appropriate control treatments to show that such consideration play only a minor role in our market. Instead, the experiment highlights the ability to organize markets as a new channel through which communication can facilitate trade in the presence of private information. In Chapter 3, I theoretically explore coalition formation via multilateral bargaining under complete information. The environment studied is extremely rich in the sense that the model allows for all kinds of externalities. This is achieved by using so-called partition functions, which pin down a coalitional worth for each possible coalition in each possible coalition structure. It is found that although binding agreements can be written, efficiency is not guaranteed, because the negotiation process is inherently non-cooperative. The prospects of cooperation are shown to crucially depend on i) the degree to which players can renegotiate and gradually build up agreements and ii) the absence of a certain type of externalities that can loosely be described as incentives to free ride. Moreover, the willingness to concede bargaining power is identified as a novel reason for gradualism. Another key contribution of the study is that it identifies a strong connection between the Core, one of the most important concepts in cooperative game theory, and the set of environments for which efficiency is attained even without renegotiation.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

It is well established that the balance of costimulatory and inhibitory signals during interactions with dendritic cells (DCs) determines T cell transition from a naïve to an activated or tolerant/anergic status. Although many of these molecular interactions are well reproduced in reductionist in vitro assays, the highly dynamic motility of naïve T cells in lymphoid tissue acts as an additional lever to fine-tune their activation threshold. T cell detachment from DCs providing suboptimal stimulation allows them to search for DCs with higher levels of stimulatory signals, while storing a transient memory of short encounters. In turn, adhesion of weakly reactive T cells to DCs presenting peptides presented on major histocompatibility complex with low affinity is prevented by lipid mediators. Finally, controlled recruitment of CD8(+) T cells to cognate DC-CD4(+) T cell clusters shapes memory T cell formation and the quality of the immune response. Dynamic physiological lymphocyte motility therefore constitutes a mechanism to mitigate low avidity T cell activation and to improve the search for "optimal" DCs, while contributing to peripheral tolerance induction in the absence of inflammation.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Anion exchange membranes (AEMs) are a potential method for determining the plant available N status of soils; however, their capacity for use with turfgrass has not been researched extensively. The main objective of this experiment was to determine the relationship between soil nitrate desorbed from AEMs and growth response and quality of turfgrass managed as a residential lawn. Two field experiments were conducted with a bluegrass-ryegrass-fescue mixture receiving four rates of N fertilizer (0, 98, 196, and 392 kg N ha(-1) yr(-1)) with clippings returned or removed. The soils at the two sites were a Paxton fine sandy loam (coarse-loamy, mixed, active, mesic Oxyaquic Dystrudepts) and a variant of a Hinckley gravelly sandy loam (sandy-skeletal, mixed, mesic Typic Udorthents). Anion exchange membranes were inserted into plots and exchanged weekly during the growing seasons of 1998 and 1999. Nitrate-N was desorbed from AEMs and quantified. As N fertilization rates increased, desorbed NO3-N increased. The relationship of desorbed NO3-N from AEMs to clipping yield and turfgrass quality was characterized using quadratic response plateau (QRP) and Cate-Nelson models (C-Ns). Critical levels of desorbed NO3-N ranged from 0.86 to 8.0 microgram cm(-2) d(-1) for relative dry matter yield (DMY) and from 2.3 to 12 microgram cm(-2) d(-1) for turfgrass quality depending upon experimental treatment. Anion exchange membranes show promise of indicating the critical levels of soil NO3-N desorbed from AEMs necessary to achieve maximum turfgrass quality and yield without overapplication of N.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Background. Insufficient and poor quality sleep among adolescents affects not only the cognitive functioning, but overall health of the individual. Existing research suggests that adolescents from varying ethnic groups exhibit differing sleep patterns. However, little research focuses on sleep patterns and associated factors (i.e. tobacco use, mental health indicators) among Hispanic youth. ^ Methods. The study population (n=2,536) included students in grades 9-12 who attended one of the three public high schools along the Texas-Mexico border in 2003. This was a cross sectional study using secondary data collected via a web-based, confidential, self-administered survey. Separate logistic regression models were estimated to identify factors associated with reduced (<9 hours/night) and poor quality sleep on average during weeknights. ^ Results. Of participants, 49.5% reported reduced sleep while 12.8% reported poor quality sleep. Factors significantly (p<0.05) associated with poor quality sleep were: often feeling stressed or anxious (OR=5.49), being born in Mexico (OR=0.65), using a computer/playing video games 15+ hours per week (OR=2.29), working (OR=1.37), being a current smoker (OR=2.16), and being a current alcohol user (OR=1.64). Factors significantly associated with reduced quantity of sleep were: often feeling stressed or anxious (OR=2.74), often having headaches/stomachaches (OR=1.77), being a current marijuana user (OR=1.70), being a current methamphetamine user (OR=4.92), and being a current alcohol user (OR=1.27). ^ Discussion. Previous research suggests that there are several factors that can influence sleep quality and quantity in adolescents. This paper discusses these factors (i.e. work, smoking, alcohol, etc.) found to be associated with poor sleep quality and reduced sleep quantity in the Hispanic adolescent population. A reduced quantity of sleep (81.20% of the participants) and a poor quality of sleep (12.80% of the participants) were also found in high school students from South Texas. ^

Relevância:

100.00% 100.00%

Publicador:

Resumo:

During the healthcare reform debate in the United States in 2009/2010, many health policy experts expressed a concern that expanding coverage would increase waiting times for patients to obtain care. Many complained that delays in obtaining care in turn would compromise the quality of healthcare in the United States. Using data from The Commonwealth Fund 2010 International Health Policy Survey in Eleven Countries, this study explored the relationship between wait times and quality of care, employing a wait time scale and several quality of care indicators present in the dataset. The impact of wait times on quality was assessed. Increased wait time was expected to reduce quality of care. However, this study found that wait times correlated with better health outcomes for some measures, and had no association with others. Since this is a pilot study and statistical significance was not achieved for any of the correlations, further research is needed to confirm and deepen the findings. However, if future studies confirm this finding, an emphasis on reducing wait times at the expense of other health system level performance variables may be inappropriate. ^

Relevância:

100.00% 100.00%

Publicador:

Resumo:

My dissertation focuses on two aspects of RNA sequencing technology. The first is the methodology for modeling the overdispersion inherent in RNA-seq data for differential expression analysis. This aspect is addressed in three sections. The second aspect is the application of RNA-seq data to identify the CpG island methylator phenotype (CIMP) by integrating datasets of mRNA expression level and DNA methylation status. Section 1: The cost of DNA sequencing has reduced dramatically in the past decade. Consequently, genomic research increasingly depends on sequencing technology. However it remains elusive how the sequencing capacity influences the accuracy of mRNA expression measurement. We observe that accuracy improves along with the increasing sequencing depth. To model the overdispersion, we use the beta-binomial distribution with a new parameter indicating the dependency between overdispersion and sequencing depth. Our modified beta-binomial model performs better than the binomial or the pure beta-binomial model with a lower false discovery rate. Section 2: Although a number of methods have been proposed in order to accurately analyze differential RNA expression on the gene level, modeling on the base pair level is required. Here, we find that the overdispersion rate decreases as the sequencing depth increases on the base pair level. Also, we propose four models and compare them with each other. As expected, our beta binomial model with a dynamic overdispersion rate is shown to be superior. Section 3: We investigate biases in RNA-seq by exploring the measurement of the external control, spike-in RNA. This study is based on two datasets with spike-in controls obtained from a recent study. We observe an undiscovered bias in the measurement of the spike-in transcripts that arises from the influence of the sample transcripts in RNA-seq. Also, we find that this influence is related to the local sequence of the random hexamer that is used in priming. We suggest a model of the inequality between samples and to correct this type of bias. Section 4: The expression of a gene can be turned off when its promoter is highly methylated. Several studies have reported that a clear threshold effect exists in gene silencing that is mediated by DNA methylation. It is reasonable to assume the thresholds are specific for each gene. It is also intriguing to investigate genes that are largely controlled by DNA methylation. These genes are called “L-shaped” genes. We develop a method to determine the DNA methylation threshold and identify a new CIMP of BRCA. In conclusion, we provide a detailed understanding of the relationship between the overdispersion rate and sequencing depth. And we reveal a new bias in RNA-seq and provide a detailed understanding of the relationship between this new bias and the local sequence. Also we develop a powerful method to dichotomize methylation status and consequently we identify a new CIMP of breast cancer with a distinct classification of molecular characteristics and clinical features.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The objectives of this dissertation were to evaluate health outcomes, quality improvement measures, and the long-term cost-effectiveness and impact on diabetes-related microvascular and macrovascular complications of a community health worker-led culturally tailored diabetes education and management intervention provided to uninsured Mexican Americans in an urban faith-based clinic. A prospective, randomized controlled repeated measures design was employed to compare the intervention effects between: (1) an intervention group (n=90) that participated in the Community Diabetes Education (CoDE) program along with usual medical care; and (2) a wait-listed comparison group (n=90) that received only usual medical care. Changes in hemoglobin A1c (HbA1c) and secondary outcomes (lipid status, blood pressure and body mass index) were assessed using linear mixed-models and an intention-to-treat approach. The CoDE group experienced greater reduction in HbA1c (-1.6%, p<.001) than the control group (-.9%, p<.001) over the 12 month study period. After adjusting for group-by-time interaction, antidiabetic medication use at baseline, changes made to the antidiabetic regime over the study period, duration of diabetes and baseline HbA1c, a statistically significant intervention effect on HbA1c (-.7%, p=.02) was observed for CoDE participants. Process and outcome quality measures were evaluated using multiple mixed-effects logistic regression models. Assessment of quality indicators revealed that the CoDE intervention group was significantly more likely to have received a dilated retinal examination than the control group, and 53% achieved a HbA1c below 7% compared with 38% of control group subjects. Long-term cost-effectiveness and impact on diabetes-related health outcomes were estimated through simulation modeling using the rigorously validated Archimedes Model. Over a 20 year time horizon, CoDE participants were forecasted to have less proliferative diabetic retinopathy, fewer foot ulcers, and reduced numbers of foot amputations than control group subjects who received usual medical care. An incremental cost-effectiveness ratio of $355 per quality-adjusted life-year gained was estimated for CoDE intervention participants over the same time period. The results from the three areas of program evaluation: impact on short-term health outcomes, quantification of improvement in quality of diabetes care, and projection of long-term cost-effectiveness and impact on diabetes-related health outcomes provide evidence that a community health worker can be a valuable resource to reduce diabetes disparities for uninsured Mexican Americans. This evidence supports formal integration of community health workers as members of the diabetes care team.^

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Background: For most cytotoxic and biologic anti-cancer agents, the response rate of the drug is commonly assumed to be non-decreasing with an increasing dose. However, an increasing dose does not always result in an appreciable increase in the response rate. This may especially be true at high doses for a biologic agent. Therefore, in a phase II trial the investigators may be interested in testing the anti-tumor activity of a drug at more than one (often two) doses, instead of only at the maximum tolerated dose (MTD). This way, when the lower dose appears equally effective, this dose can be recommended for further confirmatory testing in a phase III trial under potential long-term toxicity and cost considerations. A common approach to designing such a phase II trial has been to use an independent (e.g., Simon's two-stage) design at each dose ignoring the prior knowledge about the ordering of the response probabilities at the different doses. However, failure to account for this ordering constraint in estimating the response probabilities may result in an inefficient design. In this dissertation, we developed extensions of Simon's optimal and minimax two-stage designs, including both frequentist and Bayesian methods, for two doses that assume ordered response rates between doses. ^ Methods: Optimal and minimax two-stage designs are proposed for phase II clinical trials in settings where the true response rates at two dose levels are ordered. We borrow strength between doses using isotonic regression and control the joint and/or marginal error probabilities. Bayesian two-stage designs are also proposed under a stochastic ordering constraint. ^ Results: Compared to Simon's designs, when controlling the power and type I error at the same levels, the proposed frequentist and Bayesian designs reduce the maximum and expected sample sizes. Most of the proposed designs also increase the probability of early termination when the true response rates are poor. ^ Conclusion: Proposed frequentist and Bayesian designs are superior to Simon's designs in terms of operating characteristics (expected sample size and probability of early termination, when the response rates are poor) Thus, the proposed designs lead to more cost-efficient and ethical trials, and may consequently improve and expedite the drug discovery process. The proposed designs may be extended to designs of multiple group trials and drug combination trials.^

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We use a panel dataset from Bangladesh to examine the relationship between fertility and the adoption of electricity with the latter instrumented by infrastructure development and the quality of service delivery. We find that the adoption of electricity reduces fertility, and this impact is more pronounced when the household already has two or more children. This observation can be explained by a simple household model of time use, in which adoption of electricity affects only the optimal number of children but not necessarily current fertility behavior if the optimal number has not yet been reached.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The high number of import rejections of food commodities suggests that producers in exporting countries are not complying with established standards. To understand why this is the case, we explore the behavior of producers and consumers in developing countries. First, we examine the successful transformation of production practices adopted by shrimp producers in Thailand. In support of the dramatic change in practices, we observe an important role played by the public sector in providing a means to visualize chemical residues and to control processes upstream of the supply chain via a registration system and a traceability system called Movement Document. Furthermore, very active information sharing by the private sector contributes to the dissemination of useful technical and market information among producers. We also examine the knowledge and perceptions of consumers with respect to food safety in Vietnam. We find that consumers in Hanoi and Ho Chi Minh City behave differently toward the third-party certification VietGAP, probably owing to differences in the history of market mechanisms between the two cities.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The paper proposes a model for estimation of perceived video quality in IPTV, taking as input both video coding and network Quality of Service parameters. It includes some fitting parameters that depend mainly on the information contents of the video sequences. A method to derive them from the Spatial and Temporal Information contents of the sequences is proposed. The model may be used for near real-time monitoring of IPTV video quality.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Soybean meal (SBM) is the main protein source in livestock feeds. United States (USA), Brazil (BRA), and Argentine (ARG) are the major SBM exporter countries. The nutritive value of SBM varies because genetics, environment, farming conditions, and processing of the beans influence strongly the content and availability of major nutrients. The present research was conducted to determine the influence of origin (USA, BRA and ARG) on nutritive value and protein quality of SBM.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper proposes a way to quantify the emissions of mercury (Hg) and CO2 associated with the manufacture and operation of compact fluorescent lamps with integrated ballasts (CFLis), as well as the economic cost of using them under different operating cycles. The main purpose of this paper is to find simple criteria for reducing the polluting emissions under consideration and the economic cost of CFLi to a minimum. A lifetime model is proposed that allows the emissions and costs to be described as a function of degradation from turning CFLi on and their continuous operation. An idealized model of a CFLi is defined that combines characteristics stated by different manufacturers. In addition, two CFLi models representing poor-quality products are analyzed. It was found that the emissions and costs per unit of time of operation of the CFLi depend linearly on the number of times per unit of time it is turned on and the time of continuous operation. The optimal conditions (lowest emissions and costs) depend on the place of manufacture, the place of operation and the quality of the components of the lamp/ballast. Finally, it was also found that for each lamp, there are intervals when it is turned off during which emissions of pollutants and costs are identical regardless of how often the lamp is turned on or the time it remains on. For CO2 emissions, the lamp must be off up to 5 minutes; for the cost, up to 7 minutes and for Hg emissions, up to 43 minutes. It is advisable not to turn on a CFLi sooner than 43 minutes from the last time it was turned off.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In this paper we present TRHIOS: a Trust and Reputation system for HIerarchical and quality-Oriented Societies. We focus our work on hierarchical medical organizations. The model estimates the reputation of an individual, RTRHIOS, taking into account information from three trust dimensions: the hierarchy of the system; the source of information; and the quality of the results. Besides the concrete reputation value, it is important to know how reliable that value is; for each of the three dimensions we calculate the reliability of the assessed reputations; and aggregating them, the reliability of the reputation of an individual. The modular approach followed in the definition of the different types of reputations provides the system with a high flexibility that allows adapting the model to the peculiarities of each society.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Modeling is an essential tool for the development of atmospheric emission abatement measures and air quality plans. Most often these plans are related to urban environments with high emission density and population exposure. However, air quality modeling in urban areas is a rather challenging task. As environmental standards become more stringent (e.g. European Directive 2008/50/EC), more reliable and sophisticated modeling tools are needed to simulate measures and plans that may effectively tackle air quality exceedances, common in large urban areas across Europe, particularly for NO2. This also implies that emission inventories must satisfy a number of conditions such as consistency across the spatial scales involved in the analysis, consistency with the emission inventories used for regulatory purposes and versatility to match the requirements of different air quality and emission projection models. This study reports the modeling activities carried out in Madrid (Spain) highlighting the atmospheric emission inventory development and preparation as an illustrative example of the combination of models and data needed to develop a consistent air quality plan at urban level. These included a series of source apportionment studies to define contributions from the international, national, regional and local sources in order to understand to what extent local authorities can enforce meaningful abatement measures. Moreover, source apportionment studies were conducted in order to define contributions from different sectors and to understand the maximum feasible air quality improvement that can be achieved by reducing emissions from those sectors, thus targeting emission reduction policies to the most relevant activities. Finally, an emission scenario reflecting the effect of such policies was developed and the associated air quality was modeled.