988 resultados para Job loss


Relevância:

20.00% 20.00%

Publicador:

Resumo:

During mitotic cell cycles, DNA experiences many types of endogenous and exogenous damaging agents that could potentially cause double strand breaks (DSB). In S. cerevisiae, DSBs are primarily repaired by mitotic recombination and as a result, could lead to loss-of-heterozygosity (LOH). Genetic recombination can happen in both meiosis and mitosis. While genome-wide distribution of meiotic recombination events has been intensively studied, mitotic recombination events have not been mapped unbiasedly throughout the genome until recently. Methods for selecting mitotic crossovers and mapping the positions of crossovers have recently been developed in our lab. Our current approach uses a diploid yeast strain that is heterozygous for about 55,000 SNPs, and employs SNP-Microarrays to map LOH events throughout the genome. These methods allow us to examine selected crossovers and unselected mitotic recombination events (crossover, noncrossover and BIR) at about 1 kb resolution across the genome. Using this method, we generated maps of spontaneous and UV-induced LOH events. In this study, we explore machine learning and variable selection techniques to build a predictive model for where the LOH events occur in the genome.

Randomly from the yeast genome, we simulated control tracts resembling the LOH tracts in terms of tract lengths and locations with respect to single-nucleotide-polymorphism positions. We then extracted roughly 1,100 features such as base compositions, histone modifications, presence of tandem repeats etc. and train classifiers to distinguish control tracts and LOH tracts. We found interesting features of good predictive values. We also found that with the current repertoire of features, the prediction is generally better for spontaneous LOH events than UV-induced LOH events.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

BACKGROUND: The obesity epidemic has spread to young adults, leading to significant public health implications later in adulthood. Intervention in early adulthood may be an effective public health strategy for reducing the long-term health impact of the epidemic. Few weight loss trials have been conducted in young adults. It is unclear what weight loss strategies are beneficial in this population. PURPOSE: To describe the design and rationale of the NHLBI-sponsored Cell Phone Intervention for You (CITY) study, which is a single center, randomized three-arm trial that compares the impact on weight loss of 1) a behavioral intervention that is delivered almost entirely via cell phone technology (Cell Phone group); and 2) a behavioral intervention delivered mainly through monthly personal coaching calls enhanced by self-monitoring via cell phone (Personal Coaching group), each compared to 3) a usual care, advice-only control condition. METHODS: A total of 365 community-dwelling overweight/obese adults aged 18-35 years were randomized to receive one of these three interventions for 24 months in parallel group design. Study personnel assessing outcomes were blinded to group assignment. The primary outcome is weight change at 24 [corrected] months. We hypothesize that each active intervention will cause more weight loss than the usual care condition. Study completion is anticipated in 2014. CONCLUSIONS: If effective, implementation of the CITY interventions could mitigate the alarming rates of obesity in young adults through promotion of weight loss. ClinicalTrial.gov: NCT01092364.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Obesity has spread to all segments of the U.S. population. Young adults, aged 18-35 years, are rarely represented in clinical weight loss trials. We conducted a qualitative study to identify factors that may facilitate recruitment of young adults into a weight loss intervention trial. Participants were 33 adults aged 18-35 years with BMI ≥25 kg/m(2). Six group discussions were conducted using the nominal group technique. Health, social image, and "self" factors such as emotions, self-esteem, and confidence were reported as reasons to pursue weight loss. Physical activity, dietary intake, social support, medical intervention, and taking control (e.g. being motivated) were perceived as the best weight loss strategies. Incentives, positive outcomes, education, convenience, and social support were endorsed as reasons young adults would consider participating in a weight loss study. Incentives, advertisement, emphasizing benefits, and convenience were endorsed as ways to recruit young adults. These results informed the Cellphone Intervention for You (CITY) marketing and advertising, including message framing and advertising avenues. Implications for recruitment methods are discussed.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Obesity, currently an epidemic, is a difficult disease to combat because it is marked by both a change in body weight and an underlying dysregulation in metabolism, making consistent weight loss challenging. We sought to elucidate this metabolic dysregulation resulting from diet-induced obesity (DIO) that persists through subsequent weight loss. We hypothesized that weight gain imparts a change in “metabolic set point” persisting through subsequent weight loss and that this modification may involve a persistent change in hepatic AMP-activated protein kinase (AMPK), a key energy-sensing enzyme in the body. To test these hypotheses, we tracked metabolic perturbations through this period, measuring changes in hepatic AMPK. To further understand the role of AMPK we used AICAR, an AMPK activator, following DIO. Our findings established a more dynamic metabolic model of DIO and subsequent weight loss. We observed hepatic AMPK elevation following weight loss, but AICAR administration without similar dieting was unsuccessful in improving metabolic dysregulation. Our findings provide an approach to modeling DIO and subsequent dieting that can be built upon in future studies and hopefully contribute to more effective long-term treatments of obesity.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

BACKGROUND: The wide range of complex photic systems observed in birds exemplifies one of their key evolutionary adaptions, a well-developed visual system. However, genomic approaches have yet to be used to disentangle the evolutionary mechanisms that govern evolution of avian visual systems. RESULTS: We performed comparative genomic analyses across 48 avian genomes that span extant bird phylogenetic diversity to assess evolutionary changes in the 17 representatives of the opsin gene family and five plumage coloration genes. Our analyses suggest modern birds have maintained a repertoire of up to 15 opsins. Synteny analyses indicate that PARA and PARIE pineal opsins were lost, probably in conjunction with the degeneration of the parietal organ. Eleven of the 15 avian opsins evolved in a non-neutral pattern, confirming the adaptive importance of vision in birds. Visual conopsins sw1, sw2 and lw evolved under negative selection, while the dim-light RH1 photopigment diversified. The evolutionary patterns of sw1 and of violet/ultraviolet sensitivity in birds suggest that avian ancestors had violet-sensitive vision. Additionally, we demonstrate an adaptive association between the RH2 opsin and the MC1R plumage color gene, suggesting that plumage coloration has been photic mediated. At the intra-avian level we observed some unique adaptive patterns. For example, barn owl showed early signs of pseudogenization in RH2, perhaps in response to nocturnal behavior, and penguins had amino acid deletions in RH2 sites responsible for the red shift and retinal binding. These patterns in the barn owl and penguins were convergent with adaptive strategies in nocturnal and aquatic mammals, respectively. CONCLUSIONS: We conclude that birds have evolved diverse opsin adaptations through gene loss, adaptive selection and coevolution with plumage coloration, and that differentiated selective patterns at the species level suggest novel photic pressures to influence evolutionary patterns of more-recent lineages.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

BACKGROUND: Illicit cigarettes comprise more than 11% of tobacco consumption and 17% of consumption in low- and middle-income countries. Illicit cigarettes, defined as those that evade taxes, lower consumer prices, threaten national tobacco control efforts, and reduce excise tax collection. METHODS: This paper measures the magnitude of illicit cigarette consumption within Indonesia using two methods: the discrepancies between legal cigarette sales and domestic consumption estimated from surveys, and discrepancies between imports recorded by Indonesia and exports recorded by trade partners. Smuggling plays a minor role in the availability of illicit cigarettes because Indonesians predominantly consume kreteks, which are primarily manufactured in Indonesia. RESULTS: Looking at the period from 1995 to 2013, illicit cigarettes first emerged in 2004. When no respondent under-reporting is assumed, illicit consumption makes up 17% of the domestic market in 2004, 9% in 2007, 11% in 2011, and 8% in 2013. Discrepancies in the trade data indicate that Indonesia was a recipient of smuggled cigarettes for each year between 1995 and 2012. The value of this illicit trade ranges from less than $1 million to nearly $50 million annually. Singapore, China, and Vietnam together accounted for nearly two-thirds of trade discrepancies over the period. Tax losses due to illicit consumption amount to between Rp 4.1 and 9.3 trillion rupiah, 4% to 13% of tobacco excise revenue, in 2011 and 2013. CONCLUSIONS: Due to the predominance of kretek consumption in Indonesia and Indonesia's status as the predominant producer of kreteks, illicit domestic production is likely the most important source for illicit cigarettes, and initiatives targeted to combat this illicit production carry the promise of the greatest potential impact.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The normal immune response of A/J mice against arsonate coupled to hemocyanin is characterized by a major recurrent cross-reactive Id, the CRIA. This Id is encoded by a single gene segment combination: VHidcr11-DFL16.1e-JH2 for the H chain and Vkidcr-Jk1 for the L chain. In this report, we show that lethal irradiation of A/J mice followed by reconstitution with autologous or syngeneic lymphoid cells results in loss of major CRIA Id expression in the response to arsonate. Different protocols were performed to repopulate the irradiated mice. First, lethally irradiated A/J mice were reconstituted by the transfer of syngeneic bone marrow cells. Second, A/J mice were lethally irradiated while their hind limbs were partially shielded. Third, lethally irradiated A/J mice received a transfer of syngeneic spleen cells. The three groups of mice produce high titers of antiarsonate antibodies completely devoid of CRIA DH-JH related idiotopes expression. Moreover, a lack of affinity maturation is observed in the secondary antiarsonate response of all irradiated and reconstituted mice. A transfer of syngeneic peritoneal cells or a transfer of primed T cells in irradiated and reconstituted A/J mice do not restore in a significant manner either the recurrent CRIA expression or the affinity maturation of the antiarsonate response. Our data suggest that the choice of this Id is not solely dictated by the Igh locus.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The paper presents an improved version of the greedy open shop approximation algorithm with pre-ordering of jobs. It is shown that the algorithm compares favorably with the greedy algorithm with no pre-ordering by reducing either its absolute or relative error. In the case of three machines, the new algorithm creates a schedule with the makespan that is at most 3/2 times the optimal value.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We consider two “minimum”NP-hard job shop scheduling problems to minimize the makespan. In one of the problems every job has to be processed on at most two out of three available machines. In the other problem there are two machines, and a job may visit one of the machines twice. For each problem, we define a class of heuristic schedules in which certain subsets of operations are kept as blocks on the corresponding machines. We show that for each problem the value of the makespan of the best schedule in that class cannot be less than 3/2 times the optimal value, and present algorithms that guarantee a worst-case ratio of 3/2.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We present here a decoupling technique to tackle the entanglement of the nonlinear boundary condition and the movement of the char/virgin front for a thermal pyrolysis model for charring materials. Standard numerical techniques to solve moving front problems — often referred to as Stefan problems — encounter difficulties when dealing with nonlinear boundaries. While special integral methods have been developed to solve this problem, they suffer from several limitations which the technique described here overcomes. The newly developed technique is compared with the exact analytical solutions for some simple ideal situations which demonstrate that the numerical method is capable of producing accurate numerical solutions. The pyrolysis model is also used to simulate the mass loss process from a white pine sample exposed to a constant radiative flux in a nitrogen atmosphere. Comparison with experimental results demonstrates that the predictions of mass loss rates and temperature profile within the solid material are in good agreement with the experiment.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The paper considers the job shop scheduling problem to minimize the makespan. It is assumed that each job consists of at most two operations, one of which is to be processed on one of m⩾2 machines, while the other operation must be performed on a single bottleneck machine, the same for all jobs. For this strongly NP-hard problem we present two heuristics with improved worst-case performance. One of them guarantees a worst-case performance ratio of 3/2. The other algorithm creates a schedule with the makespan that exceeds the largest machine workload by at most the length of the largest operation.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this paper, we study a problem of scheduling and batching on two machines in a flow-shop and open-shop environment. Each machine processes operations in batches, and the processing time of a batch is the sum of the processing times of the operations in that batch. A setup time, which depends only on the machine, is required before a batch is processed on a machine, and all jobs in a batch remain at the machine until the entire batch is processed. The aim is to make batching and sequencing decisions, which specify a partition of the jobs into batches on each machine, and a processing order of the batches on each machine, respectively, so that the makespan is minimized. The flow-shop problem is shown to be strongly NP-hard. We demonstrate that there is an optimal solution with the same batches on the two machines; we refer to these as consistent batches. A heuristic is developed that selects the best schedule among several with one, two, or three consistent batches, and is shown to have a worst-case performance ratio of 4/3. For the open-shop, we show that the problem is NP-hard in the ordinary sense. By proving the existence of an optimal solution with one, two or three consistent batches, a close relationship is established with the problem of scheduling two or three identical parallel machines to minimize the makespan. This allows a pseudo-polynomial algorithm to be derived, and various heuristic methods to be suggested.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We consider the problem of scheduling independent jobs on two machines in an open shop, a job shop and a flow shop environment. Both machines are batching machines, which means that several operations can be combined into a batch and processed simultaneously on a machine. The batch processing time is the maximum processing time of operations in the batch, and all operations in a batch complete at the same time. Such a situation may occur, for instance, during the final testing stage of circuit board manufacturing, where burn-in operations are performed in ovens. We consider cases in which there is no restriction on the size of a batch on a machine, and in which a machine can process only a bounded number of operations in one batch. For most of the possible combinations of restrictions, we establish the complexity status of the problem.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In order to find a link between results obtained from a laboratory erosion tester and tests carried out on a pneumatic conveyor, a comparison has been made between weight loss from bends on an industrial-scale pneumatic conveyor and erosion rates obtained in a small centrifugal erosion tester, for the same materials. Identical test conditions have been applied to both experiments so that comparable test results have been obtained. The erosion rate of mild steel commonly used as the wall material of conveyor pipes and pipe bends was determined individually on both test rigs. A relationship between weight loss from the bends and erosion rate determined from the tester has been developed. A discussion based on the results and their applicability to the prediction of wear in pneumatic conveyors concludes the paper.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this paper, we provide a unified approach to solving preemptive scheduling problems with uniform parallel machines and controllable processing times. We demonstrate that a single criterion problem of minimizing total compression cost subject to the constraint that all due dates should be met can be formulated in terms of maximizing a linear function over a generalized polymatroid. This justifies applicability of the greedy approach and allows us to develop fast algorithms for solving the problem with arbitrary release and due dates as well as its special case with zero release dates and a common due date. For the bicriteria counterpart of the latter problem we develop an efficient algorithm that constructs the trade-off curve for minimizing the compression cost and the makespan.