983 resultados para Penalized likelihood
Resumo:
The CD209 gene family that encodes C-type lectins in primates includes CD209 (DC-SIGN), CD209L (L-SIGN) and CD209L2. Understanding the evolution of these genes can help understand the duplication events generating this family, the process leading to the repeated neck region and identify protein domains under selective pressure. We compiled sequences from 14 primates representing 40 million years of evolution and from three non-primate mammal species. Phylogenetic analyses used Bayesian inference, and nucleotide substitutional patterns were assessed by codon-based maximum likelihood. Analyses suggest that CD209 genes emerged from a first duplication event in the common ancestor of anthropoids, yielding CD209L2 and an ancestral CD209 gene, which, in turn, duplicated in the common Old World primate ancestor, giving rise to CD209L and CD209. K(A)/K(S) values averaged over the entire tree were 0.43 (CD209), 0.52 (CD209L) and 0.35 (CD209L2), consistent with overall signatures of purifying selection. We also assessed the Toll-like receptor (TLR) gene family, which shares with CD209 genes a common profile of evolutionary constraint. The general feature of purifying selection of CD209 genes, despite an apparent redundancy (gene absence and gene loss), may reflect the need to faithfully recognize a multiplicity of pathogen motifs, commensals and a number of self-antigens
Resumo:
Intensity modulated radiotherapy (IMRT) is a conformal radiotherapy that produces concave and irregular target volume dose distributions. IMRT has a potential to reduce the volume of healthy tissue irradiated to a high dose, but this often at the price of an increased volume of normal tissue irradiated to a low dose. Clinical benefits of IMRT are expected to be most pronounced at the body sites where sensitive normal tissues surround or are located next to a target with a complex 3D shape. The irradiation doses needed for tumor control are often markedly higher than the tolerance of the radiation sensitive structures such as the spinal cord, the optic nerves, the eyes, or the salivary glands in the treatment of head and neck cancer. Parotid gland salivary flow is markedly reduced following a cumulative dose of 30 50 Gy given with conventional fractionation and xerostomia may be prevented in most patients using a conformal parotid-sparing radiotherapy technique. However, in cohort studies where IMRT was compared with conventional and conformal radiotherapy techniques in the treatment of laryngeal or oropharyngeal carcinoma, the dosimetric advantage of IMRT translated into a reduction of late salivary toxicity with no apparent adverse impact on the tumor control. IMRT might reduce the radiation dose to the major salivary glands and the risk of permanent xerostomia without compromizing the likelihood for cure. Alternatively, IMRT might allow the target dose escalation at a given level of normal tissue damage. We describe here the clinical results on postirradiation salivary gland function in head and neck cancer patients treated with IMRT, and the technical aspects of IMRT applied. The results suggest that the major salivary gland function can be maintained with IMRT without a need to compromise the clinical target volume dose, or the locoregional control.
Resumo:
BACKGROUND AND STUDY AIMS: The current gold standard in Barrett's esophagus monitoring consists of four-quadrant biopsies every 1-2 cm in accordance with the Seattle protocol. Adding brush cytology processed by digital image cytometry (DICM) may further increase the detection of patients with Barrett's esophagus who are at risk of neoplasia. The aim of the present study was to assess the additional diagnostic value and accuracy of DICM when added to the standard histological analysis in a cross-sectional multicenter study of patients with Barrett's esophagus in Switzerland. METHODS: One hundred sixty-four patients with Barrett's esophagus underwent 239 endoscopies with biopsy and brush cytology. DICM was carried out on 239 cytology specimens. Measures of the test accuracy of DICM (relative risk, sensitivity, specificity, likelihood ratios) were obtained by dichotomizing the histopathology results (high-grade dysplasia or adenocarcinoma vs. all others) and DICM results (aneuploidy/intermediate pattern vs. diploidy). RESULTS: DICM revealed diploidy in 83% of 239 endoscopies, an intermediate pattern in 8.8%, and aneuploidy in 8.4%. An intermediate DICM result carried a relative risk (RR) of 12 and aneuploidy a RR of 27 for high-grade dysplasia/adenocarcinoma. Adding DICM to the standard biopsy protocol, a pathological cytometry result (aneuploid or intermediate) was found in 25 of 239 endoscopies (11%; 18 patients) with low-risk histology (no high-grade dysplasia or adenocarcinoma). During follow-up of 14 of these 18 patients, histological deterioration was seen in 3 (21%). CONCLUSION: DICM from brush cytology may add important information to a standard biopsy protocol by identifying a subgroup of BE-patients with high-risk cellular abnormalities.
Resumo:
GeneID is a program to predict genes in anonymous genomic sequences designed with a hierarchical structure. In the first step, splice sites, and start and stop codons are predicted and scored along the sequence using position weight matrices (PWMs). In the second step, exons are built from the sites. Exons are scored as the sum of the scores of the defining sites, plus the log-likelihood ratio of a Markov model for coding DNA. In the last step, from the set of predicted exons, the gene structure is assembled, maximizing the sum of the scores of the assembled exons. In this paper we describe the obtention of PWMs for sites, and the Markov model of coding DNA in Drosophila melanogaster. We also compare other models of coding DNA with the Markov model. Finally, we present and discuss the results obtained when GeneID is used to predict genes in the Adh region. These results show that the accuracy of GeneID predictions compares currently with that of other existing tools but that GeneID is likely to be more efficient in terms of speed and memory usage.
Resumo:
Prevention programs in adolescence are particularly effective if they target homogeneous risk groups of adolescents who share a combination of particular needs and problems. The present work aims to identify and classify risky single-occasion drinking (RSOD) adolescents according to their motivation to engage in drinking. An easy-to-use coding procedure was developed. It was validated by means of cluster analyses and structural equation modeling based on two randomly selected subsamples of a nationally representative sample of 2,449 12- to 18-year-old RSOD students in Switzerland. Results revealed that the coding procedure classified RSOD adolescents as either enhancement drinkers or coping drinkers. The high concordance (Sample A: kappa - .88, Sample B: kappa - .90) with the results of the cluster analyses demonstrated the convergent validity of the coding classification. The fact that enhancement drinkers in both subsamples were found to go out more frequently in the evenings and to have more satisfactory social relationships, as well as a higher proportion of drinking peers and a lower likelihood to drink at home than coping drinkers demonstrates the concurrent validity of the classification. To conclude, the coding procedure appears to be a valid, reliable, and easy-to-use tool that can help better adapt prevention activities to adolescent risky drinking motives.
Resumo:
CodeML (part of the PAML package) im- plements a maximum likelihood-based approach to de- tect positive selection on a specific branch of a given phylogenetic tree. While CodeML is widely used, it is very compute-intensive. We present SlimCodeML, an optimized version of CodeML for the branch-site model. Our performance analysis shows that SlimCodeML substantially outperforms CodeML (up to 9.38 times faster), especially for large-scale genomic analyses.
Resumo:
Confronting a recently mated female with a strange male can induce a pregnancy block ('Bruce effect'). The physiology of this effect is well studied, but its functional significance is still not fully understood. The 'anticipated infanticide hypothesis' suggests that the pregnancy block serves to avoid the cost of embryogenesis and giving birth to offspring that are likely to be killed by a new territory holder. Some 'compatible-genes sexual selection hypotheses' suggest that the likelihood of a pregnancy block is also dependent on the female's perception of the stud's and the stimulus male's genetic quality. We used two inbred strains of mice (C57BL/6 and BALB/c) to test all possible combinations of female strain, stud strain, and stimulus strain under experimental conditions (N(total) = 241 mated females). As predicted from previous studies, we found increased rates of pregnancy blocks if stud and stimulus strains differed, and we found evidence for hybrid vigour in offspring of between-strain mating. Despite the observed heterosis, pregnancies of within-strain matings were not more likely to be blocked than pregnancies of between-strain matings. A power analysis revealed that if we missed an existing effect (type-II error), the effect must be very small. If a female gave birth, the number and weight of newborns were not significantly influenced by the stimulus males. In conclusion, we found no support for the 'compatible-genes sexual selection hypotheses'.
Resumo:
Background: One of the main goals of cancer genetics is to identify the causative elements at the molecular level leading to cancer.Results: We have conducted an analysis of a set of genes known to be involved in cancer in order to unveil their unique features that can assist towards the identification of new candidate cancer genes. Conclusion: We have detected key patterns in this group of genes in terms of the molecular function or the biological process in which they are involved as well as sequence properties. Based on these features we have developed an accurate Bayesian classification model with which human genes have been scored for their likelihood of involvement in cancer.
Resumo:
Background: Choosing an adequate measurement instrument depends on the proposed use of the instrument, the concept to be measured, the measurement properties (e.g. internal consistency, reproducibility, content and construct validity, responsiveness, and interpretability), the requirements, the burden for subjects, and costs of the available instruments. As far as measurement properties are concerned, there are no sufficiently specific standards for the evaluation of measurement properties of instruments to measure health status, and also no explicit criteria for what constitutes good measurement properties. In this paper we describe the protocol for the COSMIN study, the objective of which is to develop a checklist that contains COnsensus-based Standards for the selection of health Measurement INstruments, including explicit criteria for satisfying these standards. We will focus on evaluative health related patient-reported outcomes (HR-PROs), i.e. patient-reported health measurement instruments used in a longitudinal design as an outcome measure, excluding health care related PROs, such as satisfaction with care or adherence. The COSMIN standards will be made available in the form of an easily applicable checklist.Method: An international Delphi study will be performed to reach consensus on which and how measurement properties should be assessed, and on criteria for good measurement properties. Two sources of input will be used for the Delphi study: (1) a systematic review of properties, standards and criteria of measurement properties found in systematic reviews of measurement instruments, and (2) an additional literature search of methodological articles presenting a comprehensive checklist of standards and criteria. The Delphi study will consist of four (written) Delphi rounds, with approximately 30 expert panel members with different backgrounds in clinical medicine, biostatistics, psychology, and epidemiology. The final checklist will subsequently be field-tested by assessing the inter-rater reproducibility of the checklist.Discussion: Since the study will mainly be anonymous, problems that are commonly encountered in face-to-face group meetings, such as the dominance of certain persons in the communication process, will be avoided. By performing a Delphi study and involving many experts, the likelihood that the checklist will have sufficient credibility to be accepted and implemented will increase.
Resumo:
We focus on full-rate, fast-decodable space–time block codes (STBCs) for 2 x 2 and 4 x 2 multiple-input multiple-output (MIMO) transmission. We first derive conditions and design criteria for reduced-complexity maximum-likelihood (ML) decodable 2 x 2 STBCs, and we apply them to two families of codes that were recently discovered. Next, we derive a novel reduced-complexity 4 x 2 STBC, and show that it outperforms all previously known codes with certain constellations.
Resumo:
We design powerful low-density parity-check (LDPC) codes with iterative decoding for the block-fading channel. We first study the case of maximum-likelihood decoding, and show that the design criterion is rather straightforward. Since optimal constructions for maximum-likelihood decoding do not performwell under iterative decoding, we introduce a new family of full-diversity LDPC codes that exhibit near-outage-limit performance under iterative decoding for all block-lengths. This family competes favorably with multiplexed parallel turbo codes for nonergodic channels.
Resumo:
This project explores the user costs and benefits of winter road closures. Severe winter weather makes travel unsafe and dramatically increases crash rates. When conditions become unsafe due to winter weather, road closures should allow users to avoid crash costs and eliminate costs associated with rescuing stranded motorists. Therefore, the benefits of road closures are the avoided safety costs. The costs of road closures are the delays that are imposed on motorists and motor carriers who would have made the trip had the road not been closed. This project investigated the costs and benefits of road closures and found that evaluating the benefits and costs is not as simple as it appears. To better understand the costs and benefits of road closures, the project investigates the literature, conducts interviews with shippers and motor carriers, and conducts case studies of road closures to determine what actually occurred on roadways during closures. The project also estimates a statistical model that relates weather severity to crash rates. Although, the statistical model is intended to illustrate the possibility to quantitatively relate measurable and predictable weather conditions to the safety performance of a roadway. In the future, weather conditions such as snow fall intensity, visibility, etc., can be used to make objective measures of the safety performance of a roadway rather than relying on subjective evaluations of field staff. The review of the literature and the interviews clearly illustrate that not all delays (increased travel time) are valued the same. Expected delays (routine delays) are valued at the generalized costs (value of the driver’s time, fuel, insurance, wear and tear on the vehicle, etc.), but unexpected delays are valued much higher because they result in interruption of synchronous activities at the trip’s destination. To reduce the costs of delays resulting from road closures, public agencies should communicate as early as possible the likelihood of a road closure.
Resumo:
This paper describes a maximum likelihood method using historical weather data to estimate a parametric model of daily precipitation and maximum and minimum air temperatures. Parameter estimates are reported for Brookings, SD, and Boone, IA, to illustrate the procedure. The use of this parametric model to generate stochastic time series of daily weather is then summarized. A soil temperature model is described that determines daily average, maximum, and minimum soil temperatures based on air temperatures and precipitation, following a lagged process due to soil heat storage and other factors.
Resumo:
Silver Code (SilC) was originally discovered in [1–4] for 2×2 multiple-input multiple-output (MIMO) transmission. It has non-vanishing minimum determinant 1/7, slightly lower than Golden code, but is fast-decodable, i.e., it allows reduced-complexity maximum likelihood decoding [5–7]. In this paper, we present a multidimensional trellis-coded modulation scheme for MIMO systems [11] based on set partitioning of the Silver Code, named Silver Space-Time Trellis Coded Modulation (SST-TCM). This lattice set partitioning is designed specifically to increase the minimum determinant. The branches of the outer trellis code are labeled with these partitions. Viterbi algorithm is applied for trellis decoding, while the branch metrics are computed by using a sphere-decoding algorithm. It is shown that the proposed SST-TCM performs very closely to the Golden Space-Time Trellis Coded Modulation (GST-TCM) scheme, yetwith a much reduced decoding complexity thanks to its fast-decoding property.
Resumo:
BACKGROUND: Studies about the association between body mass index (BMI) and health-related quality of life (HRQOL) are often limited, because they 1) did not include a broad range of health-risk behaviors as covariates; 2) relied on clinical samples, which might lead to biased results; and 3) did not incorporate underweight individuals. Hence, this study aims to examine associations between BMI (from being underweight through obesity) and HRQOL in a population-based sample, while considering multiple health-risk behaviors (low physical activity, risky alcohol consumption, daily cigarette smoking, frequent cannabis use) as well as socio-demographic characteristics. METHODS: A total of 5 387 young Swiss men (mean age = 19.99; standard deviation = 1.24) of a cross-sectional population-based study were included. BMI was calculated (kg/m²) based on self-reported height and weight and divided into 'underweight' (<18.5), 'normal weight' (18.5-24.9), 'overweight' (25.0-29.9) and 'obese' (≥30.0). Mental and physical HRQOL was assessed via the SF-12v2. Self-reported information on physical activity, substance use (alcohol, cigarettes, and cannabis) and socio-demographic characteristics also was collected. Logistic regression analyses were conducted to study the associations between BMI categories and below average mental or physical HRQOL. Substance use variables and socio-demographic variables were used as covariates. RESULTS: Altogether, 76.3% were normal weight, whereas 3.3% were underweight, 16.5% overweight and 3.9% obese. Being overweight or obese was associated with reduced physical HRQOL (adjusted OR [95% CI] = 1.58 [1.18-2.13] and 2.45 [1.57-3.83], respectively), whereas being underweight predicted reduced mental HRQOL (adjusted OR [95% CI] = 1.49 [1.08-2.05]). Surprisingly, obesity decreased the likelihood of experiencing below average mental HRQOL (adjusted OR [95% CI] = 0.66 [0.46-0.94]). Besides BMI, expressed as a categorical variable, all health-risk behaviors and socio-demographic variables were associated with reduced physical and/or mental HRQOL. CONCLUSIONS: Deviations from normal weight are, even after controlling for important health-risk behaviors and socio-demographic characteristics, associated with compromised physical or mental HRQOL among young men. Hence, preventive programs should aim to preserve or re-establish normal weight. The self-appraised positive mental well-being of obese men noted here, which possibly reflects a response shift, might complicate such efforts.