960 resultados para Discrete Data Models


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Several different models of Trypanosoma cruzi evolution have been proposed. These models suggest that scarce events of genetic exchange occurred during the evolutionary history of this parasite. In addition, the debate has focused on the existence of one or two hybridisation events during the evolution of T. cruzi lineages. Here, we reviewed the literature and analysed available sequence data to clarify the phylogenetic relationships among these different lineages. We observed that TcI, TcIII and TcIV form a monophyletic group and that TcIII and TcIV are not, as previously suggested, TcI-TcII hybrids. Particularly, TcI and TcIII are sister groups that diverged around the same time that a widely distributed TcIV split into two clades (TcIVS and TcIVN). In addition, we collected evidence that TcIII received TcIVSkDNA by introgression on several occasions. Different demographic hypotheses (surfing and asymmetrical introgression) may explain the origin and expansion of the TcIII group. Considering these hypotheses, genetic exchange should have been relatively frequent between TcIII and TcIVS in the geographic area in which their distributions overlapped. In addition, our results support the hypothesis that two independent hybridisation events gave rise to TcV and TcVI. Consequently, TcIVS kDNA was first transferred to TcIII and later to TcV and TcVI in TcII/TcIII hybridisation events.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Summary : 1. Measuring health literacy in Switzerland: a review of six surveys: 1.1 Comparison of questionnaires - 1.2 Measures of health literacy in Switzerland - 1.3 Discussion of Swiss data on HL - 1.4 Description of the six surveys: 1.4.1 Current health trends and health literacy in the Swiss population (gfs-UNIVOX), 1.4.2 Nutrition, physical exercise and body weight : opinions and perceptions of the Swiss population (USI), 1.4.3 Health Literacy in Switzerland (ISPMZ), 1.4.4 Swiss Health Survey (SHS), 1.4.5 Survey of Health, Ageing and Retirement in Europe (SHARE), 1.4.6 Adult literacy and life skills survey (ALL). - 2 . Economic costs of low health literacy in Switzerland: a rough calculation. Appendix: Screenshots cost model

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Gene expression data from microarrays are being applied to predict preclinical and clinical endpoints, but the reliability of these predictions has not been established. In the MAQC-II project, 36 independent teams analyzed six microarray data sets to generate predictive models for classifying a sample with respect to one of 13 endpoints indicative of lung or liver toxicity in rodents, or of breast cancer, multiple myeloma or neuroblastoma in humans. In total, >30,000 models were built using many combinations of analytical methods. The teams generated predictive models without knowing the biological meaning of some of the endpoints and, to mimic clinical reality, tested the models on data that had not been used for training. We found that model performance depended largely on the endpoint and team proficiency and that different approaches generated models of similar performance. The conclusions and recommendations from MAQC-II should be useful for regulatory agencies, study committees and independent investigators that evaluate methods for global gene expression analysis.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

MicroRNAs (miRNAs) constitute an important class of gene regulators. While models have been proposed to explain their appearance and expansion, the validation of these models has been difficult due to the lack of comparative studies. Here, we analyze miRNA evolutionary patterns in two mammals, human and mouse, in relation to the age of miRNA families. In this comparative framework, we confirm some predictions of previously advanced models of miRNA evolution, e.g. that miRNAs arise more frequently de novo than by duplication, or that the number of protein-coding gene targeted by miRNAs decreases with evolutionary time. We also corroborate that miRNAs display an increase in expression level with evolutionary time, however we show that this relation is largely tissue-dependent, and especially low in embryonic or nervous tissues. We identify a bias of tag-sequencing techniques regarding the assessment of breadth of expression, leading us, contrary to predictions, to find more tissue-specific expression of older miRNAs. Together, our results refine the models used so far to depict the evolution of miRNA genes. They underline the role of tissue-specific selective forces on the evolution of miRNAs, as well as the potential co-evolution patterns between miRNAs and the protein-coding genes they target.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The objective of the EU funded integrated project "ACuteTox" is to develop a strategy in which general cytotoxicity, together with organ-specific endpoints and biokinetic features, are taken into consideration in the in vitro prediction of oral acute systemic toxicity. With regard to the nervous system, the effects of 23 reference chemicals were tested with approximately 50 endpoints, using a neuronal cell line, primary neuronal cell cultures, brain slices and aggregated brain cell cultures. Comparison of the in vitro neurotoxicity data with general cytotoxicity data generated in a non-neuronal cell line and with in vivo data such as acute human lethal blood concentration, revealed that GABA(A) receptor function, acetylcholine esterase activity, cell membrane potential, glucose uptake, total RNA expression and altered gene expression of NF-H, GFAP, MBP, HSP32 and caspase-3 were the best endpoints to use for further testing with 36 additional chemicals. The results of the second analysis showed that no single neuronal endpoint could give a perfect improvement in the in vitro-in vivo correlation, indicating that several specific endpoints need to be analysed and combined with biokinetic data to obtain the best correlation with in vivo acute toxicity.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

There is an increasing trend in the incidence of cancer worldwide, and it has been accepted that environmental factors account for an important proportion of the global burden. The present paper reports preliminary findings on the influence of the historical exposure to a group of persistent organic pollutants on total cancer risk, at year 9 in the follow-up of a cohort from Southern Spain. A cohort of 368 participants (median age 51 years) was recruited in 2003. Their historical exposure was estimated by analyzing residues of persistent organic pollutants in adipose tissue. Estimation of cancer incidence was based on data from a population-based cancer registry. Statistical analyses were performed using multivariable Cox-regression models. In males, PCB 153 concentrations were positively associated with total cancer risk, with an adjusted hazard ratio (95% confidence interval) of 1.20 (1.01-1.41) for an increment of 100 ng/g lipid. Our preliminary findings suggest a potential relationship between the historical exposure to persistent organic pollutants and the risk of cancer in men. However, these results should be interpreted with caution and require verification during the future follow-up of this cohort.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper investigates the role of learning by private agents and the central bank (two-sided learning) in a New Keynesian framework in which both sides of the economy have asymmetric and imperfect knowledge about the true data generating process. We assume that all agents employ the data that they observe (which may be distinct for different sets of agents) to form beliefs about unknown aspects of the true model of the economy, use their beliefs to decide on actions, and revise these beliefs through a statistical learning algorithm as new information becomes available. We study the short-run dynamics of our model and derive its policy recommendations, particularly with respect to central bank communications. We demonstrate that two-sided learning can generate substantial increases in volatility and persistence, and alter the behavior of the variables in the model in a signifficant way. Our simulations do not converge to a symmetric rational expectations equilibrium and we highlight one source that invalidates the convergence results of Marcet and Sargent (1989). Finally, we identify a novel aspect of central bank communication in models of learning: communication can be harmful if the central bank's model is substantially mis-specified

Relevância:

30.00% 30.00%

Publicador:

Resumo:

One of the tantalising remaining problems in compositional data analysis lies in how to deal with data sets in which there are components which are essential zeros. By anessential zero we mean a component which is truly zero, not something recorded as zero simply because the experimental design or the measuring instrument has not been sufficiently sensitive to detect a trace of the part. Such essential zeros occur inmany compositional situations, such as household budget patterns, time budgets,palaeontological zonation studies, ecological abundance studies. Devices such as nonzero replacement and amalgamation are almost invariably ad hoc and unsuccessful insuch situations. From consideration of such examples it seems sensible to build up amodel in two stages, the first determining where the zeros will occur and the secondhow the unit available is distributed among the non-zero parts. In this paper we suggest two such models, an independent binomial conditional logistic normal model and a hierarchical dependent binomial conditional logistic normal model. The compositional data in such modelling consist of an incidence matrix and a conditional compositional matrix. Interesting statistical problems arise, such as the question of estimability of parameters, the nature of the computational process for the estimation of both the incidence and compositional parameters caused by the complexity of the subcompositional structure, the formation of meaningful hypotheses, and the devising of suitable testing methodology within a lattice of such essential zero-compositional hypotheses. The methodology is illustrated by application to both simulated and real compositional data

Relevância:

30.00% 30.00%

Publicador:

Resumo:

First discussion on compositional data analysis is attributable to Karl Pearson, in 1897. However, notwithstanding the recent developments on algebraic structure of the simplex, more than twenty years after Aitchison’s idea of log-transformations of closed data, scientific literature is again full of statistical treatments of this type of data by using traditional methodologies. This is particularly true in environmental geochemistry where besides the problem of the closure, the spatial structure (dependence) of the data have to be considered. In this work we propose the use of log-contrast values, obtained by asimplicial principal component analysis, as LQGLFDWRUV of given environmental conditions. The investigation of the log-constrast frequency distributions allows pointing out the statistical laws able togenerate the values and to govern their variability. The changes, if compared, for example, with the mean values of the random variables assumed as models, or other reference parameters, allow definingmonitors to be used to assess the extent of possible environmental contamination. Case study on running and ground waters from Chiavenna Valley (Northern Italy) by using Na+, K+, Ca2+, Mg2+, HCO3-, SO4 2- and Cl- concentrations will be illustrated

Relevância:

30.00% 30.00%

Publicador:

Resumo:

With advances in the effectiveness of treatment and disease management, the contribution of chronic comorbid diseases (comorbidities) found within the Charlson comorbidity index to mortality is likely to have changed since development of the index in 1984. The authors reevaluated the Charlson index and reassigned weights to each condition by identifying and following patients to observe mortality within 1 year after hospital discharge. They applied the updated index and weights to hospital discharge data from 6 countries and tested for their ability to predict in-hospital mortality. Compared with the original Charlson weights, weights generated from the Calgary, Alberta, Canada, data (2004) were 0 for 5 comorbidities, decreased for 3 comorbidities, increased for 4 comorbidities, and did not change for 5 comorbidities. The C statistics for discriminating in-hospital mortality between the new score generated from the 12 comorbidities and the Charlson score were 0.825 (new) and 0.808 (old), respectively, in Australian data (2008), 0.828 and 0.825 in Canadian data (2008), 0.878 and 0.882 in French data (2004), 0.727 and 0.723 in Japanese data (2008), 0.831 and 0.836 in New Zealand data (2008), and 0.869 and 0.876 in Swiss data (2008). The updated index of 12 comorbidities showed good-to-excellent discrimination in predicting in-hospital mortality in data from 6 countries and may be more appropriate for use with more recent administrative data.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Customer satisfaction and retention are key issues for organizations in today’s competitive market place. As such, much research and revenue has been invested in developing accurate ways of assessing consumer satisfaction at both the macro (national) and micro (organizational) level, facilitating comparisons in performance both within and between industries. Since the instigation of the national customer satisfaction indices (CSI), partial least squares (PLS) has been used to estimate the CSI models in preference to structural equation models (SEM) because they do not rely on strict assumptions about the data. However, this choice was based upon some misconceptions about the use of SEM’s and does not take into consideration more recent advances in SEM, including estimation methods that are robust to non-normality and missing data. In this paper, both SEM and PLS approaches were compared by evaluating perceptions of the Isle of Man Post Office Products and Customer service using a CSI format. The new robust SEM procedures were found to be advantageous over PLS. Product quality was found to be the only driver of customer satisfaction, while image and satisfaction were the only predictors of loyalty, thus arguing for the specificity of postal services

Relevância:

30.00% 30.00%

Publicador:

Resumo:

High-resolution tomographic imaging of the shallow subsurface is becoming increasingly important for a wide range of environmental, hydrological and engineering applications. Because of their superior resolution power, their sensitivity to pertinent petrophysical parameters, and their far reaching complementarities, both seismic and georadar crosshole imaging are of particular importance. To date, corresponding approaches have largely relied on asymptotic, ray-based approaches, which only account for a very small part of the observed wavefields, inherently suffer from a limited resolution, and in complex environments may prove to be inadequate. These problems can potentially be alleviated through waveform inversion. We have developed an acoustic waveform inversion approach for crosshole seismic data whose kernel is based on a finite-difference time-domain (FDTD) solution of the 2-D acoustic wave equations. This algorithm is tested on and applied to synthetic data from seismic velocity models of increasing complexity and realism and the results are compared to those obtained using state-of-the-art ray-based traveltime tomography. Regardless of the heterogeneity of the underlying models, the waveform inversion approach has the potential of reliably resolving both the geometry and the acoustic properties of features of the size of less than half a dominant wavelength. Our results do, however, also indicate that, within their inherent resolution limits, ray-based approaches provide an effective and efficient means to obtain satisfactory tomographic reconstructions of the seismic velocity structure in the presence of mild to moderate heterogeneity and in absence of strong scattering. Conversely, the excess effort of waveform inversion provides the greatest benefits for the most heterogeneous, and arguably most realistic, environments where multiple scattering effects tend to be prevalent and ray-based methods lose most of their effectiveness.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

SummaryDiscrete data arise in various research fields, typically when the observations are count data.I propose a robust and efficient parametric procedure for estimation of discrete distributions. The estimation is done in two phases. First, a very robust, but possibly inefficient, estimate of the model parameters is computed and used to indentify outliers. Then the outliers are either removed from the sample or given low weights, and a weighted maximum likelihood estimate (WML) is computed.The weights are determined via an adaptive process such that if the data follow the model, then asymptotically no observation is downweighted.I prove that the final estimator inherits the breakdown point of the initial one, and that its influence function at the model is the same as the influence function of the maximum likelihood estimator, which strongly suggests that it is asymptotically fully efficient.The initial estimator is a minimum disparity estimator (MDE). MDEs can be shown to have full asymptotic efficiency, and some MDEs have very high breakdown points and very low bias under contamination. Several initial estimators are considered, and the performances of the WMLs based on each of them are studied.It results that in a great variety of situations the WML substantially improves the initial estimator, both in terms of finite sample mean square error and in terms of bias under contamination. Besides, the performances of the WML are rather stable under a change of the MDE even if the MDEs have very different behaviors.Two examples of application of the WML to real data are considered. In both of them, the necessity for a robust estimator is clear: the maximum likelihood estimator is badly corrupted by the presence of a few outliers.This procedure is particularly natural in the discrete distribution setting, but could be extended to the continuous case, for which a possible procedure is sketched.RésuméLes données discrètes sont présentes dans différents domaines de recherche, en particulier lorsque les observations sont des comptages.Je propose une méthode paramétrique robuste et efficace pour l'estimation de distributions discrètes. L'estimation est faite en deux phases. Tout d'abord, un estimateur très robuste des paramètres du modèle est calculé, et utilisé pour la détection des données aberrantes (outliers). Cet estimateur n'est pas nécessairement efficace. Ensuite, soit les outliers sont retirés de l'échantillon, soit des faibles poids leur sont attribués, et un estimateur du maximum de vraisemblance pondéré (WML) est calculé.Les poids sont déterminés via un processus adaptif, tel qu'asymptotiquement, si les données suivent le modèle, aucune observation n'est dépondérée.Je prouve que le point de rupture de l'estimateur final est au moins aussi élevé que celui de l'estimateur initial, et que sa fonction d'influence au modèle est la même que celle du maximum de vraisemblance, ce qui suggère que cet estimateur est pleinement efficace asymptotiquement.L'estimateur initial est un estimateur de disparité minimale (MDE). Les MDE sont asymptotiquement pleinement efficaces, et certains d'entre eux ont un point de rupture très élevé et un très faible biais sous contamination. J'étudie les performances du WML basé sur différents MDEs.Le résultat est que dans une grande variété de situations le WML améliore largement les performances de l'estimateur initial, autant en terme du carré moyen de l'erreur que du biais sous contamination. De plus, les performances du WML restent assez stables lorsqu'on change l'estimateur initial, même si les différents MDEs ont des comportements très différents.Je considère deux exemples d'application du WML à des données réelles, où la nécessité d'un estimateur robuste est manifeste : l'estimateur du maximum de vraisemblance est fortement corrompu par la présence de quelques outliers.La méthode proposée est particulièrement naturelle dans le cadre des distributions discrètes, mais pourrait être étendue au cas continu.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Purpose: To evaluate whether the correlation between in vitro bond strength data and estimated clinical retention rates of cervical restorations after two years depends on pooled data obtained from multicenter studies or single-test data. Materials and Methods: Pooled mean data for six dentin adhesive systems (Adper Prompt L-Pop, Clearfil SE, OptiBond FL, Prime & Bond NT, Single Bond, and Scotchbond Multipurpose) and four laboratory methods (macroshear, microshear, macrotensile and microtensile bond strength test) (Scherrer et al, 2010) were correlated to estimated pooled two-year retention rates of Class V restorations using the same adhesive systems. For bond strength data from a single test institute, the literature search in SCOPUS revealed one study that tested all six adhesive systems (microtensile) and two that tested five of the six systems (microtensile, macroshear). The correlation was determined with a database designed to perform a meta-analysis on the clinical performance of cervical restorations (Heintze et al, 2010). The clinical data were pooled and adjusted in a linear mixed model, taking the study effect, dentin preparation, type of isolation and bevelling of enamel into account. A regression analysis was carried out to evaluate the correlation between clinical and laboratory findings. Results: The results of the regression analysis for the pooled data revealed that only the macrotensile (adjusted R2 = 0.86) and microtensile tests (adjusted R2 = 0.64), but not the shear and the microshear tests, correlated well with the clinical findings. As regards the data from a single-test institute, the correlation was not statistically significant. Conclusion: Macrotensile and microtensile bond strength tests showed an adequate correlation with the retention rate of cervical restorations after two years. Bond strength tests should be carried out by different operators and/or research institutes to determine the reliability and technique sensitivity of the material under investigation.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Abstract Significance: Schizophrenia (SZ) and bipolar disorder (BD) are classified as two distinct diseases. However, accumulating evidence shows that both disorders share genetic, pathological, and epidemiological characteristics. Based on genetic and functional findings, redox dysregulation due to an imbalance between pro-oxidants and antioxidant defense mechanisms has been proposed as a risk factor contributing to their pathophysiology. Recent Advances: Altered antioxidant systems and signs of increased oxidative stress are observed in peripheral tissues and brains of SZ and BD patients, including abnormal prefrontal levels of glutathione (GSH), the major cellular redox regulator and antioxidant. Here we review experimental data from rodent models demonstrating that permanent as well as transient GSH deficit results in behavioral, morphological, electrophysiological, and neurochemical alterations analogous to pathologies observed in patients. Mice with GSH deficit display increased stress reactivity, altered social behavior, impaired prepulse inhibition, and exaggerated locomotor responses to psychostimulant injection. These behavioral changes are accompanied by N-methyl-D-aspartate receptor hypofunction, elevated glutamate levels, impairment of parvalbumin GABA interneurons, abnormal neuronal synchronization, altered dopamine neurotransmission, and deficient myelination. Critical Issues: Treatment with the GSH precursor and antioxidant N-acetylcysteine normalizes some of those deficits in mice, but also improves SZ and BD symptoms when given as adjunct to antipsychotic medication. Future Directions: These data demonstrate the usefulness of GSH-deficient rodent models to identify the mechanisms by which a redox imbalance could contribute to the development of SZ and BD pathophysiologies, and to develop novel therapeutic approaches based on antioxidant and redox regulator compounds. Antioxid. Redox Signal. 18, 1428-1443.