907 resultados para Model selection criteria


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Self‐selection into treatment and self‐selection into the sample are major concerns of VAA research and need to be controlled for if the aim is to deduce causal effects from VAA use in observational data. This paper focuses on the methodological aspects of VAA research and outlines omnipresent endogeneity issues, partly imposed through unobserved factors that affect both whether individuals chose to use VAAs and their electoral behavior. We promote using Heckman selection models and apply various versions of the model to data from the Swiss electorate and smartvote users in order to see to what extent selection biases interfere with the estimated effects of interest.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Models of codon evolution have attracted particular interest because of their unique capabilities to detect selection forces and their high fit when applied to sequence evolution. We described here a novel approach for modeling codon evolution, which is based on Kronecker product of matrices. The 61 × 61 codon substitution rate matrix is created using Kronecker product of three 4 × 4 nucleotide substitution matrices, the equilibrium frequency of codons, and the selection rate parameter. The entities of the nucleotide substitution matrices and selection rate are considered as parameters of the model, which are optimized by maximum likelihood. Our fully mechanistic model allows the instantaneous substitution matrix between codons to be fully estimated with only 19 parameters instead of 3,721, by using the biological interdependence existing between positions within codons. We illustrate the properties of our models using computer simulations and assessed its relevance by comparing the AICc measures of our model and other models of codon evolution on simulations and a large range of empirical data sets. We show that our model fits most biological data better compared with the current codon models. Furthermore, the parameters in our model can be interpreted in a similar way as the exchangeability rates found in empirical codon models.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Recombination arrest between X and Y chromosomes, driven by sexually antagonistic genes, is expected to induce their progressive differentiation. However, in contrast to birds and mammals (which display the predicted pattern), most cold-blooded vertebrates have homomorphic sex chromosomes. Two main hypotheses have been proposed to account for this, namely high turnover rates of sex-determining systems and occasional XY recombination. Using individual-based simulations, we formalize the evolution of XY recombination (here mediated by sex reversal; the "fountain-of-youth" model) under the contrasting forces of sexually antagonistic selection and deleterious mutations. The shift between the domains of elimination and accumulation occurs at much lower selection coefficients for the Y than for the X. In the absence of dosage compensation, mildly deleterious mutations accumulating on the Y depress male fitness, thereby providing incentives for XY recombination. Under our settings, this occurs via "demasculinization" of the Y, allowing recombination in XY (sex-reversed) females. As we also show, this generates a conflict with the X, which coevolves to oppose sex reversal. The resulting rare events of XY sex reversal are enough to purge the Y from its load of deleterious mutations. Our results support the "fountain of youth" as a plausible mechanism to account for the maintenance of sex-chromosome homomorphy.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

1. We investigated experimentally predation by the flatworm Dugesia lugubris on the snail Physa acuta in relation to predator body length and to prey morphology [shell length (SL) and aperture width (AW)]. 2. SL and AW correlate strongly in the field, but display significant and independent variance among populations. In the laboratory, predation by Dugesia resulted in large and significant selection differentials on both SL and AW. Analysis of partial effects suggests that selection on AW was indirect, and mediated through its strong correlation with SL. 3. The probability P(ij) for a snail of size category i (SL) to be preyed upon by a flatworm of size category j was fitted with a Poisson-probability distribution, the mean of which increased linearly with predator size (i). Despite the low number of parameters, the fit was excellent (r2 = 0.96). We offer brief biological interpretations of this relationship with reference to optimal foraging theory. 4. The largest size class of Dugesia (>2 cm) did not prey on snails larger than 7 mm shell length. This size threshold might offer Physa a refuge against flatworm predation and thereby allow coexistence in the field. 5. Our results are further discussed with respect to previous field and laboratory observations on P acuta life-history patterns, in particular its phenotypic variance in adult body size.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

CodeML (part of the PAML package) im- plements a maximum likelihood-based approach to de- tect positive selection on a specific branch of a given phylogenetic tree. While CodeML is widely used, it is very compute-intensive. We present SlimCodeML, an optimized version of CodeML for the branch-site model. Our performance analysis shows that SlimCodeML substantially outperforms CodeML (up to 9.38 times faster), especially for large-scale genomic analyses.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Background: Choosing an adequate measurement instrument depends on the proposed use of the instrument, the concept to be measured, the measurement properties (e.g. internal consistency, reproducibility, content and construct validity, responsiveness, and interpretability), the requirements, the burden for subjects, and costs of the available instruments. As far as measurement properties are concerned, there are no sufficiently specific standards for the evaluation of measurement properties of instruments to measure health status, and also no explicit criteria for what constitutes good measurement properties. In this paper we describe the protocol for the COSMIN study, the objective of which is to develop a checklist that contains COnsensus-based Standards for the selection of health Measurement INstruments, including explicit criteria for satisfying these standards. We will focus on evaluative health related patient-reported outcomes (HR-PROs), i.e. patient-reported health measurement instruments used in a longitudinal design as an outcome measure, excluding health care related PROs, such as satisfaction with care or adherence. The COSMIN standards will be made available in the form of an easily applicable checklist.Method: An international Delphi study will be performed to reach consensus on which and how measurement properties should be assessed, and on criteria for good measurement properties. Two sources of input will be used for the Delphi study: (1) a systematic review of properties, standards and criteria of measurement properties found in systematic reviews of measurement instruments, and (2) an additional literature search of methodological articles presenting a comprehensive checklist of standards and criteria. The Delphi study will consist of four (written) Delphi rounds, with approximately 30 expert panel members with different backgrounds in clinical medicine, biostatistics, psychology, and epidemiology. The final checklist will subsequently be field-tested by assessing the inter-rater reproducibility of the checklist.Discussion: Since the study will mainly be anonymous, problems that are commonly encountered in face-to-face group meetings, such as the dominance of certain persons in the communication process, will be avoided. By performing a Delphi study and involving many experts, the likelihood that the checklist will have sufficient credibility to be accepted and implemented will increase.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The development of susceptibility maps for debris flows is of primary importance due to population pressure in hazardous zones. However, hazard assessment by processbased modelling at a regional scale is difficult due to the complex nature of the phenomenon, the variability of local controlling factors, and the uncertainty in modelling parameters. A regional assessment must consider a simplified approach that is not highly parameter dependant and that can provide zonation with minimum data requirements. A distributed empirical model has thus been developed for regional susceptibility assessments using essentially a digital elevation model (DEM). The model is called Flow-R for Flow path assessment of gravitational hazards at a Regional scale (available free of charge under www.flow-r.org) and has been successfully applied to different case studies in various countries with variable data quality. It provides a substantial basis for a preliminary susceptibility assessment at a regional scale. The model was also found relevant to assess other natural hazards such as rockfall, snow avalanches and floods. The model allows for automatic source area delineation, given user criteria, and for the assessment of the propagation extent based on various spreading algorithms and simple frictional laws.We developed a new spreading algorithm, an improved version of Holmgren's direction algorithm, that is less sensitive to small variations of the DEM and that is avoiding over-channelization, and so produces more realistic extents. The choices of the datasets and the algorithms are open to the user, which makes it compliant for various applications and dataset availability. Amongst the possible datasets, the DEM is the only one that is really needed for both the source area delineation and the propagation assessment; its quality is of major importance for the results accuracy. We consider a 10m DEM resolution as a good compromise between processing time and quality of results. However, valuable results have still been obtained on the basis of lower quality DEMs with 25m resolution.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

To estimate the prevalence of metabolically healthy obesity (MHO) according to different definitions. Population-based sample of 2803 women and 2557 men participated in the study. Metabolic abnormalities were defined using six sets of criteria, which included different combinations of the following: waist; blood pressure; total, high-density lipoprotein or low-density lipoprotein-cholesterol; triglycerides; fasting glucose; homeostasis model assessment; high-sensitivity C-reactive protein; personal history of cardiovascular, respiratory or metabolic diseases. For each set, prevalence of MHO was assessed for body mass index (BMI); waist or percent body fat. Among obese (BMI 30 kg/m(2)) participants, prevalence of MHO ranged between 3.3 and 32.1% in men and between 11.4 and 43.3% in women according to the criteria used. Using abdominal obesity, prevalence of MHO ranged between 5.7 and 36.7% (men) and 12.2 and 57.5% (women). Using percent body fat led to a prevalence of MHO ranging between 6.4 and 43.1% (men) and 12.0 and 55.5% (women). MHO participants had a lower odd of presenting a family history of type 2 diabetes. After multivariate adjustment, the odds of presenting with MHO decreased with increasing age, whereas no relationship was found with gender, alcohol consumption or tobacco smoking using most sets of criteria. Physical activity was positively related, whereas increased waist was negatively related with BMI-defined MHO. MHO prevalence varies considerably according to the criteria used, underscoring the need for a standard definition of this metabolic entity. Physical activity increases the likelihood of presenting with MHO, and MHO is associated with a lower prevalence of family history of type 2 diabetes.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We propose an adverse selection framework in which the financial sector has a dual role. It amplifies or dampens exogenous shocks and also generates endogenous fluctuations. We fully characterize constrained optimal contracts in a setting in which entrepreneurs need to borrow and are privately informed about the quality of their projects. Our characterization is novel in analyzing pooling and separating allocations in a context of multi-dimensional screening: specifically, the amounts of investment undertaken and of entrepreneurial net worth are used to screen projects. We then embed these results in a dynamic competitive economy. First, we show how endogenous regime switches in financial contracts may generate fluctuations in an economy that exhibits no dynamics under full information. Unlike previous models of endogenous cycles, our result does not rely on entrepreneurial net worth being counter-cyclical or inconsequential for determining investment. Secondly, the model shows the different implications of adverse selection as opposed to pure moral hazard. In particular, and contrary to standard results in the macroeconomic literature, the financial system may dampen exogenous shocks in the presence of adverse selection.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

OBJECTIVE: To develop a provisional definition for the evaluation of response to therapy in juvenile dermatomyositis (DM) based on the Paediatric Rheumatology International Trials Organisation juvenile DM core set of variables. METHODS: Thirty-seven experienced pediatric rheumatologists from 27 countries achieved consensus on 128 difficult patient profiles as clinically improved or not improved using a stepwise approach (patient's rating, statistical analysis, definition selection). Using the physicians' consensus ratings as the "gold standard measure," chi-square, sensitivity, specificity, false-positive and-negative rates, area under the receiver operating characteristic curve, and kappa agreement for candidate definitions of improvement were calculated. Definitions with kappa values >0.8 were multiplied by the face validity score to select the top definitions. RESULTS: The top definition of improvement was at least 20% improvement from baseline in 3 of 6 core set variables with no more than 1 of the remaining worsening by more than 30%, which cannot be muscle strength. The second-highest scoring definition was at least 20% improvement from baseline in 3 of 6 core set variables with no more than 2 of the remaining worsening by more than 25%, which cannot be muscle strength (definition P1 selected by the International Myositis Assessment and Clinical Studies group). The third is similar to the second with the maximum amount of worsening set to 30%. This indicates convergent validity of the process. CONCLUSION: We propose a provisional data-driven definition of improvement that reflects well the consensus rating of experienced clinicians, which incorporates clinically meaningful change in core set variables in a composite end point for the evaluation of global response to therapy in juvenile DM.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

When individuals in a population can acquire traits through learning, each individual may express a certain number of distinct cultural traits. These traits may have been either invented by the individual himself or acquired from others in the population. Here, we develop a game theoretic model for the accumulation of cultural traits through individual and social learning. We explore how the rates of innovation, decay, and transmission of cultural traits affect the evolutionary stable (ES) levels of individual and social learning and the number of cultural traits expressed by an individual when cultural dynamics are at a steady-state. We explore the evolution of these phenotypes in both panmictic and structured population settings. Our results suggest that in panmictic populations, the ES level of learning and number of traits tend to be independent of the social transmission rate of cultural traits and is mainly affected by the innovation and decay rates. By contrast, in structured populations, where interactions occur between relatives, the ES level of learning and the number of traits per individual can be increased (relative to the panmictic case) and may then markedly depend on the transmission rate of cultural traits. This suggests that kin selection may be one additional solution to Rogers's paradox of nonadaptive culture.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this paper we study the relevance of multiple kernel learning (MKL) for the automatic selection of time series inputs. Recently, MKL has gained great attention in the machine learning community due to its flexibility in modelling complex patterns and performing feature selection. In general, MKL constructs the kernel as a weighted linear combination of basis kernels, exploiting different sources of information. An efficient algorithm wrapping a Support Vector Regression model for optimizing the MKL weights, named SimpleMKL, is used for the analysis. In this sense, MKL performs feature selection by discarding inputs/kernels with low or null weights. The approach proposed is tested with simulated linear and nonlinear time series (AutoRegressive, Henon and Lorenz series).

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Aim: We asked whether myocardial flow reserve (MFR) by Rb-82 cardiac PET improve the selection of patients eligible for invasive coronary angiography (ICA). Material and Methods: We enrolled 26 consecutive patients with suspected or known coronary artery disease who performed dynamic Rb-82 PET/CT and (ICA) within 60 days; 4 patients who underwent revascularization or had any cardiovascular events between PET and ICA were excluded. Myocardial blood flow at rest (rMBF), at stress with adenosine (sMBF) and myocardial flow reserve (MFR=sMBF/rMBF) were estimated using the 1-compartment Lortie model (FlowQuant) for each coronary arteries territories. Stenosis severity was assessed using computer-based automated edge detection (QCA). MFR was divided in 3 groups: G1:MFR<1.5, G2:1.5≤MFR<2 and G3:2≤MFR. Stenosis severity was graded as non-significant (<50% or FFR ≥0.8), intermediate (50%≤stenosis<70%) and severe (≥70%). Correlation between MFR and percentage of stenosis were assessed using a non-parametric Spearman test. Results: In G1 (44 vessels), 17 vessels (39%) had a severe stenosis, 11 (25%) an intermediate one, and 16 (36%) no significant stenosis. In G2 (13 vessels), 2 (15%) vessels presented a severe stenosis, 7 (54%) an intermediate one, and 4 (31%) no significant stenosis. In G3 (9 vessels), 0 vessel presented a severe stenosis, 1 (11%) an intermediate one, and 8 (89%) no significant stenosis. Of note, among 11 patients with 3-vessel low MFR<1.5 (G1), 9/11 (82%) had at least one severe stenosis and 2/11 (18%) had at least one intermediate stenosis. There was a significant inverse correlation between stenosis severity and MFR among all 66 territories analyzed (rho= -0.38, p=0.002). Conclusion: Patients with MFR>2 could avoid ICA. Low MFR (G1, G2) on a vessel-based analysis seems to be a poor predictor of severe stenosis severity. Patients with 3-vessel low MFR would benefit from ICA as they are likely to present a significant stenosis in at least one vessel.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

BACKGROUND: The reverse transcription quantitative real-time polymerase chain reaction (RT-qPCR) is a widely used, highly sensitive laboratory technique to rapidly and easily detect, identify and quantify gene expression. Reliable RT-qPCR data necessitates accurate normalization with validated control genes (reference genes) whose expression is constant in all studied conditions. This stability has to be demonstrated.We performed a literature search for studies using quantitative or semi-quantitative PCR in the rat spared nerve injury (SNI) model of neuropathic pain to verify whether any reference genes had previously been validated. We then analyzed the stability over time of 7 commonly used reference genes in the nervous system - specifically in the spinal cord dorsal horn and the dorsal root ganglion (DRG). These were: Actin beta (Actb), Glyceraldehyde-3-phosphate dehydrogenase (GAPDH), ribosomal proteins 18S (18S), L13a (RPL13a) and L29 (RPL29), hypoxanthine phosphoribosyltransferase 1 (HPRT1) and hydroxymethylbilane synthase (HMBS). We compared the candidate genes and established a stability ranking using the geNorm algorithm. Finally, we assessed the number of reference genes necessary for accurate normalization in this neuropathic pain model. RESULTS: We found GAPDH, HMBS, Actb, HPRT1 and 18S cited as reference genes in literature on studies using the SNI model. Only HPRT1 and 18S had been once previously demonstrated as stable in RT-qPCR arrays. All the genes tested in this study, using the geNorm algorithm, presented gene stability values (M-value) acceptable enough for them to qualify as potential reference genes in both DRG and spinal cord. Using the coefficient of variation, 18S failed the 50% cut-off with a value of 61% in the DRG. The two most stable genes in the dorsal horn were RPL29 and RPL13a; in the DRG they were HPRT1 and Actb. Using a 0.15 cut-off for pairwise variations we found that any pair of stable reference gene was sufficient for the normalization process. CONCLUSIONS: In the rat SNI model, we validated and ranked Actb, RPL29, RPL13a, HMBS, GAPDH, HPRT1 and 18S as good reference genes in the spinal cord. In the DRG, 18S did not fulfill stability criteria. The combination of any two stable reference genes was sufficient to provide an accurate normalization.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

PURPOSE: Spine surgery rates are increasing worldwide. Treatment failures are often attributed to poor patient selection and inappropriate treatment, but for many spinal disorders there is little consensus on the precise indications for surgery. With an aging population, more patients with lumbar degenerative spondylolisthesis (LDS) will present for surgery. The aim of this study was to develop criteria for the appropriateness of surgery in symptomatic LDS. METHODS: A systematic review was carried out to summarize the current level of evidence for the treatment of LDS. Clinical scenarios were generated comprising combinations of signs and symptoms in LDS and other relevant variables. Based on the systematic review and their own clinical experience, twelve multidisciplinary international experts rated each scenario on a 9-point scale (1 highly inappropriate, 9 highly appropriate) with respect to performing decompression only, fusion, and instrumented fusion. Surgery for each theoretical scenario was classified as appropriate, inappropriate, or uncertain based on the median ratings and disagreement in the ratings. RESULTS: 744 hypothetical scenarios were generated; overall, surgery (of some type) was rated appropriate in 27 %, uncertain in 41 % and inappropriate in 31 %. Frank panel disagreement was low (7 % scenarios). Face validity was shown by the logical relationship between each variable's subcategories and the appropriateness ratings, e.g., no/mild disability had a mean appropriateness rating of 2.3 ± 1.5, whereas the rating for moderate disability was 5.0 ± 1.6 and for severe disability, 6.6 ± 1.6. Similarly, the average rating for no/minimal neurological abnormality was 2.3 ± 1.5, increasing to 4.3 ± 2.4 for moderate and 5.9 ± 1.7 for severe abnormality. The three variables most likely (p < 0.0001) to be components of scenarios rated "appropriate" were: severe disability, no yellow flags, and severe neurological deficit. CONCLUSION: This is the first study to report criteria for determining candidacy for surgery in LDS developed by a multidisciplinary international panel using a validated method (RAM). The panel ratings followed logical clinical rationale, indicating good face validity. The work refines clinical classification and the phenotype of degenerative spondylolisthesis. The predictive validity of the criteria should be evaluated prospectively to examine whether patients treated "appropriately" have better clinical outcomes.