911 resultados para Null Hypothesis
Resumo:
Background: Obese adults are prone to develop metabolic and cardiovascular diseases. Furthermore, over-weight expectant mothers give birth to large babies who also have increased likelihood of developing metabolic and cardiovascular diseases. Fundamental advancements to better understand the pathophysiology of obesity are critical in the development of anti-obesity therapies not only for this but also future generations. Skeletal muscle plays a major role in fat metabolism and much work has focused in promoting this activity in order to control the development of obesity. Research has evaluated myostatin inhibition as a strategy to prevent the development of obesity and concluded in some cases that it offers a protective mechanism against a high-fat diet. Results: We hypothesised that myostatin inhibition should protect not only the mother but also its developing foetus from the detrimental effects of a high-fat diet. Unexpectedly, we found muscle development was attenuated in the foetus of myostatin null mice raised on a high-fat diet. We therefore re-examined the effect of the high-fat diet on adults and found myostatin null mice were more susceptible to diet-induced obesity through a mechanism involving impairment of inter-organ fat utilization. Conclusions: Loss of myostatin alters fatty acid uptake and oxidation in skeletal muscle and liver. We show that abnormally high metabolic activity of fat in myostatin null mice is decreased by a high-fat diet resulting in excessive adipose deposition and lipotoxicity. Collectively, our genetic loss-of-function studies offer an explanation of the lean phenotype displayed by a host of animals lacking myostatin signalling. Keywords: Muscle, Obesity, High-fat diet, Metabolism, Myostatin
Resumo:
The contraction of a species’ distribution range, which results from the extirpation of local populations, generally precedes its extinction. Therefore, understanding drivers of range contraction is important for conservation and management. Although there are many processes that can potentially lead to local extirpation and range contraction, three main null models have been proposed: demographic, contagion, and refuge. The first two models postulate that the probability of local extirpation for a given area depends on its relative position within the range; but these models generate distinct spatial predictions because they assume either a ubiquitous (demographic) or a clinal (contagion) distribution of threats. The third model (refuge) postulates that extirpations are determined by the intensity of human impacts, leading to heterogeneous spatial predictions potentially compatible with those made by the other two null models. A few previous studies have explored the generality of some of these null models, but we present here the first comprehensive evaluation of all three models. Using descriptive indices and regression analyses we contrast the predictions made by each of the null models using empirical spatial data describing range contraction in 386 terrestrial vertebrates (mammals, birds, amphibians, and reptiles) distributed across the World. Observed contraction patterns do not consistently conform to the predictions of any of the three models, suggesting that these may not be adequate null models to evaluate range contraction dynamics among terrestrial vertebrates. Instead, our results support alternative null models that account for both relative position and intensity of human impacts. These new models provide a better multifactorial baseline to describe range contraction patterns in vertebrates. This general baseline can be used to explore how additional factors influence contraction, and ultimately extinction for particular areas or species as well as to predict future changes in light of current and new threats.
Resumo:
Background: The differential susceptibly hypothesis suggests that certain genetic variants moderate the effects of both negative and positive environments on mental health and may therefore be important predictors of response to psychological treatments. Nevertheless, the identification of such variants has so far been limited to preselected candidate genes. In this study we extended the differential susceptibility hypothesis from a candidate gene to a genome-wide approach to test whether a polygenic score of environmental sensitivity predicted response to Cognitive Behavioural Therapy (CBT) in children with anxiety disorders. Methods: We identified variants associated with environmental sensitivity using a novel method in which within-pair variability in emotional problems in 1026 monozygotic (MZ) twin pairs was examined as a function of the pairs’ genotype. We created a polygenic score of environmental sensitivity based on the whole-genome findings and tested the score as a moderator of parenting on emotional problems in 1,406 children and response to individual, group and brief parent-led CBT in 973 children with anxiety disorders. Results: The polygenic score significantly moderated the effects of parenting on emotional problems and the effects of treatment. Individuals with a high score responded significantly better to individual CBT than group CBT or brief parent-led CBT (remission rates: 70.9%, 55.5% and 41.6% respectively). Conclusions: Pending successful replication, our results should be considered exploratory. Nevertheless, if replicated, they suggest that individuals with the greatest environmental sensitivity may be more likely to develop emotional problems in adverse environments, but also benefit more from the most intensive types of treatment.
Resumo:
The thermodynamic properties of dark energy fluids described by an equation of state parameter omega = p/rho are rediscussed in the context of FRW type geometries. Contrarily to previous claims, it is argued here that the phantom regime omega < -1 is not physically possible since that both the temperature and the entropy of every physical fluids must be always positive definite. This means that one cannot appeal to negative temperature in order to save the phantom dark energy hypothesis as has been recently done in the literature. Such a result remains true as long as the chemical potential is zero. However, if the phantom fluid is endowed with a non-null chemical potential, the phantom field hypothesis becomes thermodynamically consistent, that is, there are macroscopic equilibrium states with T > 0 and S > 0 in the course of the Universe expansion. (C) 2008 Elsevier B.V. All rights reserved.
Resumo:
The matrix-tolerance hypothesis suggests that the most abundant species in the inter-habitat matrix would be less vulnerable to their habitat fragmentation. This model was tested with leaf-litter frogs in the Atlantic Forest where the fragmentation process is older and more severe than in the Amazon, where the model was first developed. Frog abundance data from the agricultural matrix, forest fragments and continuous forest localities were used. We found an expected negative correlation between the abundance of frogs in the matrix and their vulnerability to fragmentation, however, results varied with fragment size and species traits. Smaller fragments exhibited stronger matrix-vulnerability correlation than intermediate fragments, while no significant relation was observed for large fragments. Moreover, some species that avoid the matrix were not sensitive to a decrease in the patch size, and the opposite was also true, indicating significant differences with that expected from the model. Most of the species that use the matrix were forest species with aquatic larvae development, but those species do not necessarily respond to fragmentation or fragment size, and thus affect more intensively the strengthen of the expected relationship. Therefore, the main relationship expected by the matrix-tolerance hypothesis was observed in the Atlantic Forest; however we noted that the prediction of this hypothesis can be substantially affected by the size of the fragments, and by species traits. We propose that matrix-tolerance model should be broadened to become a more effective model, including other patch characteristics, particularly fragment size, and individual species traits (e. g., reproductive mode and habitat preference).
Resumo:
P>Neuropeptides are produced from larger precursors by limited proteolysis, first by endopeptidases and then by carboxypeptidases. Major endopeptidases required for these cleavages include prohormone convertase (PC) 1/3 and PC2. In this study, quantitative peptidomics analysis was used to characterize the specific role PC1/3 plays in this process. Peptides isolated from hypothalamus, amygdala, and striatum of PC1/3 null mice were compared with those from heterozygous and wild-type mice. Extracts were labeled with stable isotopic tags and fractionated by HPLC, after which relative peptide levels were determined using tandem mass spectrometry. In total, 92 peptides were found, of which 35 were known neuropeptides or related peptides derived from 15 distinct secretory pathway proteins: 7B2, chromogranin A and B, cocaine- and amphetamine-regulated transcript, procholecystokinin, proenkephalin, promelanin concentrating hormone, proneurotensin, propituitary adenylate cyclase-activating peptide, proSAAS, prosomatosatin, provasoactive intestinal peptide, provasopressin, secretogranin III, and VGF. Among the peptides derived from these proteins, similar to 1/3 were decreased in the PC1/3 null mice relative to wild-type mice, similar to 1/3 showed no change, and similar to 1/3 increased in PC1/3 null. Cleavage sites were analyzed in peptides that showed no change or that decreased in PC1/3 mice, and these results were compared with peptides that showed no change or decreased in previous peptidomic studies with PC2 null mice. Analysis of these sites showed that while PC1/3 and PC2 have overlapping substrate preferences, there are particular cleavage site residues that distinguish peptides preferred by each PC.
Resumo:
The multivariate skew-t distribution (J Multivar Anal 79:93-113, 2001; J R Stat Soc, Ser B 65:367-389, 2003; Statistics 37:359-363, 2003) includes the Student t, skew-Cauchy and Cauchy distributions as special cases and the normal and skew-normal ones as limiting cases. In this paper, we explore the use of Markov Chain Monte Carlo (MCMC) methods to develop a Bayesian analysis of repeated measures, pretest/post-test data, under multivariate null intercept measurement error model (J Biopharm Stat 13(4):763-771, 2003) where the random errors and the unobserved value of the covariate (latent variable) follows a Student t and skew-t distribution, respectively. The results and methods are numerically illustrated with an example in the field of dentistry.
Resumo:
Considering the Wald, score, and likelihood ratio asymptotic test statistics, we analyze a multivariate null intercept errors-in-variables regression model, where the explanatory and the response variables are subject to measurement errors, and a possible structure of dependency between the measurements taken within the same individual are incorporated, representing a longitudinal structure. This model was proposed by Aoki et al. (2003b) and analyzed under the bayesian approach. In this article, considering the classical approach, we analyze asymptotic test statistics and present a simulation study to compare the behavior of the three test statistics for different sample sizes, parameter values and nominal levels of the test. Also, closed form expressions for the score function and the Fisher information matrix are presented. We consider two real numerical illustrations, the odontological data set from Hadgu and Koch (1999), and a quality control data set.
Resumo:
Skew-normal distribution is a class of distributions that includes the normal distributions as a special case. In this paper, we explore the use of Markov Chain Monte Carlo (MCMC) methods to develop a Bayesian analysis in a multivariate, null intercept, measurement error model [R. Aoki, H. Bolfarine, J.A. Achcar, and D. Leao Pinto Jr, Bayesian analysis of a multivariate null intercept error-in -variables regression model, J. Biopharm. Stat. 13(4) (2003b), pp. 763-771] where the unobserved value of the covariate (latent variable) follows a skew-normal distribution. The results and methods are applied to a real dental clinical trial presented in [A. Hadgu and G. Koch, Application of generalized estimating equations to a dental randomized clinical trial, J. Biopharm. Stat. 9 (1999), pp. 161-178].
Resumo:
In this article, we discuss inferential aspects of the measurement error regression models with null intercepts when the unknown quantity x (latent variable) follows a skew normal distribution. We examine first the maximum-likelihood approach to estimation via the EM algorithm by exploring statistical properties of the model considered. Then, the marginal likelihood, the score function and the observed information matrix of the observed quantities are presented allowing direct inference implementation. In order to discuss some diagnostics techniques in this type of models, we derive the appropriate matrices to assessing the local influence on the parameter estimates under different perturbation schemes. The results and methods developed in this paper are illustrated considering part of a real data set used by Hadgu and Koch [1999, Application of generalized estimating equations to a dental randomized clinical trial. Journal of Biopharmaceutical Statistics, 9, 161-178].
Resumo:
In many epidemiological studies it is common to resort to regression models relating incidence of a disease and its risk factors. The main goal of this paper is to consider inference on such models with error-prone observations and variances of the measurement errors changing across observations. We suppose that the observations follow a bivariate normal distribution and the measurement errors are normally distributed. Aggregate data allow the estimation of the error variances. Maximum likelihood estimates are computed numerically via the EM algorithm. Consistent estimation of the asymptotic variance of the maximum likelihood estimators is also discussed. Test statistics are proposed for testing hypotheses of interest. Further, we implement a simple graphical device that enables an assessment of the model`s goodness of fit. Results of simulations concerning the properties of the test statistics are reported. The approach is illustrated with data from the WHO MONICA Project on cardiovascular disease. Copyright (C) 2008 John Wiley & Sons, Ltd.
A robust Bayesian approach to null intercept measurement error model with application to dental data
Resumo:
Measurement error models often arise in epidemiological and clinical research. Usually, in this set up it is assumed that the latent variable has a normal distribution. However, the normality assumption may not be always correct. Skew-normal/independent distribution is a class of asymmetric thick-tailed distributions which includes the Skew-normal distribution as a special case. In this paper, we explore the use of skew-normal/independent distribution as a robust alternative to null intercept measurement error model under a Bayesian paradigm. We assume that the random errors and the unobserved value of the covariate (latent variable) follows jointly a skew-normal/independent distribution, providing an appealing robust alternative to the routine use of symmetric normal distribution in this type of model. Specific distributions examined include univariate and multivariate versions of the skew-normal distribution, the skew-t distributions, the skew-slash distributions and the skew contaminated normal distributions. The methods developed is illustrated using a real data set from a dental clinical trial. (C) 2008 Elsevier B.V. All rights reserved.