13 resultados para Transitive Inferences

em Collection Of Biostatistics Research Archive


Relevância:

20.00% 20.00%

Publicador:

Resumo:

This contribution investigates the evolution of diet in the Pan – Homo and hominin clades. It does this by focusing on 12 variables (nine dental and three mandibular) for which data are available about extant chimpanzees, modern humans and most extinct hominins. Previous analyses of this type have approached the interpretation of dental and gnathic function by focusing on the identification of the food consumed (i.e. fruits, leaves, etc.) rather than on the physical properties (i.e. hardness, toughness, etc.) of those foods, and they have not specifically addressed the role that the physical properties of foods play in determining dental adaptations. We take the available evidence for the 12 variables, and set out what the expression of each of those variables is in extant chimpanzees, the earliest hominins, archaic hominins, megadont archaic hominins, and an inclusive grouping made up of transitional hominins and pre-modern Homo . We then present hypotheses about what the states of these variables would be in the last common ancestor of the Pan – Homo clade and in the stem hominin. We review the physical properties of food and suggest how these physical properties can be used to investigate the functional morphology of the dentition. We show what aspects of anterior tooth morphology are critical for food preparation (e.g. peeling fruit) prior to its ingestion, which features of the postcanine dentition (e.g. overall and relative size of the crowns) are related to the reduction in the particle size of food, and how information about the macrostructure (e.g. enamel thickness) and microstructure (e.g. extent and location of enamel prism decussation) of the enamel cap might be used to make predictions about the types of foods consumed by extinct hominins. Specifically, we show how thick enamel can protect against the generation and propagation of cracks in the enamel that begin at the enamel– dentine junction and move towards the outer enamel surface.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Various inference procedures for linear regression models with censored failure times have been studied extensively. Recent developments on efficient algorithms to implement these procedures enhance the practical usage of such models in survival analysis. In this article, we present robust inferences for certain covariate effects on the failure time in the presence of "nuisance" confounders under a semiparametric, partial linear regression setting. Specifically, the estimation procedures for the regression coefficients of interest are derived from a working linear model and are valid even when the function of the confounders in the model is not correctly specified. The new proposals are illustrated with two examples and their validity for cases with practical sample sizes is demonstrated via a simulation study.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The diet of early human ancestors has received renewed theoretical interest since the discovery of elevated d13C values in the enamel of Australopithecus africanus and Paranthropus robustus. As a result, the hominin diet is hypothesized to have included C4 grass or the tissues of animals which themselves consumed C4 grass. On mechanical grounds, such a diet is incompatible with the dental morphology and dental microwear of early hominins. Most inferences, particularly for Paranthropus, favor a diet of hard or mechanically resistant foods. This discrepancy has invigorated the longstanding hypothesis that hominins consumed plant underground storage organs (USOs). Plant USOs are attractive candidate foods because many bulbous grasses and cormous sedges use C4 photosynthesis. Yet mechanical data for USOs—or any putative hominin food—are scarcely known. To fill this empirical void we measured the mechanical properties of USOs from 98 plant species from across sub-Saharan Africa. We found that rhizomes were the most resistant to deformation and fracture, followed by tubers, corms, and bulbs. An important result of this study is that corms exhibited low toughness values (mean = 265.0 J m-2) and relatively high Young’s modulus values (mean = 4.9 MPa). This combination of properties fits many descriptions of the hominin diet as consisting of hard-brittle objects. When compared to corms, bulbs are tougher (mean = 325.0 J m-2) and less stiff (mean = 2.5 MPa). Again, this combination of traits resembles dietary inferences, especially for Australopithecus, which is predicted to have consumed soft-tough foods. Lastly, we observed the roasting behavior of Hadza hunter-gatherers and measured the effects of roasting on the toughness on undomesticated tubers. Our results support assumptions that roasting lessens the work of mastication, and, by inference, the cost of digestion. Together these findings provide the first mechanical basis for discussing the adaptive advantages of roasting tubers and the plausibility of USOs in the diet of early hominins.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Professor Sir David R. Cox (DRC) is widely acknowledged as among the most important scientists of the second half of the twentieth century. He inherited the mantle of statistical science from Pearson and Fisher, advanced their ideas, and translated statistical theory into practice so as to forever change the application of statistics in many fields, but especially biology and medicine. The logistic and proportional hazards models he substantially developed, are arguably among the most influential biostatistical methods in current practice. This paper looks forward over the period from DRC's 80th to 90th birthdays, to speculate about the future of biostatistics, drawing lessons from DRC's contributions along the way. We consider "Cox's model" of biostatistics, an approach to statistical science that: formulates scientific questions or quantities in terms of parameters gamma in probability models f(y; gamma) that represent in a parsimonious fashion, the underlying scientific mechanisms (Cox, 1997); partition the parameters gamma = theta, eta into a subset of interest theta and other "nuisance parameters" eta necessary to complete the probability distribution (Cox and Hinkley, 1974); develops methods of inference about the scientific quantities that depend as little as possible upon the nuisance parameters (Barndorff-Nielsen and Cox, 1989); and thinks critically about the appropriate conditional distribution on which to base infrences. We briefly review exciting biomedical and public health challenges that are capable of driving statistical developments in the next decade. We discuss the statistical models and model-based inferences central to the CM approach, contrasting them with computationally-intensive strategies for prediction and inference advocated by Breiman and others (e.g. Breiman, 2001) and to more traditional design-based methods of inference (Fisher, 1935). We discuss the hierarchical (multi-level) model as an example of the future challanges and opportunities for model-based inference. We then consider the role of conditional inference, a second key element of the CM. Recent examples from genetics are used to illustrate these ideas. Finally, the paper examines causal inference and statistical computing, two other topics we believe will be central to biostatistics research and practice in the coming decade. Throughout the paper, we attempt to indicate how DRC's work and the "Cox Model" have set a standard of excellence to which all can aspire in the future.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Genomic alterations have been linked to the development and progression of cancer. The technique of Comparative Genomic Hybridization (CGH) yields data consisting of fluorescence intensity ratios of test and reference DNA samples. The intensity ratios provide information about the number of copies in DNA. Practical issues such as the contamination of tumor cells in tissue specimens and normalization errors necessitate the use of statistics for learning about the genomic alterations from array-CGH data. As increasing amounts of array CGH data become available, there is a growing need for automated algorithms for characterizing genomic profiles. Specifically, there is a need for algorithms that can identify gains and losses in the number of copies based on statistical considerations, rather than merely detect trends in the data. We adopt a Bayesian approach, relying on the hidden Markov model to account for the inherent dependence in the intensity ratios. Posterior inferences are made about gains and losses in copy number. Localized amplifications (associated with oncogene mutations) and deletions (associated with mutations of tumor suppressors) are identified using posterior probabilities. Global trends such as extended regions of altered copy number are detected. Since the posterior distribution is analytically intractable, we implement a Metropolis-within-Gibbs algorithm for efficient simulation-based inference. Publicly available data on pancreatic adenocarcinoma, glioblastoma multiforme and breast cancer are analyzed, and comparisons are made with some widely-used algorithms to illustrate the reliability and success of the technique.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Whilst estimation of the marginal (total) causal effect of a point exposure on an outcome is arguably the most common objective of experimental and observational studies in the health and social sciences, in recent years, investigators have also become increasingly interested in mediation analysis. Specifically, upon establishing a non-null total effect of the exposure, investigators routinely wish to make inferences about the direct (indirect) pathway of the effect of the exposure not through (through) a mediator variable that occurs subsequently to the exposure and prior to the outcome. Although powerful semiparametric methodologies have been developed to analyze observational studies, that produce double robust and highly efficient estimates of the marginal total causal effect, similar methods for mediation analysis are currently lacking. Thus, this paper develops a general semiparametric framework for obtaining inferences about so-called marginal natural direct and indirect causal effects, while appropriately accounting for a large number of pre-exposure confounding factors for the exposure and the mediator variables. Our analytic framework is particularly appealing, because it gives new insights on issues of efficiency and robustness in the context of mediation analysis. In particular, we propose new multiply robust locally efficient estimators of the marginal natural indirect and direct causal effects, and develop a novel double robust sensitivity analysis framework for the assumption of ignorability of the mediator variable.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In recent years, researchers in the health and social sciences have become increasingly interested in mediation analysis. Specifically, upon establishing a non-null total effect of an exposure, investigators routinely wish to make inferences about the direct (indirect) pathway of the effect of the exposure not through (through) a mediator variable that occurs subsequently to the exposure and prior to the outcome. Natural direct and indirect effects are of particular interest as they generally combine to produce the total effect of the exposure and therefore provide insight on the mechanism by which it operates to produce the outcome. A semiparametric theory has recently been proposed to make inferences about marginal mean natural direct and indirect effects in observational studies (Tchetgen Tchetgen and Shpitser, 2011), which delivers multiply robust locally efficient estimators of the marginal direct and indirect effects, and thus generalizes previous results for total effects to the mediation setting. In this paper we extend the new theory to handle a setting in which a parametric model for the natural direct (indirect) effect within levels of pre-exposure variables is specified and the model for the observed data likelihood is otherwise unrestricted. We show that estimation is generally not feasible in this model because of the curse of dimensionality associated with the required estimation of auxiliary conditional densities or expectations, given high-dimensional covariates. We thus consider multiply robust estimation and propose a more general model which assumes a subset but not all of several working models holds.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Suppose that having established a marginal total effect of a point exposure on a time-to-event outcome, an investigator wishes to decompose this effect into its direct and indirect pathways, also know as natural direct and indirect effects, mediated by a variable known to occur after the exposure and prior to the outcome. This paper proposes a theory of estimation of natural direct and indirect effects in two important semiparametric models for a failure time outcome. The underlying survival model for the marginal total effect and thus for the direct and indirect effects, can either be a marginal structural Cox proportional hazards model, or a marginal structural additive hazards model. The proposed theory delivers new estimators for mediation analysis in each of these models, with appealing robustness properties. Specifically, in order to guarantee ignorability with respect to the exposure and mediator variables, the approach, which is multiply robust, allows the investigator to use several flexible working models to adjust for confounding by a large number of pre-exposure variables. Multiple robustness is appealing because it only requires a subset of working models to be correct for consistency; furthermore, the analyst need not know which subset of working models is in fact correct to report valid inferences. Finally, a novel semiparametric sensitivity analysis technique is developed for each of these models, to assess the impact on inference, of a violation of the assumption of ignorability of the mediator.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Geostatistics involves the fitting of spatially continuous models to spatially discrete data (Chil`es and Delfiner, 1999). Preferential sampling arises when the process that determines the data-locations and the process being modelled are stochastically dependent. Conventional geostatistical methods assume, if only implicitly, that sampling is non-preferential. However, these methods are often used in situations where sampling is likely to be preferential. For example, in mineral exploration samples may be concentrated in areas thought likely to yield high-grade ore. We give a general expression for the likelihood function of preferentially sampled geostatistical data and describe how this can be evaluated approximately using Monte Carlo methods. We present a model for preferential sampling, and demonstrate through simulated examples that ignoring preferential sampling can lead to seriously misleading inferences. We describe an application of the model to a set of bio-monitoring data from Galicia, northern Spain, in which making allowance for preferential sampling materially changes the inferences.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Permutation tests are useful for drawing inferences from imaging data because of their flexibility and ability to capture features of the brain that are difficult to capture parametrically. However, most implementations of permutation tests ignore important confounding covariates. To employ covariate control in a nonparametric setting we have developed a Markov chain Monte Carlo (MCMC) algorithm for conditional permutation testing using propensity scores. We present the first use of this methodology for imaging data. Our MCMC algorithm is an extension of algorithms developed to approximate exact conditional probabilities in contingency tables, logit, and log-linear models. An application of our non-parametric method to remove potential bias due to the observed covariates is presented.