829 resultados para Social hypothesis testing
Resumo:
To estimate causal relationships, time series econometricians must be aware of spurious correlation, a problem first mentioned by Yule (1926). To deal with this problem, one can work either with differenced series or multivariate models: VAR (VEC or VECM) models. These models usually include at least one cointegration relation. Although the Bayesian literature on VAR/VEC is quite advanced, Bauwens et al. (1999) highlighted that "the topic of selecting the cointegrating rank has not yet given very useful and convincing results". The present article applies the Full Bayesian Significance Test (FBST), especially designed to deal with sharp hypotheses, to cointegration rank selection tests in VECM time series models. It shows the FBST implementation using both simulated and available (in the literature) data sets. As illustration, standard non informative priors are used.
Resumo:
In this paper, we present approximate distributions for the ratio of the cumulative wavelet periodograms considering stationary and non-stationary time series generated from independent Gaussian processes. We also adapt an existing procedure to use this statistic and its approximate distribution in order to test if two regularly or irregularly spaced time series are realizations of the same generating process. Simulation studies show good size and power properties for the test statistic. An application with financial microdata illustrates the test usefulness. We conclude advocating the use of these approximate distributions instead of the ones obtained through randomizations, mainly in the case of irregular time series. (C) 2012 Elsevier B.V. All rights reserved.
Resumo:
[EN]In this paper the authors show that techniques employed in the prediction of chaotic time series" can also be applied to detection of outliers. A definition of outlier" lS provided and a theorem on hypothesis testing is also proved.
Resumo:
Gallup (this issue) believes that our recent review on the function of yawning (Guggisberg et al., 2010) is unbalanced and that it ignores evidence for his thermoregulation hypothesis. Here we address these criticisms and show them to be untenable. While we never claimed that the social hypothesis of yawning has "definite experimental support", we emphasize the importance of experimental evidence for specific effects of yawns when considering why we yawn. The only specific effect of yawning that could be demonstrated so far is its contagiousness in humans, some non-human primates, and possibly dogs, whereas all studies investigating physiological consequences of yawns were unable to observe specific yawn-induced effects in the individual of any species. The argument that from an evolutionary perspective, yawns must have a "primitive" physiological function arises from imprecise reasoning.
Resumo:
Model-based calibration of steady-state engine operation is commonly performed with highly parameterized empirical models that are accurate but not very robust, particularly when predicting highly nonlinear responses such as diesel smoke emissions. To address this problem, and to boost the accuracy of more robust non-parametric methods to the same level, GT-Power was used to transform the empirical model input space into multiple input spaces that simplified the input-output relationship and improved the accuracy and robustness of smoke predictions made by three commonly used empirical modeling methods: Multivariate Regression, Neural Networks and the k-Nearest Neighbor method. The availability of multiple input spaces allowed the development of two committee techniques: a 'Simple Committee' technique that used averaged predictions from a set of 10 pre-selected input spaces chosen by the training data and the "Minimum Variance Committee" technique where the input spaces for each prediction were chosen on the basis of disagreement between the three modeling methods. This latter technique equalized the performance of the three modeling methods. The successively increasing improvements resulting from the use of a single best transformed input space (Best Combination Technique), Simple Committee Technique and Minimum Variance Committee Technique were verified with hypothesis testing. The transformed input spaces were also shown to improve outlier detection and to improve k-Nearest Neighbor performance when predicting dynamic emissions with steady-state training data. An unexpected finding was that the benefits of input space transformation were unaffected by changes in the hardware or the calibration of the underlying GT-Power model.
Resumo:
Estimation of breastmilk infectivity in HIV-1 infected mothers is difficult because transmission can occur while the fetus is in-utero, during delivery, or through breastfeeding. Since transmission can only be detected through periodic testing, however, it may be impossible to determine the actual mode of transmission in any individual child. In this paper we develop a model to estimate breastmilk infectivity as well as the probabilities of in-utero and intrapartum transmission. In addition, the model allows separate estimation of early and late breastmilk infectivity and individual variation in maternal infectivity. Methods for hypothesis testing of binary risk factors and a method for assessing goodness of fit are also described. Data from a randomized trial of breastfeeding versus formula feeding among HIV-1 infected mothers in Nairobi, Kenya are used to illustrate the methods.
Resumo:
Bioequivalence trials are abbreviated clinical trials whereby a generic drug or new formulation is evaluated to determine if it is "equivalent" to a corresponding previously approved brand-name drug or formulation. In this manuscript, we survey the process of testing bioequivalence and advocate the likelihood paradigm for representing the resulting data as evidence. We emphasize the unique conflicts between hypothesis testing and confidence intervals in this area - which we believe are indicative of the existence of the systemic defects in the frequentist approach - that the likelihood paradigm avoids. We suggest the direct use of profile likelihoods for evaluating bioequivalence and examine the main properties of profile likelihoods and estimated likelihoods under simulation. This simulation study shows that profile likelihoods are a reasonable alternative to the (unknown) true likelihood for a range of parameters commensurate with bioequivalence research. Our study also shows that the standard methods in the current practice of bioequivalence trials offers only weak evidence from the evidential point of view.
Resumo:
Constructing a 3D surface model from sparse-point data is a nontrivial task. Here, we report an accurate and robust approach for reconstructing a surface model of the proximal femur from sparse-point data and a dense-point distribution model (DPDM). The problem is formulated as a three-stage optimal estimation process. The first stage, affine registration, is to iteratively estimate a scale and a rigid transformation between the mean surface model of the DPDM and the sparse input points. The estimation results of the first stage are used to establish point correspondences for the second stage, statistical instantiation, which stably instantiates a surface model from the DPDM using a statistical approach. This surface model is then fed to the third stage, kernel-based deformation, which further refines the surface model. Handling outliers is achieved by consistently employing the least trimmed squares (LTS) approach with a roughly estimated outlier rate in all three stages. If an optimal value of the outlier rate is preferred, we propose a hypothesis testing procedure to automatically estimate it. We present here our validations using four experiments, which include 1 leave-one-out experiment, 2 experiment on evaluating the present approach for handling pathology, 3 experiment on evaluating the present approach for handling outliers, and 4 experiment on reconstructing surface models of seven dry cadaver femurs using clinically relevant data without noise and with noise added. Our validation results demonstrate the robust performance of the present approach in handling outliers, pathology, and noise. An average 95-percentile error of 1.7-2.3 mm was found when the present approach was used to reconstruct surface models of the cadaver femurs from sparse-point data with noise added.
Resumo:
Ziel dieses Beitrages ist die Analyse der Anwendung empirischer Tests in der deutschsprachigen Sportpsychologie. Die Ergebnisse vergleichbarer Analysen, bspw. in der Psychologie, zeigen, dass zwischen Anforderungen aus Testkonzepten und empirischer Realität Unterschiede existieren, die bislang für die Sportpsychologie nicht beschrieben und bewertet worden sind. Die Jahrgänge 1994–2007 der Zeitschrift für Sportpsychologie (früher psychologie und sport) wurden danach untersucht, ob Forschungsfragen formuliert, welche Stichprobenart gewählt, welches Testkonzept verwendet, welches Signifikanzniveau benutzt und ob statistische Probleme diskutiert wurden. 83 Artikel wurden von zwei unabhängigen Bewertern nach diesen Aspekten kategorisiert. Als Ergebnis ist festzuhalten, dass in der sportpsychologischen Forschung überwiegend eine Mischung aus Fishers Signifikanztesten sowie Neyman-Pearsons-Hypothesentesten zur Anwendung kommt,das sogenannte „Hybrid-Modell” oder „Null-Ritual”. Die Beschreibung der Teststärke ist kaum zu beobachten. Eine zeitliche Analyse der Beiträge zeigt, dass vor allem die Benutzung von Effektgrößen in den letzten Jahren zugenommen hat. Abschließend werden Ansätze zur Verbesserung und der Vereinheitlichung der Anwendung empirischer Tests vorgeschlagen und diskutiert.
Resumo:
Drought perturbation driven by the El Niño Southern Oscillation (ENSO) is a principal stochastic variable determining the dynamics of lowland rain forest in S.E. Asia. Mortality, recruitment and stem growth rates at Danum in Sabah (Malaysian Borneo) were recorded in two 4-ha plots (trees ≥ 10 cm gbh) for two periods, 1986–1996 and 1996–2001. Mortality and growth were also recorded in a sample of subplots for small trees (10 to <50 cm gbh) in two sub-periods, 1996–1999 and 1999–2001. Dynamics variables were employed to build indices of drought response for each of the 34 most abundant plot-level species (22 at the subplot level), these being interval-weighted percentage changes between periods and sub-periods. A significant yet complex effect of the strong 1997/1998 drought at the forest community level was shown by randomization procedures followed by multiple hypothesis testing. Despite a general resistance of the forest to drought, large and significant differences in short-term responses were apparent for several species. Using a diagrammatic form of stability analysis, different species showed immediate or lagged effects, high or low degrees of resilience or even oscillatory dynamics. In the context of the local topographic gradient, species’ responses define the newly termed perturbation response niche. The largest responses, particularly for recruitment and growth, were among the small trees, many of which are members of understorey taxa. The results bring with them a novel approach to understanding community dynamics: the kaleidoscopic complexity of idiosyncratic responses to stochastic perturbations suggests that plurality, rather than neutrality, of responses may be essential to understanding these tropical forests. The basis to the various responses lies with the mechanisms of tree-soil water relations which are physiologically predictable: the timing and intensity of the next drought, however, is not. To date, environmental stochasticity has been insufficiently incorporated into models of tropical forest dynamics, a step that might considerably improve the reality of theories about these globally important ecosystems.
Resumo:
OBJECTIVE: In young, first-episode, never-treated schizophrenics compared with controls, (a) generally shorter durations of EEG microstates were reported (Koukkou et al., Brain Topogr 6 (1994) 251; Kinoshita et al., Psychiatry Res Neuroimaging 83 (1998) 58), and (b) specifically, shorter duration of a particular class of microstates (Koenig et al., Eur Arch Psychiatry Clin Neurosci 249 (1999) 205). We now examined whether older, chronic schizophrenic patients with positive symptomatology also show these characteristics. METHODS: Multichannel resting EEG (62.2 s/subject) from two subject groups, 14 patients (36.1+/-10.2 years old) and 13 controls (35.1+/-8.2 years old), all males, was analyzed into microstates using a global approach for microstate analysis that clustered the microstates into 4 classes (Koenig et al., 1999). RESULTS: (a) Hypothesis testing of general microstate shortening supported a trend (P=0.064). (b) Two-way repeated measure ANOVA (two subject groupsx4 microstate classes) showed a significant group effect for microstate duration. Posthoc tests revealed that a microstate class with brain electric field orientation from left central to right central-posterior had significantly shorter microstates in patients than controls (68.5 vs. 76.1 ms, P=0.034). CONCLUSIONS: The results were in line with the results from young, never-treated, productive patients, thus suggesting that in schizophrenic information processing, one class of mental operations might intermittently cause deviant mental constructs because of premature termination of processing.
Resumo:
Despite major advances in the study of glioma, the quantitative links between intra-tumor molecular/cellular properties, clinically observable properties such as morphology, and critical tumor behaviors such as growth and invasiveness remain unclear, hampering more effective coupling of tumor physical characteristics with implications for prognosis and therapy. Although molecular biology, histopathology, and radiological imaging are employed in this endeavor, studies are severely challenged by the multitude of different physical scales involved in tumor growth, i.e., from molecular nanoscale to cell microscale and finally to tissue centimeter scale. Consequently, it is often difficult to determine the underlying dynamics across dimensions. New techniques are needed to tackle these issues. Here, we address this multi-scalar problem by employing a novel predictive three-dimensional mathematical and computational model based on first-principle equations (conservation laws of physics) that describe mathematically the diffusion of cell substrates and other processes determining tumor mass growth and invasion. The model uses conserved variables to represent known determinants of glioma behavior, e.g., cell density and oxygen concentration, as well as biological functional relationships and parameters linking phenomena at different scales whose specific forms and values are hypothesized and calculated based on in vitro and in vivo experiments and from histopathology of tissue specimens from human gliomas. This model enables correlation of glioma morphology to tumor growth by quantifying interdependence of tumor mass on the microenvironment (e.g., hypoxia, tissue disruption) and on the cellular phenotypes (e.g., mitosis and apoptosis rates, cell adhesion strength). Once functional relationships between variables and associated parameter values have been informed, e.g., from histopathology or intra-operative analysis, this model can be used for disease diagnosis/prognosis, hypothesis testing, and to guide surgery and therapy. In particular, this tool identifies and quantifies the effects of vascularization and other cell-scale glioma morphological characteristics as predictors of tumor-scale growth and invasion.
Resumo:
Despite major advances in the study of glioma, the quantitative links between intra-tumor molecular/cellular properties, clinically observable properties such as morphology, and critical tumor behaviors such as growth and invasiveness remain unclear, hampering more effective coupling of tumor physical characteristics with implications for prognosis and therapy. Although molecular biology, histopathology, and radiological imaging are employed in this endeavor, studies are severely challenged by the multitude of different physical scales involved in tumor growth, i.e., from molecular nanoscale to cell microscale and finally to tissue centimeter scale. Consequently, it is often difficult to determine the underlying dynamics across dimensions. New techniques are needed to tackle these issues. Here, we address this multi-scalar problem by employing a novel predictive three-dimensional mathematical and computational model based on first-principle equations (conservation laws of physics) that describe mathematically the diffusion of cell substrates and other processes determining tumor mass growth and invasion. The model uses conserved variables to represent known determinants of glioma behavior, e.g., cell density and oxygen concentration, as well as biological functional relationships and parameters linking phenomena at different scales whose specific forms and values are hypothesized and calculated based on in vitro and in vivo experiments and from histopathology of tissue specimens from human gliomas. This model enables correlation of glioma morphology to tumor growth by quantifying interdependence of tumor mass on the microenvironment (e.g., hypoxia, tissue disruption) and on the cellular phenotypes (e.g., mitosis and apoptosis rates, cell adhesion strength). Once functional relationships between variables and associated parameter values have been informed, e.g., from histopathology or intra-operative analysis, this model can be used for disease diagnosis/prognosis, hypothesis testing, and to guide surgery and therapy. In particular, this tool identifies and quantifies the effects of vascularization and other cell-scale glioma morphological characteristics as predictors of tumor-scale growth and invasion.
Resumo:
A workshop providing an introduction to Bayesian data analysis and hypothesis testing using R, Jags and the BayesFactor package.