89 resultados para Statistical hypothesis testing

em Université de Lausanne, Switzerland


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Many people regard the concept of hypothesis testing as fundamental to inferential statistics. Various schools of thought, in particular frequentist and Bayesian, have promoted radically different solutions for taking a decision about the plausibility of competing hypotheses. Comprehensive philosophical comparisons about their advantages and drawbacks are widely available and continue to span over large debates in the literature. More recently, controversial discussion was initiated by an editorial decision of a scientific journal [1] to refuse any paper submitted for publication containing null hypothesis testing procedures. Since the large majority of papers published in forensic journals propose the evaluation of statistical evidence based on the so called p-values, it is of interest to expose the discussion of this journal's decision within the forensic science community. This paper aims to provide forensic science researchers with a primer on the main concepts and their implications for making informed methodological choices.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In the 1920s, Ronald Fisher developed the theory behind the p value and Jerzy Neyman and Egon Pearson developed the theory of hypothesis testing. These distinct theories have provided researchers important quantitative tools to confirm or refute their hypotheses. The p value is the probability to obtain an effect equal to or more extreme than the one observed presuming the null hypothesis of no effect is true; it gives researchers a measure of the strength of evidence against the null hypothesis. As commonly used, investigators will select a threshold p value below which they will reject the null hypothesis. The theory of hypothesis testing allows researchers to reject a null hypothesis in favor of an alternative hypothesis of some effect. As commonly used, investigators choose Type I error (rejecting the null hypothesis when it is true) and Type II error (accepting the null hypothesis when it is false) levels and determine some critical region. If the test statistic falls into that critical region, the null hypothesis is rejected in favor of the alternative hypothesis. Despite similarities between the two, the p value and the theory of hypothesis testing are different theories that often are misunderstood and confused, leading researchers to improper conclusions. Perhaps the most common misconception is to consider the p value as the probability that the null hypothesis is true rather than the probability of obtaining the difference observed, or one that is more extreme, considering the null is true. Another concern is the risk that an important proportion of statistically significant results are falsely significant. Researchers should have a minimum understanding of these two theories so that they are better able to plan, conduct, interpret, and report scientific experiments.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Forest fire sequences can be modelled as a stochastic point process where events are characterized by their spatial locations and occurrence in time. Cluster analysis permits the detection of the space/time pattern distribution of forest fires. These analyses are useful to assist fire-managers in identifying risk areas, implementing preventive measures and conducting strategies for an efficient distribution of the firefighting resources. This paper aims to identify hot spots in forest fire sequences by means of the space-time scan statistics permutation model (STSSP) and a geographical information system (GIS) for data and results visualization. The scan statistical methodology uses a scanning window, which moves across space and time, detecting local excesses of events in specific areas over a certain period of time. Finally, the statistical significance of each cluster is evaluated through Monte Carlo hypothesis testing. The case study is the forest fires registered by the Forest Service in Canton Ticino (Switzerland) from 1969 to 2008. This dataset consists of geo-referenced single events including the location of the ignition points and additional information. The data were aggregated into three sub-periods (considering important preventive legal dispositions) and two main ignition-causes (lightning and anthropogenic causes). Results revealed that forest fire events in Ticino are mainly clustered in the southern region where most of the population is settled. Our analysis uncovered local hot spots arising from extemporaneous arson activities. Results regarding the naturally-caused fires (lightning fires) disclosed two clusters detected in the northern mountainous area.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

BACKGROUND: The majority of Haemosporida species infect birds or reptiles, but many important genera, including Plasmodium, infect mammals. Dipteran vectors shared by avian, reptilian and mammalian Haemosporida, suggest multiple invasions of Mammalia during haemosporidian evolution; yet, phylogenetic analyses have detected only a single invasion event. Until now, several important mammal-infecting genera have been absent in these analyses. This study focuses on the evolutionary origin of Polychromophilus, a unique malaria genus that only infects bats (Microchiroptera) and is transmitted by bat flies (Nycteribiidae). METHODS: Two species of Polychromophilus were obtained from wild bats caught in Switzerland. These were molecularly characterized using four genes (asl, clpc, coI, cytb) from the three different genomes (nucleus, apicoplast, mitochondrion). These data were then combined with data of 60 taxa of Haemosporida available in GenBank. Bayesian inference, maximum likelihood and a range of rooting methods were used to test specific hypotheses concerning the phylogenetic relationships between Polychromophilus and the other haemosporidian genera. RESULTS: The Polychromophilus melanipherus and Polychromophilus murinus samples show genetically distinct patterns and group according to species. The Bayesian tree topology suggests that the monophyletic clade of Polychromophilus falls within the avian/saurian clade of Plasmodium and directed hypothesis testing confirms the Plasmodium origin. CONCLUSION: Polychromophilus' ancestor was most likely a bird- or reptile-infecting Plasmodium before it switched to bats. The invasion of mammals as hosts has, therefore, not been a unique event in the evolutionary history of Haemosporida, despite the suspected costs of adapting to a new host. This was, moreover, accompanied by a switch in dipteran host.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Emotion communication research strongly focuses on the face and voice as expressive modalities, leaving the rest of the body relatively understudied. Contrary to the early assumption that body movement only indicates emotional intensity, recent studies show that body movement and posture also convey emotion specific information. However, a deeper understanding of the underlying mechanisms is hampered by a lack of production studies informed by a theoretical framework. In this research we adopted the Body Action and Posture (BAP) coding system to examine the types and patterns of body movement that are employed by 10 professional actors to portray a set of 12 emotions. We investigated to what extent these expression patterns support explicit or implicit predictions from basic emotion theory, bi-dimensional theory, and componential appraisal theory. The overall results showed partial support for the different theoretical approaches. They revealed that several patterns of body movement systematically occur in portrayals of specific emotions, allowing emotion differentiation. While a few emotions were prototypically encoded by one particular pattern, most emotions were variably expressed by multiple patterns, many of which can be explained as reflecting functional components of emotion such as modes of appraisal and action readiness. It is concluded that further work in this largely underdeveloped area should be guided by an appropriate theoretical framework to allow a more systematic design of experiments and clear hypothesis testing.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

BACKGROUND: Hallux valgus is one of the most common forefoot problems in females. Studies have looked at gait alterations due to hallux valgus deformity, assessing temporal, kinematic or plantar pressure parameters individually. The present study, however, aims to assess all listed parameters at once and to isolate the most clinically relevant gait parameters for moderate to severe hallux valgus deformity with the intent of improving post-operative patient prognosis and rehabilitation. METHODS: The study included 26 feet with moderate to severe hallux valgus deformity and 30 feet with no sign of hallux valgus in female participants. Initially, weight bearing radiographs and foot and ankle clinical scores were assessed. Gait assessment was then performed utilizing pressure insoles (PEDAR®) and inertial sensors (Physilog®) and the two groups were compared using a non-parametric statistical hypothesis test (Wilcoxon rank sum, P<0.05). Furthermore, forward stepwise regression was used to reduce the number of gait parameters to the most clinically relevant and correlation of these parameters was assessed with the clinical score. FINDINGS: Overall, the results showed clear deterioration in several gait parameters in the hallux valgus group compared to controls and 9 gait parameters (effect size between 1.03 and 1.76) were successfully isolated to best describe the altered gait in hallux valgus deformity (r(2)=0.71) as well as showed good correlation with clinical scores. INTERPRETATION: Our results, and nine listed parameters, could serve as benchmark for characterization of hallux valgus and objective evaluation of treatment efficacy.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The application of DNA-based markers toward the task of discriminating among alternate salmon runs has evolved in accordance with ongoing genomic developments and increasingly has enabled resolution of which genetic markers associate with important life-history differences. Accurate and efficient identification of the most likely origin for salmon encountered during ocean fisheries, or at salvage from fresh water diversion and monitoring facilities, has far-reaching consequences for improving measures for management, restoration and conservation. Near-real-time provision of high-resolution identity information enables prompt response to changes in encounter rates. We thus continue to develop new tools to provide the greatest statistical power for run identification. As a proof of concept for genetic identification improvements, we conducted simulation and blind tests for 623 known-origin Chinook salmon (Oncorhynchus tshawytscha) to compare and contrast the accuracy of different population sampling baselines and microsatellite loci panels. This test included 35 microsatellite loci (1266 alleles), some known to be associated with specific coding regions of functional significance, such as the circadian rhythm cryptochrome genes, and others not known to be associated with any functional importance. The identification of fall run with unprecedented accuracy was demonstrated. Overall, the top performing panel and baseline (HMSC21) were predicted to have a success rate of 98%, but the blind-test success rate was 84%. Findings for bias or non-bias are discussed to target primary areas for further research and resolution.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Interpretability and power of genome-wide association studies can be increased by imputing unobserved genotypes, using a reference panel of individuals genotyped at higher marker density. For many markers, genotypes cannot be imputed with complete certainty, and the uncertainty needs to be taken into account when testing for association with a given phenotype. In this paper, we compare currently available methods for testing association between uncertain genotypes and quantitative traits. We show that some previously described methods offer poor control of the false-positive rate (FPR), and that satisfactory performance of these methods is obtained only by using ad hoc filtering rules or by using a harsh transformation of the trait under study. We propose new methods that are based on exact maximum likelihood estimation and use a mixture model to accommodate nonnormal trait distributions when necessary. The new methods adequately control the FPR and also have equal or better power compared to all previously described methods. We provide a fast software implementation of all the methods studied here; our new method requires computation time of less than one computer-day for a typical genome-wide scan, with 2.5 M single nucleotide polymorphisms and 5000 individuals.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Due to their performance enhancing properties, use of anabolic steroids (e.g. testosterone, nandrolone, etc.) is banned in elite sports. Therefore, doping control laboratories accredited by the World Anti-Doping Agency (WADA) screen among others for these prohibited substances in urine. It is particularly challenging to detect misuse with naturally occurring anabolic steroids such as testosterone (T), which is a popular ergogenic agent in sports and society. To screen for misuse with these compounds, drug testing laboratories monitor the urinary concentrations of endogenous steroid metabolites and their ratios, which constitute the steroid profile and compare them with reference ranges to detect unnaturally high values. However, the interpretation of the steroid profile is difficult due to large inter-individual variances, various confounding factors and different endogenous steroids marketed that influence the steroid profile in various ways. A support vector machine (SVM) algorithm was developed to statistically evaluate urinary steroid profiles composed of an extended range of steroid profile metabolites. This model makes the interpretation of the analytical data in the quest for deviating steroid profiles feasible and shows its versatility towards different kinds of misused endogenous steroids. The SVM model outperforms the current biomarkers with respect to detection sensitivity and accuracy, particularly when it is coupled to individual data as stored in the Athlete Biological Passport.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We study an adaptive statistical approach to analyze brain networks represented by brain connection matrices of interregional connectivity (connectomes). Our approach is at a middle level between a global analysis and single connections analysis by considering subnetworks of the global brain network. These subnetworks represent either the inter-connectivity between two brain anatomical regions or by the intra-connectivity within the same brain anatomical region. An appropriate summary statistic, that characterizes a meaningful feature of the subnetwork, is evaluated. Based on this summary statistic, a statistical test is performed to derive the corresponding p-value. The reformulation of the problem in this way reduces the number of statistical tests in an orderly fashion based on our understanding of the problem. Considering the global testing problem, the p-values are corrected to control the rate of false discoveries. Finally, the procedure is followed by a local investigation within the significant subnetworks. We contrast this strategy with the one based on the individual measures in terms of power. We show that this strategy has a great potential, in particular in cases where the subnetworks are well defined and the summary statistics are properly chosen. As an application example, we compare structural brain connection matrices of two groups of subjects with a 22q11.2 deletion syndrome, distinguished by their IQ scores.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

1. Harsh environmental conditions experienced during development can reduce the performance of the same individuals in adulthood. However, the 'predictive adaptive response' hypothesis postulates that if individuals adapt their phenotype during development to the environments where they are likely to live in the future, individuals exposed to harsh conditions in early life perform better when encountering the same harsh conditions in adulthood compared to those never exposed to these conditions before. 2. Using the common vole (Microtus arvalis) as study organism, we tested how exposure to flea parasitism during the juvenile stage affects the physiology (haematocrit, resistance to oxidative stress, resting metabolism, spleen mass, and testosterone), morphology (body mass, testis mass) and motor performance (open field activity and swimming speed) of the same individuals when infested with fleas in adulthood. According to the 'predictive adaptive response' hypothesis, we predicted that voles parasitized at the adult stage would perform better if they had already been parasitized with fleas at the juvenile stage. 3. We found that voles exposed to fleas in adulthood had a higher metabolic rate if already exposed to fleas when juvenile, compared to voles free of fleas when juvenile and voles free of fleas in adulthood. Independently of juvenile parasitism, adult parasitism impaired adult haematocrit and motor performances. Independently of adult parasitism, juvenile parasitism slowed down crawling speed in adult female voles. 4. Our results suggest that juvenile parasitism has long-term effects that do not protect from the detrimental effects of adult parasitism. On the contrary, experiencing parasitism in early-life incurs additional costs upon adult parasitism measured in terms of higher energy expenditure, rather than inducing an adaptive shift in the developmental trajectory. 5. Hence, our study provides experimental evidence for long term costs of parasitism. We found no support for a predictive adaptive response in this host-parasite system.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Background: Bacteria form biofilms on the surface of orthopaedic devices, causing persistent infections. Monitoring biofilm formation on bone grafts and bone substitutes is challenging due to heterogeneous surface characteristics. We analyzed various bone grafts and bone substitutes regarding their propensity for in-vitro biofilm formation caused by S. aureus and S. epidermidis. Methods: Beta-tricalciumphosphate (b-TCP, ChronOsTM), processed human spongiosa (TutoplastTM) and PMMA (PalacosTM) were investigated. PE was added as a growth control. As test strains S. aureus (ATCC 29213) and S. epidermidis RP62A (ATCC 35984) were used. Test materials were incubated with 105 cfu/ml. After 24 h, test materials were removed and washed, followed by a standardised sonication protocol. The resulting sonication fluid was plated and bacterial counts were enumerated and expressed as cfu/sample. Sonicated samples were transferred to a microcalorimeter (TA Instrument) and heat flow monitored over a 24 h period with a precision of 0.0001°C and a sensitiviy of 200 μW. Experiments were performed in triplicates to calculate the mean ± SD. One-way ANOVA analysis was used for statistical analysis. Results: Bacterial counts (log10 cfu/sample) were highest on b-TCP (S. aureus 7.67 ± 0.17; S. epidermidis 8.14 ± 0.05) while bacterial density (log10 cfu/surface) was highest on PMMA (S. aureus 6.12 ± 0.2, S. epidermidis 7.65 ± 0.13). Detection time for S. aureus biofilms was shorter for the porous materials (b-TCP and Tutoplast, p <0.001) compared to the smooth materials (PMMA and PE) with no differences between b-TCP and TutoplastTM (p >0.05) or PMMA and PE (p >0.05). In contrast, for S. epidermidis biofilms the detection time was different (p <0.001) between all materials except between Tutoplast and PE (p >0.05). Conclusion: Our results demonstrate biofilm formation with both strains on all tested materials. Microcalorimetry was able to detect quantitatively the amount of biofilm. Further studies are needed to see whether calorimetry is a suitable tool also to monitor approaches to prevent and treat infections associated with bone grafts and bone substitutes.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

BACKGROUND: As part of EUROCAT's surveillance of congenital anomalies in Europe, a statistical monitoring system has been developed to detect recent clusters or long-term (10 year) time trends. The purpose of this article is to describe the system for the identification and investigation of 10-year time trends, conceived as a "screening" tool ultimately leading to the identification of trends which may be due to changing teratogenic factors.METHODS: The EUROCAT database consists of all cases of congenital anomalies including livebirths, fetal deaths from 20 weeks gestational age, and terminations of pregnancy for fetal anomaly. Monitoring of 10-year trends is performed for each registry for each of 96 non-independent EUROCAT congenital anomaly subgroups, while Pan-Europe analysis combines data from all registries. The monitoring results are reviewed, prioritized according to a prioritization strategy, and communicated to registries for investigation. Twenty-one registries covering over 4 million births, from 1999 to 2008, were included in monitoring in 2010.CONCLUSIONS: Significant increasing trends were detected for abdominal wall anomalies, gastroschisis, hypospadias, Trisomy 18 and renal dysplasia in the Pan-Europe analysis while 68 increasing trends were identified in individual registries. A decreasing trend was detected in over one-third of anomaly subgroups in the Pan-Europe analysis, and 16.9% of individual registry tests. Registry preliminary investigations indicated that many trends are due to changes in data quality, ascertainment, screening, or diagnostic methods. Some trends are inevitably chance phenomena related to multiple testing, while others seem to represent real and continuing change needing further investigation and response by regional/national public health authorities.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Using Monte Carlo simulations and reanalyzing the data of a validation study of the AEIM emotional intelligence test, we demonstrated that an atheoretical approach and the use of weak statistical procedures can result in biased validity estimates. These procedures included stepwise regression-and the general case of failing to include important theoretical controls-extreme scores analysis, and ignoring heteroscedasticity as well as measurement error. The authors of the AEIM test responded by offering more complete information about their analyses, allowing us to further examine the perils of ignoring theory and correct statistical procedures. In this paper we show with extended analyses that the AEIM test is invalid.