989 resultados para Statistical methodologies
Resumo:
Base rate neglect on the mammography problem can be overcome by explicitly presenting a causal basis for the typically vague false-positive statistic. One account of this causal facilitation effect is that people make probabilistic judgements over intuitive causal models parameterized with the evidence in the problem. Poorly defined or difficult-to-map evidence interferes with this process, leading to errors in statistical reasoning. To assess whether the construction of parameterized causal representations is an intuitive or deliberative process, in Experiment 1 we combined a secondary load paradigm with manipulations of the presence or absence of an alternative cause in typical statistical reasoning problems. We found limited effects of a secondary load, no evidence that information about an alternative cause improves statistical reasoning, but some evidence that it reduces base rate neglect errors. In Experiments 2 and 3 where we did not impose a load, we observed causal facilitation effects. The amount of Bayesian responding in the causal conditions was impervious to the presence of a load (Experiment 1) and to the precise statistical information that was presented (Experiment 3). However, we found less Bayesian responding in the causal condition than previously reported. We conclude with a discussion of the implications of our findings and the suggestion that there may be population effects in the accuracy of statistical reasoning.
Resumo:
People often struggle when making Bayesian probabilistic estimates on the basis of competing sources of statistical evidence. Recently, Krynski and Tenenbaum (Journal of Experimental Psychology: General, 136, 430–450, 2007) proposed that a causal Bayesian framework accounts for peoples’ errors in Bayesian reasoning and showed that, by clarifying the causal relations among the pieces of evidence, judgments on a classic statistical reasoning problem could be significantly improved. We aimed to understand whose statistical reasoning is facilitated by the causal structure intervention. In Experiment 1, although we observed causal facilitation effects overall, the effect was confined to participants high in numeracy. We did not find an overall facilitation effect in Experiment 2 but did replicate the earlier interaction between numerical ability and the presence or absence of causal content. This effect held when we controlled for general cognitive ability and thinking disposition. Our results suggest that clarifying causal structure facilitates Bayesian judgments, but only for participants with sufficient understanding of basic concepts in probability and statistics.
Resumo:
A key tracer of the elusive progenitor systems of Type Ia supernovae (SNe Ia) is the detection of narrow blueshifted time-varying Na I D absorption lines, interpreted as evidence of circumstellar material surrounding the progenitor system. The origin of this material is controversial, but the simplest explanation is that it results from previous mass-loss in a system containing a white dwarf and a non-degenerate companion star. We present new single-epoch intermediate-resolution spectra of 17 low-redshift SNe Ia taken with XShooter on the European Southern Observatory Very Large Telescope. Combining this sample with events from the literature, we confirm an excess (∼20 per cent) of SNe Ia displaying blueshifted narrow Na I D absorption features compared to redshifted Na I D features. The host galaxies of SNe Ia displaying blueshifted absorption profiles are skewed towards later-type galaxies, compared to SNe Ia that show no Na I D absorption and SNe Ia displaying blueshifted narrow Na I D absorption features have broader light curves. The strength of the Na I D absorption is stronger in SNe Ia displaying blueshifted Na I D absorption features than those without blueshifted features, and the strength of the blueshifted Na I D is correlated with the B − V colour of the SN at maximum light. This strongly suggests the absorbing material is local to the SN. In the context of the progenitor systems of SNe Ia, we discuss the significance of these findings and other recent observational evidence on the nature of SN Ia progenitors. We present a summary that suggests that there are at least two distinct populations of normal, cosmologically useful SNe Ia.
Resumo:
The thriving and well-established field of Law and Society (also referred to as Sociolegal Studies) has diverse methodological influences; it draws on social-scientific and arts-based methods. The approach of scholars researching and teaching in the field often crosses disciplinary borders, but, broadly speaking, Law and Society scholarship goes behind formalism to investigate how and why law operates, or does not operate as intended, in society. By exploring law’s connections with broader social and political forces—both domestic and international—scholars gain valuable perspectives on ideology, culture, identity, and social life. Law and Society scholarship considers both the law in contexts, as well as contexts in law.
Law and Society flourishes today, perhaps as never before. Academic thinkers toil both on the mundane and the local, as well as the global, making major advances in the ways in which we think both about law and society. Especially over the last four decades, scholarly output has rapidly burgeoned, and this new title from Routledge’s acclaimed Critical Concepts in Law series answers the need for an authoritative reference collection to help users make sense of the daunting quantity of serious research and thinking.
Edited by the leading scholars in the field, Law and Society brings together in four volumes the vital classic and contemporary contributions. Volume I is dedicated to historical antecedents and precursors. The second volume covers methodologies and crucial themes. The third volume assembles key works on legal processes and professional groups, while the final volume of the collection focuses on substantive areas. Together, the volumes provide a one-stop ‘mini library’ enabling all interested researchers, teachers, and students to explore the origins of this thriving sub discipline, and to gain a thorough understanding of where it is today.
Resumo:
Three experiments examined children’s and adults’ abilities to use statistical and temporal information to distinguish between common cause and causal chain structures. In Experiment 1, participants were provided with conditional probability information and/or temporal information and asked to infer the causal structure of a three-variable mechanical system that operated probabilistically. Participants of all ages preferentially relied on the temporal pattern of events in their inferences, even if this conflicted with statistical information. In Experiments 2 and 3, participants observed a series of interventions on the system, which in these experiments operated deterministically. In Experiment 2, participants found it easier to use temporal pattern information than statistical information provided as a result of interventions. In Experiment 3, in which no temporal pattern information was provided, children from 6-7 years, but not younger children, were able to use intervention information to make causal chain judgments, although they had difficulty when the structure was a common cause. The findings suggest that participants, and children in particular, may find it more difficult to use statistical information than temporal pattern information because of its demands on information processing resources. However, there may also be an inherent preference for temporal information.
Resumo:
This paper presents a new statistical signal reception model for shadowed body-centric communications channels. In this model, the potential clustering of multipath components is considered alongside the presence of elective dominant signal components. As typically occurs in body-centric communications channels, the dominant or line-of-sight (LOS) components are shadowed by body matter situated in the path trajectory. This situation may be further exacerbated due to physiological and biomechanical movements of the body. In the proposed model, the resultant dominant component which is formed by the phasor addition of these leading contributions is assumed to follow a lognormal distribution. A wide range of measured and simulated shadowed body-centric channels considering on-body, off-body and body-to-body communications are used to validate the model. During the course of the validation experiments, it was found that, even for environments devoid of multipath or specular reflections generated by the local surroundings, a noticeable resultant dominant component can still exist in body-centric channels where the user's body shadows the direct LOS signal path between the transmitter and the receiver.
Resumo:
Refined vegetable oils are widely used in the food industry as ingredients or components in many processed food products in the form of oil blends. To date, the generic term 'vegetable oil' has been used in the labelling of food containing oil blends. With the introduction of new EU Regulation for Food Information (1169/2011) due to take effect in 2014, the oil species used must be clearly identified on the package and there is a need for development of fit for purpose methodology for industry and regulators alike to verify the oil species present in a product. The available methodologies that may be employed to authenticate the botanical origin of a vegetable oil admixture were reviewed and evaluated. The majority of the sources however, described techniques applied to crude vegetable oils such as olive oil due to the lack of refined vegetable oil focused studies. Nevertheless, DNA based typing methods and stable isotopes procedures were found not suitable for this particular purpose due to several issues. Only a small number of specific chromatographic and spectroscopic fingerprinting methods in either targeted or untargeted mode were found to be applicable in potentially providing a solution to this complex authenticity problem. Applied as a single method in isolation, these techniques would be able to give limited information on the oils identity as signals obtained for various oil types may well be overlapping. Therefore, more complex and combined approaches are likely to be needed to identify the oil species present in oil blends employing a stepwise approach in combination with advanced chemometrics. Options to provide such a methodology are outlined in the current study.
Resumo:
Background: Previous research demonstrates various associations between depression, cardiovascular disease (CVD) incidence and mortality, possibly as a result of the different methodologies used to measure depression and analyse relationships. This analysis investigated the association between depression, CVD incidence (CVDI) and mortality from CVD (MCVD), smoking related conditions (MSRC), and all causes (MALL), in a sample data set, where depression was measured using items from a validated questionnaire and using items derived from the factor analysis of a larger questionnaire, and analyses were conducted based on continuous data and grouped data.
Methods: Data from the PRIME Study (N=9798 men) on depression and 10-year CVD incidence and mortality were analysed using Cox proportional hazards models.
Results: Using continuous data, both measures of depression resulted in the emergence of positive associations between depression and mortality (MCVD, MSRC, MALL). Using grouped data, however, associations between a validated measure of depression and MCVD, and between a measure of depression derived from factor analysis and all measures of mortality were lost.
Limitations: Low levels of depression, low numbers of individuals with high depression and low numbers of outcome events may limit these analyses, but levels are usual for the population studied.
Conclusions: These data demonstrate a possible association between depression and mortality but detecting this association is dependent on the measurement used and method of analysis. Different findings based on methodology present clear problems for the elucidation and determination of relationships. The differences here argue for the use of validated scales where possible and suggest against over-reduction via factor analysis and grouping.
Resumo:
This paper presents the results of a measurement campaign aimed at characterizing and modeling the indoor radio channel between two hypothetical cellular handsets. The device-to-device channel measurements were made at 868 MHz and investigated a number of different everyday scenarios such as the devices being held at the user's heads, placed in a pocket and one of the devices placed on a desktop. The recently proposed shadowed k-μ fading model was used to characterize these channels and was shown to provide a good description of the measured data. It was also evident from the experiments, that the device-to-device communications channel is susceptible to shadowing caused by the human body.
Resumo:
In recent years, the embracement of smart devices carried or worn by people have transformed how society interact with one another. This trend has also been observed in the advancement of vehicular networks. Here, developments in wireless technologies for vehicle-to-vehicle (V2V) and vehicle-to-roadside (V2R) communications are leading to a new generation of vehicular networks. A natural extension of both types of networks will be their eventual wireless integration. Both people and vehicles will undoubtedly form integral parts of future mobile networks of people and things. Central to this will be the person-to-vehicle (P2V) communications channel. As the P2V channel will be subject to different signal propagation characteristics than either type of communication system considered in isolation, it is imperative the characteristics of the wireless channel must first be fully understood. To the best of the author's knowledge, this is a topic which has not yet been addressed in the open literature. In this paper we will present our most recent research on the statistical characterization of the 5.8 GHz person-to-vehicle channel in an urban environment.
Resumo:
In this paper we investigate the received signal characteristics of on-body communications channels at 2.45 GHz. The hypothetical body area network configuration considered a transmitter node situated on the person’s left waist and receiving nodes positioned on the head, knee and wrist of the person’s right side. The on-body channel measurements were performed in both anechoic and reverberant environments while the person was moving. It was found that the recently proposed shadowed κ‒μ fading model provided an excellent fit to the measured data.
Resumo:
In this paper we investigate the first and second order characteristics of the received signal at the output ofhypothetical selection, equal gain and maximal ratio combiners which utilize spatially separated antennas at the basestation. Considering a range of human body movements, we model the model the small-scale fading characteristics ofthe signal using diversity specific analytical equations which take into account the number of available signal branchesat the receiver. It is shown that these equations provide an excellent fit to the measured channel data. Furthermore, formany hypothetical diversity receiver configurations, the Nakagami-m parameter was found to be close to 1.
Resumo:
Objectives
A P-value <0.05 is one metric used to evaluate the results of a randomized controlled trial (RCT). We wondered how often statistically significant results in RCTs may be lost with small changes in the numbers of outcomes.
Study Design and Setting
A review of RCTs in high-impact medical journals that reported a statistically significant result for at least one dichotomous or time-to-event outcome in the abstract. In the group with the smallest number of events, we changed the status of patients without an event to an event until the P-value exceeded 0.05. We labeled this number the Fragility Index; smaller numbers indicated a more fragile result.
Results
The 399 eligible trials had a median sample size of 682 patients (range: 15-112,604) and a median of 112 events (range: 8-5,142); 53% reported a P-value <0.01. The median Fragility Index was 8 (range: 0-109); 25% had a Fragility Index of 3 or less. In 53% of trials, the Fragility Index was less than the number of patients lost to follow-up.
Conclusion
The statistically significant results of many RCTs hinge on small numbers of events. The Fragility Index complements the P-value and helps identify less robust results.