5 resultados para Statistical inference
em University of Queensland eSpace - Australia
Resumo:
Many long-lived marine species exhibit life history traits. that make them more vulnerable to overexploitation. Accurate population trend analysis is essential for development and assessment of management plans for these species. However, because many of these species disperse over large geographic areas, have life stages inaccessible to human surveyors, and/or undergo complex developmental migrations, data on trends in abundance are often available for only one stage of the population, usually breeding adults. The green turtle (Chelonia mydas) is one of these long-lived species for which population trends are based almost exclusively on either numbers of females that emerge to nest or numbers of nests deposited each year on geographically restricted beaches. In this study, we generated estimates of annual abundance for juvenile green turtles at two foraging grounds in the Bahamas based on long-term capture-mark-recapture (CMR) studies at Union Creek (24 years) and Conception Creek (13 years), using a two-stage approach. First, we estimated recapture probabilities from CMR data using the Cormack-Jolly-Seber models in the software program MARK; second, we estimated annual abundance of green turtles. at both study sites using the recapture probabilities in a Horvitz-Thompson type estimation procedure. Green turtle abundance did not change significantly in Conception Creek, but, in Union Creek, green turtle abundance had successive phases of significant increase, significant decrease, and stability. These changes in abundance resulted from changes in immigration, not survival or emigration. The trends in abundance on the foraging grounds did not conform to the significantly increasing trend for the major nesting population at Tortuguero, Costa Rica. This disparity highlights the challenges of assessing population-wide trends of green turtles and other long-lived species. The best approach for monitoring population trends may be a combination of (1) extensive surveys to provide data for large-scale trends in relative population abundance, and (2) intensive surveys, using CMR techniques, to estimate absolute abundance and evaluate the demographic processes' driving the trends.
Resumo:
Vector error-correction models (VECMs) have become increasingly important in their application to financial markets. Standard full-order VECM models assume non-zero entries in all their coefficient matrices. However, applications of VECM models to financial market data have revealed that zero entries are often a necessary part of efficient modelling. In such cases, the use of full-order VECM models may lead to incorrect inferences. Specifically, if indirect causality or Granger non-causality exists among the variables, the use of over-parameterised full-order VECM models may weaken the power of statistical inference. In this paper, it is argued that the zero–non-zero (ZNZ) patterned VECM is a more straightforward and effective means of testing for both indirect causality and Granger non-causality. For a ZNZ patterned VECM framework for time series of integrated order two, we provide a new algorithm to select cointegrating and loading vectors that can contain zero entries. Two case studies are used to demonstrate the usefulness of the algorithm in tests of purchasing power parity and a three-variable system involving the stock market.
Resumo:
Statistics is known to be an art as well as a science. The training of mathematical physicists predisposes them towards hypothesising plausible Bayesean priors. Tony Bracken and I were of that mind [1], but in our discussions we also recognised the Bayesean will-o'-the-wisp illustrated below.
Resumo:
The compelling quality of the Global Change simulation study (Altemeyer, 2003), in which high RWA (right-wing authoritarianism)/high SDO (social dominance orientation) individuals produced poor outcomes for the planet, rests on the inference that the link between high RWA/SDO scores and disaster in the simulation can be generalized to real environmental and social situations. However, we argue that studies of the Person × Situation interaction are biased to overestimate the role of the individual variability. When variables are operationalized, strongly normative items are excluded because they are skewed and kurtotic. This occurs both in the measurement of predictor constructs, such as RWA, and in the outcome constructs, such as prejudice and war. Analyses of normal linear statistics highlight personality variables such as RWA, which produce variance, and overlook the role of norms, which produce invariance. Where both normative and personality forces are operating, as in intergroup contexts, the linear analysis generates statistics for the sample that disproportionately reflect the behavior of the deviant, antinormative minority and direct attention away from the baseline, normative position. The implications of these findings for the link between high RWA and disaster are discussed.