925 resultados para Data uncertainty


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this paper we present the results of a coherent narrow-band search for continuous gravitational-wave signals from the Crab and Vela pulsars conducted on Virgo VSR4 data. In order to take into account a possible small mismatch between the gravitational-wave frequency and two times the star rotation frequency, inferred from measurement of the electromagnetic pulse rate, a range of 0.02 Hz around two times the star rotational frequency has been searched for both the pulsars. No evidence for a signal has been found and 95% confidence level upper limits have been computed assuming both that polarization parameters are completely unknown and that they are known with some uncertainty, as derived from x-ray observations of the pulsar wind torii. For Vela the upper limits are comparable to the spin-down limit, computed assuming that all the observed spin-down is due to the emission of gravitational waves. For Crab the upper limits are about a factor of 2 below the spin-down limit, and represent a significant improvement with respect to past analysis. This is the first time the spin-down limit is significantly overcome in a narrow-band search.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The recent likely extinction of the baiji (Chinese river dolphin [Lipotes vexillifer]) (Turvey et al. 2007) makes the vaquita (Gulf of California porpoise [Phocoena sinus]) the most endangered cetacean. The vaquita has the smallest range of any porpoise, dolphin, or whale and, like the baiji, has long been threatened primarily by accidental deaths in fishing gear (bycatch) (Rojas-Bracho et al. 2006). Despite repeated recommendations from scientific bodies and conservation organizations, no effective actions have been taken to remove nets from the vaquita’s environment. Here, we address three questions that are important to vaquita conservation: (1) How many vaquitas remain? (2) How much time is left to find a solution to the bycatch problem? and (3) Are further abundance surveys or bycatch estimates needed to justify the immediate removal of all entangling nets from the range of the vaquita? Our answers are, in short: (1) there are about 150 vaquitas left, (2) there are at most 2 years within which to find a solution, and (3) further abundance surveys or bycatch estimates are not needed. The answers to the first two questions make clear that action is needed now, whereas the answer to the last question removes the excuse of uncertainty as a delay tactic. Herein we explain our reasoning.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Analyses of ecological data should account for the uncertainty in the process(es) that generated the data. However, accounting for these uncertainties is a difficult task, since ecology is known for its complexity. Measurement and/or process errors are often the only sources of uncertainty modeled when addressing complex ecological problems, yet analyses should also account for uncertainty in sampling design, in model specification, in parameters governing the specified model, and in initial and boundary conditions. Only then can we be confident in the scientific inferences and forecasts made from an analysis. Probability and statistics provide a framework that accounts for multiple sources of uncertainty. Given the complexities of ecological studies, the hierarchical statistical model is an invaluable tool. This approach is not new in ecology, and there are many examples (both Bayesian and non-Bayesian) in the literature illustrating the benefits of this approach. In this article, we provide a baseline for concepts, notation, and methods, from which discussion on hierarchical statistical modeling in ecology can proceed. We have also planted some seeds for discussion and tried to show where the practical difficulties lie. Our thesis is that hierarchical statistical modeling is a powerful way of approaching ecological analysis in the presence of inevitable but quantifiable uncertainties, even if practical issues sometimes require pragmatic compromises.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Categorical data cannot be interpolated directly because they are outcomes of discrete random variables. Thus, types of categorical variables are transformed into indicator functions that can be handled by interpolation methods. Interpolated indicator values are then backtransformed to the original types of categorical variables. However, aspects such as variability and uncertainty of interpolated values of categorical data have never been considered. In this paper we show that the interpolation variance can be used to map an uncertainty zone around boundaries between types of categorical variables. Moreover, it is shown that the interpolation variance is a component of the total variance of the categorical variables, as measured by the coefficient of unalikeability. (C) 2011 Elsevier Ltd. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The continental margin of southeast Brazil is elevated. Onshore Tertiary basins and Late Cretaceous/Paleogene intrusions are good evidence for post breakup tectono-magmatic activity. To constrain the impact of post-rift reactivation on the geological history of the area, we carried out a new thermochronological study. Apatite fission track ages range from 60.7 +/- 1.9 Ma to 129.3 +/- 4.3 Ma, mean track lengths from 11.41 +/- 0.23 mu m to 14.31 +/- 0.24 mu m and a subset of the (U-Th)/He ages range from 45.1 +/- 1.5 to 122.4 +/- 2.5 Ma. Results of inverse thermal history modeling generally support the conclusions from an earlier study for a Late Cretaceous phase of cooling. Around the onshore Taubate Basin, for a limited number of samples, the first detectable period of cooling occurred during the Early Tertiary. The inferred thermal histories for many samples also imply subsequent reheating followed by Neogene cooling. Given the uncertainty of the inversion results, we did deterministic forward modeling to assess the range of possibilities of this Tertiary part of the thermal history. The evidence for reheating seems to be robust around the Taubate Basin, but elsewhere the data cannot discriminate between this and a less complex thermal history. However, forward modeling results and geological information support the conclusion that the whole area underwent cooling during the Neogene. The synchronicity of the cooling phases with Andean tectonics and those in NE Brazil leads us to assume a plate-wide compressional stress that reactivated inherited structures. The present-day topographic relief of the margin reflects a contribution from post-breakup reactivation and uplift.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This work is focused on the study of saltwater intrusion in coastal aquifers, and in particular on the realization of conceptual schemes to evaluate the risk associated with it. Saltwater intrusion depends on different natural and anthropic factors, both presenting a strong aleatory behaviour, that should be considered for an optimal management of the territory and water resources. Given the uncertainty of problem parameters, the risk associated with salinization needs to be cast in a probabilistic framework. On the basis of a widely adopted sharp interface formulation, key hydrogeological problem parameters are modeled as random variables, and global sensitivity analysis is used to determine their influence on the position of saltwater interface. The analyses presented in this work rely on an efficient model reduction technique, based on Polynomial Chaos Expansion, able to combine the best description of the model without great computational burden. When the assumptions of classical analytical models are not respected, and this occurs several times in the applications to real cases of study, as in the area analyzed in the present work, one can adopt data-driven techniques, based on the analysis of the data characterizing the system under study. It follows that a model can be defined on the basis of connections between the system state variables, with only a limited number of assumptions about the "physical" behaviour of the system.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Chlamydia trachomatis is the most common bacterial sexually transmitted infection (STI) in many developed countries. The highest prevalence rates are found among young adults who have frequent partner change rates. Three published individual-based models have incorporated a detailed description of age-specific sexual behaviour in order to quantify the transmission of C. trachomatis in the population and to assess the impact of screening interventions. Owing to varying assumptions about sexual partnership formation and dissolution and the great uncertainty about critical parameters, such models show conflicting results about the impact of preventive interventions. Here, we perform a detailed evaluation of these models by comparing the partnership formation and dissolution dynamics with data from Natsal 2000, a population-based probability sample survey of sexual attitudes and lifestyles in Britain. The data also allow us to describe the dispersion of C. trachomatis infections as a function of sexual behaviour, using the Gini coefficient. We suggest that the Gini coefficient is a useful measure for calibrating infectious disease models that include risk structure and highlight the need to estimate this measure for other STIs.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Backcalculation is the primary method used to reconstruct past human immunodeficiency virus (HIV) infection rates, to estimate current prevalence of HIV infection, and to project future incidence of acquired immunodeficiency syndrome (AIDS). The method is very sensitive to uncertainty about the incubation period. We estimate incubation distributions from three sets of cohort data and find that the estimates for the cohorts are substantially different. Backcalculations employing the different estimates produce equally good fits to reported AIDS counts but quite different estimates of cumulative infections. These results suggest that the incubation distribution is likely to differ for different populations and that the differences are large enough to have a big impact on the resulting estimates of HIV infection rates. This seriously limits the usefulness of backcalculation for populations (such as intravenous drug users, heterosexuals, and women) that lack precise information on incubation times.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We analyze three sets of doubly-censored cohort data on incubation times, estimating incubation distributions using semi-parametric methods and assessing the comparability of the estimates. Weibull models appear to be inappropriate for at least one of the cohorts, and the estimates for the different cohorts are substantially different. We use these estimates as inputs for backcalculation, using a nonparametric method based on maximum penalized likelihood. The different incubations all produce fits to the reported AIDS counts that are as good as the fit from a nonstationary incubation distribution that models treatment effects, but the estimated infection curves are very different. We also develop a method for estimating nonstationarity as part of the backcalculation procedure and find that such estimates also depend very heavily on the assumed incubation distribution. We conclude that incubation distributions are so uncertain that meaningful error bounds are difficult to place on backcalculated estimates and that backcalculation may be too unreliable to be used without being supplemented by other sources of information in HIV prevalence and incidence.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Genome-wide association studies (GWAS) are used to discover genes underlying complex, heritable disorders for which less powerful study designs have failed in the past. The number of GWAS has skyrocketed recently with findings reported in top journals and the mainstream media. Mircorarrays are the genotype calling technology of choice in GWAS as they permit exploration of more than a million single nucleotide polymorphisms (SNPs)simultaneously. The starting point for the statistical analyses used by GWAS, to determine association between loci and disease, are genotype calls (AA, AB, or BB). However, the raw data, microarray probe intensities, are heavily processed before arriving at these calls. Various sophisticated statistical procedures have been proposed for transforming raw data into genotype calls. We find that variability in microarray output quality across different SNPs, different arrays, and different sample batches has substantial inuence on the accuracy of genotype calls made by existing algorithms. Failure to account for these sources of variability, GWAS run the risk of adversely affecting the quality of reported findings. In this paper we present solutions based on a multi-level mixed model. Software implementation of the method described in this paper is available as free and open source code in the crlmm R/BioConductor.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The past 1500 years provide a valuable opportunity to study the response of the climate system to external forcings. However, the integration of paleoclimate proxies with climate modeling is critical to improving the understanding of climate dynamics. In this paper, a climate system model and proxy records are therefore used to study the role of natural and anthropogenic forcings in driving the global climate. The inverse and forward approaches to paleoclimate data–model comparison are applied, and sources of uncertainty are identified and discussed. In the first of two case studies, the climate model simulations are compared with multiproxy temperature reconstructions. Robust solar and volcanic signals are detected in Southern Hemisphere temperatures, with a possible volcanic signal detected in the Northern Hemisphere. The anthropogenic signal dominates during the industrial period. It is also found that seasonal and geographical biases may cause multiproxy reconstructions to overestimate the magnitude of the long-term preindustrial cooling trend. In the second case study, the model simulations are compared with a coral δ18O record from the central Pacific Ocean. It is found that greenhouse gases, solar irradiance, and volcanic eruptions all influence the mean state of the central Pacific, but there is no evidence that natural or anthropogenic forcings have any systematic impact on El Niño–Southern Oscillation. The proxy climate relationship is found to change over time, challenging the assumption of stationarity that underlies the interpretation of paleoclimate proxies. These case studies demonstrate the value of paleoclimate data–model comparison but also highlight the limitations of current techniques and demonstrate the need to develop alternative approaches.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This study compared four alternative approaches (Taylor, Fieller, percentile bootstrap, and bias-corrected bootstrap methods) to estimating confidence intervals (CIs) around cost-effectiveness (CE) ratio. The study consisted of two components: (1) Monte Carlo simulation was conducted to identify characteristics of hypothetical cost-effectiveness data sets which might lead one CI estimation technique to outperform another. These results were matched to the characteristics of an (2) extant data set derived from the National AIDS Demonstration Research (NADR) project. The methods were used to calculate (CIs) for data set. These results were then compared. The main performance criterion in the simulation study was the percentage of times the estimated (CIs) contained the “true” CE. A secondary criterion was the average width of the confidence intervals. For the bootstrap methods, bias was estimated. ^ Simulation results for Taylor and Fieller methods indicated that the CIs estimated using the Taylor series method contained the true CE more often than did those obtained using the Fieller method, but the opposite was true when the correlation was positive and the CV of effectiveness was high for each value of CV of costs. Similarly, the CIs obtained by applying the Taylor series method to the NADR data set were wider than those obtained using the Fieller method for positive correlation values and for values for which the CV of effectiveness were not equal to 30% for each value of the CV of costs. ^ The general trend for the bootstrap methods was that the percentage of times the true CE ratio was contained in CIs was higher for the percentile method for higher values of the CV of effectiveness, given the correlation between average costs and effects and the CV of effectiveness. The results for the data set indicated that the bias corrected CIs were wider than the percentile method CIs. This result was in accordance with the prediction derived from the simulation experiment. ^ Generally, the bootstrap methods are more favorable for parameter specifications investigated in this study. However, the Taylor method is preferred for low CV of effect, and the percentile method is more favorable for higher CV of effect. ^

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The radar reflectivity of an ice-sheet bed is a primary measurement for discriminating between thawed and frozen beds. Uncertainty in englacial radar attenuation and its spatial variation introduces corresponding uncertainty in estimates of basal reflectivity. Radar attenuation is proportional to ice conductivity, which depends on the concentrations of acid and sea-salt chloride and the temperature of the ice. We synthesize published conductivity measurements to specify an ice-conductivity model and find that some of the dielectric properties of ice at radar frequencies are not yet well constrained. Using depth profiles of ice-core chemistry and borehole temperature and an average of the experimental values for the dielectric properties, we calculate an attenuation rate profile for Siple Dome, West Antarctica. The depth-averaged modeled attenuation rate at Siple Dome (20.0 +/- 5.7 dB km(-1)) is somewhat lower than the value derived from radar profiles (25.3 +/- 1.1 dB km(-1)). Pending more experimental data on the dielectric properties of ice, we can match the modeled and radar-derived attenuation rates by an adjustment to the value for the pure ice conductivity that is within the range of reported values. Alternatively, using the pure ice dielectric properties derived from the most extensive single data set, the modeled depth-averaged attenuation rate is 24.0 +/- 2.2 dB km(-1). This work shows how to calculate englacial radar attenuation using ice chemistry and temperature data and establishes a basis for mapping spatial variations in radar attenuation across an ice sheet.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Stepwise uncertainty reduction (SUR) strategies aim at constructing a sequence of points for evaluating a function  f in such a way that the residual uncertainty about a quantity of interest progressively decreases to zero. Using such strategies in the framework of Gaussian process modeling has been shown to be efficient for estimating the volume of excursion of f above a fixed threshold. However, SUR strategies remain cumbersome to use in practice because of their high computational complexity, and the fact that they deliver a single point at each iteration. In this article we introduce several multipoint sampling criteria, allowing the selection of batches of points at which f can be evaluated in parallel. Such criteria are of particular interest when f is costly to evaluate and several CPUs are simultaneously available. We also manage to drastically reduce the computational cost of these strategies through the use of closed form formulas. We illustrate their performances in various numerical experiments, including a nuclear safety test case. Basic notions about kriging, auxiliary problems, complexity calculations, R code, and data are available online as supplementary materials.