892 resultados para estimating conditional probabilities
Resumo:
Failure to detect a species in an area where it is present is a major source of error in biological surveys. We assessed whether it is possible to optimize single-visit biological monitoring surveys of highly dynamic freshwater ecosystems by framing them a priori within a particular period of time. Alternatively, we also searched for the optimal number of visits and when they should be conducted. We developed single-species occupancy models to estimate the monthly probability of detection of pond-breeding amphibians during a four-year monitoring program. Our results revealed that detection probability was species-specific and changed among sampling visits within a breeding season and also among breeding seasons. Thereby, the optimization of biological surveys with minimal survey effort (a single visit) is not feasible as it proves impossible to select a priori an adequate sampling period that remains robust across years. Alternatively, a two-survey combination at the beginning of the sampling season yielded optimal results and constituted an acceptable compromise between sampling efficacy and survey effort. Our study provides evidence of the variability and uncertainty that likely affects the efficacy of monitoring surveys, highlighting the need of repeated sampling in both ecological studies and conservation management.
Resumo:
BACKGROUND: The strength of the association between intensive care unit (ICU)-acquired nosocomial infections (NIs) and mortality might differ according to the methodological approach taken. OBJECTIVE: To assess the association between ICU-acquired NIs and mortality using the concept of population-attributable fraction (PAF) for patient deaths caused by ICU-acquired NIs in a large cohort of critically ill patients. SETTING: Eleven ICUs of a French university hospital. DESIGN: We analyzed surveillance data on ICU-acquired NIs collected prospectively during the period from 1995 through 2003. The primary outcome was mortality from ICU-acquired NI stratified by site of infection. A matched-pair, case-control study was performed. Each patient who died before ICU discharge was defined as a case patient, and each patient who survived to ICU discharge was defined as a control patient. The PAF was calculated after adjustment for confounders by use of conditional logistic regression analysis. RESULTS: Among 8,068 ICU patients, a total of 1,725 deceased patients were successfully matched with 1,725 control patients. The adjusted PAF due to ICU-acquired NI for patients who died before ICU discharge was 14.6% (95% confidence interval [CI], 14.4%-14.8%). Stratified by the type of infection, the PAF was 6.1% (95% CI, 5.7%-6.5%) for pulmonary infection, 3.2% (95% CI, 2.8%-3.5%) for central venous catheter infection, 1.7% (95% CI, 0.9%-2.5%) for bloodstream infection, and 0.0% (95% CI, -0.4% to 0.4%) for urinary tract infection. CONCLUSIONS: ICU-acquired NI had an important effect on mortality. However, the statistical association between ICU-acquired NI and mortality tended to be less pronounced in findings based on the PAF than in study findings based on estimates of relative risk. Therefore, the choice of methods does matter when the burden of NI needs to be assessed.
Resumo:
This paper describes a methodology to estimate the coefficients, to test specification hypothesesand to conduct policy exercises in multi-country VAR models with cross unit interdependencies, unit specific dynamics and time variations in the coefficients. The framework of analysis is Bayesian: a prior flexibly reduces the dimensionality of the model and puts structure on the time variations; MCMC methods are used to obtain posterior distributions; and marginal likelihoods to check the fit of various specifications. Impulse responses and conditional forecasts are obtained with the output of MCMC routine. The transmission of certain shocks across countries is analyzed.
Resumo:
Aim The imperfect detection of species may lead to erroneous conclusions about species-environment relationships. Accuracy in species detection usually requires temporal replication at sampling sites, a time-consuming and costly monitoring scheme. Here, we applied a lower-cost alternative based on a double-sampling approach to incorporate the reliability of species detection into regression-based species distribution modelling.Location Doñana National Park (south-western Spain).Methods Using species-specific monthly detection probabilities, we estimated the detection reliability as the probability of having detected the species given the species-specific survey time. Such reliability estimates were used to account explicitly for data uncertainty by weighting each absence. We illustrated how this novel framework can be used to evaluate four competing hypotheses as to what constitutes primary environmental control of amphibian distribution: breeding habitat, aestivating habitat, spatial distribution of surrounding habitats and/or major ecosystems zonation. The study was conducted on six pond-breeding amphibian species during a 4-year period.Results Non-detections should not be considered equivalent to real absences, as their reliability varied considerably. The occurrence of Hyla meridionalis and Triturus pygmaeus was related to a particular major ecosystem of the study area, where suitable habitat for these species seemed to be widely available. Characteristics of the breeding habitat (area and hydroperiod) were of high importance for the occurrence of Pelobates cultripes and Pleurodeles waltl. Terrestrial characteristics were the most important predictors of the occurrence of Discoglossus galganoi and Lissotriton boscai, along with spatial distribution of breeding habitats for the last species.Main conclusions We did not find a single best supported hypothesis valid for all species, which stresses the importance of multiscale and multifactor approaches. More importantly, this study shows that estimating the reliability of non-detection records, an exercise that had been previously seen as a naïve goal in species distribution modelling, is feasible and could be promoted in future studies, at least in comparable systems.
Resumo:
A statewide study was performed to develop regional regression equations for estimating selected annual exceedance- probability statistics for ungaged stream sites in Iowa. The study area comprises streamgages located within Iowa and 50 miles beyond the State’s borders. Annual exceedanceprobability estimates were computed for 518 streamgages by using the expected moments algorithm to fit a Pearson Type III distribution to the logarithms of annual peak discharges for each streamgage using annual peak-discharge data through 2010. The estimation of the selected statistics included a Bayesian weighted least-squares/generalized least-squares regression analysis to update regional skew coefficients for the 518 streamgages. Low-outlier and historic information were incorporated into the annual exceedance-probability analyses, and a generalized Grubbs-Beck test was used to detect multiple potentially influential low flows. Also, geographic information system software was used to measure 59 selected basin characteristics for each streamgage. Regional regression analysis, using generalized leastsquares regression, was used to develop a set of equations for each flood region in Iowa for estimating discharges for ungaged stream sites with 50-, 20-, 10-, 4-, 2-, 1-, 0.5-, and 0.2-percent annual exceedance probabilities, which are equivalent to annual flood-frequency recurrence intervals of 2, 5, 10, 25, 50, 100, 200, and 500 years, respectively. A total of 394 streamgages were included in the development of regional regression equations for three flood regions (regions 1, 2, and 3) that were defined for Iowa based on landform regions and soil regions. Average standard errors of prediction range from 31.8 to 45.2 percent for flood region 1, 19.4 to 46.8 percent for flood region 2, and 26.5 to 43.1 percent for flood region 3. The pseudo coefficients of determination for the generalized leastsquares equations range from 90.8 to 96.2 percent for flood region 1, 91.5 to 97.9 percent for flood region 2, and 92.4 to 96.0 percent for flood region 3. The regression equations are applicable only to stream sites in Iowa with flows not significantly affected by regulation, diversion, channelization, backwater, or urbanization and with basin characteristics within the range of those used to develop the equations. These regression equations will be implemented within the U.S. Geological Survey StreamStats Web-based geographic information system tool. StreamStats allows users to click on any ungaged site on a river and compute estimates of the eight selected statistics; in addition, 90-percent prediction intervals and the measured basin characteristics for the ungaged sites also are provided by the Web-based tool. StreamStats also allows users to click on any streamgage in Iowa and estimates computed for these eight selected statistics are provided for the streamgage.
Resumo:
Nous y introduisons une nouvelle classe de distributions bivariées de type Marshall-Olkin, la distribution Erlang bivariée. La transformée de Laplace, les moments et les densités conditionnelles y sont obtenus. Les applications potentielles en assurance-vie et en finance sont prises en considération. Les estimateurs du maximum de vraisemblance des paramètres sont calculés par l'algorithme Espérance-Maximisation. Ensuite, notre projet de recherche est consacré à l'étude des processus de risque multivariés, qui peuvent être utiles dans l'étude des problèmes de la ruine des compagnies d'assurance avec des classes dépendantes. Nous appliquons les résultats de la théorie des processus de Markov déterministes par morceaux afin d'obtenir les martingales exponentielles, nécessaires pour établir des bornes supérieures calculables pour la probabilité de ruine, dont les expressions sont intraitables.
Resumo:
La crisis que se desató en el mercado hipotecario en Estados Unidos en 2008 y que logró propagarse a lo largo de todo sistema financiero, dejó en evidencia el nivel de interconexión que actualmente existe entre las entidades del sector y sus relaciones con el sector productivo, dejando en evidencia la necesidad de identificar y caracterizar el riesgo sistémico inherente al sistema, para que de esta forma las entidades reguladoras busquen una estabilidad tanto individual, como del sistema en general. El presente documento muestra, a través de un modelo que combina el poder informativo de las redes y su adecuación a un modelo espacial auto regresivo (tipo panel), la importancia de incorporar al enfoque micro-prudencial (propuesto en Basilea II), una variable que capture el efecto de estar conectado con otras entidades, realizando así un análisis macro-prudencial (propuesto en Basilea III).
Resumo:
We investigate the effect of education Conditional Cash Transfer programs (CCTs) on teenage pregnancy. Our main concern is with how the size and sign of the effect may depend on the design of the program. Using a simple model we show that an education CCT that conditions renewal on school performance reduces teenage pregnancy; the program can increase teenage pregnancy if it does not condition on school performance. Then, using an original data base, we estimate the causal impact on teenage pregnancy of two education CCTs implemented in Bogot´a (Subsidio Educativo, SE, and Familias en Acci´on, FA); both programs differ particularly on whether school success is a condition for renewal or not. We show that SE has negative average effect on teenage pregnancy while FA has a null average effect. We also find that SE has either null or no effect for adolescents in all age and grade groups while FA has positive, null or negative effects for adolescents in different age and grade groups. Since SE conditions renewal on school success and FA does not, we can argue that the empirical results are consistent with the predictions of our model and that conditioning renewal of the subsidy on school success crucially determines the effect of the subsidy on teenage pregnancy
Resumo:
The paper concerns the design and analysis of serial dilution assays to estimate the infectivity of a sample of tissue when it is assumed that the sample contains a finite number of indivisible infectious units such that a subsample will be infectious if it contains one or more of these units. The aim of the study is to estimate the number of infectious units in the original sample. The standard approach to the analysis of data from such a study is based on the assumption of independence of aliquots both at the same dilution level and at different dilution levels, so that the numbers of infectious units in the aliquots follow independent Poisson distributions. An alternative approach is based on calculation of the expected value of the total number of samples tested that are not infectious. We derive the likelihood for the data on the basis of the discrete number of infectious units, enabling calculation of the maximum likelihood estimate and likelihood-based confidence intervals. We use the exact probabilities that are obtained to compare the maximum likelihood estimate with those given by the other methods in terms of bias and standard error and to compare the coverage of the confidence intervals. We show that the methods have very similar properties and conclude that for practical use the method that is based on the Poisson assumption is to be recommended, since it can be implemented by using standard statistical software. Finally we consider the design of serial dilution assays, concluding that it is important that neither the dilution factor nor the number of samples that remain untested should be too large.
Resumo:
Internal risk management models of the kind popularized by J. P. Morgan are now used widely by the world’s most sophisticated financial institutions as a means of measuring risk. Using the returns on three of the most popular futures contracts on the London International Financial Futures Exchange, in this paper we investigate the possibility of using multivariate generalized autoregressive conditional heteroscedasticity (GARCH) models for the calculation of minimum capital risk requirements (MCRRs). We propose a method for the estimation of the value at risk of a portfolio based on a multivariate GARCH model. We find that the consideration of the correlation between the contracts can lead to more accurate, and therefore more appropriate, MCRRs compared with the values obtained from a univariate approach to the problem.
Resumo:
A crucial aspect of evidential reasoning in crime investigation involves comparing the support that evidence provides for alternative hypotheses. Recent work in forensic statistics has shown how Bayesian Networks (BNs) can be employed for this purpose. However, the specification of BNs requires conditional probability tables describing the uncertain processes under evaluation. When these processes are poorly understood, it is necessary to rely on subjective probabilities provided by experts. Accurate probabilities of this type are normally hard to acquire from experts. Recent work in qualitative reasoning has developed methods to perform probabilistic reasoning using coarser representations. However, the latter types of approaches are too imprecise to compare the likelihood of alternative hypotheses. This paper examines this shortcoming of the qualitative approaches when applied to the aforementioned problem, and identifies and integrates techniques to refine them.
Resumo:
Estimating the parameters of the instantaneous spot interest rate process is of crucial importance for pricing fixed income derivative securities. This paper presents an estimation for the parameters of the Gaussian interest rate model for pricing fixed income derivatives based on the term structure of volatility. We estimate the term structure of volatility for US treasury rates for the period 1983 - 1995, based on a history of yield curves. We estimate both conditional and first differences term structures of volatility and subsequently estimate the implied parameters of the Gaussian model with non-linear least squares estimation. Results for bond options illustrate the effects of differing parameters in pricing.
Resumo:
The goal of this paper is to present a comprehensive emprical analysis of the return and conditional variance of four Brazilian …nancial series using models of the ARCH class. Selected models are then compared regarding forecasting accuracy and goodness-of-…t statistics. To help understanding the empirical results, a self-contained theoretical discussion of ARCH models is also presented in such a way that it is useful for the applied researcher. Empirical results show that although all series share ARCH and are leptokurtic relative to the Normal, the return on the US$ has clearly regime switching and no asymmetry for the variance, the return on COCOA has no asymmetry, while the returns on the CBOND and TELEBRAS have clear signs of asymmetry favoring the leverage e¤ect. Regarding forecasting, the best model overall was the EGARCH(1; 1) in its Gaussian version. Regarding goodness-of-…t statistics, the SWARCH model did well, followed closely by the Student-t GARCH(1; 1)
Resumo:
This paper proposes a two-step procedure to back out the conditional alpha of a given stock using high-frequency data. We rst estimate the realized factor loadings of the stocks, and then retrieve their conditional alphas by estimating the conditional expectation of their risk-adjusted returns. We start with the underlying continuous-time stochastic process that governs the dynamics of every stock price and then derive the conditions under which we may consistently estimate the daily factor loadings and the resulting conditional alphas. We also contribute empiri-cally to the conditional CAPM literature by examining the main drivers of the conditional alphas of the S&P 100 index constituents from January 2001 to December 2008. In addition, to con rm whether these conditional alphas indeed relate to pricing errors, we assess the performance of both cross-sectional and time-series momentum strategies based on the conditional alpha estimates. The ndings are very promising in that these strategies not only seem to perform pretty well both in absolute and relative terms, but also exhibit virtually no systematic exposure to the usual risk factors (namely, market, size, value and momentum portfolios).
Resumo:
The estimation of labor supply elasticities has been an important issue m the economic literature. Yet all works have estimated conditional mean labor supply functions only. The objective of this paper is to obtain more information on labor supply, by estimating the conditional quantile labor supply function. vI/e use a sample of prime age urban males employees in Brazil. Two stage estimators are used as the net wage and virtual income are found to be endogenous to the model. Contrary to previous works using conditional mean estimators, it is found that labor supply elasticities vary significantly and asymmetrically across hours of work. vVhile the income and wage elasticities at the standard work week are zero, for those working longer hours the elasticities are negative.