888 resultados para Paternal uncertainty


Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this essay, we explore an issue of moral uncertainty: what we are permitted to do when we are unsure about which moral principles are correct. We develop a novel approach to this issue that incorporates important insights from previous work on moral uncertainty, while avoiding some of the difficulties that beset existing alternative approaches. Our approach is based on evaluating and choosing between option sets rather than particular conduct options. We show how our approach is particularly well-suited to address this issue of moral uncertainty with respect to agents that have credence in moral theories that are not fully consequentialist.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Attending to stimuli that share perceptual similarity to learned threats is an adaptive strategy. However, prolonged threat generalization to cues signalling safety is considered a core feature of pathological anxiety. One potential factor that may sustain over-generalization is sensitivity to future threat uncertainty. To assess the extent to which Intolerance of Uncertainty (IU) predicts threat generalization, we recorded skin conductance in 54 healthy participants during an associative learning paradigm, where threat and safety cues varied in perceptual similarity. Lower IU was associated with stronger discrimination between threat and safety cues during acquisition and extinction. Higher IU, however, was associated with generalized responding to threat and safety cues during acquisition, and delayed discrimination between threat and safety cues during extinction. These results were specific to IU, over and above other measures of anxious disposition. These findings highlight: (1) a critical role of uncertainty-based mechanisms in threat generalization, and (2) IU as a potential risk factor for anxiety disorder development.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This study tests predictions of the hypothesis of evolution of paternal care via sexual selection by using the Neotropical harvestman Pseudopucrolia sp. as the model organism. Females use natural cavities in roadside banks as nesting sites, which are defended by males against other males. Females leave the nests after oviposition, and all postzygotic parental care is accomplished by males, which protect the eggs and nymphs from predators. We provided artificial mud nests to individuals in the laboratory and conducted observations on the reproduction of the species. Male reproductive success was directly related to nest ownership time: the longer a male held a nest, the higher his chances of obtaining copulations. All males that succeeded in mating and obtaining one clutch eventually mated with additional females that added eggs to the clutch. Thus, desirable males were not limited to monogamy by paternal care. Experimental manipulations demonstrated that guarding males were more attractive to females than were nonguarding males and also that males guarded unrelated eggs. Finally, we found that females and nonguarding males spent more time foraging than guarding males. We use our data to contrast hypotheses on the origin and maintenance of paternal care and to provide a critical assessment of the hypothesis of the evolution of paternal care via sexual selection. (C) 2009 The Association for the Study of Animal Behaviour. Published by Elsevier Ltd. All rights reserved.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We provide necessary and sufficient conditions for states to have an arbitrarily small uncertainty product of the azimuthal angle phi and its canonical moment L(z). We illustrate our results with analytical examples.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We show that single and multislit experiments involving matter waves may be constructed to assess dispersively generated correlations between the position and momentum of a single free particle. These correlations give rise to position dependent phases which develop dynamically as a result of dispersion and may play an important role in the interference patterns. To the extent that initial transverse coherence is preserved throughout the proposed diffraction setup, such interference patterns are noticeably different from those of a classical dispersion free wave. (c) 2007 Published by Elsevier B.V.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Random effect models have been widely applied in many fields of research. However, models with uncertain design matrices for random effects have been little investigated before. In some applications with such problems, an expectation method has been used for simplicity. This method does not include the extra information of uncertainty in the design matrix is not included. The closed solution for this problem is generally difficult to attain. We therefore propose an two-step algorithm for estimating the parameters, especially the variance components in the model. The implementation is based on Monte Carlo approximation and a Newton-Raphson-based EM algorithm. As an example, a simulated genetics dataset was analyzed. The results showed that the proportion of the total variance explained by the random effects was accurately estimated, which was highly underestimated by the expectation method. By introducing heuristic search and optimization methods, the algorithm can possibly be developed to infer the 'model-based' best design matrix and the corresponding best estimates.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper presents the techniques of likelihood prediction for the generalized linear mixed models. Methods of likelihood prediction is explained through a series of examples; from a classical one to more complicated ones. The examples show, in simple cases, that the likelihood prediction (LP) coincides with already known best frequentist practice such as the best linear unbiased predictor. The paper outlines a way to deal with the covariate uncertainty while producing predictive inference. Using a Poisson error-in-variable generalized linear model, it has been shown that in complicated cases LP produces better results than already know methods.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This study presents an approach to combine uncertainties of the hydrological model outputs predicted from a number of machine learning models. The machine learning based uncertainty prediction approach is very useful for estimation of hydrological models' uncertainty in particular hydro-metrological situation in real-time application [1]. In this approach the hydrological model realizations from Monte Carlo simulations are used to build different machine learning uncertainty models to predict uncertainty (quantiles of pdf) of the a deterministic output from hydrological model . Uncertainty models are trained using antecedent precipitation and streamflows as inputs. The trained models are then employed to predict the model output uncertainty which is specific for the new input data. We used three machine learning models namely artificial neural networks, model tree, locally weighted regression to predict output uncertainties. These three models produce similar verification results, which can be improved by merging their outputs dynamically. We propose an approach to form a committee of the three models to combine their outputs. The approach is applied to estimate uncertainty of streamflows simulation from a conceptual hydrological model in the Brue catchment in UK and the Bagmati catchment in Nepal. The verification results show that merged output is better than an individual model output. [1] D. L. Shrestha, N. Kayastha, and D. P. Solomatine, and R. Price. Encapsulation of parameteric uncertainty statistics by various predictive machine learning models: MLUE method, Journal of Hydroinformatic, in press, 2013.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A procedure for characterizing global uncertainty of a rainfall-runoff simulation model based on using grey numbers is presented. By using the grey numbers technique the uncertainty is characterized by an interval; once the parameters of the rainfall-runoff model have been properly defined as grey numbers, by using the grey mathematics and functions it is possible to obtain simulated discharges in the form of grey numbers whose envelope defines a band which represents the vagueness/uncertainty associated with the simulated variable. The grey numbers representing the model parameters are estimated in such a way that the band obtained from the envelope of simulated grey discharges includes an assigned percentage of observed discharge values and is at the same time as narrow as possible. The approach is applied to a real case study highlighting that a rigorous application of the procedure for direct simulation through the rainfall-runoff model with grey parameters involves long computational times. However, these times can be significantly reduced using a simplified computing procedure with minimal approximations in the quantification of the grey numbers representing the simulated discharges. Relying on this simplified procedure, the conceptual rainfall-runoff grey model is thus calibrated and the uncertainty bands obtained both downstream of the calibration process and downstream of the validation process are compared with those obtained by using a well-established approach, like the GLUE approach, for characterizing uncertainty. The results of the comparison show that the proposed approach may represent a valid tool for characterizing the global uncertainty associable with the output of a rainfall-runoff simulation model.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Using vector autoregressive (VAR) models and Monte-Carlo simulation methods we investigate the potential gains for forecasting accuracy and estimation uncertainty of two commonly used restrictions arising from economic relationships. The Örst reduces parameter space by imposing long-term restrictions on the behavior of economic variables as discussed by the literature on cointegration, and the second reduces parameter space by imposing short-term restrictions as discussed by the literature on serial-correlation common features (SCCF). Our simulations cover three important issues on model building, estimation, and forecasting. First, we examine the performance of standard and modiÖed information criteria in choosing lag length for cointegrated VARs with SCCF restrictions. Second, we provide a comparison of forecasting accuracy of Ötted VARs when only cointegration restrictions are imposed and when cointegration and SCCF restrictions are jointly imposed. Third, we propose a new estimation algorithm where short- and long-term restrictions interact to estimate the cointegrating and the cofeature spaces respectively. We have three basic results. First, ignoring SCCF restrictions has a high cost in terms of model selection, because standard information criteria chooses too frequently inconsistent models, with too small a lag length. Criteria selecting lag and rank simultaneously have a superior performance in this case. Second, this translates into a superior forecasting performance of the restricted VECM over the VECM, with important improvements in forecasting accuracy ñreaching more than 100% in extreme cases. Third, the new algorithm proposed here fares very well in terms of parameter estimation, even when we consider the estimation of long-term parameters, opening up the discussion of joint estimation of short- and long-term parameters in VAR models.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Lucas (1987) has shown the surprising result that the welfare cost of business cycles is quite small. Using standard assumptions on preferences and a fully-áedged econometric model we computed the welfare costs of macroeconomic uncertainty for the post-WWII era using the multivariate Beveridge-Nelson decomposition for trends and cycles, which considers not only business-cycle uncertainty but also uncertainty from the stochastic trend in consumption. The post-WWII period is relatively quiet, with the welfare costs of uncertainty being about 0:9% of per-capita consumption. Although changing the decomposition method changed substantially initial results, the welfare cost of uncertainty is qualitatively small in the post-WWII era - about $175.00 a year per-capita in the U.S. We also computed the marginal welfare cost of macroeconomic uncertainty using this same technique. It is about twice as large as the welfare cost ñ$350.00 a year per-capita.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Consider the demand for a good whose consumption be chosen prior to the resolution of uncertainty regarding income. How do changes in the distribution of income affect the demand for this good? In this paper we show that normality, is sufficient to guarantee that consumption increases of the Radon-Nikodym derivative of the new distribution with respect to the old is non-decreasing in the whole domain. However, if only first order stochastic dominance is assumed more structure must be imposed on preferences to guanantee the validity of the result. Finally a converse of the first result also obtains. If the change in measure is characterized by non-decreasing Radon-Nicodyn derivative, consumption of such a good will always increase if and only if the good is normal.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this paper we apply the theory of declsion making with expected utility and non-additive priors to the choice of optimal portfolio. This theory describes the behavior of a rational agent who i5 averse to pure 'uncertainty' (as well as, possibly, to 'risk'). We study the agent's optimal allocation of wealth between a safe and an uncertain asset. We show that there is a range of prices at which the agent neither buys not sells short the uncertain asset. In contrast the standard theory of expected utility predicts that there is exactly one such price. We also provide a definition of an increase in uncertainty aversion and show that it causes the range of prices to increase.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

With standard assumptions on preferences and a fully-fledged econometric model we computed the welfare costs of macroeconomic uncertainty for post-war U.S. using the BeveridgeNelson decomposition. Welfare costs are about 0.9% per-capita consumption ($175.00) and marginal welfare costs are about twice as large.