951 resultados para Probabilities.
Resumo:
Risk taking is central to human activity. Consequently, it lies at the focal point of behavioral sciences such as neuroscience, economics, and finance. Many influential models from these sciences assume that financial risk preferences form a stable trait. Is this assumption justified and, if not, what causes the appetite for risk to fluctuate? We have previously found that traders experience a sustained increase in the stress hormone cortisol when the amount of uncertainty, in the form of market volatility, increases. Here we ask whether these elevated cortisol levels shift risk preferences. Using a double-blind, placebo-controlled, cross-over protocol we raised cortisol levels in volunteers over eight days to the same extent previously observed in traders. We then tested for the utility and probability weighting functions underlying their risk taking, and found that participants became more risk averse. We also observed that the weighting of probabilities became more distorted among men relative to women. These results suggest that risk preferences are highly dynamic. Specifically, the stress response calibrates risk taking to our circumstances, reducing it in times of prolonged uncertainty, such as a financial crisis. Physiology-induced shifts in risk preferences may thus be an under-appreciated cause of market instability.
Resumo:
In this paper we introduce a formalization of Logical Imaging applied to IR in terms of Quantum Theory through the use of an analogy between states of a quantum system and terms in text documents. Our formalization relies upon the Schrodinger Picture, creating an analogy between the dynamics of a physical system and the kinematics of probabilities generated by Logical Imaging. By using Quantum Theory, it is possible to model more precisely contextual information in a seamless and principled fashion within the Logical Imaging process. While further work is needed to empirically validate this, the foundations for doing so are provided.
Resumo:
In this paper we present truncated differential analysis of reduced-round LBlock by computing the differential distribution of every nibble of the state. LLR statistical test is used as a tool to apply the distinguishing and key-recovery attacks. To build the distinguisher, all possible differences are traced through the cipher and the truncated differential probability distribution is determined for every output nibble. We concatenate additional rounds to the beginning and end of the truncated differential distribution to apply the key-recovery attack. By exploiting properties of the key schedule, we obtain a large overlap of key bits used in the beginning and final rounds. This allows us to significantly increase the differential probabilities and hence reduce the attack complexity. We validate the analysis by implementing the attack on LBlock reduced to 12 rounds. Finally, we apply single-key and related-key attacks on 18 and 21-round LBlock, respectively.
Resumo:
Business processes are an important instrument for understanding and improving how companies provide goods and services to customers. Therefore, many companies have documented their business processes well, often in the Event-driven Process Chains (EPC). Unfortunately, in many cases the resulting EPCs are rather complex, so that the overall process logic is hidden in low level process details. This paper proposes abstraction mechanisms for process models that aim to reduce their complexity, while keeping the overall process structure. We assume that functions are marked with efforts and splits are marked with probabilities. This information is used to separate important process parts from less important ones. Real world process models are used to validate the approach.
Resumo:
This paper discusses a model of the civil aviation reg- ulation framework and shows how the current assess- ment of reliability and risk for piloted aircraft has limited applicability for Unmanned Aircraft Systems (UAS) with high levels of autonomous decision mak- ing. Then, a new framework for risk management of robust autonomy is proposed, which arises from combining quantified measures of risk with normative decision making. The term Robust Autonomy de- scribes the ability of an autonomous system to either continue or abort its operation whilst not breaching a minimum level of acceptable safety in the presence of anomalous conditions. The decision making associ- ated with risk management requires quantifying prob- abilities associated with the measures of risk and also consequences of outcomes related to the behaviour of autonomy. The probabilities are computed from an assessment under both nominal and anomalous sce- narios described by faults, which can be associated with the aircraft’s actuators, sensors, communication link, changes in dynamics, and the presence of other aircraft in the operational space. The consequences of outcomes are characterised by a loss function which rewards the certification decision
Resumo:
As the level of autonomy in Unmanned Aircraft Systems (UAS) increases, there is an imperative need for developing methods to assess robust autonomy. This paper focuses on the computations that lead to a set of measures of robust autonomy. These measures are the probabilities that selected performance indices related to the mission requirements and airframe capabilities remain within regions of acceptable performance.
Resumo:
Introduction Total scatter factor (or output factor) in megavoltage photon dosimetry is a measure of relative dose relating a certain field size to a reference field size. The use of solid phantoms has been well established for output factor measurements, however to date these phantoms have not been tested with small fields. In this work, we evaluate the water equivalency of a number of solid phantoms for small field output factor measurements using the EGSnrc Monte Carlo code. Methods The following small square field sizes were simulated using BEAMnrc: 5, 6, 7, 8, 10 and 30 mm. Each simulated phantom geometry was created in DOSXYZnrc and consisted of a silicon diode (of length and width 1.5 mm and depth 0.5 mm) submersed in the phantom at a depth of 5 g/cm2. The source-to-detector distance was 100 cm for all simulations. The dose was scored in a single voxel at the location of the diode. Interaction probabilities and radiation transport parameters for each material were created using custom PEGS4 files. Results A comparison of the resultant output factors in the solid phantoms, compared to the same factors in a water phantom are shown in Fig. 1. The statistical uncertainty in each point was less than or equal to 0.4 %. The results in Fig. 1 show that the density of the phantoms affected the output factor results, with higher density materials (such as PMMA) resulting in higher output factors. Additionally, it was also calculated that scaling the depth for equivalent path length had negligible effect on the output factor results at these field sizes. Discussion and conclusions Electron stopping power and photon mass energy absorption change minimally with small field size [1]. Also, it can be seen from Fig. 1 that the difference from water decreases with increasing field size. Therefore, the most likely cause for the observed discrepancies in output factors is differing electron disequilibrium as a function of phantom density. When measuring small field output factors in a solid phantom, it is important that the density is very close to that of water.
Resumo:
Numerous initiatives have been employed around the world in order to address rising greenhouse gas (GHG) emissions originating from the transport sector. These measures include: travel demand management (congestion‐charging), increased fuel taxes, alternative fuel subsidies and low‐emission vehicle (LEV) rebates. Incentivizing the purchase of LEVs has been one of the more prevalent approaches in attempting to tackle this global issue. LEVs, whilst having the advantage of lower emissions and, in some cases, more efficient fuel consumption, also bring the downsides of increased purchase cost, reduced convenience of vehicle fuelling, and operational uncertainty. To stimulate demand in the face of these challenges, various incentive‐based policies, such as toll exemptions, have been used by national and local governments to encourage the purchase of these types of vehicles. In order to address rising GHG emissions in Stockholm, and in line with the Swedish Government’s ambition to operate a fossil free fleet by 2030, a number of policies were implemented targeting the transport sector. Foremost amongst these was the combination of a congestion charge – initiated to discourage emissions‐intensive travel – and an exemption from this charge for some LEVs, established to encourage a transition towards a ‘green’ vehicle fleet. Although both policies shared the aim of reducing GHG emissions, the exemption for LEVs carried the risk of diminishing the effectiveness of the congestion charging scheme. As the number of vehicle owners choosing to transition to an eligible LEV increased, the congestion‐reduction effectiveness of the charging scheme weakened. In fact, policy makers quickly recognized this potential issue and consequently phased out the LEV exemption less than 18 months after its introduction (1). Several studies have investigated the demand for LEVs through stated‐preference (SP) surveys across multiple countries, including: Denmark (2), Germany (3, 4), UK (5), Canada (6), USA (7, 8) and Australia (9). Although each of these studies differed in approach, all involved SP surveys where differing characteristics between various types of vehicles, including LEVs, were presented to respondents and these respondents in turn made hypothetical decisions about which vehicle they would be most likely to purchase. Although these studies revealed a number of interesting findings in regards to the potential demand for LEVs, they relied on SP data. In contrast, this paper employs an approach where LEV choice is modelled by taking a retrospective view and by using revealed preference (RP) data. By examining the revealed preferences of vehicle owners in Stockholm, this study overcomes one of the principal limitations of SP data, namely that stated preferences may not in fact reflect individuals’ actual choices, such as when cost, time, and inconvenience factors are real rather than hypothetical. This paper’s RP approach involves modelling the characteristics of individuals who purchased new LEVs, whilst estimating the effect of the congestion charging exemption upon choice probabilities and subsequent aggregate demand. The paper contributes to the current literature by examining the effectiveness of a toll exemption under revealed preference conditions, and by assessing the total effect of the policy based on key indicators for policy makers, including: vehicle owner home location, commuting patterns, number of children, age, gender and income. Extended Abstract Submission for Kuhmo Nectar Conference 2014 2 The two main research questions motivating this study were: Which individuals chose to purchase a new LEV in Stockholm in 2008?; and, How did the congestion charging exemption affect the aggregate demand for new LEVs in Stockholm in 2008? In order to answer these research questions the analysis was split into two stages. Firstly, a multinomial logit (MNL) model was used to identify which demographic characteristics were most significantly related to the purchase of an LEV over a conventional vehicle. The three most significant variables were found to be: intra‐cordon residency (positive); commuting across the cordon (positive); and distance of residence from the cordon (negative). In order to estimate the effect of the exemption policy on vehicle purchase choice, the model included variables to control for geographic differences in preferences, based on the location of the vehicle owners’ homes and workplaces in relation to the congestion‐charging cordon boundary. These variables included one indicator representing commutes across the cordon and another indicator representing intra‐cordon residency. The effect of the exemption policy on the probability of purchasing LEVs was estimated in the second stage of the analysis by focusing on the groups of vehicle owners that were most likely to have been affected by the policy i.e. those commuting across the cordon boundary (in both directions). Given the inclusion of the indicator variable representing commutes across the cordon, it is assumed that the estimated coefficient of this variable captures the effect of the exemption policy on the utility of choosing to purchase an exempt LEV for these two groups of vehicle owners. The intra‐cordon residency indicator variable also controls for differences between the two groups, based upon direction of travel across the cordon boundary. A counter‐hypothesis to this assumption is that the coefficient of the variable representing commuting across the cordon boundary instead only captures geo‐demographic differences that lead to variations in LEV ownership across the different groups of vehicle owners in relation to the cordon boundary. In order to address this counter‐hypothesis, an additional analysis was performed on data from a city with a similar geodemographic pattern to Stockholm, Gothenburg ‐ Sweden’s second largest city. The results of this analysis provided evidence to support the argument that the coefficient of the variable representing commutes across the cordon was capturing the effect of the exemption policy. Based upon this framework, the predicted vehicle type shares were calculated using the estimated coefficients of the MNL model and compared with predicted vehicle type shares from a simulated scenario where the exemption policy was inactive. This simulated scenario was constructed by setting the coefficient for the variable representing commutes across the cordon boundary to zero for all observations to remove the utility benefit of the exemption policy. Overall, the procedure of this second stage of the analysis led to results showing that the exemption had a substantial effect upon the probability of purchasing and aggregate demand for exempt LEVs in Stockholm during 2008. By making use of unique evidence of revealed preferences of LEV owners, this study identifies the common characteristics of new LEV owners and estimates the effect of Stockholm's congestion charging exemption upon the demand for new LEVs during 2008. It was found that the variables that had the greatest effect upon the choice of purchasing an exempt LEV included intra‐cordon residency (positive), distance of home from the cordon (negative), and commuting across the cordon (positive). It was also determined that owners under the age of 30 years preferred non‐exempt LEVs (low CO2 LEVs), whilst those over the age of 30 years preferred electric vehicles. In terms of electric vehicles, it was apparent that those individuals living within the city had the highest propensity towards purchasing this vehicle type. A negative relationship between choosing an electric vehicle and the distance of an individuals’ residency from the cordon was also evident. Overall, the congestion charging exemption was found to have increased the share of exempt LEVs in Stockholm by 1.9%, with, as expected, a much stronger effect on those commuting across the boundary, with those living inside the cordon having a 13.1% increase, and those owners living outside the cordon having a 5.0% increase. This increase in demand corresponded to an additional 538 (+/‐ 93; 95% C.I.) new exempt LEVs purchased in Stockholm during 2008 (out of a total of 5 427; 9.9%). Policy makers can take note that an incentive‐based policy can increase the demand for LEVs and appears to be an appropriate approach to adopt when attempting to reduce transport emissions through encouraging a transition towards a ‘green’ vehicle fleet.
Resumo:
This paper considers two problems that frequently arise in dynamic discrete choice problems but have not received much attention with regard to simulation methods. The first problem is how to simulate unbiased simulators of probabilities conditional on past history. The second is simulating a discrete transition probability model when the underlying dependent variable is really continuous. Both methods work well relative to reasonable alternatives in the application discussed. However, in both cases, for this application, simpler methods also provide reasonably good results.
Resumo:
Potential conflicts exist between biodiversity conservation and climate-change mitigation as trade-offs in multiple-use land management. This study aims to evaluate public preferences for biodiversity conservation and climate-change mitigation policy considering respondents’ uncertainty on their choice. We conducted a choice experiment using land-use scenarios in the rural Kushiro watershed in northern Japan. The results showed that the public strongly wish to avoid the extinction of endangered species in preference to climate-change mitigation in the form of carbon sequestration by increasing the area of managed forest. Knowledge of the site and the respondents’ awareness of the personal benefits associated with supporting and regulating services had a positive effect on their preference for conservation plans. Thus, decision-makers should be careful about how they provide ecological information for informed choices concerning ecosystem services tradeoffs. Suggesting targets with explicit indicators will affect public preferences, as well as the willingness of the public to pay for such measures. Furthermore, the elicited-choice probabilities approach is useful for revealing the distribution of relative preferences for incomplete scenarios, thus verifying the effectiveness of indicators introduced in the experiment.
Resumo:
Analytically or computationally intractable likelihood functions can arise in complex statistical inferential problems making them inaccessible to standard Bayesian inferential methods. Approximate Bayesian computation (ABC) methods address such inferential problems by replacing direct likelihood evaluations with repeated sampling from the model. ABC methods have been predominantly applied to parameter estimation problems and less to model choice problems due to the added difficulty of handling multiple model spaces. The ABC algorithm proposed here addresses model choice problems by extending Fearnhead and Prangle (2012, Journal of the Royal Statistical Society, Series B 74, 1–28) where the posterior mean of the model parameters estimated through regression formed the summary statistics used in the discrepancy measure. An additional stepwise multinomial logistic regression is performed on the model indicator variable in the regression step and the estimated model probabilities are incorporated into the set of summary statistics for model choice purposes. A reversible jump Markov chain Monte Carlo step is also included in the algorithm to increase model diversity for thorough exploration of the model space. This algorithm was applied to a validating example to demonstrate the robustness of the algorithm across a wide range of true model probabilities. Its subsequent use in three pathogen transmission examples of varying complexity illustrates the utility of the algorithm in inferring preference of particular transmission models for the pathogens.
Resumo:
A key concept in many Information Retrieval (IR) tasks, e.g. document indexing, query language modelling, aspect and diversity retrieval, is the relevance measurement of topics, i.e. to what extent an information object (e.g. a document or a query) is about the topics. This paper investigates the interference of relevance measurement of a topic caused by another topic. For example, consider that two user groups are required to judge whether a topic q is relevant to a document d, and q is presented together with another topic (referred to as a companion topic). If different companion topics are used for different groups, interestingly different relevance probabilities of q given d can be reached. In this paper, we present empirical results showing that the relevance of a topic to a document is greatly affected by the companion topic’s relevance to the same document, and the extent of the impact differs with respect to different companion topics. We further analyse the phenomenon from classical and quantum-like interference perspectives, and connect the phenomenon to nonreality and contextuality in quantum mechanics. We demonstrate that quantum like model fits in the empirical data, could be potentially used for predicting the relevance when interference exists.
Resumo:
Emotion and cognition are known to interact during human decision processes. In this study we focus on a specific kind of cognition, namely metacognition. Our experiment induces a negative emotion, worry, during a perceptual task. In a numerosity task subjects have to make a two alternative forced choice and then reveal their confidence in this decision. We measure metacognition in terms of discrimination and calibration abilities. Our results show that metacognition, but not choice, is affected by the level of worry anticipatedbefore the decision. Under worry individuals tend to have better metacognition in terms of the two measures. Furthermore understanding the formation of confidence is better explained with taking into account the level of worry in the model. This study shows the importance of an emotional component in the formation and the quality of the subjective probabilities.
Resumo:
Bayesian networks (BNs) are graphical probabilistic models used for reasoning under uncertainty. These models are becoming increasing popular in a range of fields including ecology, computational biology, medical diagnosis, and forensics. In most of these cases, the BNs are quantified using information from experts, or from user opinions. An interest therefore lies in the way in which multiple opinions can be represented and used in a BN. This paper proposes the use of a measurement error model to combine opinions for use in the quantification of a BN. The multiple opinions are treated as a realisation of measurement error and the model uses the posterior probabilities ascribed to each node in the BN which are computed from the prior information given by each expert. The proposed model addresses the issues associated with current methods of combining opinions such as the absence of a coherent probability model, the lack of the conditional independence structure of the BN being maintained, and the provision of only a point estimate for the consensus. The proposed model is applied an existing Bayesian Network and performed well when compared to existing methods of combining opinions.
Resumo:
Background Up-to-date evidence on levels and trends for age-sex-specific all-cause and cause-specific mortality is essential for the formation of global, regional, and national health policies. In the Global Burden of Disease Study 2013 (GBD 2013) we estimated yearly deaths for 188 countries between 1990, and 2013. We used the results to assess whether there is epidemiological convergence across countries. Methods We estimated age-sex-specific all-cause mortality using the GBD 2010 methods with some refinements to improve accuracy applied to an updated database of vital registration, survey, and census data. We generally estimated cause of death as in the GBD 2010. Key improvements included the addition of more recent vital registration data for 72 countries, an updated verbal autopsy literature review, two new and detailed data systems for China, and more detail for Mexico, UK, Turkey, and Russia. We improved statistical models for garbage code redistribution. We used six different modelling strategies across the 240 causes; cause of death ensemble modelling (CODEm) was the dominant strategy for causes with sufficient information. Trends for Alzheimer's disease and other dementias were informed by meta-regression of prevalence studies. For pathogen-specific causes of diarrhoea and lower respiratory infections we used a counterfactual approach. We computed two measures of convergence (inequality) across countries: the average relative difference across all pairs of countries (Gini coefficient) and the average absolute difference across countries. To summarise broad findings, we used multiple decrement life-tables to decompose probabilities of death from birth to exact age 15 years, from exact age 15 years to exact age 50 years, and from exact age 50 years to exact age 75 years, and life expectancy at birth into major causes. For all quantities reported, we computed 95% uncertainty intervals (UIs). We constrained cause-specific fractions within each age-sex-country-year group to sum to all-cause mortality based on draws from the uncertainty distributions. Findings Global life expectancy for both sexes increased from 65·3 years (UI 65·0–65·6) in 1990, to 71·5 years (UI 71·0–71·9) in 2013, while the number of deaths increased from 47·5 million (UI 46·8–48·2) to 54·9 million (UI 53·6–56·3) over the same interval. Global progress masked variation by age and sex: for children, average absolute differences between countries decreased but relative differences increased. For women aged 25–39 years and older than 75 years and for men aged 20–49 years and 65 years and older, both absolute and relative differences increased. Decomposition of global and regional life expectancy showed the prominent role of reductions in age-standardised death rates for cardiovascular diseases and cancers in high-income regions, and reductions in child deaths from diarrhoea, lower respiratory infections, and neonatal causes in low-income regions. HIV/AIDS reduced life expectancy in southern sub-Saharan Africa. For most communicable causes of death both numbers of deaths and age-standardised death rates fell whereas for most non-communicable causes, demographic shifts have increased numbers of deaths but decreased age-standardised death rates. Global deaths from injury increased by 10·7%, from 4·3 million deaths in 1990 to 4·8 million in 2013; but age-standardised rates declined over the same period by 21%. For some causes of more than 100 000 deaths per year in 2013, age-standardised death rates increased between 1990 and 2013, including HIV/AIDS, pancreatic cancer, atrial fibrillation and flutter, drug use disorders, diabetes, chronic kidney disease, and sickle-cell anaemias. Diarrhoeal diseases, lower respiratory infections, neonatal causes, and malaria are still in the top five causes of death in children younger than 5 years. The most important pathogens are rotavirus for diarrhoea and pneumococcus for lower respiratory infections. Country-specific probabilities of death over three phases of life were substantially varied between and within regions. Interpretation For most countries, the general pattern of reductions in age-sex specific mortality has been associated with a progressive shift towards a larger share of the remaining deaths caused by non-communicable disease and injuries. Assessing epidemiological convergence across countries depends on whether an absolute or relative measure of inequality is used. Nevertheless, age-standardised death rates for seven substantial causes are increasing, suggesting the potential for reversals in some countries. Important gaps exist in the empirical data for cause of death estimates for some countries; for example, no national data for India are available for the past decade.