995 resultados para Statistical decision


Relevância:

30.00% 30.00%

Publicador:

Resumo:

There have been many models developed by scientists to assist decision-makers in making socio-economic and environmental decisions. It is now recognised that there is a shift in the dominant paradigm to making decisions with stakeholders, rather than making decisions for stakeholders. Our paper investigates two case studies where group model building has been undertaken for maintaining biodiversity in Australia. The first case study focuses on preservation and management of green spaces and biodiversity in metropolitan Melbourne under the umbrella of the Melbourne 2030 planning strategy. A geographical information system is used to collate a number of spatial datasets encompassing a range of cultural and natural assets data layers including: existing open spaces, waterways, threatened fauna and flora, ecological vegetation covers, registered cultural heritage sites, and existing land parcel zoning. Group model building is incorporated into the study through eliciting weightings and ratings of importance for each datasets from urban planners to formulate different urban green system scenarios. The second case study focuses on modelling ecoregions from spatial datasets for the state of Queensland. The modelling combines collaborative expert knowledge and a vast amount of environmental data to build biogeographical classifications of regions. An information elicitation process is used to capture expert knowledge of ecoregions as geographical descriptions, and to transform this into prior probability distributions that characterise regions in terms of environmental variables. This prior information is combined with measured data on the environmental variables within a Bayesian modelling technique to produce the final classified regions. We describe how linked views between descriptive information, mapping and statistical plots are used to decide upon representative regions that satisfy a number of criteria for biodiversity and conservation. This paper discusses the advantages and problems encountered when undertaking group model building. Future research will extend the group model building approach to include interested individuals and community groups.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Background - The literature is not univocal about the effects of Peer Review (PR) within the context of constructivist learning. Due to the predominant focus on using PR as an assessment tool, rather than a constructivist learning activity, and because most studies implicitly assume that the benefits of PR are limited to the reviewee, little is known about the effects upon students who are required to review their peers. Much of the theoretical debate in the literature is focused on explaining how and why constructivist learning is beneficial. At the same time these discussions are marked by an underlying presupposition of a causal relationship between reviewing and deep learning. Objectives - The purpose of the study is to investigate whether the writing of PR feedback causes students to benefit in terms of: perceived utility about statistics, actual use of statistics, better understanding of statistical concepts and associated methods, changed attitudes towards market risks, and outcomes of decisions that were made. Methods - We conducted a randomized experiment, assigning students randomly to receive PR or non–PR treatments and used two cohorts with a different time span. The paper discusses the experimental design and all the software components that we used to support the learning process: Reproducible Computing technology which allows students to reproduce or re–use statistical results from peers, Collaborative PR, and an AI–enhanced Stock Market Engine. Results - The results establish that the writing of PR feedback messages causes students to experience benefits in terms of Behavior, Non–Rote Learning, and Attitudes, provided the sequence of PR activities are maintained for a period that is sufficiently long.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Effective clinical decision making depends upon identifying possible outcomes for a patient, selecting relevant cues, and processing the cues to arrive at accurate judgements of each outcome's probability of occurrence. These activities can be considered as classification tasks. This paper describes a new model of psychological classification that explains how people use cues to determine class or outcome likelihoods. It proposes that clinicians respond to conditional probabilities of outcomes given cues and that these probabilities compete with each other for influence on classification. The model explains why people appear to respond to base rates inappropriately, thereby overestimating the occurrence of rare categories, and a clinical example is provided for predicting suicide risk. The model makes an effective representation for expert clinical judgements and its psychological validity enables it to generate explanations in a form that is comprehensible to clinicians. It is a strong candidate for incorporation within a decision support system for mental-health risk assessment, where it can link with statistical and pattern recognition tools applied to a database of patients. The symbiotic combination of empirical evidence and clinical expertise can provide an important web-based resource for risk assessment, including multi-disciplinary education and training. © 2002 Informa UK Ltd All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

* The work is supported by RFBR, grant 04-01-00858-a

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The first essay developed a respondent model of Bayesian updating for a double-bound dichotomous choice (DB-DC) contingent valuation methodology. I demonstrated by way of data simulations that current DB-DC identifications of true willingness-to-pay (WTP) may often fail given this respondent Bayesian updating context. Further simulations demonstrated that a simple extension of current DB-DC identifications derived explicitly from the Bayesian updating behavioral model can correct for much of the WTP bias. Additional results provided caution to viewing respondents as acting strategically toward the second bid. Finally, an empirical application confirmed the simulation outcomes. The second essay applied a hedonic property value model to a unique water quality (WQ) dataset for a year-round, urban, and coastal housing market in South Florida, and found evidence that various WQ measures affect waterfront housing prices in this setting. However, the results indicated that this relationship is not consistent across any of the six particular WQ variables used, and is furthermore dependent upon the specific descriptive statistic employed to represent the WQ measure in the empirical analysis. These results continue to underscore the need to better understand both the WQ measure and its statistical form homebuyers use in making their purchase decision. The third essay addressed a limitation to existing hurricane evacuation modeling aspects by developing a dynamic model of hurricane evacuation behavior. A household's evacuation decision was framed as an optimal stopping problem where every potential evacuation time period prior to the actual hurricane landfall, the household's optimal choice is to either evacuate, or to wait one more time period for a revised hurricane forecast. A hypothetical two-period model of evacuation and a realistic multi-period model of evacuation that incorporates actual forecast and evacuation cost data for my designated Gulf of Mexico region were developed for the dynamic analysis. Results from the multi-period model were calibrated with existing evacuation timing data from a number of hurricanes. Given the calibrated dynamic framework, a number of policy questions that plausibly affect the timing of household evacuations were analyzed, and a deeper understanding of existing empirical outcomes in regard to the timing of the evacuation decision was achieved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The first essay developed a respondent model of Bayesian updating for a double-bound dichotomous choice (DB-DC) contingent valuation methodology. I demonstrated by way of data simulations that current DB-DC identifications of true willingness-to-pay (WTP) may often fail given this respondent Bayesian updating context. Further simulations demonstrated that a simple extension of current DB-DC identifications derived explicitly from the Bayesian updating behavioral model can correct for much of the WTP bias. Additional results provided caution to viewing respondents as acting strategically toward the second bid. Finally, an empirical application confirmed the simulation outcomes. The second essay applied a hedonic property value model to a unique water quality (WQ) dataset for a year-round, urban, and coastal housing market in South Florida, and found evidence that various WQ measures affect waterfront housing prices in this setting. However, the results indicated that this relationship is not consistent across any of the six particular WQ variables used, and is furthermore dependent upon the specific descriptive statistic employed to represent the WQ measure in the empirical analysis. These results continue to underscore the need to better understand both the WQ measure and its statistical form homebuyers use in making their purchase decision. The third essay addressed a limitation to existing hurricane evacuation modeling aspects by developing a dynamic model of hurricane evacuation behavior. A household’s evacuation decision was framed as an optimal stopping problem where every potential evacuation time period prior to the actual hurricane landfall, the household’s optimal choice is to either evacuate, or to wait one more time period for a revised hurricane forecast. A hypothetical two-period model of evacuation and a realistic multi-period model of evacuation that incorporates actual forecast and evacuation cost data for my designated Gulf of Mexico region were developed for the dynamic analysis. Results from the multi-period model were calibrated with existing evacuation timing data from a number of hurricanes. Given the calibrated dynamic framework, a number of policy questions that plausibly affect the timing of household evacuations were analyzed, and a deeper understanding of existing empirical outcomes in regard to the timing of the evacuation decision was achieved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

BACKGROUND: Guidance for appropriate utilisation of transthoracic echocardiograms (TTEs) can be incorporated into ordering prompts, potentially affecting the number of requests. METHODS: We incorporated data from the 2011 Appropriate Use Criteria for Echocardiography, the 2010 National Institute for Clinical Excellence Guideline on Chronic Heart Failure, and American College of Cardiology Choosing Wisely list on TTE use for dyspnoea, oedema and valvular disease into electronic ordering systems at Durham Veterans Affairs Medical Center. Our primary outcome was TTE orders per month. Secondary outcomes included rates of outpatient TTE ordering per 100 visits and frequency of brain natriuretic peptide (BNP) ordering prior to TTE. Outcomes were measured for 20 months before and 12 months after the intervention. RESULTS: The number of TTEs ordered did not decrease (338±32 TTEs/month prior vs 320±33 afterwards, p=0.12). Rates of outpatient TTE ordering decreased minimally post intervention (2.28 per 100 primary care/cardiology visits prior vs 1.99 afterwards, p<0.01). Effects on TTE ordering and ordering rate significantly interacted with time from intervention (p<0.02 for both), as the small initial effects waned after 6 months. The percentage of TTE orders with preceding BNP increased (36.5% prior vs 42.2% after for inpatients, p=0.01; 10.8% prior vs 14.5% after for outpatients, p<0.01). CONCLUSIONS: Ordering prompts for TTEs initially minimally reduced the number of TTEs ordered and increased BNP measurement at a single institution, but the effect on TTEs ordered was likely insignificant from a utilisation standpoint and decayed over time.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The advances in three related areas of state-space modeling, sequential Bayesian learning, and decision analysis are addressed, with the statistical challenges of scalability and associated dynamic sparsity. The key theme that ties the three areas is Bayesian model emulation: solving challenging analysis/computational problems using creative model emulators. This idea defines theoretical and applied advances in non-linear, non-Gaussian state-space modeling, dynamic sparsity, decision analysis and statistical computation, across linked contexts of multivariate time series and dynamic networks studies. Examples and applications in financial time series and portfolio analysis, macroeconomics and internet studies from computational advertising demonstrate the utility of the core methodological innovations.

Chapter 1 summarizes the three areas/problems and the key idea of emulating in those areas. Chapter 2 discusses the sequential analysis of latent threshold models with use of emulating models that allows for analytical filtering to enhance the efficiency of posterior sampling. Chapter 3 examines the emulator model in decision analysis, or the synthetic model, that is equivalent to the loss function in the original minimization problem, and shows its performance in the context of sequential portfolio optimization. Chapter 4 describes the method for modeling the steaming data of counts observed on a large network that relies on emulating the whole, dependent network model by independent, conjugate sub-models customized to each set of flow. Chapter 5 reviews those advances and makes the concluding remarks.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Thesis (Master's)--University of Washington, 2016-08

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Traffic demand increases are pushing aging ground transportation infrastructures to their theoretical capacity. The result of this demand is traffic bottlenecks that are a major cause of delay on urban freeways. In addition, the queues associated with those bottlenecks increase the probability of a crash while adversely affecting environmental measures such as emissions and fuel consumption. With limited resources available for network expansion, traffic professionals have developed active traffic management systems (ATMS) in an attempt to mitigate the negative consequences of traffic bottlenecks. Among these ATMS strategies, variable speed limits (VSL) and ramp metering (RM) have been gaining international interests for their potential to improve safety, mobility, and environmental measures at freeway bottlenecks. Though previous studies have shown the tremendous potential of variable speed limit (VSL) and VSL paired with ramp metering (VSLRM) control, little guidance has been developed to assist decision makers in the planning phase of a congestion mitigation project that is considering VSL or VSLRM control. To address this need, this study has developed a comprehensive decision/deployment support tool for the application of VSL and VSLRM control in recurrently congested environments. The decision tool will assist practitioners in deciding the most appropriate control strategy at a candidate site, which candidate sites have the most potential to benefit from the suggested control strategy, and how to most effectively design the field deployment of the suggested control strategy at each implementation site. To do so, the tool is comprised of three key modules, (1) Decision Module, (2) Benefits Module, and (3) Deployment Guidelines Module. Each module uses commonly known traffic flow and geometric parameters as inputs to statistical models and empirically based procedures to provide guidance on the application of VSL and VSLRM at each candidate site. These models and procedures were developed from the outputs of simulated experiments, calibrated with field data. To demonstrate the application of the tool, a list of real-world candidate sites were selected from the Maryland State Highway Administration Mobility Report. Here, field data from each candidate site was input into the tool to illustrate the step-by-step process required for efficient planning of VSL or VSLRM control. The output of the tool includes the suggested control system at each site, a ranking of the sites based on the expected benefit-to-cost ratio, and guidelines on how to deploy the VSL signs, ramp meters, and detectors at the deployment site(s). This research has the potential to assist traffic engineers in the planning of VSL and VSLRM control, thus enhancing the procedure for allocating limited resources for mobility and safety improvements on highways plagued by recurrent congestion.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In a microscopic setting, humans behave in rich and unexpected ways. In a macroscopic setting, however, distinctive patterns of group behavior emerge, leading statistical physicists to search for an underlying mechanism. The aim of this dissertation is to analyze the macroscopic patterns of competing ideas in order to discern the mechanics of how group opinions form at the microscopic level. First, we explore the competition of answers in online Q&A (question and answer) boards. We find that a simple individual-level model can capture important features of user behavior, especially as the number of answers to a question grows. Our model further suggests that the wisdom of crowds may be constrained by information overload, in which users are unable to thoroughly evaluate each answer and therefore tend to use heuristics to pick what they believe is the best answer. Next, we explore models of opinion spread among voters to explain observed universal statistical patterns such as rescaled vote distributions and logarithmic vote correlations. We introduce a simple model that can explain both properties, as well as why it takes so long for large groups to reach consensus. An important feature of the model that facilitates agreement with data is that individuals become more stubborn (unwilling to change their opinion) over time. Finally, we explore potential underlying mechanisms for opinion formation in juries, by comparing data to various types of models. We find that different null hypotheses in which jurors do not interact when reaching a decision are in strong disagreement with data compared to a simple interaction model. These findings provide conceptual and mechanistic support for previous work that has found mutual influence can play a large role in group decisions. In addition, by matching our models to data, we are able to infer the time scales over which individuals change their opinions for different jury contexts. We find that these values increase as a function of the trial time, suggesting that jurors and judicial panels exhibit a kind of stubbornness similar to what we include in our model of voting behavior.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This dissertation proposes statistical methods to formulate, estimate and apply complex transportation models. Two main problems are part of the analyses conducted and presented in this dissertation. The first method solves an econometric problem and is concerned with the joint estimation of models that contain both discrete and continuous decision variables. The use of ordered models along with a regression is proposed and their effectiveness is evaluated with respect to unordered models. Procedure to calculate and optimize the log-likelihood functions of both discrete-continuous approaches are derived, and difficulties associated with the estimation of unordered models explained. Numerical approximation methods based on the Genz algortithm are implemented in order to solve the multidimensional integral associated with the unordered modeling structure. The problems deriving from the lack of smoothness of the probit model around the maximum of the log-likelihood function, which makes the optimization and the calculation of standard deviations very difficult, are carefully analyzed. A methodology to perform out-of-sample validation in the context of a joint model is proposed. Comprehensive numerical experiments have been conducted on both simulated and real data. In particular, the discrete-continuous models are estimated and applied to vehicle ownership and use models on data extracted from the 2009 National Household Travel Survey. The second part of this work offers a comprehensive statistical analysis of free-flow speed distribution; the method is applied to data collected on a sample of roads in Italy. A linear mixed model that includes speed quantiles in its predictors is estimated. Results show that there is no road effect in the analysis of free-flow speeds, which is particularly important for model transferability. A very general framework to predict random effects with few observations and incomplete access to model covariates is formulated and applied to predict the distribution of free-flow speed quantiles. The speed distribution of most road sections is successfully predicted; jack-knife estimates are calculated and used to explain why some sections are poorly predicted. Eventually, this work contributes to the literature in transportation modeling by proposing econometric model formulations for discrete-continuous variables, more efficient methods for the calculation of multivariate normal probabilities, and random effects models for free-flow speed estimation that takes into account the survey design. All methods are rigorously validated on both real and simulated data.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

As climate change continues to impact socio-ecological systems, tools that assist conservation managers to understand vulnerability and target adaptations are essential. Quantitative assessments of vulnerability are rare because available frameworks are complex and lack guidance for dealing with data limitations and integrating across scales and disciplines. This paper describes a semi-quantitative method for assessing vulnerability to climate change that integrates socio-ecological factors to address management objectives and support decision-making. The method applies a framework first adopted by the Intergovernmental Panel on Climate Change and uses a structured 10-step process. The scores for each framework element are normalized and multiplied to produce a vulnerability score and then the assessed components are ranked from high to low vulnerability. Sensitivity analyses determine which indicators most influence the analysis and the resultant decision-making process so data quality for these indicators can be reviewed to increase robustness. Prioritisation of components for conservation considers other economic, social and cultural values with vulnerability rankings to target actions that reduce vulnerability to climate change by decreasing exposure or sensitivity and/or increasing adaptive capacity. This framework provides practical decision-support and has been applied to marine ecosystems and fisheries, with two case applications provided as examples: (1) food security in Pacific Island nations under climate-driven fish declines, and (2) fisheries in the Gulf of Carpentaria, northern Australia. The step-wise process outlined here is broadly applicable and can be undertaken with minimal resources using existing data, thereby having great potential to inform adaptive natural resource management in diverse locations.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This research work analyzes human behavior in complex situations and explains how decisions makers act in ambiguous situations. The objective of this research work is to study the sunk cost effect and the completion percentage effect of an investment project in a decision-making process. This research work uses a “retrospective rationality” approach to justify irrational behaviors such as the sunk cost effect, the completion percentage effect of an investment project and the irrational escalation since decision-makers are repeatedly affected by the decisions on past irreversible investments. This research work evaluates three sunk cost levels, and three completion percentage levels of an investment project, besides three neutral situations in a business environment and a personal decision situation. Graduate students in three Portuguese Management Schools responded to the questionnaires. Model results show that the value of resources invested is crucial for understanding the students’ rational behavior, who participated in this research work. These results disclose statistical evidence that the information on sunk costs and completion percentage of an investment project determines human behavior under irrational escalation in ambiguous situations. As a consequence, decision makers have the opportunity to interpret their decisions, since the scenarios do not allow a unique definition of rational choice, it is not correct to judge the irrational decision makers that decide to continue to invest in ambiguous situations. Keywords: Human Behavior, Sunk costs Effect, Completion Percentage Effect of an Investment Project, Irrational Escalation, Ambiguous Situations.