641 resultados para decision theory

em Queensland University of Technology - ePrints Archive


Relevância:

100.00% 100.00%

Publicador:

Resumo:

The primary goal of a phase I trial is to find the maximally tolerated dose (MTD) of a treatment. The MTD is usually defined in terms of a tolerable probability, q*, of toxicity. Our objective is to find the highest dose with toxicity risk that does not exceed q*, a criterion that is often desired in designing phase I trials. This criterion differs from that of finding the dose with toxicity risk closest to q*, that is used in methods such as the continual reassessment method. We use the theory of decision processes to find optimal sequential designs that maximize the expected number of patients within the trial allocated to the highest dose with toxicity not exceeding q*, among the doses under consideration. The proposed method is very general in the sense that criteria other than the one considered here can be optimized and that optimal dose assignment can be defined in terms of patients within or outside the trial. It includes as an important special case the continual reassessment method. Numerical study indicates the strategy compares favourably with other phase I designs.

Relevância:

70.00% 70.00%

Publicador:

Resumo:

Reliable infrastructure assets impact significantly on quality of life and provide a stable foundation for economic growth and competitiveness. Decisions about the way assets are managed are of utmost importance in achieving this. Timely renewal of infrastructure assets supports reliability and maximum utilisation of infrastructure and enables business and community to grow and prosper. This research initially examined a framework for asset management decisions and then focused on asset renewal optimisation and renewal engineering optimisation in depth. This study had four primary objectives. The first was to develop a new Asset Management Decision Framework (AMDF) for identifying and classifying asset management decisions. The AMDF was developed by applying multi-criteria decision theory, classical management theory and life cycle management. The AMDF is an original and innovative contribution to asset management in that: · it is the first framework to provide guidance for developing asset management decision criteria based on fundamental business objectives; · it is the first framework to provide a decision context identification and analysis process for asset management decisions; and · it is the only comprehensive listing of asset management decision types developed from first principles. The second objective of this research was to develop a novel multi-attribute Asset Renewal Decision Model (ARDM) that takes account of financial, customer service, health and safety, environmental and socio-economic objectives. The unique feature of this ARDM is that it is the only model to optimise timing of asset renewal with respect to fundamental business objectives. The third objective of this research was to develop a novel Renewal Engineering Decision Model (REDM) that uses multiple criteria to determine the optimal timing for renewal engineering. The unique features of this model are that: · it is a novel extension to existing real options valuation models in that it uses overall utility rather than present value of cash flows to model engineering value; and · it is the only REDM that optimises timing of renewal engineering with respect to fundamental business objectives; The final objective was to develop and validate an Asset Renewal Engineering Philosophy (AREP) consisting of three principles of asset renewal engineering. The principles were validated using a novel application of real options theory. The AREP is the only renewal engineering philosophy in existence. The original contributions of this research are expected to enrich the body of knowledge in asset management through effectively addressing the need for an asset management decision framework, asset renewal and renewal engineering optimisation based on fundamental business objectives and a novel renewal engineering philosophy.

Relevância:

70.00% 70.00%

Publicador:

Resumo:

Much of our understanding of human thinking is based on probabilistic models. This innovative book by Jerome R. Busemeyer and Peter D. Bruza argues that, actually, the underlying mathematical structures from quantum theory provide a much better account of human thinking than traditional models. They introduce the foundations for modelling probabilistic-dynamic systems using two aspects of quantum theory. The first, "contextuality", is a way to understand interference effects found with inferences and decisions under conditions of uncertainty. The second, "entanglement", allows cognitive phenomena to be modelled in non-reductionist ways. Employing these principles drawn from quantum theory allows us to view human cognition and decision in a totally new light...

Relevância:

70.00% 70.00%

Publicador:

Resumo:

Systematic studies that evaluate the quality of decision-making processes are relatively rare. Using the literature on decision quality, this research develops a framework to assess the quality of decision-making processes for resolving boundary conflicts in the Philippines. The evaluation framework breaks down the decision-making process into three components (the decision procedure, the decision method, and the decision unit) and is applied to two ex-post (one resolved and one unresolved) and one ex-ante cases. The evaluation results from the resolved and the unresolved cases show that the choice of decision method plays a minor role in resolving boundary conflicts whereas the choice of decision procedure is more influential. In the end, a decision unit can choose a simple method to resolve the conflict. The ex-ante case presents a follow-up intended to resolve the unresolved case for a changing decision-making process in which the associated decision unit plans to apply the spatial multi criteria evaluation (SMCE) tool as a decision method. The evaluation results from the ex-ante case confirm that the SMCE has the potential to enhance the decision quality because: a) it provides high quality as a decision method in this changing process, and b) the weaknesses associated with the decision unit and the decision procedure of the unresolved case were found to be eliminated in this process.

Relevância:

70.00% 70.00%

Publicador:

Resumo:

Sophisticated models of human social behaviour are fast becoming highly desirable in an increasingly complex and interrelated world. Here, we propose that rather than taking established theories from the physical sciences and naively mapping them into the social world, the advanced concepts and theories of social psychology should be taken as a starting point, and used to develop a new modelling methodology. In order to illustrate how such an approach might be carried out, we attempt to model the low elaboration attitude changes of a society of agents in an evolving social context. We propose a geometric model of an agent in context, where individual agent attitudes are seen to self-organise to form ideologies, which then serve to guide further agent-based attitude changes. A computational implementation of the model is shown to exhibit a number of interesting phenomena, including a tendency for a measure of the entropy in the system to decrease, and a potential for externally guiding a population of agents towards a new desired ideology.

Relevância:

70.00% 70.00%

Publicador:

Resumo:

The article focuses on how the information seeker makes decisions about relevance. It will employ a novel decision theory based on quantum probabilities. This direction derives from mounting research within the field of cognitive science showing that decision theory based on quantum probabilities is superior to modelling human judgements than standard probability models [2, 1]. By quantum probabilities, we mean decision event space is modelled as vector space rather than the usual Boolean algebra of sets. In this way,incompatible perspectives around a decision can be modelled leading to an interference term which modifies the law of total probability. The interference term is crucial in modifying the probability judgements made by current probabilistic systems so they align better with human judgement. The goal of this article is thus to model the information seeker user as a decision maker. For this purpose, signal detection models will be sketched which are in principle applicable in a wide variety of information seeking scenarios.

Relevância:

70.00% 70.00%

Publicador:

Resumo:

The quality of environmental decisions are gauged according to the management objectives of a conservation project. Management objectives are generally about maximising some quantifiable measure of system benefit, for instance population growth rate. They can also be defined in terms of learning about the system in question, in such a case actions would be chosen that maximise knowledge gain, for instance in experimental management sites. Learning about a system can also take place when managing practically. The adaptive management framework (Walters 1986) formally acknowledges this fact by evaluating learning in terms of how it will improve management of the system and therefore future system benefit. This is taken into account when ranking actions using stochastic dynamic programming (SDP). However, the benefits of any management action lie on a spectrum from pure system benefit, when there is nothing to be learned about the system, to pure knowledge gain. The current adaptive management framework does not permit management objectives to evaluate actions over the full range of this spectrum. By evaluating knowledge gain in units distinct to future system benefit this whole spectrum of management objectives can be unlocked. This paper outlines six decision making policies that differ across the spectrum of pure system benefit through to pure learning. The extensions to adaptive management presented allow specification of the relative importance of learning compared to system benefit in management objectives. Such an extension means practitioners can be more specific in the construction of conservation project objectives and be able to create policies for experimental management sites in the same framework as practical management sites.

Relevância:

70.00% 70.00%

Publicador:

Resumo:

Decision-making for conservation is conducted within the margins of limited funding. Furthermore, to allocate these scarce resources we make assumptions about the relationship between management impact and expenditure. The structure of these relationships, however, is rarely known with certainty. We present a summary of work investigating the impact of model uncertainty on robust decision-making in conservation and how this is affected by available conservation funding. We show that achieving robustness in conservation decisions can require a triage approach, and emphasize the need for managers to consider triage not as surrendering but as rational decision making to ensure species persistence in light of the urgency of the conservation problems, uncertainty, and the poor state of conservation funding. We illustrate this theory by a specific application to allocation of funding to reduce poaching impact on the Sumatran tiger Panthera tigris sumatrae in Kerinci Seblat National Park, Indonesia. To conserve our environment, conservation managers must make decisions in the face of substantial uncertainty. Further, they must deal with the fact that limitations in budgets and temporal constraints have led to a lack of knowledge on the systems we are trying to preserve and on the benefits of the actions we have available (Balmford & Cowling 2006). Given this paucity of decision-informing data there is a considerable need to assess the impact of uncertainty on the benefit of management options (Regan et al. 2005). Although models of management impact can improve decision making (e.g.Tenhumberg et al. 2004), they typically rely on assumptions around which there is substantial uncertainty. Ignoring this 'model uncertainty', can lead to inferior decision-making (Regan et al. 2005), and potentially, the loss of the species we are trying to protect. Current methods used in ecology allow model uncertainty to be incorporated into the model selection process (Burnham & Anderson 2002; Link & Barker 2006), but do not enable decision-makers to assess how this uncertainty would change a decision. This is the basis of information-gap decision theory (info-gap); finding strategies most robust to model uncertainty (Ben-Haim 2006). Info-gap has permitted conservation biology to make the leap from recognizing uncertainty to explicitly incorporating severe uncertainty into decision-making. In this paper we present a summary of McDonald-Madden et al (2008a) who use an info-gap framework to address the impact of uncertainty in the functional representations of biological systems on conservation decision-making. Furthermore, we highlight the importance of two key elements limiting conservation decision-making - funding and knowledge - and how they interact to influence the best management strategy for a threatened species. Copyright © ASCE 2011.

Relevância:

70.00% 70.00%

Publicador:

Resumo:

A flexible and simple Bayesian decision-theoretic design for dose-finding trials is proposed in this paper. In order to reduce the computational burden, we adopt a working model with conjugate priors, which is flexible to fit all monotonic dose-toxicity curves and produces analytic posterior distributions. We also discuss how to use a proper utility function to reflect the interest of the trial. Patients are allocated based on not only the utility function but also the chosen dose selection rule. The most popular dose selection rule is the one-step-look-ahead (OSLA), which selects the best-so-far dose. A more complicated rule, such as the two-step-look-ahead, is theoretically more efficient than the OSLA only when the required distributional assumptions are met, which is, however, often not the case in practice. We carried out extensive simulation studies to evaluate these two dose selection rules and found that OSLA was often more efficient than two-step-look-ahead under the proposed Bayesian structure. Moreover, our simulation results show that the proposed Bayesian method's performance is superior to several popular Bayesian methods and that the negative impact of prior misspecification can be managed in the design stage.

Relevância:

70.00% 70.00%

Publicador:

Resumo:

Stallard (1998, Biometrics 54, 279-294) recently used Bayesian decision theory for sample-size determination in phase II trials. His design maximizes the expected financial gains in the development of a new treatment. However, it results in a very high probability (0.65) of recommending an ineffective treatment for phase III testing. On the other hand, the expected gain using his design is more than 10 times that of a design that tightly controls the false positive error (Thall and Simon, 1994, Biometrics 50, 337-349). Stallard's design maximizes the expected gain per phase II trial, but it does not maximize the rate of gain or total gain for a fixed length of time because the rate of gain depends on the proportion: of treatments forwarding to the phase III study. We suggest maximizing the rate of gain, and the resulting optimal one-stage design becomes twice as efficient as Stallard's one-stage design. Furthermore, the new design has a probability of only 0.12 of passing an ineffective treatment to phase III study.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

An information filtering (IF) system monitors an incoming document stream to find the documents that match the information needs specified by the user profiles. To learn to use the user profiles effectively is one of the most challenging tasks when developing an IF system. With the document selection criteria better defined based on the users’ needs, filtering large streams of information can be more efficient and effective. To learn the user profiles, term-based approaches have been widely used in the IF community because of their simplicity and directness. Term-based approaches are relatively well established. However, these approaches have problems when dealing with polysemy and synonymy, which often lead to an information overload problem. Recently, pattern-based approaches (or Pattern Taxonomy Models (PTM) [160]) have been proposed for IF by the data mining community. These approaches are better at capturing sematic information and have shown encouraging results for improving the effectiveness of the IF system. On the other hand, pattern discovery from large data streams is not computationally efficient. Also, these approaches had to deal with low frequency pattern issues. The measures used by the data mining technique (for example, “support” and “confidences”) to learn the profile have turned out to be not suitable for filtering. They can lead to a mismatch problem. This thesis uses the rough set-based reasoning (term-based) and pattern mining approach as a unified framework for information filtering to overcome the aforementioned problems. This system consists of two stages - topic filtering and pattern mining stages. The topic filtering stage is intended to minimize information overloading by filtering out the most likely irrelevant information based on the user profiles. A novel user-profiles learning method and a theoretical model of the threshold setting have been developed by using rough set decision theory. The second stage (pattern mining) aims at solving the problem of the information mismatch. This stage is precision-oriented. A new document-ranking function has been derived by exploiting the patterns in the pattern taxonomy. The most likely relevant documents were assigned higher scores by the ranking function. Because there is a relatively small amount of documents left after the first stage, the computational cost is markedly reduced; at the same time, pattern discoveries yield more accurate results. The overall performance of the system was improved significantly. The new two-stage information filtering model has been evaluated by extensive experiments. Tests were based on the well-known IR bench-marking processes, using the latest version of the Reuters dataset, namely, the Reuters Corpus Volume 1 (RCV1). The performance of the new two-stage model was compared with both the term-based and data mining-based IF models. The results demonstrate that the proposed information filtering system outperforms significantly the other IF systems, such as the traditional Rocchio IF model, the state-of-the-art term-based models, including the BM25, Support Vector Machines (SVM), and Pattern Taxonomy Model (PTM).

Relevância:

60.00% 60.00%

Publicador:

Relevância:

60.00% 60.00%

Publicador: