974 resultados para Linear decision rules
Resumo:
We consider multistage stochastic linear optimization problems combining joint dynamic probabilistic constraints with hard constraints. We develop a method for projecting decision rules onto hard constraints of wait-and-see type. We establish the relation between the original (in nite dimensional) problem and approximating problems working with projections from di erent subclasses of decision policies. Considering the subclass of linear decision rules and a generalized linear model for the underlying stochastic process with noises that are Gaussian or truncated Gaussian, we show that the value and gradient of the objective and constraint functions of the approximating problems can be computed analytically.
Resumo:
The task of smooth and stable decision rules construction in logical recognition models is considered. Logical regularities of classes are defined as conjunctions of one-place predicates that determine the membership of features values in an intervals of the real axis. The conjunctions are true on a special no extending subsets of reference objects of some class and are optimal. The standard approach of linear decision rules construction for given sets of logical regularities consists in realization of voting schemes. The weighting coefficients of voting procedures are done as heuristic ones or are as solutions of complex optimization task. The modifications of linear decision rules are proposed that are based on the search of maximal estimations of standard objects for their classes and use approximations of logical regularities by smooth sigmoid functions.
Resumo:
The capacity to distinguish colony members from strangers is a key component in social life. In social insects, this extends to the brood and involves discrimination of queen eggs. Chemical substances communicate colony affiliation for both adults and brood; thus, in theory, all colony members should be able to recognize fellow nestmates. In this study, we investigate the ability of Dinoponera quadriceps workers to discriminate nestmate and non-nestmate eggs based on cuticular hydrocarbon composition. We analyzed whether cuticular hydrocarbons present on the eggs provide cues of discrimination. The results show that egg recognition in D. quadriceps is related to both age and the functional role of workers. Brood care workers were able to distinguish nestmate from non-nestmate eggs, while callow and forager workers were unable to do so.
Resumo:
Background: Although CD4 cell count monitoring is used to decide when to start antiretroviral therapy in patients with HIV-1 infection, there are no evidence-based recommendations regarding its optimal frequency. It is common practice to monitor every 3 to 6 months, often coupled with viral load monitoring. We developed rules to guide frequency of CD4 cell count monitoring in HIV infection before starting antiretroviral therapy, which we validated retrospectively in patients from the Swiss HIV Cohort Study.Methodology/Principal Findings: We built up two prediction rules ("Snap-shot rule" for a single sample and "Track-shot rule" for multiple determinations) based on a systematic review of published longitudinal analyses of CD4 cell count trajectories. We applied the rules in 2608 untreated patients to classify their 18 061 CD4 counts as either justifiable or superfluous, according to their prior >= 5% or < 5% chance of meeting predetermined thresholds for starting treatment. The percentage of measurements that both rules falsely deemed superfluous never exceeded 5%. Superfluous CD4 determinations represented 4%, 11%, and 39% of all actual determinations for treatment thresholds of 500, 350, and 200x10(6)/L, respectively. The Track-shot rule was only marginally superior to the Snap-shot rule. Both rules lose usefulness for CD4 counts coming near to treatment threshold.Conclusions/Significance: Frequent CD4 count monitoring of patients with CD4 counts well above the threshold for initiating therapy is unlikely to identify patients who require therapy. It appears sufficient to measure CD4 cell count 1 year after a count > 650 for a threshold of 200, > 900 for 350, or > 1150 for 500x10(6)/L, respectively. When CD4 counts fall below these limits, increased monitoring frequency becomes advisable. These rules offer guidance for efficient CD4 monitoring, particularly in resource-limited settings.
Resumo:
Feature selection plays an important role in knowledge discovery and data mining nowadays. In traditional rough set theory, feature selection using reduct - the minimal discerning set of attributes - is an important area. Nevertheless, the original definition of a reduct is restrictive, so in one of the previous research it was proposed to take into account not only the horizontal reduction of information by feature selection, but also a vertical reduction considering suitable subsets of the original set of objects. Following the work mentioned above, a new approach to generate bireducts using a multi--objective genetic algorithm was proposed. Although the genetic algorithms were used to calculate reduct in some previous works, we did not find any work where genetic algorithms were adopted to calculate bireducts. Compared to the works done before in this area, the proposed method has less randomness in generating bireducts. The genetic algorithm system estimated a quality of each bireduct by values of two objective functions as evolution progresses, so consequently a set of bireducts with optimized values of these objectives was obtained. Different fitness evaluation methods and genetic operators, such as crossover and mutation, were applied and the prediction accuracies were compared. Five datasets were used to test the proposed method and two datasets were used to perform a comparison study. Statistical analysis using the one-way ANOVA test was performed to determine the significant difference between the results. The experiment showed that the proposed method was able to reduce the number of bireducts necessary in order to receive a good prediction accuracy. Also, the influence of different genetic operators and fitness evaluation strategies on the prediction accuracy was analyzed. It was shown that the prediction accuracies of the proposed method are comparable with the best results in machine learning literature, and some of them outperformed it.
Resumo:
The paper focuses on the organization of institutions designed to resolve disputes between two parties, when some information is not veriable and decision makers may have vested preferences. It shows that the choice of how much discretional power to grant to the decision maker and who provides the information are intrinsically related. Direct involvement of the interested parties in the supply of information enhances monitoring over the decision maker, although at the cost of higher manipulation. Thus, it is desirable when the decision maker is granted high discretion. On the contrary, when the decision maker has limited discretional power, information provision is better assigned to an agent with no direct stake. The analysis helps to rationalize some organizational arrangements that are commonly observed in the context of judicial and antitrust decision-making.
Resumo:
We analyze the contractual design problem of a principal who delegates decision-making and information provision. The principal faces two tasks: he has to decide the level of discretion to be granted to the decision-maker and to establish who is in charge of supplying the information. We show that these two choices are intrinsically related. When the decision-maker is granted high discretion, information provision is optimally delegated to the parties directly affected by the decision. Conversely, when the decision-maker enjoys little discretion, it is more desirable to rely on a third impartial agent. The paper helps rationalize some organizational arrangements that are commonly observed in the context of judicial and antitrust decision-making.
Resumo:
Background Although CD4 cell count monitoring is used to decide when to start antiretroviral therapy in patients with HIV-1 infection, there are no evidence-based recommendations regarding its optimal frequency. It is common practice to monitor every 3 to 6 months, often coupled with viral load monitoring. We developed rules to guide frequency of CD4 cell count monitoring in HIV infection before starting antiretroviral therapy, which we validated retrospectively in patients from the Swiss HIV Cohort Study. Methodology/Principal Findings We built up two prediction rules (“Snap-shot rule” for a single sample and “Track-shot rule” for multiple determinations) based on a systematic review of published longitudinal analyses of CD4 cell count trajectories. We applied the rules in 2608 untreated patients to classify their 18 061 CD4 counts as either justifiable or superfluous, according to their prior ≥5% or <5% chance of meeting predetermined thresholds for starting treatment. The percentage of measurements that both rules falsely deemed superfluous never exceeded 5%. Superfluous CD4 determinations represented 4%, 11%, and 39% of all actual determinations for treatment thresholds of 500, 350, and 200×106/L, respectively. The Track-shot rule was only marginally superior to the Snap-shot rule. Both rules lose usefulness for CD4 counts coming near to treatment threshold. Conclusions/Significance Frequent CD4 count monitoring of patients with CD4 counts well above the threshold for initiating therapy is unlikely to identify patients who require therapy. It appears sufficient to measure CD4 cell count 1 year after a count >650 for a threshold of 200, >900 for 350, or >1150 for 500×106/L, respectively. When CD4 counts fall below these limits, increased monitoring frequency becomes advisable. These rules offer guidance for efficient CD4 monitoring, particularly in resource-limited settings.
Resumo:
Venous thromboembolism (VTE) is a potentially lethal clinical condition that is suspected in patients with common clinical complaints, in many and varied, clinical care settings. Once VTE is diagnosed, optimal therapeutic management (thrombolysis, IVC filters, type and duration of anticoagulants) and ideal therapeutic management settings (outpatient, critical care) are also controversial. Clinical prediction tools, including clinical decision rules and D-Dimer, have been developed, and some validated, to assist clinical decision making along the diagnostic and therapeutic management paths for VTE. Despite these developments, practice variation is high and there remain many controversies in the use of the clinical prediction tools. In this narrative review, we highlight challenges and controversies in VTE diagnostic and therapeutic management with a focus on clinical decision rules and D-Dimer.
Resumo:
Includes bibliographical references (p. 38).
Resumo:
This dissertation is a discourse on the capital market and its interactive framework of acquisition and issuance of financial assets that drive the economy from both sides—investors/lenders and issuers/users of capital assets. My work consists of four essays in financial economics that offer a spectrum of revisions to this significant area of study. The first essay is a delineation of the capital market over the past half a century and major developments on capital markets on issues that pertain to the investor's opportunity set and the corporation's capital-raising availability set. This chapter should have merits on two counts: (i) a comprehensive account of capital markets and return-generating assets and (ii) a backdrop against which I present my findings in Chapters 2 through 4. ^ In Chapter 2, I rework on the Markowitz-Roy-Tobin structure of the efficient frontier and of the Separation Theorem. Starting off with a 2-asset portfolio and extending the paradigm to an n-asset portfolio, I bring out the optimal choice of assets for an investor under constrained utility maximization. In this chapter, I analyze the selection and revision-theoretic construct and bring out optimum choices. The effect of a change in perceived risk or return in the mind of an investor is ascertained on the portfolio composition. ^ Chapter 3 takes a look into corporations that issue market securities. The question of how a corporation decides what kinds of securities it should issue in the marketplace to raise funds brings out the classic value invariance proposition of Modigliani and Miller and fills the gap that existed in the literature for almost half a century. I question the general validity in the classic results of Modigliani and Miller and modify the existing literature on the celebrated value invariance proposition. ^ Chapter 4 takes the Modigliani-Miller regime to its correct prescription in the presence of corporate and personal taxes. I show that Modigliani-Miller's age-old proposition needs corrections and extensions, which I derive. ^ My dissertation overall brings all of these corrections and extensions to the existing literature as my findings, showing that capital markets are in an ever-changing state of necessary revision. ^
Resumo:
Research on judgment and decision making presents a confusing picture of human abilities. For example, much research has emphasized the dysfunctional aspects of judgmental heuristics, and yet, other findings suggest that these can be highly effective. A further line of research has modeled judgment as resulting from as if linear models. This paper illuminates the distinctions in these approaches by providing a common analytical framework based on the central theoretical premise that understanding human performance requires specifying how characteristics of the decision rules people use interact with the demands of the tasks they face. Our work synthesizes the analytical tools of lens model research with novel methodology developed to specify the effectiveness of heuristics in different environments and allows direct comparisons between the different approaches. We illustrate with both theoretical analyses and simulations. We further link our results to the empirical literature by a meta-analysis of lens model studies and estimate both human andheuristic performance in the same tasks. Our results highlight the trade-off betweenlinear models and heuristics. Whereas the former are cognitively demanding, the latterare simple to use. However, they require knowledge and thus maps of when andwhich heuristic to employ.