976 resultados para Logical Decision Function
Resumo:
Economics is a social science which, therefore, focuses on people and on the decisions they make, be it in an individual context, or in group situations. It studies human choices, in face of needs to be fulfilled, and a limited amount of resources to fulfill them. For a long time, there was a convergence between the normative and positive views of human behavior, in that the ideal and predicted decisions of agents in economic models were entangled in one single concept. That is, it was assumed that the best that could be done in each situation was exactly the choice that would prevail. Or, at least, that the facts that economics needed to explain could be understood in the light of models in which individual agents act as if they are able to make ideal decisions. However, in the last decades, the complexity of the environment in which economic decisions are made and the limits on the ability of agents to deal with it have been recognized, and incorporated into models of decision making in what came to be known as the bounded rationality paradigm. This was triggered by the incapacity of the unboundedly rationality paradigm to explain observed phenomena and behavior. This thesis contributes to the literature in three different ways. Chapter 1 is a survey on bounded rationality, which gathers and organizes the contributions to the field since Simon (1955) first recognized the necessity to account for the limits on human rationality. The focus of the survey is on theoretical work rather than the experimental literature which presents evidence of actual behavior that differs from what classic rationality predicts. The general framework is as follows. Given a set of exogenous variables, the economic agent needs to choose an element from the choice set that is avail- able to him, in order to optimize the expected value of an objective function (assuming his preferences are representable by such a function). If this problem is too complex for the agent to deal with, one or more of its elements is simplified. Each bounded rationality theory is categorized according to the most relevant element it simplifes. Chapter 2 proposes a novel theory of bounded rationality. Much in the same fashion as Conlisk (1980) and Gabaix (2014), we assume that thinking is costly in the sense that agents have to pay a cost for performing mental operations. In our model, if they choose not to think, such cost is avoided, but they are left with a single alternative, labeled the default choice. We exemplify the idea with a very simple model of consumer choice and identify the concept of isofin curves, i.e., sets of default choices which generate the same utility net of thinking cost. Then, we apply the idea to a linear symmetric Cournot duopoly, in which the default choice can be interpreted as the most natural quantity to be produced in the market. We find that, as the thinking cost increases, the number of firms thinking in equilibrium decreases. More interestingly, for intermediate levels of thinking cost, an equilibrium in which one of the firms chooses the default quantity and the other best responds to it exists, generating asymmetric choices in a symmetric model. Our model is able to explain well-known regularities identified in the Cournot experimental literature, such as the adoption of different strategies by players (Huck et al. , 1999), the inter temporal rigidity of choices (Bosch-Dom enech & Vriend, 2003) and the dispersion of quantities in the context of di cult decision making (Bosch-Dom enech & Vriend, 2003). Chapter 3 applies a model of bounded rationality in a game-theoretic set- ting to the well-known turnout paradox in large elections, pivotal probabilities vanish very quickly and no one should vote, in sharp contrast with the ob- served high levels of turnout. Inspired by the concept of rhizomatic thinking, introduced by Bravo-Furtado & Côrte-Real (2009a), we assume that each per- son is self-delusional in the sense that, when making a decision, she believes that a fraction of the people who support the same party decides alike, even if no communication is established between them. This kind of belief simplifies the decision of the agent, as it reduces the number of players he believes to be playing against { it is thus a bounded rationality approach. Studying a two-party first-past-the-post election with a continuum of self-delusional agents, we show that the turnout rate is positive in all the possible equilibria, and that it can be as high as 100%. The game displays multiple equilibria, at least one of which entails a victory of the bigger party. The smaller one may also win, provided its relative size is not too small; more self-delusional voters in the minority party decreases this threshold size. Our model is able to explain some empirical facts, such as the possibility that a close election leads to low turnout (Geys, 2006), a lower margin of victory when turnout is higher (Geys, 2006) and high turnout rates favoring the minority (Bernhagen & Marsh, 1997).
Resumo:
What genotype should the scientist specify for conducting a database search to try to find the source of a low-template-DNA (lt-DNA) trace? When the scientist answers this question, he or she makes a decision. Here, we approach this decision problem from a normative point of view by defining a decision-theoretic framework for answering this question for one locus. This framework combines the probability distribution describing the uncertainty over the trace's donor's possible genotypes with a loss function describing the scientist's preferences concerning false exclusions and false inclusions that may result from the database search. According to this approach, the scientist should choose the genotype designation that minimizes the expected loss. To illustrate the results produced by this approach, we apply it to two hypothetical cases: (1) the case of observing one peak for allele xi on a single electropherogram, and (2) the case of observing one peak for allele xi on one replicate, and a pair of peaks for alleles xi and xj, i ≠ j, on a second replicate. Given that the probabilities of allele drop-out are defined as functions of the observed peak heights, the threshold values marking the turning points when the scientist should switch from one designation to another are derived in terms of the observed peak heights. For each case, sensitivity analyses show the impact of the model's parameters on these threshold values. The results support the conclusion that the procedure should not focus on a single threshold value for making this decision for all alleles, all loci and in all laboratories.
Resumo:
The Notch1 gene has an important role in mammalian cell-fate decision and tumorigenesis. Upstream control mechanisms for transcription of this gene are still poorly understood. In a chemical genetics screen for small molecule activators of Notch signalling, we identified epidermal growth factor receptor (EGFR) as a key negative regulator of Notch1 gene expression in primary human keratinocytes, intact epidermis and skin squamous cell carcinomas (SCCs). The underlying mechanism for negative control of the Notch1 gene in human cells, as well as in a mouse model of EGFR-dependent skin carcinogenesis, involves transcriptional suppression of p53 by the EGFR effector c-Jun. Suppression of Notch signalling in cancer cells counteracts the differentiation-inducing effects of EGFR inhibitors while, at the same time, synergizing with these compounds in induction of apoptosis. Thus, our data reveal a key role of EGFR signalling in the negative regulation of Notch1 gene transcription, of potential relevance for combinatory approaches for cancer therapy.
Resumo:
BACKGROUND: Shared Decision Making (SDM) is increasingly advocated as a model for medical decision making. However, there is still low use of SDM in clinical practice. High impact factor journals might represent an efficient way for its dissemination. We aimed to identify and characterize publication trends of SDM in 15 high impact medical journals. METHODS: We selected the 15 general and internal medicine journals with the highest impact factor publishing original articles, letters and editorials. We retrieved publications from 1996 to 2011 through the full-text search function on each journal website and abstracted bibliometric data. We included publications of any type containing the phrase "shared decision making" or five other variants in their abstract or full text. These were referred to as SDM publications. A polynomial Poisson regression model with logarithmic link function was used to assess the evolution across the period of the number of SDM publications according to publication characteristics. RESULTS: We identified 1285 SDM publications out of 229,179 publications in 15 journals from 1996 to 2011. The absolute number of SDM publications by journal ranged from 2 to 273 over 16 years. SDM publications increased both in absolute and relative numbers per year, from 46 (0.32% relative to all publications from the 15 journals) in 1996 to 165 (1.17%) in 2011. This growth was exponential (P < 0.01). We found fewer research publications (465, 36.2% of all SDM publications) than non-research publications, which included non-systematic reviews, letters, and editorials. The increase of research publications across time was linear. Full-text search retrieved ten times more SDM publications than a similar PubMed search (1285 vs. 119 respectively). CONCLUSION: This review in full-text showed that SDM publications increased exponentially in major medical journals from 1996 to 2011. This growth might reflect an increased dissemination of the SDM concept to the medical community.
Resumo:
In patients undergoing non-cardiac surgery, cardiac events are the most common cause of perioperative morbidity and mortality. It is often difficult to choose adequate cardiologic examinations before surgery. This paper, inspired by the guidelines of the European and American societies of cardiology (ESC, AHA, ACC), discusses the place of standard ECG, echocardiography, treadmill or bicycle ergometer and pharmacological stress testing in preoperative evaluations. The role of coronary angiography and prophylactic revascularization will also be discussed. Finally, we provide a decision tree which will be helpful to both general practitioners and specialists.
Resumo:
This paper applies probability and decision theory in the graphical interface of an influence diagram to study the formal requirements of rationality which justify the individualization of a person found through a database search. The decision-theoretic part of the analysis studies the parameters that a rational decision maker would use to individualize the selected person. The modeling part (in the form of an influence diagram) clarifies the relationships between this decision and the ingredients that make up the database search problem, i.e., the results of the database search and the different pairs of propositions describing whether an individual is at the source of the crime stain. These analyses evaluate the desirability associated with the decision of 'individualizing' (and 'not individualizing'). They point out that this decision is a function of (i) the probability that the individual in question is, in fact, at the source of the crime stain (i.e., the state of nature), and (ii) the decision maker's preferences among the possible consequences of the decision (i.e., the decision maker's loss function). We discuss the relevance and argumentative implications of these insights with respect to recent comments in specialized literature, which suggest points of view that are opposed to the results of our study.
Resumo:
Single amino acid substitution is the type of protein alteration most related to human diseases. Current studies seek primarily to distinguish neutral mutations from harmful ones. Very few methods offer an explanation of the final prediction result in terms of the probable structural or functional effect on the protein. In this study, we describe the use of three novel parameters to identify experimentally-verified critical residues of the TP53 protein (p53). The first two parameters make use of a surface clustering method to calculate the protein surface area of highly conserved regions or regions with high nonlocal atomic interaction energy (ANOLEA) score. These parameters help identify important functional regions on the surface of a protein. The last parameter involves the use of a new method for pseudobinding free-energy estimation to specifically probe the importance of residue side-chains to the stability of protein fold. A decision tree was designed to optimally combine these three parameters. The result was compared to the functional data stored in the International Agency for Research on Cancer (IARC) TP53 mutation database. The final prediction achieved a prediction accuracy of 70% and a Matthews correlation coefficient of 0.45. It also showed a high specificity of 91.8%. Mutations in the 85 correctly identified important residues represented 81.7% of the total mutations recorded in the database. In addition, the method was able to correctly assign a probable functional or structural role to the residues. Such information could be critical for the interpretation and prediction of the effect of missense mutations, as it not only provided the fundamental explanation of the observed effect, but also helped design the most appropriate laboratory experiment to verify the prediction results.
Resumo:
At a time when disciplined inference and decision making under uncertainty represent common aims to participants in legal proceedings, the scientific community is remarkably heterogenous in its attitudes as to how these goals ought to be achieved. Probability and decision theory exert a considerable influence, and we think by all reason rightly do so, but they go against a mainstream of thinking that does not embrace-or is not aware of-the 'normative' character of this body of theory. It is normative, in the sense understood in this article, in that it prescribes particular properties, typically (logical) coherence, to which reasoning and decision making ought to conform. Disregarding these properties can result in diverging views which are occasionally used as an argument against the theory, or as a pretext for not following it. Typical examples are objections according to which people, both in everyday life but also individuals involved at various levels in the judicial process, find the theory difficult to understand and to apply. A further objection is that the theory does not reflect how people actually behave. This article aims to point out in what sense these examples misinterpret the analytical framework in its normative perspective. Through examples borrowed mostly from forensic science contexts, it is argued that so-called intuitive scientific attitudes are particularly liable to such misconceptions. These attitudes are contrasted with a statement of the actual liberties and constraints of probability and decision theory and the view according to which this theory is normative.
Resumo:
Reinsurance is one of the tools that an insurer can use to mitigate the underwriting risk and then to control its solvency. In this paper, we focus on the proportional reinsurance arrangements and we examine several optimization and decision problems of the insurer with respect to the reinsurance strategy. To this end, we use as decision tools not only the probability of ruin but also the random variable deficit at ruin if ruin occurs. The discounted penalty function (Gerber & Shiu, 1998) is employed to calculate as particular cases the probability of ruin and the moments and the distribution function of the deficit at ruin if ruin occurs.
Resumo:
Since the first implantation of an endograft in 1991, endovascular aneurysm repair (EVAR) rapidly gained recognition. Historical trials showed lower early mortality rates but these results were not maintained beyond 4 years. Despite newer-generation devices, higher rates of reintervention are associated with EVAR during follow-up. Therefore, the best therapeutic decision relies on many parameters that the physician has to take in consideration. Patient's preferences and characteristics are important, especially age and life expectancy besides health status. Aneurysmal anatomical conditions remain probably the most predictive factor that should be carefully evaluated to offer the best treatment. Unfavorable anatomy has been observed to be associated with more complications especially endoleak, leading to more re-interventions and higher risk of late mortality. Nevertheless, technological advances have made surgeons move forward beyond the set barriers. Thus, more endografts are implanted outside the instructions for use despite excellent results after open repair especially in low-risk patients. When debating about AAA repair, some other crucial points should be analysed. It has been shown that strict surveillance is mandatory after EVAR to offer durable results and prevent late rupture. Such program is associated with additional costs and with increased risk of radiation. Moreover, a risk of loss of renal function exists when repetitive imaging and secondary procedures are required. The aim of this article is to review the data associated with abdominal aortic aneurysm and its treatment in order to establish selection criteria to decide between open or endovascular repair.
Resumo:
Our objective was to determine the test and treatment thresholds for common acute primary care conditions. We presented 200 clinicians with a series of web-based clinical vignettes, describing patients with possible influenza, acute coronary syndrome (ACS), pneumonia, deep vein thrombosis (DVT) and urinary tract infection (UTI). We randomly varied the probability of disease and asked whether the clinician wanted to rule out disease, order tests or rule in disease. By randomly varying the probability, we obtained clinical decisions across a broad range of disease probabilities that we used to create threshold curves. For influenza, the test (4.5% vs 32%, p<0.001) and treatment (55% vs 68%, p=0.11) thresholds were lower for US compared with Swiss physicians. US physicians had somewhat higher test (3.8% vs 0.7%, p=0.107) and treatment (76% vs 58%, p=0.005) thresholds for ACS than Swiss physicians. For both groups, the range between test and treatment thresholds was greater for ACS than for influenza (which is sensible, given the consequences of incorrect diagnosis). For pneumonia, US physicians had a trend towards higher test thresholds and lower treatment thresholds (48% vs 64%, p=0.076) than Swiss physicians. The DVT and UTI scenarios did not provide easily interpretable data, perhaps due to poor wording of the vignettes. We have developed a novel approach for determining decision thresholds. We found important differences in thresholds for US and Swiss physicians that may be a function of differences in healthcare systems. Our results can also guide development of clinical decision rules and guidelines.
Resumo:
BACKGROUND: Shared Decision Making (SDM) is increasingly advocated as a model for medical decision making. However, there is still low use of SDM in clinical practice. High impact factor journals might represent an efficient way for its dissemination. We aimed to identify and characterize publication trends of SDM in 15 high impact medical journals. METHODS: We selected the 15 general and internal medicine journals with the highest impact factor publishing original articles, letters and editorials. We retrieved publications from 1996 to 2011 through the full-text search function on each journal website and abstracted bibliometric data. We included publications of any type containing the phrase "shared decision making" or five other variants in their abstract or full text. These were referred to as SDM publications. A polynomial Poisson regression model with logarithmic link function was used to assess the evolution across the period of the number of SDM publications according to publication characteristics. RESULTS: We identified 1285 SDM publications out of 229,179 publications in 15 journals from 1996 to 2011. The absolute number of SDM publications by journal ranged from 2 to 273 over 16 years. SDM publications increased both in absolute and relative numbers per year, from 46 (0.32% relative to all publications from the 15 journals) in 1996 to 165 (1.17%) in 2011. This growth was exponential (P < 0.01). We found fewer research publications (465, 36.2% of all SDM publications) than non-research publications, which included non-systematic reviews, letters, and editorials. The increase of research publications across time was linear. Full-text search retrieved ten times more SDM publications than a similar PubMed search (1285 vs. 119 respectively). CONCLUSION: This review in full-text showed that SDM publications increased exponentially in major medical journals from 1996 to 2011. This growth might reflect an increased dissemination of the SDM concept to the medical community.
Resumo:
The target of the thesis was to find out has the decision to outsource part of Filtronic LK warehouse function been profitable. Furthermore, another thesis target was to demonstrate current logistics processes between TPLP and company and find out the targets for developing these processes. The decision to outsource part of logistical funtions have been profitable during the first business year. Partnership includes always business risks. Risk increases high asset specific investments. In the other hand investment to partnership increases mutual trust and commitment between parties. By developing partnership risks and opportunitic behaviour can be decreased. The potential of managing material and data flows between logistic service provider and company observed. By analyzing inventory effiency were highlighted the need for decreasing the capital invested to inventories. The recommendations for managing outsourced logistical funtions were established such as improving partnership, process development, performance measurement and invoice checking.
Resumo:
This paper aims at clarifying the nature of Frege's system of logic, as presented in the first volume of the Grundgesetze . We undertake a rational reconstruction of this system, by distinguishing its propositional and predicate fragments. This allows us to emphasise the differences and similarities between this system and a modern system of classical second-order logic.
Resumo:
This research aimed to compare two female broiler breeder ages during the incubation period regarding management using the Analytic Hierarchy Process method (AHP). This method is characterized by the possibility of analyzing a multicriteria problem and assists a decision making. This study was carried out on a commercial hatchery located in São Paulo, Brazil. Two ages of broiler breeder (42 and 56 weeks) were compared relative to production rate. Production index data were the same in both ages and were submitted to multicriteria decision analysis using the AHP method. The results indicate that broiler breeders of 42 weeks presented better performance than those of 56 week-old. The setter phase (incubation) is more critical than the hatcher. The AHP method was efficient for this analysis and can serve as a methodological basis for future studies to improve the hatchability of broilers eggs.