963 resultados para decision strategies
Resumo:
This paper provides a general treatment of the implications for welfare of legal uncertainty. We distinguish legal uncertainty from decision errors: though the former can be influenced by the latter, the latter are neither necessary nor sufficient for the existence of legal uncertainty. We show that an increase in decision errors will always reduce welfare. However, for any given level of decision errors, information structures involving more legal uncertainty can improve welfare. This holds always, even when there is complete legal uncertainty, when sanctions on socially harmful actions are set at their optimal level. This transforms radically one’s perception about the “costs” of legal uncertainty. We also provide general proofs for two results, previously established under restrictive assumptions. The first is that Effects-Based enforcement procedures may welfare dominate Per Se (or object-based) procedures and will always do so when sanctions are optimally set. The second is that optimal sanctions may well be higher under enforcement procedures involving more legal uncertainty.
Resumo:
Three polar types of monetary architecture are identified together with the institutional and market infrastructure required for each type and the kinds of monetary policy feasible in each case: a ‘basic’ architecture where there is little or no financial system as such but an elementary central bank which is able to fix the exchange rate, as a substitute for a proper monetary policy; a ‘modern’ monetary architecture with developed banks, financial markets and central bank where policy choices include types of inflation targeting; and an ‘intermediate’ monetary architecture where less market-based monetary policies involving less discretion are feasible. A range of data is used to locate the various MENA countries with respect to these polar types. Five countries (Iran, Libya, Sudan, Syria and Yemen) are identified as those with the least developed monetary architecture, while Bahrain and Jordan are identified as the group at the other end of the spectrum, reaching beyond the intermediate polar type in some dimensions but not others. The countries in between vary on different dimensions but all lie between basic and intermediate architectures. The key general findings are that the MENA countries are both less differentiated and less ‘developed’ than might have been expected. The paper ends by calling for research on the costs and benefits of different types of monetary architecture.
Resumo:
Background Decisions on limiting life-sustaining treatment for patients in the vegetative state (VS) are emotionally and morally challenging. In Germany, doctors have to discuss, together with the legal surrogate (often a family member), whether the proposed treatment is in accordance with the patient's will. However, it is unknown whether family members of the patient in the VS actually base their decisions on the patient's wishes. Objective To examine the role of advance directives, orally expressed wishes, or the presumed will of patients in a VS for family caregivers' decisions on life-sustaining treatment. Methods and sample A qualitative interview study with 14 next of kin of patients in a VS in a long-term care setting was conducted; 13 participants were the patient's legal surrogates. Interviews were analysed according to qualitative content analysis. Results The majority of family caregivers said that they were aware of aforementioned wishes of the patient that could be applied to the VS condition, but did not base their decisions primarily on these wishes. They gave three reasons for this: (a) the expectation of clinical improvement, (b) the caregivers' definition of life-sustaining treatments and (c) the moral obligation not to harm the patient. If the patient's wishes were not known or not revealed, the caregivers interpreted a will to live into the patient's survival and non-verbal behaviour. Conclusions Whether or not prior treatment wishes of patients in a VS are respected depends on their applicability, and also on the medical assumptions and moral attitudes of the surrogates. We recommend repeated communication, support for the caregivers and advance care planning.
Resumo:
Bacteria often possess multiple siderophore-based iron uptake systems for scavenging this vital resource from their environment. However, some siderophores seem redundant, because they have limited iron-binding efficiency and are seldom expressed under iron limitation. Here, we investigate the conundrum of why selection does not eliminate this apparent redundancy. We focus on Pseudomonas aeruginosa, a bacterium that can produce two siderophores-the highly efficient but metabolically expensive pyoverdine, and the inefficient but metabolically cheap pyochelin. We found that the bacteria possess molecular mechanisms to phenotypically switch from mainly producing pyoverdine under severe iron limitation to mainly producing pyochelin when iron is only moderately limited. We further show that strains exclusively producing pyochelin grew significantly better than strains exclusively producing pyoverdine under moderate iron limitation, whereas the inverse was seen under severe iron limitation. This suggests that pyochelin is not redundant, but that switching between siderophore strategies might be beneficial to trade off efficiencies versus costs of siderophores. Indeed, simulations parameterized from our data confirmed that strains retaining the capacity to switch between siderophores significantly outcompeted strains defective for one or the other siderophore under fluctuating iron availabilities. Finally, we discuss how siderophore switching can be viewed as a form of collective decision-making, whereby a coordinated shift in behaviour at the group level emerges as a result of positive and negative feedback loops operating among individuals at the local scale.
Resumo:
The significant development of immunosuppressive drug therapies within the past 20 years has had a major impact on the outcome of clinical solid organ transplantation, mainly by decreasing the incidence of acute rejection episodes and improving short-term patient and graft survival. However, long-term results remain relatively disappointing because of chronic allograft dysfunction and patient morbidity or mortality, which is often related to the adverse effects of immunosuppressive treatment. Thus, the induction of specific immunological tolerance of the recipient towards the allograft remains an important objective in transplantation. In this article, we first briefly describe the mechanisms of allograft rejection and immune tolerance. We then review in detail current tolerogenic strategies that could promote central or peripheral tolerance, highlighting the promises as well as the remaining challenges in clinical transplantation. The induction of haematopoietic mixed chimerism could be an approach to induce robust central tolerance, and we describe recent encouraging reports of end-stage kidney disease patients, without concomitant malignancy, who have undergone combined bone marrow and kidney transplantation. We discuss current studies suggesting that, while promoting peripheral transplantation tolerance in preclinical models, induction protocols based on lymphocyte depletion (polyclonal antithymocyte globulins, alemtuzumab) or co-stimulatory blockade (belatacept) should, at the current stage, be considered more as drug-minimization rather than tolerance-inducing strategies. Thus, a better understanding of the mechanisms that promote peripheral tolerance has led to newer approaches and the investigation of individualized donor-specific cellular therapies based on manipulated recipient regulatory T cells.
Resumo:
Continuing developments in science and technology mean that the amounts of information forensic scientists are able to provide for criminal investigations is ever increasing. The commensurate increase in complexity creates difficulties for scientists and lawyers with regard to evaluation and interpretation, notably with respect to issues of inference and decision. Probability theory, implemented through graphical methods, and specifically Bayesian networks, provides powerful methods to deal with this complexity. Extensions of these methods to elements of decision theory provide further support and assistance to the judicial system. Bayesian Networks for Probabilistic Inference and Decision Analysis in Forensic Science provides a unique and comprehensive introduction to the use of Bayesian decision networks for the evaluation and interpretation of scientific findings in forensic science, and for the support of decision-makers in their scientific and legal tasks. Includes self-contained introductions to probability and decision theory. Develops the characteristics of Bayesian networks, object-oriented Bayesian networks and their extension to decision models. Features implementation of the methodology with reference to commercial and academically available software. Presents standard networks and their extensions that can be easily implemented and that can assist in the reader's own analysis of real cases. Provides a technique for structuring problems and organizing data based on methods and principles of scientific reasoning. Contains a method for the construction of coherent and defensible arguments for the analysis and evaluation of scientific findings and for decisions based on them. Is written in a lucid style, suitable for forensic scientists and lawyers with minimal mathematical background. Includes a foreword by Ian Evett. The clear and accessible style of this second edition makes this book ideal for all forensic scientists, applied statisticians and graduate students wishing to evaluate forensic findings from the perspective of probability and decision analysis. It will also appeal to lawyers and other scientists and professionals interested in the evaluation and interpretation of forensic findings, including decision making based on scientific information.
Resumo:
BACKGROUND: Preoperative central neurologic deficits in the context of acute type A dissection are a complex comorbidity and difficult to handle. The aim this study was to analyze this subgroup of patients by comparing them with neurologically asymptomatic patients with type A dissection. Results may help the surgeon in preoperative risk assessment and thereby aid in the decision-making process. METHODS: We reviewed the data of patients admitted for acute type A dissection during the period from 1999 to 2010. Associated risk factors, time to surgery from admission, extension of the dissection, localization of central nervous ischemic lesions, and the influence of perioperative brain protective strategies were analyzed in a comparison of preoperative neurologically deficient to nondeficient patients. RESULTS: Forty-seven (24.5%) of a total of 192 patients had new-onset central neurologic symptoms prior to surgery. Concomitant myocardial infarction (OR 4.9, 95% CI 1.6-15.3, P = 0.006), renal failure (OR 5.9, 95% CI 1.1-32.8, P = 0.04), dissected carotid arteries (OR 9.2, 95% CI 2.4-34.7, P = 0.001), and late admission to surgery at >6 hours after symptom onset (OR 2.7, 95% CI 1.1-6.8, P = 0.04) were observed more frequently in neurologically deficient patients. These patients had a higher 30-day in-hospital mortality on univariate analysis (P = 0.01) and a higher rate of new postoperative neurologic deficits (OR 9.2, 95% CI 2.4-34.7, P = 0.02). Neurologic survivors had an equal hospital stay, and 67% of them had improved symptoms. CONCLUSIONS: The predominance of neurologic symptoms at admission may be responsible for an initial misdiagnosis. The concurrent central nervous system ischemia and myocardial infarction explains a higher mortality rate and a more extensive "character" of the disease. Neurologically deficient patients are at higher risk of developing new postoperative neurologic symptoms, but prognosis for the neurologic evolution of survivors is generally favorable.
Resumo:
Regular physical activity is among the most effective interventions to prevent or delay functional decline and disability, even in older persons. Despite relatively strong scientific evidence supporting these benefits, the majority of older persons remain mostly sedentary. For these persons, concerns about injury or fear of negative consequences on their chronic diseases are among the most powerful barriers to participation in regular physical activity. Promotion of physical activity among older persons has therefore become one of the five main themes of the health promotion project "Via", a project that aims at promoting good practice in prevention and health promotion directed toward older adults in Switzerland. This paper summarizes the main recommendations issued from this national project supported by the Swiss Health Promotion Foundation.
Resumo:
This guide introduces Data Envelopment Analysis (DEA), a performance measurement technique, in such a way as to be appropriate to decision makers with little or no background in economics and operational research. The use of mathematics is kept to a minimum. This guide therefore adopts a strong practical approach in order to allow decision makers to conduct their own efficiency analysis and to easily interpret results. DEA helps decision makers for the following reasons: - By calculating an efficiency score, it indicates if a firm is efficient or has capacity for improvement. - By setting target values for input and output, it calculates how much input must be decreased or output increased in order to become efficient. - By identifying the nature of returns to scale, it indicates if a firm has to decrease or increase its scale (or size) in order to minimize the average cost. - By identifying a set of benchmarks, it specifies which other firms' processes need to be analysed in order to improve its own practices.
Resumo:
Malaria and other arthropod born diseases remain a serious public health problem affecting the lives and health of certain social groups when the two basic strategies to control fail due to : (1) the lack of effective chemoprophylaxis/chemotherapy or the rapid development of drug resistance of the infectious agents and (2) the ineffectiveness of pesticides or the arthropod vectors develop resistance to them. These situations enhances the need for the design and implementation of other alternatives for sustainable health programmes. The application of the epidemiological methods is essential not only for analyzing the relevant data for the understanding of the biological characteristics of the infectious agents, their reservoirs and vectors and the methods for their control, but also for the assessment of the human behaviour, the environmental, social and economic factors involved in disease transmission and the capacity of the health systems to implement interventions for both changes in human behaviour and environmental management to purpose guaranteed prevention and control of malaria and other arthropod born diseases with efficiency, efficacy and equity. This paper discuss the evolution of the malaria arthropod diseases programmes in the American Region and the perspectives for their integration into health promotion programs and emphasis is made in the need to establish solid basis in the decision-making process for the selection of intervention strategies to remove the risk factors determining the probability to get sick or die from ABDs. The implications of the general planning and the polices to be adopted in an area should be analyzed in the light of programme feasibility at the local level, in the multisectoral context specific social groups and taking in consideration the principles of stratification and equity
Resumo:
Detection and discrimination of visuospatial input involve at least extracting, selecting and encoding relevant information and decision-making processes allowing selecting a response. These two operations are altered, respectively, by attentional mechanisms that change discrimination capacities, and by beliefs concerning the likelihood of uncertain events. Information processing is tuned by the attentional level that acts like a filter on perception, while decision-making processes are weighed by subjective probability of risk. In addition, it has been shown that anxiety could affect the detection of unexpected events through the modification of the level of arousal. Consequently, purpose of this study concerns whether and how decision-making and brain dynamics are affected by anxiety. To investigate these questions, the performance of women with either a high (12) or a low (12) STAI-T (State-Trait Anxiety Inventory, Spielberger, 1983) was examined in a decision-making visuospatial task where subjects have to recognize a target visual pattern from non-target patterns. The target pattern was a schematic image of furniture arranged in such a way as to give the impression of a living room. Non-target patterns were created by either the compression or the dilatation of the distances between objects. Target and non-target patterns were always presented in the same configuration. Preliminary behavioral results show no group difference in reaction time. In addition, visuo-spatial abilities were analyzed trough the signal detection theory for quantifying perceptual decisions in the presence of uncertainty (Green and Swets, 1966). This theory treats detection of a stimulus as a decision-making process determined by the nature of the stimulus and cognitive factors. Astonishingly, no difference in d' (corresponding to the distance between means of the distributions) and c (corresponds to the likelihood ratio) indexes was observed. Comparison of Event-related potentials (ERP) reveals that brain dynamics differ according to anxiety. It shows differences in component latencies, particularly a delay in anxious subjects over posterior electrode sites. However, these differences are compensated during later components by shorter latencies in anxious subjects compared to non-anxious one. These inverted effects seem indicate that the absence of difference in reaction time rely on a compensation of attentional level that tunes cortical activation in anxious subjects, but they have to hammer away to maintain performance.
Resumo:
What genotype should the scientist specify for conducting a database search to try to find the source of a low-template-DNA (lt-DNA) trace? When the scientist answers this question, he or she makes a decision. Here, we approach this decision problem from a normative point of view by defining a decision-theoretic framework for answering this question for one locus. This framework combines the probability distribution describing the uncertainty over the trace's donor's possible genotypes with a loss function describing the scientist's preferences concerning false exclusions and false inclusions that may result from the database search. According to this approach, the scientist should choose the genotype designation that minimizes the expected loss. To illustrate the results produced by this approach, we apply it to two hypothetical cases: (1) the case of observing one peak for allele xi on a single electropherogram, and (2) the case of observing one peak for allele xi on one replicate, and a pair of peaks for alleles xi and xj, i ≠ j, on a second replicate. Given that the probabilities of allele drop-out are defined as functions of the observed peak heights, the threshold values marking the turning points when the scientist should switch from one designation to another are derived in terms of the observed peak heights. For each case, sensitivity analyses show the impact of the model's parameters on these threshold values. The results support the conclusion that the procedure should not focus on a single threshold value for making this decision for all alleles, all loci and in all laboratories.
Resumo:
In this review the authors analyze the effector and regulatory mechanisms in the immune response to schistosomiasis. To study these mechanisms two animal models were used, mouse and rat. The mouse totaly permissive host like human, show prominent-T cell control in the acquisition of resistance. But other mechanisms like antibody mediated cytotoxity (ADCC) involving eosinophils and IgG antibodies described in humans, are observed in rats. Also in this animal, it is observed specific IgE antibody high production and blood and tisssue eosinophilia. Using the rat model and schistosomula as target, some ADCC features have emerged: the cellular population involved are bone marrow derived inflammatory cell (mononuclear phagocytes, eosinophils and platelets), interacting with IgE through IgE Fc receptors. Immunization has been attempted using the recombinant protein Sm28/GST. Protection has been observed in rodents with significant decrease of parasite fecundity and egg viability affecting the number, size and volume of liver egg granulomas. The association of praziquantel and immunization with with Sm28/GST increases the resistance to infection and decreases egg viability. The authors suggest the possibility of the stablishment of a future vaccine against Schistosoma mansoni.
Resumo:
BACKGROUND: According to recent guidelines, patients with coronary artery disease (CAD) should undergo revascularization if significant myocardial ischemia is present. Both, cardiovascular magnetic resonance (CMR) and fractional flow reserve (FFR) allow for a reliable ischemia assessment and in combination with anatomical information provided by invasive coronary angiography (CXA), such a work-up sets the basis for a decision to revascularize or not. The cost-effectiveness ratio of these two strategies is compared. METHODS: Strategy 1) CMR to assess ischemia followed by CXA in ischemia-positive patients (CMR + CXA), Strategy 2) CXA followed by FFR in angiographically positive stenoses (CXA + FFR). The costs, evaluated from the third party payer perspective in Switzerland, Germany, the United Kingdom (UK), and the United States (US), included public prices of the different outpatient procedures and costs induced by procedural complications and by diagnostic errors. The effectiveness criterion was the correct identification of hemodynamically significant coronary lesion(s) (= significant CAD) complemented by full anatomical information. Test performances were derived from the published literature. Cost-effectiveness ratios for both strategies were compared for hypothetical cohorts with different pretest likelihood of significant CAD. RESULTS: CMR + CXA and CXA + FFR were equally cost-effective at a pretest likelihood of CAD of 62% in Switzerland, 65% in Germany, 83% in the UK, and 82% in the US with costs of CHF 5'794, euro 1'517, £ 2'680, and $ 2'179 per patient correctly diagnosed. Below these thresholds, CMR + CXA showed lower costs per patient correctly diagnosed than CXA + FFR. CONCLUSIONS: The CMR + CXA strategy is more cost-effective than CXA + FFR below a CAD prevalence of 62%, 65%, 83%, and 82% for the Swiss, the German, the UK, and the US health care systems, respectively. These findings may help to optimize resource utilization in the diagnosis of CAD.