962 resultados para Shared-decision
Resumo:
This paper focuses on an efficient user-level method for the deployment of application-specific extensions, using commodity operating systems and hardware. A sandboxing technique is described that supports multiple extensions within a shared virtual address space. Applications can register sandboxed code with the system, so that it may be executed in the context of any process. Such code may be used to implement generic routines and handlers for a class of applications, or system service extensions that complement the functionality of the core kernel. Using our approach, application-specific extensions can be written like conventional user-level code, utilizing libraries and system calls, with the advantage that they may be executed without the traditional costs of scheduling and context-switching between process-level protection domains. No special hardware support such as segmentation or tagged translation look-aside buffers (TLBs) is required. Instead, our ``user-level sandboxing'' mechanism requires only paged-based virtual memory support, given that sandboxed extensions are either written by a trusted source or are guaranteed to be memory-safe (e.g., using type-safe languages). Using a fast method of upcalls, we show how our mechanism provides significant performance improvements over traditional methods of invoking user-level services. As an application of our approach, we have implemented a user-level network subsystem that avoids data copying via the kernel and, in many cases, yields far greater network throughput than kernel-level approaches.
Resumo:
How does the brain make decisions? Speed and accuracy of perceptual decisions covary with certainty in the input, and correlate with the rate of evidence accumulation in parietal and frontal cortical "decision neurons." A biophysically realistic model of interactions within and between Retina/LGN and cortical areas V1, MT, MST, and LIP, gated by basal ganglia, simulates dynamic properties of decision-making in response to ambiguous visual motion stimuli used by Newsome, Shadlen, and colleagues in their neurophysiological experiments. The model clarifies how brain circuits that solve the aperture problem interact with a recurrent competitive network with self-normalizing choice properties to carry out probablistic decisions in real time. Some scientists claim that perception and decision-making can be described using Bayesian inference or related general statistical ideas, that estimate the optimal interpretation of the stimulus given priors and likelihoods. However, such concepts do not propose the neocortical mechanisms that enable perception, and make decisions. The present model explains behavioral and neurophysiological decision-making data without an appeal to Bayesian concepts and, unlike other existing models of these data, generates perceptual representations and choice dynamics in response to the experimental visual stimuli. Quantitative model simulations include the time course of LIP neuronal dynamics, as well as behavioral accuracy and reaction time properties, during both correct and error trials at different levels of input ambiguity in both fixed duration and reaction time tasks. Model MT/MST interactions compute the global direction of random dot motion stimuli, while model LIP computes the stochastic perceptual decision that leads to a saccadic eye movement.
Resumo:
When brain mechanism carry out motion integration and segmentation processes that compute unambiguous global motion percepts from ambiguous local motion signals? Consider, for example, a deer running at variable speeds behind forest cover. The forest cover is an occluder that creates apertures through which fragments of the deer's motion signals are intermittently experienced. The brain coherently groups these fragments into a trackable percept of the deer in its trajectory. Form and motion processes are needed to accomplish this using feedforward and feedback interactions both within and across cortical processing streams. All the cortical areas V1, V2, MT, and MST are involved in these interactions. Figure-ground processes in the form stream through V2, such as the seperation of occluding boundaries of the forest cover from the boundaries of the deer, select the motion signals which determine global object motion percepts in the motion stream through MT. Sparse, but unambiguous, feauture tracking signals are amplified before they propogate across position and are intergrated with far more numerous ambiguous motion signals. Figure-ground and integration processes together determine the global percept. A neural model predicts the processing stages that embody these form and motion interactions. Model concepts and data are summarized about motion grouping across apertures in response to a wide variety of displays, and probabilistic decision making in parietal cortex in response to random dot displays.
Resumo:
Political drivers such as the Kyoto protocol, the EU Energy Performance of Buildings Directive and the Energy end use and Services Directive have been implemented in response to an identified need for a reduction in human related CO2 emissions. Buildings account for a significant portion of global CO2 emissions, approximately 25-30%, and it is widely acknowledged by industry and research organisations that they operate inefficiently. In parallel, unsatisfactory indoor environmental conditions have proven to negatively impact occupant productivity. Legislative drivers and client education are seen as the key motivating factors for an improvement in the holistic environmental and energy performance of a building. A symbiotic relationship exists between building indoor environmental conditions and building energy consumption. However traditional Building Management Systems and Energy Management Systems treat these separately. Conventional performance analysis compares building energy consumption with a previously recorded value or with the consumption of a similar building and does not recognise the fact that all buildings are unique. Therefore what is required is a new framework which incorporates performance comparison against a theoretical building specific ideal benchmark. Traditionally Energy Managers, who work at the operational level of organisations with respect to building performance, do not have access to ideal performance benchmark information and as a result cannot optimally operate buildings. This thesis systematically defines Holistic Environmental and Energy Management and specifies the Scenario Modelling Technique which in turn uses an ideal performance benchmark. The holistic technique uses quantified expressions of building performance and by doing so enables the profiled Energy Manager to visualise his actions and the downstream consequences of his actions in the context of overall building operation. The Ideal Building Framework facilitates the use of this technique by acting as a Building Life Cycle (BLC) data repository through which ideal building performance benchmarks are systematically structured and stored in parallel with actual performance data. The Ideal Building Framework utilises transformed data in the form of the Ideal Set of Performance Objectives and Metrics which are capable of defining the performance of any building at any stage of the BLC. It is proposed that the union of Scenario Models for an individual building would result in a building specific Combination of Performance Metrics which would in turn be stored in the BLC data repository. The Ideal Data Set underpins the Ideal Set of Performance Objectives and Metrics and is the set of measurements required to monitor the performance of the Ideal Building. A Model View describes the unique building specific data relevant to a particular project stakeholder. The energy management data and information exchange requirements that underlie a Model View implementation are detailed and incorporate traditional and proposed energy management. This thesis also specifies the Model View Methodology which complements the Ideal Building Framework. The developed Model View and Rule Set methodology process utilises stakeholder specific rule sets to define stakeholder pertinent environmental and energy performance data. This generic process further enables each stakeholder to define the resolution of data desired. For example, basic, intermediate or detailed. The Model View methodology is applicable for all project stakeholders, each requiring its own customised rule set. Two rule sets are defined in detail, the Energy Manager rule set and the LEED Accreditor rule set. This particular measurement generation process accompanied by defined View would filter and expedite data access for all stakeholders involved in building performance. Information presentation is critical for effective use of the data provided by the Ideal Building Framework and the Energy Management View definition. The specifications for a customised Information Delivery Tool account for the established profile of Energy Managers and best practice user interface design. Components of the developed tool could also be used by Facility Managers working at the tactical and strategic levels of organisations. Informed decision making is made possible through specified decision assistance processes which incorporate the Scenario Modelling and Benchmarking techniques, the Ideal Building Framework, the Energy Manager Model View, the Information Delivery Tool and the established profile of Energy Managers. The Model View and Rule Set Methodology is effectively demonstrated on an appropriate mixed use existing ‘green’ building, the Environmental Research Institute at University College Cork, using the Energy Management and LEED rule sets. Informed Decision Making is also demonstrated using a prototype scenario for the demonstration building.
Resumo:
Aim: Diabetes is an important barometer of health system performance. This chronic condition is a source of significant morbidity, premature mortality and a major contributor to health care costs. There is an increasing focus internationally, and more recently nationally, on system, practice and professional-level initiatives to promote the quality of care. The aim of this thesis was to investigate the ‘quality chasm’ around the organisation and delivery of diabetes care in general practice, to explore GPs’ attitudes to engaging in quality improvement activities and to examine efforts to improve the quality of diabetes care in Ireland from practice to policy. Methods: Quantitative and qualitative methods were used. As part of a mixed methods sequential design, a postal survey of 600 GPs was conducted to assess the organization of care. This was followed by an in-depth qualitative study using semi-structured interviews with a purposive sample of 31 GPs from urban and rural areas. The qualitative methodology was also used to examine GPs’ attitudes to engaging in quality improvement. Data were analysed using a Framework approach. A 2nd observation study was used to assess the quality of care in 63 practices with a special interest in diabetes. Data on 3010 adults with Type 2 diabetes from 3 primary care initiatives were analysed and the results were benchmarked against national guidelines and standards of care in the UK. The final study was an instrumental case study of policy formulation. Semi-structured interviews were conducted with 15 members of the Expert Advisory Group (EAG) for Diabetes. Thematic analysis was applied to the data using 3 theories of the policy process as analytical tools. Results: The survey response rate was 44% (n=262). Results suggested care delivery was largely unstructured; 45% of GPs had a diabetes register (n=157), 53% reported using guidelines (n=140), 30% had formal call recall system (n=78) and 24% had none of these organizational features (n=62). Only 10% of GPs had a formal shared protocol with the local hospital specialist diabetes team (n=26). The lack of coordination between settings was identified as a major barrier to providing optimal care leading to waiting times, overburdened hospitals and avoidable duplication. The lack of remuneration for chronic disease management had a ripple effect also creating costs for patients and apathy among GPs. There was also a sense of inertia around quality improvement activities particularly at a national level. This attitude was strongly influenced by previous experiences of change in the health system. In contrast GP’s spoke positively about change at a local level which was facilitated by a practice ethos, leadership and special interest in diabetes. The 2nd quantitative study found that practices with a special interest in diabetes achieved a standard of care comparable to the UK in terms of the recording of clinical processes of care and the achievement of clinical targets; 35% of patients reached the HbA1c target of <6.5% compared to 26% in England and Wales. With regard to diabetes policy formulation, the evolving process of action and inaction was best described by the Multiple Streams Theory. Within the EAG, the formulation of recommendations was facilitated by overarching agreement on the “obvious” priorities while the details of proposals were influenced by personal preferences and local capacity. In contrast the national decision-making process was protracted and ambiguous. The lack of impetus from senior management coupled with the lack of power conferred on the EAG impeded progress. Conclusions: The findings highlight the inconsistency of diabetes care in Ireland. The main barriers to optimal diabetes management center on the organization and coordination of care at the systems level with consequences for practice, providers and patients. Quality improvement initiatives need to stimulate a sense of ownership and interest among frontline service providers to address the local sense of inertia to national change. To date quality improvement in diabetes care has been largely dependent the “special interest” of professionals. The challenge for the Irish health system is to embed this activity as part of routine practice, professional responsibility and the underlying health care culture.
Resumo:
This PhD thesis investigates the potential use of science communication models to engage a broader swathe of actors in decision making in relation to scientific and technological innovation in order to address possible democratic deficits in science and technology policy-making. A four-pronged research approach has been employed to examine different representations of the public(s) and different modes of engagement. The first case study investigates whether patient-groups could represent an alternative needs-driven approach to biomedical and health sciences R & D. This is followed by enquiry into the potential for Science Shops to represent a bottom-up approach to promote research and development of local relevance. The barriers and opportunities for the involvement of scientific researchers in science communication are next investigated via a national survey which is comparable to a similar survey conducted in the UK. The final case study investigates to what extent opposition or support regarding nanotechnology (as an emerging technology) is reflected amongst the YouTube user community and the findings are considered in the context of how support or opposition to new or emerging technologies can be addressed using conflict resolution based approaches to manage potential conflict trajectories. The research indicates that the majority of communication exercises of relevance to science policy and planning take the form of a one-way flow of information with little or no facility for public feedback. This thesis proposes that a more bottom-up approach to research and technology would help broaden acceptability and accountability for decisions made relating to new or existing technological trajectories. This approach could be better integrated with and complementary to government, institutional, e.g. university, and research funding agencies activities and help ensure that public needs and issues are better addressed directly by the research community. Such approaches could also facilitate empowerment of societal stakeholders regarding scientific literacy and agenda-setting. One-way information relays could be adapted to facilitate feedback from representative groups e.g. Non-governmental organisations or Civil Society Organisations (such as patient groups) in order to enhance the functioning and socio-economic relevance of knowledge-based societies to the betterment of human livelihoods.
Resumo:
In many real world situations, we make decisions in the presence of multiple, often conflicting and non-commensurate objectives. The process of optimizing systematically and simultaneously over a set of objective functions is known as multi-objective optimization. In multi-objective optimization, we have a (possibly exponentially large) set of decisions and each decision has a set of alternatives. Each alternative depends on the state of the world, and is evaluated with respect to a number of criteria. In this thesis, we consider the decision making problems in two scenarios. In the first scenario, the current state of the world, under which the decisions are to be made, is known in advance. In the second scenario, the current state of the world is unknown at the time of making decisions. For decision making under certainty, we consider the framework of multiobjective constraint optimization and focus on extending the algorithms to solve these models to the case where there are additional trade-offs. We focus especially on branch-and-bound algorithms that use a mini-buckets algorithm for generating the upper bound at each node of the search tree (in the context of maximizing values of objectives). Since the size of the guiding upper bound sets can become very large during the search, we introduce efficient methods for reducing these sets, yet still maintaining the upper bound property. We define a formalism for imprecise trade-offs, which allows the decision maker during the elicitation stage, to specify a preference for one multi-objective utility vector over another, and use such preferences to infer other preferences. The induced preference relation then is used to eliminate the dominated utility vectors during the computation. For testing the dominance between multi-objective utility vectors, we present three different approaches. The first is based on a linear programming approach, the second is by use of distance-based algorithm (which uses a measure of the distance between a point and a convex cone); the third approach makes use of a matrix multiplication, which results in much faster dominance checks with respect to the preference relation induced by the trade-offs. Furthermore, we show that our trade-offs approach, which is based on a preference inference technique, can also be given an alternative semantics based on the well known Multi-Attribute Utility Theory. Our comprehensive experimental results on common multi-objective constraint optimization benchmarks demonstrate that the proposed enhancements allow the algorithms to scale up to much larger problems than before. For decision making problems under uncertainty, we describe multi-objective influence diagrams, based on a set of p objectives, where utility values are vectors in Rp, and are typically only partially ordered. These can be solved by a variable elimination algorithm, leading to a set of maximal values of expected utility. If the Pareto ordering is used this set can often be prohibitively large. We consider approximate representations of the Pareto set based on ϵ-coverings, allowing much larger problems to be solved. In addition, we define a method for incorporating user trade-offs, which also greatly improves the efficiency.
Resumo:
The influence of communication technology on group decision-making has been examined in many studies. But the findings are inconsistent. Some studies showed a positive effect on decision quality, other studies have shown that communication technology makes the decision even worse. One possible explanation for these different findings could be the use of different Group Decision Support Systems (GDSS) in these studies, with some GDSS better fitting to the given task than others and with different sets of functions. This paper outlines an approach with an information system solely designed to examine the effect of (1) anonymity, (2) voting and (3) blind picking on decision quality, discussion quality and perceived quality of information.
Resumo:
As more diagnostic testing options become available to physicians, it becomes more difficult to combine various types of medical information together in order to optimize the overall diagnosis. To improve diagnostic performance, here we introduce an approach to optimize a decision-fusion technique to combine heterogeneous information, such as from different modalities, feature categories, or institutions. For classifier comparison we used two performance metrics: The receiving operator characteristic (ROC) area under the curve [area under the ROC curve (AUC)] and the normalized partial area under the curve (pAUC). This study used four classifiers: Linear discriminant analysis (LDA), artificial neural network (ANN), and two variants of our decision-fusion technique, AUC-optimized (DF-A) and pAUC-optimized (DF-P) decision fusion. We applied each of these classifiers with 100-fold cross-validation to two heterogeneous breast cancer data sets: One of mass lesion features and a much more challenging one of microcalcification lesion features. For the calcification data set, DF-A outperformed the other classifiers in terms of AUC (p < 0.02) and achieved AUC=0.85 +/- 0.01. The DF-P surpassed the other classifiers in terms of pAUC (p < 0.01) and reached pAUC=0.38 +/- 0.02. For the mass data set, DF-A outperformed both the ANN and the LDA (p < 0.04) and achieved AUC=0.94 +/- 0.01. Although for this data set there were no statistically significant differences among the classifiers' pAUC values (pAUC=0.57 +/- 0.07 to 0.67 +/- 0.05, p > 0.10), the DF-P did significantly improve specificity versus the LDA at both 98% and 100% sensitivity (p < 0.04). In conclusion, decision fusion directly optimized clinically significant performance measures, such as AUC and pAUC, and sometimes outperformed two well-known machine-learning techniques when applied to two different breast cancer data sets.
Resumo:
Real decision makers exhibit significant shortcomings in the generation of objectives for decisions that they face. Prior research has illustrated the magnitude of this shortcoming but not its causes. In this paper, we identify two distinct impediments to the generation of decision objectives: not thinking broadly enough about the range of relevant objectives, and not thinking deeply enough to articulate every objective within the range that is considered. To test these explanations and explore ways of stimulating a more comprehensive set of objectives, we present three experiments involving a variety of interventions: the provision of sample objectives, organization of objectives by category, and direct challenges to do better, with or without a warning that important objectives are missing. The use of category names and direct challenges with a warning both led to improvements in the quantity of objectives generated without impacting their quality; other interventions yielded less improvement. We conclude by discussing the relevance of our findings to decision analysis and offering prescriptive implications for the elicitation of decision objectives. © 2010 INFORMS.
Resumo:
BACKGROUND: Microsporidia are obligate intracellular, eukaryotic pathogens that infect a wide range of animals from nematodes to humans, and in some cases, protists. The preponderance of evidence as to the origin of the microsporidia reveals a close relationship with the fungi, either within the kingdom or as a sister group to it. Recent phylogenetic studies and gene order analysis suggest that microsporidia share a particularly close evolutionary relationship with the zygomycetes. METHODOLOGY/PRINCIPAL FINDINGS: Here we expanded this analysis and also examined a putative sex-locus for variability between microsporidian populations. Whole genome inspection reveals a unique syntenic gene pair (RPS9-RPL21) present in the vast majority of fungi and the microsporidians but not in other eukaryotic lineages. Two other unique gene fusions (glutamyl-prolyl tRNA synthetase and ubiquitin-ribosomal subunit S30) that are present in metazoans, choanoflagellates, and filasterean opisthokonts are unfused in the fungi and microsporidians. One locus previously found to be conserved in many microsporidian genomes is similar to the sex locus of zygomycetes in gene order and architecture. Both sex-related and sex loci harbor TPT, HMG, and RNA helicase genes forming a syntenic gene cluster. We sequenced and analyzed the sex-related locus in 11 different Encephalitozoon cuniculi isolates and the sibling species E. intestinalis (3 isolates) and E. hellem (1 isolate). There was no evidence for an idiomorphic sex-related locus in this Encephalitozoon species sample. According to sequence-based phylogenetic analyses, the TPT and RNA helicase genes flanking the HMG genes are paralogous rather than orthologous between zygomycetes and microsporidians. CONCLUSION/SIGNIFICANCE: The unique genomic hallmarks between microsporidia and fungi are independent of sequence based phylogenetic comparisons and further contribute to define the borders of the fungal kingdom and support the classification of microsporidia as unusual derived fungi. And the sex/sex-related loci appear to have been subject to frequent gene conversion and translocations in microsporidia and zygomycetes.
Resumo:
Reproduction extracts a cost in resources that organisms are then unable to utilize to deal with a multitude of environmental stressors. In the nematode C. elegans, development of the germline shortens the lifespan of the animal and increases its susceptibility to microbial pathogens. Prior studies have demonstrated germline-deficient nematodes to have increased resistance to gram negative bacteria. We show that germline-deficient strains display increased resistance across a broad range of pathogens including gram positive and gram negative bacteria, and the fungal pathogen Cryptococcus neoformans. Furthermore, we show that the FOXO transcription factor DAF-16, which regulates longevity and immunity in C. elegans, appears to be crucial for maintaining longevity in both wild-type and germline-deficient backgrounds. Our studies indicate that germline-deficient mutants glp-1 and glp-4 respond to pathogen infection using common and different mechanisms that involve the activation of DAF-16.
Resumo:
In this study, we explored how adolescents in rural Kenya apply religious coping in sexual decision-making in the context of high rates of poverty and Human Immunodeficiency Virus (HIV). Semi-structured interviews were conducted with 34 adolescents. One-third (13) reported religious coping related to economic stress, HIV, or sexual decision-making; the majority (29) reported religious coping with these or other stressors. Adolescents reported praying for God to partner with them to engage in positive behaviors, praying for strength to resist unwanted behaviors, and passive strategies characterized by waiting for God to provide resources or protection from HIV. Adolescents in Sub-Saharan Africa may benefit from HIV prevention interventions that integrate and build upon their use of religious coping.
Resumo:
People often do not realize they are being influenced by an incidental emotional state. As a result, decisions based on a fleeting incidental emotion can become the basis for future decisions and hence outlive the original cause for the behavior (i.e., the emotion itself). Using a sequence of ultimatum and dictator games, we provide empirical evidence for the enduring impact of transient emotions on economic decision making. Behavioral consistency and false consensus are presented as potential underlying processes. © 2009 Elsevier Inc. All rights reserved.
Resumo:
Externalizing behavior problems of 124 adolescents were assessed across Grades 7-11. In Grade 9, participants were also assessed across social-cognitive domains after imagining themselves as the object of provocations portrayed in six videotaped vignettes. Participants responded to vignette-based questions representing multiple processes of the response decision step of social information processing. Phase 1 of our investigation supported a two-factor model of the response evaluation process of response decision (response valuation and outcome expectancy). Phase 2 showed significant relations between the set of these response decision processes, as well as response selection, measured in Grade 9 and (a) externalizing behavior in Grade 9 and (b) externalizing behavior in Grades 10-11, even after controlling externalizing behavior in Grades 7-8. These findings suggest that on-line behavioral judgments about aggression play a crucial role in the maintenance and growth of aggressive response tendencies in adolescence.