19 resultados para problem-solving-methods
Resumo:
The new computing paradigm known as cognitive computing attempts to imitate the human capabilities of learning, problem solving, and considering things in context. To do so, an application (a cognitive system) must learn from its environment (e.g., by interacting with various interfaces). These interfaces can run the gamut from sensors to humans to databases. Accessing data through such interfaces allows the system to conduct cognitive tasks that can support humans in decision-making or problem-solving processes. Cognitive systems can be integrated into various domains (e.g., medicine or insurance). For example, a cognitive system in cities can collect data, can learn from various data sources and can then attempt to connect these sources to provide real time optimizations of subsystems within the city (e.g., the transportation system). In this study, we provide a methodology for integrating a cognitive system that allows data to be verbalized, making the causalities and hypotheses generated from the cognitive system more understandable to humans. We abstract a city subsystem—passenger flow for a taxi company—by applying fuzzy cognitive maps (FCMs). FCMs can be used as a mathematical tool for modeling complex systems built by directed graphs with concepts (e.g., policies, events, and/or domains) as nodes and causalities as edges. As a verbalization technique we introduce the restriction-centered theory of reasoning (RCT). RCT addresses the imprecision inherent in language by introducing restrictions. Using this underlying combinatorial design, our approach can handle large data sets from complex systems and make the output understandable to humans.
Resumo:
Land degradation is intrinsically complex and involves decisions by many agencies and individuals, land degradation map- ping should be used as a learning tool through which managers, experts and stakeholders can re-examine their views within a wider semantic context. In this paper, we introduce an analytical framework for mapping land degradation, developed by World Overview for Conservation Approaches and technologies (WOCAT) programs, which aims to develop some thematic maps that serve as an useful tool and including effective information on land degradation and conservation status. Consequently, this methodology would provide an important background for decision-making in order to launch rehabilitation/remediation actions in high-priority intervention areas. As land degradation mapping is a problem-solving task that aims to provide clear information, this study entails the implementation of WOCAT mapping tool, which integrate a set of indicators to appraise the severity of land degradation across a representative watershed. So this work focuses on the use of the most relevant indicators for measuring impacts of different degradation processes in El Mkhachbiya catchment, situated in Northwest of Tunisia and those actions taken to deal with them based on the analysis of operating modes and issues of degradation in different land use systems. This study aims to provide a database for surveillance and monitoring of land degradation, in order to support stakeholders in making appropriate choices and judge guidelines and possible suitable recommendations to remedy the situation in order to promote sustainable development. The approach is illustrated through a case study of an urban watershed in Northwest of Tunisia. Results showed that the main land degradation drivers in the study area were related to natural processes, which were exacerbated by human activities. So the output of this analytical framework enabled a better communication of land degradation issues and concerns in a way relevant for policymakers.
Resumo:
Polymorbid patients, diverse diagnostic and therapeutic options, more complex hospital structures, financial incentives, benchmarking, as well as perceptional and societal changes put pressure on medical doctors, specifically if medical errors surface. This is particularly true for the emergency department setting, where patients face delayed or erroneous initial diagnostic or therapeutic measures and costly hospital stays due to sub-optimal triage. A "biomarker" is any laboratory tool with the potential better to detect and characterise diseases, to simplify complex clinical algorithms and to improve clinical problem solving in routine care. They must be embedded in clinical algorithms to complement and not replace basic medical skills. Unselected ordering of laboratory tests and shortcomings in test performance and interpretation contribute to diagnostic errors. Test results may be ambiguous with false positive or false negative results and generate unnecessary harm and costs. Laboratory tests should only be ordered, if results have clinical consequences. In studies, we must move beyond the observational reporting and meta-analysing of diagnostic accuracies for biomarkers. Instead, specific cut-off ranges should be proposed and intervention studies conducted to prove outcome relevant impacts on patient care. The focus of this review is to exemplify the appropriate use of selected laboratory tests in the emergency setting for which randomised-controlled intervention studies have proven clinical benefit. Herein, we focus on initial patient triage and allocation of treatment opportunities in patients with cardiorespiratory diseases in the emergency department. The following five biomarkers will be discussed: proadrenomedullin for prognostic triage assessment and site-of-care decisions, cardiac troponin for acute myocardial infarction, natriuretic peptides for acute heart failure, D-dimers for venous thromboembolism, C-reactive protein as a marker of inflammation, and procalcitonin for antibiotic stewardship in infections of the respiratory tract and sepsis. For these markers we provide an overview on physiopathology, historical evolution of evidence, strengths and limitations for a rational implementation into clinical algorithms. We critically discuss results from key intervention trials that led to their use in clinical routine and potential future indications. The rational for the use of all these biomarkers, first, tackle diagnostic ambiguity and consecutive defensive medicine, second, delayed and sub-optimal therapeutic decisions, and third, prognostic uncertainty with misguided triage and site-of-care decisions all contributing to the waste of our limited health care resources. A multifaceted approach for a more targeted management of medical patients from emergency admission to discharge including biomarkers, will translate into better resource use, shorter length of hospital stay, reduced overall costs, improved patients satisfaction and outcomes in terms of mortality and re-hospitalisation. Hopefully, the concepts outlined in this review will help the reader to improve their diagnostic skills and become more parsimonious laboratory test requesters.
Resumo:
This article examines the decision-making process leading to the new constitutional articles on education in Switzerland. It analyzes how actors from both state levels (Confederation and cantons) could reach consensus in a process that was prone to a "joint-decision trap". To that end, we hypothesize which factors may be conducive to a "problem-solving" style of policy-making in a compulsory negotiation system. Rich empirical material from various sources supports our theoretical arguments: We show that shared beliefs and a common frame of reference, the procedural separation between constitutional and distributional issues, neutral brokers, and informal structures were all beneficial to the success of the reform project.