977 resultados para Expert System Shell
Resumo:
Human tremor can be defined as a somewhat rhythmic and quick movement of one or more body parts. In some people, it is a symptom of a neurological disorder. From the mathematical point of view, human tremor can be defined as a weighted contribution of different sinusoidal signals which causes oscillations of some parts of the body. This sinusoidal is repeated over time, but its amplitude and frequency change slowly. This is why amplitude and frequency are considered important factors in the tremor characterization, and thus for its diagnosis. In this paper, a tool for the prediagnosis of the human tremor is presented. This tool uses a low cost device (<$40) and allows to compute the main factors of the human tremor accurately. Real cases have been tested using the algorithms developed in this investigation. The patients suffered from different tremor severities, and the components of amplitude and frequency were computed using a series of tests. These additional measures will help the experts to make better diagnoses allowing them to focus on specific stages of the test or get an overview of these tests. From the experimental, we stated that not all tests are valid for every patient to give a diagnosis. Guided by years of experience, the expert will decide which test or set of tests are the most appropriate for a patient.
Resumo:
Femicide, defined as the killings of females by males because they are females, is becoming recognized worldwide as an important ongoing manifestation of gender inequality. Despite its high prevalence or widespread prevalence, only a few countries have specific registries about this issue. This study aims to assemble expert opinion regarding the strategies which might feasibly be employed to promote, develop and implement an integrated and differentiated femicide data collection system in Europe at both the national and international levels. Concept mapping methodology was followed, involving 28 experts from 16 countries in generating strategies, sorting and rating them with respect to relevance and feasibility. The experts involved were all members of the EU-Cost-Action on femicide, which is a scientific network of experts on femicide and violence against women across Europe. As a result, a conceptual map emerged, consisting of 69 strategies organized in 10 clusters, which fit into two domains: “Political action” and “Technical steps”. There was consensus among participants regarding the high relevance of strategies to institutionalize national databases and raise public awareness through different stakeholders, while strategies to promote media involvement were identified as the most feasible. Differences in perceived priorities according to the level of human development index of the experts’ countries were also observed.
Resumo:
Marine invertebrates with open circulatory system establish low and constant oxygen partial pressure (Po2) around their tissues. We hypothesized that as a first step towards maintenance of low haemolymph and tissue oxygenation, the Po2 in molluscan mantle cavity water should be lowered against normoxic (21 kPa) seawater Po2, but balanced high enough to meet the energetic requirements in a given species. We recorded Po2 in mantle cavity water of five molluscan species with different lifestyles, two pectinids (Aequipecten opercularis, Pecten maximus), two mud clams (Arctica islandica, Mya arenaria), and a limpet (Patella vulgata). All species maintain mantle cavity water oxygenation below normoxic Po2. Average mantle cavity water Po2 correlates positively with standard metabolic rate (SMR): highest in scallops and lowest in mud clams. Scallops show typical Po2 frequency distribution, with peaks between 3 and 10 kPa, whereas mud clams and limpets maintain mantle water Po2 mostly <5 kPa. Only A. islandica and P. vulgata display distinguishable temporal patterns in Po2 time series. Adjustment of mantle cavity Po2 to lower than ambient levels through controlled pumping prevents high oxygen gradients between bivalve tissues and surrounding fluid, limiting oxygen flux across the body surface. The patterns of Po2 in mantle cavity water correspond to molluscan ecotypes.
Resumo:
"Shell Development Company."
Resumo:
Vol. 3 of his "New and accurate system of natural history."
Resumo:
Vol. III; of his new and accurate system of natural history.
Resumo:
Assessments for assigning the conservation status of threatened species that are based purely on subjective judgements become problematic because assessments can be influenced by hidden assumptions, personal biases and perceptions of risks, making the assessment process difficult to repeat. This can result in inconsistent assessments and misclassifications, which can lead to a lack of confidence in species assessments. It is almost impossible to Understand an expert's logic or visualise the underlying reasoning behind the many hidden assumptions used throughout the assessment process. In this paper, we formalise the decision making process of experts, by capturing their logical ordering of information, their assumptions and reasoning, and transferring them into a set of decisions rules. We illustrate this through the process used to evaluate the conservation status of species under the NatureServe system (Master, 1991). NatureServe status assessments have been used for over two decades to set conservation priorities for threatened species throughout North America. We develop a conditional point-scoring method, to reflect the current subjective process. In two test comparisons, 77% of species' assessments using the explicit NatureServe method matched the qualitative assessments done subjectively by NatureServe staff. Of those that differed, no rank varied by more than one rank level under the two methods. In general, the explicit NatureServe method tended to be more precautionary than the subjective assessments. The rank differences that emerged from the comparisons may be due, at least in part, to the flexibility of the qualitative system, which allows different factors to be weighted on a species-by-species basis according to expert judgement. The method outlined in this study is the first documented attempt to explicitly define a transparent process for weighting and combining factors under the NatureServe system. The process of eliciting expert knowledge identifies how information is combined and highlights any inconsistent logic that may not be obvious in Subjective decisions. The method provides a repeatable, transparent, and explicit benchmark for feedback, further development, and improvement. (C) 2004 Elsevier SAS. All rights reserved.
Resumo:
Power systems are large scale nonlinear systems with high complexity. Various optimization techniques and expert systems have been used in power system planning. However, there are always some factors that cannot be quantified, modeled, or even expressed by expert systems. Moreover, such planning problems are often large scale optimization problems. Although computational algorithms that are capable of handling large dimensional problems can be used, the computational costs are still very high. To solve these problems, in this paper, investigation is made to explore the efficiency and effectiveness of combining mathematic algorithms with human intelligence. It had been discovered that humans can join the decision making progresses by cognitive feedback. Based on cognitive feedback and genetic algorithm, a new algorithm called cognitive genetic algorithm is presented. This algorithm can clarify and extract human's cognition. As an important application of this cognitive genetic algorithm, a practical decision method for power distribution system planning is proposed. By using this decision method, the optimal results that satisfy human expertise can be obtained and the limitations of human experts can be minimized in the mean time.
Resumo:
Aim: To present an evidence-based framework to improve the quality of occupational therapy expert opinions on work capacity for litigation, compensation and insurance purposes. Methods: Grounded theory methodology was used to collect and analyse data from a sample of 31 participants, comprising 19 occupational therapists, 6 medical specialists and 6 lawyers. A focused semistructured interview was completed with each participant. In addition, 20 participants verified the key findings. Results: The framework is contextualised within a medicolegal system requiring increasing expertise. The framework consists of (i) broad professional development strategies and principles, and (ii) specific strategies and principles for improving opinions through reporting and assessment practices. Conclusions: The synthesis of the participants' recommendations provides systematic guidelines for improving occupational therapy expert opinion on work capacity.
Resumo:
The purpose of this research is to propose a procurement system across other disciplines and retrieved information with relevant parties so as to have a better co-ordination between supply and demand sides. This paper demonstrates how to analyze the data with an agent-based procurement system (APS) to re-engineer and improve the existing procurement process. The intelligence agents take the responsibility of searching the potential suppliers, negotiation with the short-listed suppliers and evaluating the performance of suppliers based on the selection criteria with mathematical model. Manufacturing firms and trading companies spend more than half of their sales dollar in the purchase of raw material and components. Efficient data collection with high accuracy is one of the key success factors to generate quality procurement which is to purchasing right material at right quality from right suppliers. In general, the enterprises spend a significant amount of resources on data collection and storage, but too little on facilitating data analysis and sharing. To validate the feasibility of the approach, a case study on a manufacturing small and medium-sized enterprise (SME) has been conducted. APS supports the data and information analyzing technique to facilitate the decision making such that the agent can enhance the negotiation and suppler evaluation efficiency by saving time and cost.
Resumo:
In order to survive in the increasingly customer-oriented marketplace, continuous quality improvement marks the fastest growing quality organization’s success. In recent years, attention has been focused on intelligent systems which have shown great promise in supporting quality control. However, only a small number of the currently used systems are reported to be operating effectively because they are designed to maintain a quality level within the specified process, rather than to focus on cooperation within the production workflow. This paper proposes an intelligent system with a newly designed algorithm and the universal process data exchange standard to overcome the challenges of demanding customers who seek high-quality and low-cost products. The intelligent quality management system is equipped with the ‘‘distributed process mining” feature to provide all levels of employees with the ability to understand the relationships between processes, especially when any aspect of the process is going to degrade or fail. An example of generalized fuzzy association rules are applied in manufacturing sector to demonstrate how the proposed iterative process mining algorithm finds the relationships between distributed process parameters and the presence of quality problems.
Resumo:
When constructing and using environmental models, it is typical that many of the inputs to the models will not be known perfectly. In some cases, it will be possible to make observations, or occasionally physics-based uncertainty propagation, to ascertain the uncertainty on these inputs. However, such observations are often either not available or even possible, and another approach to characterising the uncertainty on the inputs must be sought. Even when observations are available, if the analysis is being carried out within a Bayesian framework then prior distributions will have to be specified. One option for gathering or at least estimating this information is to employ expert elicitation. Expert elicitation is well studied within statistics and psychology and involves the assessment of the beliefs of a group of experts about an uncertain quantity, (for example an input / parameter within a model), typically in terms of obtaining a probability distribution. One of the challenges in expert elicitation is to minimise the biases that might enter into the judgements made by the individual experts, and then to come to a consensus decision within the group of experts. Effort is made in the elicitation exercise to prevent biases clouding the judgements through well-devised questioning schemes. It is also important that, when reaching a consensus, the experts are exposed to the knowledge of the others in the group. Within the FP7 UncertWeb project (http://www.uncertweb.org/), there is a requirement to build a Webbased tool for expert elicitation. In this paper, we discuss some of the issues of building a Web-based elicitation system - both the technological aspects and the statistical and scientific issues. In particular, we demonstrate two tools: a Web-based system for the elicitation of continuous random variables and a system designed to elicit uncertainty about categorical random variables in the setting of landcover classification uncertainty. The first of these examples is a generic tool developed to elicit uncertainty about univariate continuous random variables. It is designed to be used within an application context and extends the existing SHELF method, adding a web interface and access to metadata. The tool is developed so that it can be readily integrated with environmental models exposed as web services. The second example was developed for the TREES-3 initiative which monitors tropical landcover change through ground-truthing at confluence points. It allows experts to validate the accuracy of automated landcover classifications using site-specific imagery and local knowledge. Experts may provide uncertainty information at various levels: from a general rating of their confidence in a site validation to a numerical ranking of the possible landcover types within a segment. A key challenge in the web based setting is the design of the user interface and the method of interacting between the problem owner and the problem experts. We show the workflow of the elicitation tool, and show how we can represent the final elicited distributions and confusion matrices using UncertML, ready for integration into uncertainty enabled workflows.We also show how the metadata associated with the elicitation exercise is captured and can be referenced from the elicited result, providing crucial lineage information and thus traceability in the decision making process.