965 resultados para Natural language processing (Computer science)
Resumo:
PHWAT is a new model that couples a geochemical reaction model (PHREEQC-2) with a density-dependent groundwater flow and solute transport model (SEAWAT) using the split-operator approach. PHWAT was developed to simulate multi-component reactive transport in variable density groundwater flow. Fluid density in PHWAT depends not on only the concentration of a single species as in SEAWAT, but also the concentrations of other dissolved chemicals that can be subject to reactive processes. Simulation results of PHWAT and PHREEQC-2 were compared in their predictions of effluent concentration from a column experiment. Both models produced identical results, showing that PHWAT has correctly coupled the sub-packages. PHWAT was then applied to the simulation of a tank experiment in which seawater intrusion was accompanied by cation exchange. The density dependence of the intrusion and the snow-plough effect in the breakthrough curves were reflected in the model simulations, which were in good agreement with the measured breakthrough data. Comparison simulations that, in turn, excluded density effects and reactions allowed us to quantify the marked effect of ignoring these processes. Next, we explored numerical issues involved in the practical application of PHWAT using the example of a dense plume flowing into a tank containing fresh water. It was shown that PHWAT could model physically unstable flow and that numerical instabilities were suppressed. Physical instability developed in the model in accordance with the increase of the modified Rayleigh number for density-dependent flow, in agreement with previous research. (c) 2004 Elsevier Ltd. All rights reserved.
Resumo:
The refinement calculus provides a framework for the stepwise development of imperative programs from specifications. In this paper we study a refinement calculus for deriving logic programs. Dealing with logic programs rather than imperative programs has the dual advantages that, due to the expressive power of logic programs, the final program is closer to the original specification, and each refinement step can achieve more. Together these reduce the overall number of derivation steps. We present a logic programming language extended with specification constructs (including general predicates, assertions, and types and invariants) to form a wide-spectrum language. General predicates allow non-executable properties to be included in specifications. Assertions, types and invariants make assumptions about the intended inputs of a procedure explicit, and can be used during refinement to optimize the constructed logic program. We provide a semantics for the extended logic programming language and derive a set of refinement laws. Finally we apply these to an example derivation.
Resumo:
In order to analyse the effect of modelling assumptions in a formal, rigorous way, a syntax of modelling assumptions has been defined. The syntax of modelling assumptions enables us to represent modelling assumptions as transformations acting on the set of model equations. The notion of syntactical correctness and semantical consistency of sets of modelling assumptions is defined and methods for checking them are described. It is shown on a simple example how different modelling assumptions act on the model equations and their effect on the differential index of the resulted model is also indicated.
Resumo:
Recent advances in computer technology have made it possible to create virtual plants by simulating the details of structural development of individual plants. Software has been developed that processes plant models expressed in a special purpose mini-language based on the Lindenmayer system formalism. These models can be extended from their architectural basis to capture plant physiology by integrating them with crop models, which estimate biomass production as a consequence of environmental inputs. Through this process, virtual plants will gain the ability to react to broad environmental conditions, while crop models will gain a visualisation component. This integration requires the resolution of the fundamentally different time scales underlying the approaches. Architectural models are usually based on physiological time; each time step encompasses the same amount of development in the plant, without regard to the passage of real time. In contrast, physiological models are based in real time; the amount of development in a time step is dependent on environmental conditions during the period. This paper provides a background on the plant modelling language, then describes how widely-used concepts of thermal time can be implemented to resolve these time scale differences. The process is illustrated using a case study. (C) 1997 Elsevier Science Ltd.
Resumo:
In this work, we take advantage of association rule mining to support two types of medical systems: the Content-based Image Retrieval (CBIR) systems and the Computer-Aided Diagnosis (CAD) systems. For content-based retrieval, association rules are employed to reduce the dimensionality of the feature vectors that represent the images and to improve the precision of the similarity queries. We refer to the association rule-based method to improve CBIR systems proposed here as Feature selection through Association Rules (FAR). To improve CAD systems, we propose the Image Diagnosis Enhancement through Association rules (IDEA) method. Association rules are employed to suggest a second opinion to the radiologist or a preliminary diagnosis of a new image. A second opinion automatically obtained can either accelerate the process of diagnosing or to strengthen a hypothesis, increasing the probability of a prescribed treatment be successful. Two new algorithms are proposed to support the IDEA method: to pre-process low-level features and to propose a preliminary diagnosis based on association rules. We performed several experiments to validate the proposed methods. The results indicate that association rules can be successfully applied to improve CBIR and CAD systems, empowering the arsenal of techniques to support medical image analysis in medical systems. (C) 2009 Elsevier B.V. All rights reserved.
Resumo:
In this paper, we propose a method based on association rule-mining to enhance the diagnosis of medical images (mammograms). It combines low-level features automatically extracted from images and high-level knowledge from specialists to search for patterns. Our method analyzes medical images and automatically generates suggestions of diagnoses employing mining of association rules. The suggestions of diagnosis are used to accelerate the image analysis performed by specialists as well as to provide them an alternative to work on. The proposed method uses two new algorithms, PreSAGe and HiCARe. The PreSAGe algorithm combines, in a single step, feature selection and discretization, and reduces the mining complexity. Experiments performed on PreSAGe show that this algorithm is highly suitable to perform feature selection and discretization in medical images. HiCARe is a new associative classifier. The HiCARe algorithm has an important property that makes it unique: it assigns multiple keywords per image to suggest a diagnosis with high values of accuracy. Our method was applied to real datasets, and the results show high sensitivity (up to 95%) and accuracy (up to 92%), allowing us to claim that the use of association rules is a powerful means to assist in the diagnosing task.
Resumo:
Uncontrolled systems (x) over dot is an element of Ax, where A is a non-empty compact set of matrices, and controlled systems (x) over dot is an element of Ax + Bu are considered. Higher-order systems 0 is an element of Px - Du, where and are sets of differential polynomials, are also studied. It is shown that, under natural conditions commonly occurring in robust control theory, with some mild additional restrictions, asymptotic stability of differential inclusions is guaranteed. The main results are variants of small-gain theorems and the principal technique used is the Krasnosel'skii-Pokrovskii principle of absence of bounded solutions.
Resumo:
In the design of lattice domes, design engineers need expertise in areas such as configuration processing, nonlinear analysis, and optimization. These are extensive numerical, iterative, and lime-consuming processes that are prone to error without an integrated design tool. This article presents the application of a knowledge-based system in solving lattice-dome design problems. An operational prototype knowledge-based system, LADOME, has been developed by employing the combined knowledge representation approach, which uses rules, procedural methods, and an object-oriented blackboard concept. The system's objective is to assist engineers in lattice-dome design by integrating all design tasks into a single computer-aided environment with implementation of the knowledge-based system approach. For system verification, results from design examples are presented.
Resumo:
In this work, we present a systematic approach to the representation of modelling assumptions. Modelling assumptions form the fundamental basis for the mathematical description of a process system. These assumptions can be translated into either additional mathematical relationships or constraints between model variables, equations, balance volumes or parameters. In order to analyse the effect of modelling assumptions in a formal, rigorous way, a syntax of modelling assumptions has been defined. The smallest indivisible syntactical element, the so called assumption atom has been identified as a triplet. With this syntax a modelling assumption can be described as an elementary assumption, i.e. an assumption consisting of only an assumption atom or a composite assumption consisting of a conjunction of elementary assumptions. The above syntax of modelling assumptions enables us to represent modelling assumptions as transformations acting on the set of model equations. The notion of syntactical correctness and semantical consistency of sets of modelling assumptions is defined and necessary conditions for checking them are given. These transformations can be used in several ways and their implications can be analysed by formal methods. The modelling assumptions define model hierarchies. That is, a series of model families each belonging to a particular equivalence class. These model equivalence classes can be related to primal assumptions regarding the definition of mass, energy and momentum balance volumes and to secondary and tiertinary assumptions regarding the presence or absence and the form of mechanisms within the system. Within equivalence classes, there are many model members, these being related to algebraic model transformations for the particular model. We show how these model hierarchies are driven by the underlying assumption structure and indicate some implications on system dynamics and complexity issues. (C) 2001 Elsevier Science Ltd. All rights reserved.
Resumo:
This paper presents a method of formally specifying, refining and verifying concurrent systems which uses the object-oriented state-based specification language Object-Z together with the process algebra CSP. Object-Z provides a convenient way of modelling complex data structures needed to define the component processes of such systems, and CSP enables the concise specification of process interactions. The basis of the integration is a semantics of Object-Z classes identical to that of CSP processes. This allows classes specified in Object-Z to he used directly within the CSP part of the specification. In addition to specification, we also discuss refinement and verification in this model. The common semantic basis enables a unified method of refinement to be used, based upon CSP refinement. To enable state-based techniques to be used fur the Object-Z components of a specification we develop state-based refinement relations which are sound and complete with respect to CSP refinement. In addition, a verification method for static and dynamic properties is presented. The method allows us to verify properties of the CSP system specification in terms of its component Object-Z classes by using the laws of the the CSP operators together with the logic for Object-Z.
Resumo:
This paper reviews current research and contemporary theories of subcortical participation in the motor control of speech production and language processing. As a necessary precursor to the discussion of the functional roles of the basal ganglia and thalamus, the neuroanatomy of the basal ganglial-thalamocortical circuitry is described. Contemporary models of hypokinetic and hyperkinetic movement disorders based on recent neuroanatomical descriptions of the multi-segmented circuits that characterise basal ganglion anatomy are described. Reported effects of surgically induced lesions in the globus pallidus and thalamus on speech production are reviewed. In addition, contemporary models proposed to explain the possible contribution of various subcortical structures to language processing are described and discussed in the context of evidence gained from observation of the effects of circumscribed surgically induced lesions in the basal ganglia and thalamus on language function. The potential of studies based on examination of the speech/language outcomes of patients undergoing pallidotomy and thalamotomy to further inform the debate relating to the role of subcortical structures in speech motor control and language processing is highlighted. Copyright (C) 2001 S. Karger AG, Basel.
Resumo:
Formulations of fuzzy integral equations in terms of the Aumann integral do not reflect the behavior of corresponding crisp models. Consequently, they are ill-adapted to describe physical phenomena, even when vagueness and uncertainty are present. A similar situation for fuzzy ODEs has been obviated by interpretation in terms of families of differential inclusions. The paper extends this formalism to fuzzy integral equations and shows that the resulting solution sets and attainability sets are fuzzy and far better descriptions of uncertain models involving integral equations. The investigation is restricted to Volterra type equations with mildly restrictive conditions, but the methods are capable of extensive generalization to other types and more general assumptions. The results are illustrated by integral equations relating to control models with fuzzy uncertainties.
Resumo:
At the core of the analysis task in the development process is information systems requirements modelling, Modelling of requirements has been occurring for many years and the techniques used have progressed from flowcharting through data flow diagrams and entity-relationship diagrams to object-oriented schemas today. Unfortunately, researchers have been able to give little theoretical guidance only to practitioners on which techniques to use and when. In an attempt to address this situation, Wand and Weber have developed a series of models based on the ontological theory of Mario Bunge-the Bunge-Wand-Weber (BWW) models. Two particular criticisms of the models have persisted however-the understandability of the constructs in the BWW models and the difficulty in applying the models to a modelling technique. This paper addresses these issues by presenting a meta model of the BWW constructs using a meta language that is familiar to many IS professionals, more specific than plain English text, but easier to understand than the set-theoretic language of the original BWW models. Such a meta model also facilitates the application of the BWW theory to other modelling techniques that have similar meta models defined. Moreover, this approach supports the identification of patterns of constructs that might be common across meta models for modelling techniques. Such findings are useful in extending and refining the BWW theory. (C) 2002 Elsevier Science Ltd. All rights reserved.
Resumo:
Management are keen to maximize the life span of an information system because of the high cost, organizational disruption, and risk of failure associated with the re-development or replacement of an information system. This research investigates the effects that various factors have on an information system's life span by understanding how the factors affect an information system's stability. The research builds on a previously developed two-stage model of information system change whereby an information system is either in a stable state of evolution in which the information system's functionality is evolving, or in a state of revolution, in which the information system is being replaced because it is not providing the functionality expected by its users. A case study surveyed a number of systems within one organization. The aim was to test whether a relationship existed between the base value of the volatility index (a measure of the stability of an information system) and certain system characteristics. Data relating to some 3000 user change requests covering 40 systems over a 10-year period were obtained. The following factors were hypothesized to have significant associations with the base value of the volatility index: language level (generation of language of construction), system size, system age, and the timing of changes applied to a system. Significant associations were found in the hypothesized directions except that the timing of user changes was not associated with any change in the value of the volatility index. Copyright (C) 2002 John Wiley Sons, Ltd.