888 resultados para nonlocal theories and models
Resumo:
Similar to classic Signal Detection Theory (SDT), recent optimal Binary Signal Detection Theory (BSDT) and based on it Neural Network Assembly Memory Model (NNAMM) can successfully reproduce Receiver Operating Characteristic (ROC) curves although BSDT/NNAMM parameters (intensity of cue and neuron threshold) and classic SDT parameters (perception distance and response bias) are essentially different. In present work BSDT/NNAMM optimal likelihood and posterior probabilities are analytically analyzed and used to generate ROCs and modified (posterior) mROCs, optimal overall likelihood and posterior. It is shown that for the description of basic discrimination experiments in psychophysics within the BSDT a ‘neural space’ can be introduced where sensory stimuli as neural codes are represented and decision processes are defined, the BSDT’s isobias curves can simultaneously be interpreted as universal psychometric functions satisfying the Neyman-Pearson objective, the just noticeable difference (jnd) can be defined and interpreted as an atom of experience, and near-neutral values of biases are observers’ natural choice. The uniformity or no-priming hypotheses, concerning the ‘in-mind’ distribution of false-alarm probabilities during ROC or overall probability estimations, is introduced. The BSDT’s and classic SDT’s sensitivity, bias, their ROC and decision spaces are compared.
Resumo:
Problems for intellectualisation for man-machine interface and methods of self-organization for network control in multi-agent infotelecommunication systems have been discussed. Architecture and principles for construction of network and neural agents for telecommunication systems of new generation have been suggested. Methods for adaptive and multi-agent routing for information flows by requests of external agents- users of global telecommunication systems and computer networks have been described.
Resumo:
A Quantified Autoepistemic Logic is axiomatized in a monotonic Modal Quantificational Logic whose modal laws are slightly stronger than S5. This Quantified Autoepistemic Logic obeys all the laws of First Order Logic and its L predicate obeys the laws of S5 Modal Logic in every fixed-point. It is proven that this Logic has a kernel not containing L such that L holds for a sentence if and only if that sentence is in the kernel. This result is important because it shows that L is superfluous thereby allowing the ori ginal equivalence to be simplified by eliminating L from it. It is also shown that the Kernel of Quantified Autoepistemic Logic is a generalization of Quantified Reflective Logic, which coincides with it in the propositional case.
Resumo:
The paper considers the use and the information support of the most important mathematical Application Packages (AP), such as Maple, Matlab, Mathcad, Mathematica, Statistica and SPSS – mostly used during Calculus tuition in Universities. The main features of the packages and the information support in the sites of the producers are outlined, as well as their capacity for work in Internet, together with educational sites and literature related to them. The most important resources of the TeX system for preparation of mathematical articles and documents are presented.
Resumo:
The paper considers some possible neuron mechanisms that do not contradict biological data. They are represented in terms of the notion of an elementary sensorium discussed in the previous authors’ works. Such mechanisms resolve problems of two large classes: when identification mechanisms are used and when sensory learning mechanisms are applied along with identification.
Resumo:
This work will explore and motivate perspectives and research issues related with the applications of automated planning technologies in order to support innovative web applications. The target for the technology transfer, i.e. the web, and, in a broader sense, the new Information Technologies (IT) is one of the most changing, evolving and hottest areas of current computer science. Nevertheless many sub-area in this field could have potential benefits from Planning and Scheduling (P&S) technologies, and, in some cases, technology transfer has already started. This paper will consider and explore a set of topics, guidelines and objectives in order to implement the technology transfer a new challenges, requirements and research issues for planning which emerge from the web and IT industry. Sample scenarios will be depicted to clarify the potential applications and limits of current planning technology. Finally we will point out some new P&S research challenge issues which are required to meet more advanced applicative goals.
Resumo:
The concept of knowledge is the central one used when solving the various problems of data mining and pattern recognition in finite spaces of Boolean or multi-valued attributes. A special form of knowledge representation, called implicative regularities, is proposed for applying in two powerful tools of modern logic: the inductive inference and the deductive inference. The first one is used for extracting the knowledge from the data. The second is applied when the knowledge is used for calculation of the goal attribute values. A set of efficient algorithms was developed for that, dealing with Boolean functions and finite predicates represented by logical vectors and matrices.
Resumo:
* The research work reviewed in this paper has been carried out in the context of the Russian Foundation for Basic Research funded project “Adaptable Intelligent Interfaces Research and Development for Distance Learning Systems”(grant N 02-01-81019). The authors wish to acknowledge the co-operation with the Byelorussian partners of this project.
Resumo:
Certain theoretical and methodological problems of designing real-time dynamical expert systems, which belong to the class of the most complex integrated expert systems, are discussed. Primary attention is given to the problems of designing subsystems for modeling the external environment in the case where the environment is represented by complex engineering systems. A specific approach to designing simulation models for complex engineering systems is proposed and examples of the application of this approach based on the G2 (Gensym Corp.) tool system are described.
Resumo:
* The presented work has discussed on the KDS-2003. It has corrected in compliance with remarks and requests of participants.
Resumo:
In article the problems of mutual adapting of the humans and computer environment are reviewed. Features of image-intuitive and physical-mathematical modes of perception and thinking are investigated. The problems of choice of means and methods of the differential education the computerized society are considered.
Resumo:
In the paper we consider the technology of new domain's ontologies development. We discuss main principles of ontology development, automatic methods of terms extraction from the domain texts and types of ontology relations.
Resumo:
On the basis of convolutional (Hamming) version of recent Neural Network Assembly Memory Model (NNAMM) for intact two-layer autoassociative Hopfield network optimal receiver operating characteristics (ROCs) have been derived analytically. A method of taking into account explicitly a priori probabilities of alternative hypotheses on the structure of information initiating memory trace retrieval and modified ROCs (mROCs, a posteriori probabilities of correct recall vs. false alarm probability) are introduced. The comparison of empirical and calculated ROCs (or mROCs) demonstrates that they coincide quantitatively and in this way intensities of cues used in appropriate experiments may be estimated. It has been found that basic ROC properties which are one of experimental findings underpinning dual-process models of recognition memory can be explained within our one-factor NNAMM.
Resumo:
The paper discusses facilities of computer systems for editing scientific and technical texts, which partially automate functions of human editor and thus help the writer to improve text quality. Two experimental systems LINAR and CONUT developed in 90s to control the quality of Russian scientific and technical texts are briefly described; and general principles for designing more powerful editing systems are pointed out. Features of an editing system being now under development are outlined, primarily the underlying linguistic knowledge base and procedures controlling the text.
Resumo:
In this paper RDPPLan, a model for planning with quantitative resources specified as numerical intervals, is presented. Nearly all existing models of planning with resources require to specify exact values for updating resources modified by actions execution. In other words these models cannot deal with more realistic situations in which the resources quantities are not completely known but are bounded by intervals. The RDPPlan model allow to manage domains more tailored to real world, where preconditions and effects over quantitative resources can be specified by intervals of values, in addition mixed logical/quantitative and pure numerical goals can be posed. RDPPlan is based on non directional search over a planning graph, like DPPlan, from which it derives, it uses propagation rules which have been appropriately extended to the management of resource intervals. The propagation rules extended with resources must verify invariant properties over the planning graph which have been proven by the authors and guarantee the correctness of the approach. An implementation of the RDPPlan model is described with search strategies specifically developed for interval resources.