934 resultados para Graphical passwords
Resumo:
When performing data fusion, one often measures where targets were and then wishes to deduce where targets currently are. There has been recent research on the processing of such out-of-sequence data. This research has culminated in the development of a number of algorithms for solving the associated tracking problem. This paper reviews these different approaches in a common Bayesian framework and proposes an architecture that orthogonalises the data association and out-of-sequence problems such that any combination of solutions to these two problems can be used together. The emphasis is not on advocating one approach over another on the basis of computational expense, but rather on understanding the relationships among the algorithms so that any approximations made are explicit. Results for a multi-sensor scenario involving out-of-sequence data association are used to illustrate the utility of this approach in a specific context.
Resumo:
Predictive performance evaluation is a fundamental issue in design, development, and deployment of classification systems. As predictive performance evaluation is a multidimensional problem, single scalar summaries such as error rate, although quite convenient due to its simplicity, can seldom evaluate all the aspects that a complete and reliable evaluation must consider. Due to this, various graphical performance evaluation methods are increasingly drawing the attention of machine learning, data mining, and pattern recognition communities. The main advantage of these types of methods resides in their ability to depict the trade-offs between evaluation aspects in a multidimensional space rather than reducing these aspects to an arbitrarily chosen (and often biased) single scalar measure. Furthermore, to appropriately select a suitable graphical method for a given task, it is crucial to identify its strengths and weaknesses. This paper surveys various graphical methods often used for predictive performance evaluation. By presenting these methods in the same framework, we hope this paper may shed some light on deciding which methods are more suitable to use in different situations.
Resumo:
“Biosim” is a simulation software which works to simulate the harvesting system.This system is able to design a model for any logistic problem with the combination of several objects so that the artificial system can show the performance of an individual model. The system will also describe the efficiency, possibility to be chosen for real life application of that particular model. So, when any one wish to setup a logistic model like- harvesting system, in real life he/she may be noticed about the suitable prostitution for his plants and factories as well as he/she may get information about the least number of objects, total time to complete the task, total investment required for his model, total amount of noise produced for his establishment in advance. It will produce an advance over view for his model. But “Biosim” is quite slow .As it is an object based system, it takes long time to make its decision. Here the main task is to modify the system so that it can work faster than the previous. So, the main objective of this thesis is to reduce the load of “Biosim” by making some modification of the original system as well as to increase its efficiency. So that the whole system will be faster than the previous one and performs more efficiently when it will be applied in real life. Theconcept is to separate the execution part of ”Biosim” form its graphical engine and run this separated portion in a third generation language platform. C++ is chosenhere as this external platform. After completing the proposed system, results with different models have been observed. The results show that, for any type of plants of fields, for any number of trucks, the proposed system is faster than the original system. The proposed system takes at least 15% less time “Biosim”. The efficiency increase with the complexity of than the original the model. More complex the model, more efficient the proposed system is than original “Biosim”.Depending on the complexity of a model, the proposed system can be 56.53 % faster than the original “Biosim”.
Resumo:
Point pattern matching in Euclidean Spaces is one of the fundamental problems in Pattern Recognition, having applications ranging from Computer Vision to Computational Chemistry. Whenever two complex patterns are encoded by two sets of points identifying their key features, their comparison can be seen as a point pattern matching problem. This work proposes a single approach to both exact and inexact point set matching in Euclidean Spaces of arbitrary dimension. In the case of exact matching, it is assured to find an optimal solution. For inexact matching (when noise is involved), experimental results confirm the validity of the approach. We start by regarding point pattern matching as a weighted graph matching problem. We then formulate the weighted graph matching problem as one of Bayesian inference in a probabilistic graphical model. By exploiting the existence of fundamental constraints in patterns embedded in Euclidean Spaces, we prove that for exact point set matching a simple graphical model is equivalent to the full model. It is possible to show that exact probabilistic inference in this simple model has polynomial time complexity with respect to the number of elements in the patterns to be matched. This gives rise to a technique that for exact matching provably finds a global optimum in polynomial time for any dimensionality of the underlying Euclidean Space. Computational experiments comparing this technique with well-known probabilistic relaxation labeling show significant performance improvement for inexact matching. The proposed approach is significantly more robust under augmentation of the sizes of the involved patterns. In the absence of noise, the results are always perfect.
Resumo:
A constraint satisfaction problem is a classical artificial intelligence paradigm characterized by a set of variables (each variable with an associated domain of possible values), and a set of constraints that specify relations among subsets of these variables. Solutions are assignments of values to all variables that satisfy all the constraints. Many real world problems may be modelled by means of constraints. The range of problems that can use this representation is very diverse and embraces areas like resource allocation, scheduling, timetabling or vehicle routing. Constraint programming is a form of declarative programming in the sense that instead of specifying a sequence of steps to execute, it relies on properties of the solutions to be found, which are explicitly defined by constraints. The idea of constraint programming is to solve problems by stating constraints which must be satisfied by the solutions. Constraint programming is based on specialized constraint solvers that take advantage of constraints to search for solutions. The success and popularity of complex problem solving tools can be greatly enhanced by the availability of friendly user interfaces. User interfaces cover two fundamental areas: receiving information from the user and communicating it to the system; and getting information from the system and deliver it to the user. Despite its potential impact, adequate user interfaces are uncommon in constraint programming in general. The main goal of this project is to develop a graphical user interface that allows to, intuitively, represent constraint satisfaction problems. The idea is to visually represent the variables of the problem, their domains and the problem constraints and enable the user to interact with an adequate constraint solver to process the constraints and compute the solutions. Moreover, the graphical interface should be capable of configure the solver’s parameters and present solutions in an appealing interactive way. As a proof of concept, the developed application – GraphicalConstraints – focus on continuous constraint programming, which deals with real valued variables and numerical constraints (equations and inequalities). RealPaver, a state-of-the-art solver in continuous domains, was used in the application. The graphical interface supports all stages of constraint processing, from the design of the constraint network to the presentation of the end feasible space solutions as 2D or 3D boxes.
Resumo:
Web service-based application is an architectural style, where a collection of Web services communicate to each other to execute processes. With the popularity increase of Web service-based applications and since messages exchanged inside of this applications can be complex, we need tools to simplify the understanding of interrelationship among Web services. This work present a description of a graphical representation of Web service-based applications and the mechanisms inserted among Web service requesters and providers to catch information to represent an application. The major contribution of this paper is to discus and use HTTP and SOAP information to show a graphical representation similar to a UML sequence diagram of Web service-based applications.
Resumo:
We studied the succession of small mammal species after fire in the cerrado (Neotropical savanna) of Central Brazil. Populations of small mammals were sampled with live-trapping techniques in a series of nine sites of different successional age, ranging from 1 to 26 years after fire. Ten species of small mammals were captured through all the seral stages of succession. Species richness ranged from two to seven species by seral stage. The species were arranged in different groups with respect to abundance along the succession: the first was composed of early successional species that peaked <2 years after fire (Calomys callosus, C. tener, Thalpomys cerradensis, Mus musculus, Thylamys velutinus); the second occurred or peaked 2-3 years after fire (Necromys lasiurus, Gracilinanus sp., Oryzomys scoth). Gracilinanus agilis peaked in the last seral stage. Species richness of small mammals showed an abrupt decrease from an average of four species immediately after fire to two species 5-26 years after the last fire. We propose a simple graphical model to explain the pattern of species richness of small mammals after fire in the cerrado. This model assumes that the occurrence of species of small mammals is determined by habitat selection behavior by each species along a habitat gradient. The habitat gradient is defined as the ratio of cover of herbaceous to woody vegetation. The replacement of species results from a trade-off in habitat requirements for the two habitat variables.
Resumo:
Recently the CP trajectory diagram was introduced to demonstrate the difference between the intrinsic CP violating effects to those induced by matter for neutrino oscillation. In this Letter we introduce the T trajectory diagram. In these diagrams the probability for a given oscillation process is plotted versus the probability for the CP- or the T-conjugate processes, which forms an ellipse as the CP- or T-violating phase is varied. Since the CP- and the T-conjugate processes are related by CPT symmetry, even in the presence of matter, these two trajectory diagrams are closely related with each other and form a unified description of neutrino oscillations in matter. (C) 2002 Published by Elsevier B.V. B.V.
Resumo:
Conselho Nacional de Desenvolvimento Científico e Tecnológico (CNPq)
Resumo:
The increase of computing power of the microcomputers has stimulated the building of direct manipulation interfaces that allow graphical representation of Linear Programming (LP) models. This work discusses the components of such a graphical interface as the basis for a system to assist users in the process of formulating LP problems. In essence, this work proposes a methodology which considers the modelling task as divided into three stages which are specification of the Data Model, the Conceptual Model and the LP Model. The necessity for using Artificial Intelligence techniques in the problem conceptualisation and to help the model formulation task is illustrated.
Resumo:
In this article, we present quasiconformal mappings related to octonionic algebra. Based on the metric definition of quasiconformal mappings and using transformations of the type f(z)=zn, we compare the graphical and analytic results. © 2009 Pushpa Publishing House.
Resumo:
Research on the micro-structural characterization of metal-matrix composites uses X-ray computed tomography to collect information about the interior features of the samples, in order to elucidate their exhibited properties. The tomographic raw data needs several steps of computational processing in order to eliminate noise and interference. Our experience with a program (Tritom) that handles these questions has shown that in some cases the processing steps take a very long time and that it is not easy for a Materials Science specialist to interact with Tritom in order to define the most adequate parameter values and the proper sequence of the available processing steps. For easing the use of Tritom, a system was built which addresses the aspects described before and that is based on the OpenDX visualization system. OpenDX visualization facilities constitute a great benefit to Tritom. The visual programming environment of OpenDX allows an easy definition of a sequence of processing steps thus fulfilling the requirement of an easy use by non-specialists on Computer Science. Also the possibility of incorporating external modules in a visual OpenDX program allows the researchers to tackle the aspect of reducing the long execution time of some processing steps. The longer processing steps of Tritom have been parallelized in two different types of hardware architectures (message-passing and shared-memory); the corresponding parallel programs can be easily incorporated in a sequence of processing steps defined in an OpenDX program. The benefits of our system are illustrated through an example where the tool is applied in the study of the sensitivity to crushing – and the implications thereof – of the reinforcements used in a functionally graded syntactic metallic foam.
Resumo:
The photons scattered by the Compton effect can be used to characterize the physical properties of a given sample due to the influence that the electron density exerts on the number of scattered photons. However, scattering measurements involve experimental and physical factors that must be carefully analyzed to predict uncertainty in the detection of Compton photons. This paper presents a method for the optimization of the geometrical parameters of an experimental arrangement for Compton scattering analysis, based on its relations with the energy and incident flux of the X-ray photons. In addition, the tool enables the statistical analysis of the information displayed and includes the coefficient of variation (CV) measurement for a comparative evaluation of the physical parameters of the model established for the simulation. (C) 2012 Elsevier B.V. All rights reserved.