45 resultados para Extensible Pluggable Architecture Hydra Data
Resumo:
There are many situations where input feature vectors are incomplete and methods to tackle the problem have been studied for a long time. A commonly used procedure is to replace each missing value with an imputation. This paper presents a method to perform categorical missing data imputation from numerical and categorical variables. The imputations are based on Simpson’s fuzzy min-max neural networks where the input variables for learning and classification are just numerical. The proposed method extends the input to categorical variables by introducing new fuzzy sets, a new operation and a new architecture. The procedure is tested and compared with others using opinion poll data.
Resumo:
An approximate analytic model of a shared memory multiprocessor with a Cache Only Memory Architecture (COMA), the busbased Data Difussion Machine (DDM), is presented and validated. It describes the timing and interference in the system as a function of the hardware, the protocols, the topology and the workload. Model results have been compared to results from an independent simulator. The comparison shows good model accuracy specially for non-saturated systems, where the errors in response times and device utilizations are independent of the number of processors and remain below 10% in 90% of the simulations. Therefore, the model can be used as an average performance prediction tool that avoids expensive simulations in the design of systems with many processors.
Resumo:
The Web has witnessed an enormous growth in the amount of semantic information published in recent years. This growth has been stimulated to a large extent by the emergence of Linked Data. Although this brings us a big step closer to the vision of a Semantic Web, it also raises new issues such as the need for dealing with information expressed in different natural languages. Indeed, although the Web of Data can contain any kind of information in any language, it still lacks explicit mechanisms to automatically reconcile such information when it is expressed in different languages. This leads to situations in which data expressed in a certain language is not easily accessible to speakers of other languages. The Web of Data shows the potential for being extended to a truly multilingual web as vocabularies and data can be published in a language-independent fashion, while associated language-dependent (linguistic) information supporting the access across languages can be stored separately. In this sense, the multilingual Web of Data can be realized in our view as a layer of services and resources on top of the existing Linked Data infrastructure adding i) linguistic information for data and vocabularies in different languages, ii) mappings between data with labels in different languages, and iii) services to dynamically access and traverse Linked Data across different languages. In this article we present this vision of a multilingual Web of Data. We discuss challenges that need to be addressed to make this vision come true and discuss the role that techniques such as ontology localization, ontology mapping, and cross-lingual ontology-based information access and presentation will play in achieving this. Further, we propose an initial architecture and describe a roadmap that can provide a basis for the implementation of this vision.
Resumo:
La historia de la construcción de las catedrales góticas es la historia de la búsqueda de la luz. Esta afirmación casi metafísica, recoge una realidad asumida por todos los historiadores tanto de la arquitectura antigua como del resto de las artes. La luz en el gótico ha sido descrita bajo múltiples matices como son su carácter simbólico, cromático e incluso místico, sin embargo no existe, en el estudio del conocimiento de la luz gótica, ninguna referencia a la misma como realidad física cuantificable, cualificable y por tanto, clasificable. La presente tesis doctoral aborda el concepto de la iluminación gótica desde una perspectiva nueva. Demuestra, con un método analítico inédito, que la iluminación gótica es cuantificable y cualificable. Para ello analiza en profundidad la iluminación de una selección de 6 edificios muestra, las catedrales de Gerona, Toledo, Sevilla y León, la basílica de Santa María del Mar y la capilla de la Sainte Chapelle de París, mediante una toma de datos “in situ” de iluminación y su comparación con los datos lumínicos obtenidos por un programa de soleamiento de la simulación en tres dimensiones de los distintos proyectos originales góticos. El análisis exhaustivo de las muestras y su introducción en el método analítico descrito, permite determinar, en primer lugar, unas cualidades inéditas que identifican la luz de los espacios góticos según unos parámetros nuevos como son la intensidad, expresividad, recorrido, distorsión y color. También describe cuales son los factores determinantes, de nuevo inéditos, que modulan cada una de las cualidades y en que proporción lo hacen cada uno de ellos. Una vez establecidas las cualidades y los factores que las definen, la tesis doctoral establece los rangos en los que se mueven las distintas cualidades y que conformarán la definitiva clasificación según “tipos de cualidad lumínica”. Adicionalmente, la tesis propone un procedimiento abreviado de acercamiento a la realidad de la iluminación gótica a través de unas fórmulas matemáticas que relacionan los factores geométricos detectados y descritos en la tesis con el resultado luminoso del espacio en lo que concierne a las dos cualidades más importantes de las reflejadas, la intensidad y la expresividad. Gracias a este método y su procedimiento abreviado, la clasificación se hace extensible al resto de catedrales góticas del panorama español y europeo y abre el camino a nuevas clasificaciones de edificios históricos de distintas épocas, iniciando un apasionante camino por recorrer en la recuperación de “la luz original”. Esta clasificación y sus cualidades podrán a su vez, ser utilizadas como herramientas de conocimiento de un factor determinante a la hora de describir cualquier espacio gótico y su aportación pretende ser un nuevo condicionante a tener en cuenta en el futuro, ayudando a entender y respetar, en las posibles intervenciones a realizar sobre el patrimonio arquitectónico, aquello que fue en su inicio motor principal del proyecto arquitectónico y que hoy día no se valora suficientemente tan solo por falta de conocimiento: su luz. The history of the construction of the Gothic cathedrals is the history of the search for light. This almost etaphysical statement reflects a reality accepted by all historians both of ancient architecture and other arts. Light in the Gothic period has been described under multiple approaches such as its symbolic, chromatic and even mystical character. However, in the study of the Gothic light, no references exist to it as a physical quantifiable and qualifiable reality and therefore, classifiable. This dissertation deals with the concept of Gothic light from a new perspective. With a new analytical method, it shows that Gothic lighting is quantifiable and can be classified regarding quality. To this end, a selection of 6 buildings light samples are analyzed; the cathedrals of Gerona, Toledo, Seville and León, the basilica of Santa María of the Sea and the Sainte Chapelle in Paris. "In situ" lighting data is collected and it is compared with lighting data obtained by a program of sunlight of the 3D simulation of various Gothic original projects. The comprehensive analysis of the samples and the data introduced in the analytical method described, allows determining, first, important qualities that identify the light of Gothic spaces according to new parameters such as intensity, expressiveness, trajectory, distortion and color. It also describes the determinant factors, which modulate each of the qualities and in what proportion they do it. Once the qualities and factors that define them have been established, in this doctoral dissertation the ranges regarding different qualities are set, which will make up the final classification according to "types of light quality". In addition, this work proposes an abbreviated procedure approach to the reality of the Gothic lighting through some mathematical formulae, relating the geometric factors identified and described in the study with the bright result of space regarding the two most important qualities of the light,intensity and expressiveness. Thanks to this method and to the abbreviated procedure, the classification can be applied to other Spanish and European Gothic cathedrals and opens up the way to new classifications of historic buildings from different eras, starting an exciting road ahead in the recovery of the "original light". This classification and its qualities may in turn be used as tools to know a determinant factor when describing any Gothic space. Its contribution is intended to be a new conditioning factor to keep in mind in the future, helping to understand and respect, in possible interventions on the architectural heritage, what was the main engine to start the architectural project and which today is not valued enough due to the lack knowledge: the light.
Resumo:
The term "Logic Programming" refers to a variety of computer languages and execution models which are based on the traditional concept of Symbolic Logic. The expressive power of these languages offers promise to be of great assistance in facing the programming challenges of present and future symbolic processing applications in Artificial Intelligence, Knowledge-based systems, and many other areas of computing. The sequential execution speed of logic programs has been greatly improved since the advent of the first interpreters. However, higher inference speeds are still required in order to meet the demands of applications such as those contemplated for next generation computer systems. The execution of logic programs in parallel is currently considered a promising strategy for attaining such inference speeds. Logic Programming in turn appears as a suitable programming paradigm for parallel architectures because of the many opportunities for parallel execution present in the implementation of logic programs. This dissertation presents an efficient parallel execution model for logic programs. The model is described from the source language level down to an "Abstract Machine" level suitable for direct implementation on existing parallel systems or for the design of special purpose parallel architectures. Few assumptions are made at the source language level and therefore the techniques developed and the general Abstract Machine design are applicable to a variety of logic (and also functional) languages. These techniques offer efficient solutions to several areas of parallel Logic Programming implementation previously considered problematic or a source of considerable overhead, such as the detection and handling of variable binding conflicts in AND-Parallelism, the specification of control and management of the execution tree, the treatment of distributed backtracking, and goal scheduling and memory management issues, etc. A parallel Abstract Machine design is offered, specifying data areas, operation, and a suitable instruction set. This design is based on extending to a parallel environment the techniques introduced by the Warren Abstract Machine, which have already made very fast and space efficient sequential systems a reality. Therefore, the model herein presented is capable of retaining sequential execution speed similar to that of high performance sequential systems, while extracting additional gains in speed by efficiently implementing parallel execution. These claims are supported by simulations of the Abstract Machine on sample programs.
Resumo:
The conformance of semantic technologies has to be systematically evaluated to measure and verify the real adherence of these technologies to the Semantic Web standards. Currente valuations of semantic technology conformance are not exhaustive enough and do not directly cover user requirements and use scenarios, which raises the need for a simple, extensible and parameterizable method to generate test data for such evaluations. To address this need, this paper presents a keyword-driven approach for generating ontology language conformance test data that can be used to evaluate semantic technologies, details the definition of a test suite for evaluating OWL DL conformance using this approach,and describes the use and extension of this test suite during the evaluation of some tools.
Resumo:
This paper proposes an architecture, based on statistical machine translation, for developing the text normalization module of a text to speech conversion system. The main target is to generate a language independent text normalization module, based on data and flexible enough to deal with all situa-tions presented in this task. The proposed architecture is composed by three main modules: a tokenizer module for splitting the text input into a token graph (tokenization), a phrase-based translation module (token translation) and a post-processing module for removing some tokens. This paper presents initial exper-iments for numbers and abbreviations. The very good results obtained validate the proposed architecture.
Resumo:
In this work a novel wake-up architecture for wireless sensor nodes based on ultra low power FPGA is presented. A simple wake up messaging mechanism for data gathering applications is proposed. The main goal of this work is to evaluate the utilization of low power configurable devices to take advantage of their speed, flexibility and low power consumption compared with traditional approaches, based on ASICs or microcontrollers, for frame decoding and data control. A test bed based on infrared communications has been built to validate the messaging mechanism and the processing architecture.
Resumo:
The definition of an agent architecture at the knowledge level makes emphasis on the knowledge role played by the data interchanged between the agent components and makes explicit this data interchange this makes easier the reuse of these knowledge structures independently of the implementation This article defines a generic task model of an agent architecture and refines some of these tasks using the interference diagrams. Finally, a operationalisation of this conceptual model using the rule-oriented language Jess is shown. knowledge level,
Resumo:
The use of semantic and Linked Data technologies for Enterprise Application Integration (EAI) is increasing in recent years. Linked Data and Semantic Web technologies such as the Resource Description Framework (RDF) data model provide several key advantages over the current de-facto Web Service and XML based integration approaches. The flexibility provided by representing the data in a more versatile RDF model using ontologies enables avoiding complex schema transformations and makes data more accessible using Web standards, preventing the formation of data silos. These three benefits represent an edge for Linked Data-based EAI. However, work still has to be performed so that these technologies can cope with the particularities of the EAI scenarios in different terms, such as data control, ownership, consistency, or accuracy. The first part of the paper provides an introduction to Enterprise Application Integration using Linked Data and the requirements imposed by EAI to Linked Data technologies focusing on one of the problems that arise in this scenario, the coreference problem, and presents a coreference service that supports the use of Linked Data in EAI systems. The proposed solution introduces the use of a context that aggregates a set of related identities and mappings from the identities to different resources that reside in distinct applications and provide different views or aspects of the same entity. A detailed architecture of the Coreference Service is presented explaining how it can be used to manage the contexts, identities, resources, and applications which they relate to. The paper shows how the proposed service can be utilized in an EAI scenario using an example involving a dashboard that integrates data from different systems and the proposed workflow for registering and resolving identities. As most enterprise applications are driven by business processes and involve legacy data, the proposed approach can be easily incorporated into enterprise applications.
Resumo:
Cloud computing and, more particularly, private IaaS, is seen as a mature technology with a myriad solutions tochoose from. However, this disparity of solutions and products has instilled in potential adopters the fear of vendor and data lock-in. Several competing and incompatible interfaces and management styles have given even more voice to these fears. On top of this, cloud users might want to work with several solutions at the same time, an integration that is difficult to achieve in practice. In this paper, we propose a management architecture that tries to tackle these problems; it offers a common way of managing several cloud solutions, and an interface that can be tailored to the needs of the user. This management architecture is designed in a modular way, and using a generic information model. We have validated our approach through the implementation of the components needed for this architecture to support a sample private IaaS solution: OpenStack
Resumo:
The electrical power distribution and commercialization scenario is evolving worldwide, and electricity companies, faced with the challenge of new information requirements, are demanding IT solutions to deal with the smart monitoring of power networks. Two main challenges arise from data management and smart monitoring of power networks: real-time data acquisition and big data processing over short time periods. We present a solution in the form of a system architecture that conveys real time issues and has the capacity for big data management.
Resumo:
The recent continuous development of Cooperative ITS has resulted in several initiatives which focus on different parts of the Cooperative environment landscape. The FOTsis project focuses on the infrastructure side of the Cooperative environment and will deploy and test 7 services designed to maximise the benefits of the integration of the road operator and infrastructure-based information providers into the ITS environment. This integration can take place in any of the stages of data collection, processing and actuations of the services, but also support and trigger external tasks such as operations of the emergency response entities, etc. This paper describes the current status of the project and focuses on the specification of the supporting architecture to the services tested: references, a brief outline of the requirements’ definition, and the FOTsis architecture proposal, with some conclusions about the architecture tests conducted. The outlook on the project’s next steps is given in the last section of the paper.
Resumo:
Abstract This paper presents a new method to extract knowledge from existing data sets, that is, to extract symbolic rules using the weights of an Artificial Neural Network. The method has been applied to a neural network with special architecture named Enhanced Neural Network (ENN). This architecture improves the results that have been obtained with multilayer perceptron (MLP). The relationship among the knowledge stored in the weights, the performance of the network and the new implemented algorithm to acquire rules from the weights is explained. The method itself gives a model to follow in the knowledge acquisition with ENN.
Resumo:
Among the main features that are intended to become part of what can be expected from the Smart City, one of them should be an improved energy management system, in order to benefit from a healthier relation with the environment, minimize energy expenses, and offer dynamic market opportunities. A Smart Grid seems like a very suitable infrastructure for this objective, as it guarantees a two-way information flow that will provide the means for energy management enhancement. However, to obtain all the required information, another entity must care about all the devices required to gather the data. What is more, this entity must consider the lifespan of the devices within the Smart Grid—when they are turned on and off or when new appliances are added—along with the services that devices are able to provide. This paper puts forward SMArc—an acronym for semantic middleware architecture—as a middleware proposal for the Smart Grid, so as to process the collected data and use it to insulate applications from the complexity of the metering facilities and guarantee that any change that may happen at these lower levels will be updated for future actions in the system.