11 resultados para Systems integration
em Consorci de Serveis Universitaris de Catalunya (CSUC), Spain
Resumo:
Toni Prieto, Técnico IC del Servicio de Bibliotecas y Documentación (SBD) de la UPC, en su presentación 'Experiencias de interoperabilidad entre CRIS y repositorios en Catalunya', describió la integración del repositorio UPCommons y del CRIS DRAC (Descriptor de la Recerca i l'Activitat Acadèmica) de la UPC. El resultado de esta integración es un esquema integrado de archivo CRIS/IR en dos fases, envío y revisión, en el que los metadatos se introducen en DRAC -para posteriormente ser transferidos, validados y enriquecidos si procede- y el archivo de texto completo asociado se realiza en UPCommons. De manera similar funciona la integración de GIR (Gestió Integral de la Recerca, basado en Universitas XXI Investigación) y el repositorio O2 en la UOC, permitiendo la asignación del identificador handle de un ítem en O2 a una referencia en GIR. Ambos sistemas, DRAC en la UPC y GIR en la UOC, están integrados en el Proyecto CVN de generación de CVs normalizados. Se mencionaron asimismo experiencias posteriores de integración CRIS/IR actualmente en curso en la Universitat de Barcelona y en la U Pompeu Fabra, y se mostró el impacto significativo de la estrategia de integración de sistemas sobre el ritmo de incorporación de contenidos a UPCommons.
Resumo:
El trabajo ha consistido en una investigación sobre las metodologías de gestión de proyectos de integración de sistemas de información, con el objetivo de sintetizar una metodología específica alineada con el modelo de gestión de proyectos definido en el PMBOK y su aplicación para la definición de un proyecto de integración de sistemas de una fábrica.
Resumo:
Projecte de recerca elaborat a partir d’una estada al Auditing and Integration of Management Systems Research Laboratory de la Universitat d’Alberta, Canadà, des de maig fins a setembre del 2007. Aquest centre porta a terme recerca de caire teòrica i bàsica aplicada a l’assegurament de la qualitat, i més concretament a l’estandardització i integració de sistemes de gestió. En primer lloc, s’han analitzat les dades obtingudes en l’estudi empíric descriptiu realitzat a Catalunya durant l’any 2005, focalitzat en els estàndards de gestió més utilitzats per les empreses catalanes, en que s’hi incloïen les normes ISO 9001, ISO 14001, OSHAS 18001 així com els nous estàndards de suport de la sèrie ISO 10000. En segon terme, i a partir d’aquest anàlisis previ, s’ha iniciat el disseny d’una metodologia flexible per a la integració dels sistemes de gestió basats en estàndards internacionals.
Resumo:
In this paper, we consider the ATM networks in which the virtual path concept is implemented. The question of how to multiplex two or more diverse traffic classes while providing different quality of service requirements is a very complicated open problem. Two distinct options are available: integration and segregation. In an integration approach all the traffic from different connections are multiplexed onto one VP. This implies that the most restrictive QOS requirements must be applied to all services. Therefore, link utilization will be decreased because unnecessarily stringent QOS is provided to all connections. With the segregation approach the problem can be much simplified if different types of traffic are separated by assigning a VP with dedicated resources (buffers and links). Therefore, resources may not be efficiently utilized because no sharing of bandwidth can take place across the VP. The probability that the bandwidth required by the accepted connections exceeds the capacity of the link is evaluated with the probability of congestion (PC). Since the PC can be expressed as the CLP, we shall simply carry out bandwidth allocation using the PC. We first focus on the influence of some parameters (CLP, bit rate and burstiness) on the capacity required by a VP supporting a single traffic class using the new convolution approach. Numerical results are presented both to compare the required capacity and to observe which conditions under each approach are preferred
Resumo:
This paper presents the use of a mobile robot platform as an innovative educational tool in order to promote and integrate different curriculum knowledge. Hence, it is presented the acquired experience within a summer course named ldquoapplied mobile roboticsrdquo. The main aim of the course is to integrate different subjects as electronics, programming, architecture, perception systems, communications, control and trajectory planning by using the educational open mobile robot platform PRIM. The summer course is addressed to a wide range of student profiles. However, it is of special interests to the students of electrical and computer engineering around their final academic year. The summer course consists of the theoretical and laboratory sessions, related to the following topics: design & programming of electronic devices, modelling and control systems, trajectory planning and control, and computer vision systems. Therefore, the clues for achieving a renewed path of progress in robotics are the integration of several knowledgeable fields, such as computing, communications, and control sciences, in order to perform a higher level reasoning and use decision tools with strong theoretical base
Resumo:
Expert supervision systems are software applications specially designed to automate process monitoring. The goal is to reduce the dependency on human operators to assure the correct operation of a process including faulty situations. Construction of this kind of application involves an important task of design and development in order to represent and to manipulate process data and behaviour at different degrees of abstraction for interfacing with data acquisition systems connected to the process. This is an open problem that becomes more complex with the number of variables, parameters and relations to account for the complexity of the process. Multiple specialised modules tuned to solve simpler tasks that operate under a co-ordination provide a solution. A modular architecture based on concepts of software agents, taking advantage of the integration of diverse knowledge-based techniques, is proposed for this purpose. The components (software agents, communication mechanisms and perception/action mechanisms) are based on ICa (Intelligent Control architecture), software middleware supporting the build-up of applications with software agent features
Resumo:
Background: To enhance our understanding of complex biological systems like diseases we need to put all of the available data into context and use this to detect relations, pattern and rules which allow predictive hypotheses to be defined. Life science has become a data rich science with information about the behaviour of millions of entities like genes, chemical compounds, diseases, cell types and organs, which are organised in many different databases and/or spread throughout the literature. Existing knowledge such as genotype - phenotype relations or signal transduction pathways must be semantically integrated and dynamically organised into structured networks that are connected with clinical and experimental data. Different approaches to this challenge exist but so far none has proven entirely satisfactory. Results: To address this challenge we previously developed a generic knowledge management framework, BioXM™, which allows the dynamic, graphic generation of domain specific knowledge representation models based on specific objects and their relations supporting annotations and ontologies. Here we demonstrate the utility of BioXM for knowledge management in systems biology as part of the EU FP6 BioBridge project on translational approaches to chronic diseases. From clinical and experimental data, text-mining results and public databases we generate a chronic obstructive pulmonary disease (COPD) knowledge base and demonstrate its use by mining specific molecular networks together with integrated clinical and experimental data. Conclusions: We generate the first semantically integrated COPD specific public knowledge base and find that for the integration of clinical and experimental data with pre-existing knowledge the configuration based set-up enabled by BioXM reduced implementation time and effort for the knowledge base compared to similar systems implemented as classical software development projects. The knowledgebase enables the retrieval of sub-networks including protein-protein interaction, pathway, gene - disease and gene - compound data which are used for subsequent data analysis, modelling and simulation. Pre-structured queries and reports enhance usability; establishing their use in everyday clinical settings requires further simplification with a browser based interface which is currently under development.
Resumo:
The increasing volume of data describing humandisease processes and the growing complexity of understanding, managing, and sharing such data presents a huge challenge for clinicians and medical researchers. This paper presents the@neurIST system, which provides an infrastructure for biomedical research while aiding clinical care, by bringing together heterogeneous data and complex processing and computing services. Although @neurIST targets the investigation and treatment of cerebral aneurysms, the system’s architecture is generic enough that it could be adapted to the treatment of other diseases.Innovations in @neurIST include confining the patient data pertaining to aneurysms inside a single environment that offers cliniciansthe tools to analyze and interpret patient data and make use of knowledge-based guidance in planning their treatment. Medicalresearchers gain access to a critical mass of aneurysm related data due to the system’s ability to federate distributed informationsources. A semantically mediated grid infrastructure ensures that both clinicians and researchers are able to seamlessly access andwork on data that is distributed across multiple sites in a secure way in addition to providing computing resources on demand forperforming computationally intensive simulations for treatment planning and research.
Resumo:
The local thermodynamics of a system with long-range interactions in d dimensions is studied using the mean-field approximation. Long-range interactions are introduced through pair interaction potentials that decay as a power law in the interparticle distance. We compute the local entropy, Helmholtz free energy, and grand potential per particle in the microcanonical, canonical, and grand canonical ensembles, respectively. From the local entropy per particle we obtain the local equation of state of the system by using the condition of local thermodynamic equilibrium. This local equation of state has the form of the ideal gas equation of state, but with the density depending on the potential characterizing long-range interactions. By volume integration of the relation between the different thermodynamic potentials at the local level, we find the corresponding equation satisfied by the potentials at the global level. It is shown that the potential energy enters as a thermodynamic variable that modifies the global thermodynamic potentials. As a result, we find a generalized Gibbs-Duhem equation that relates the potential energy to the temperature, pressure, and chemical potential. For the marginal case where the power of the decaying interaction potential is equal to the dimension of the space, the usual Gibbs-Duhem equation is recovered. As examples of the application of this equation, we consider spatially uniform interaction potentials and the self-gravitating gas. We also point out a close relationship with the thermodynamics of small systems.
Resumo:
The main goal of the InterAmbAr reseach project is to analyze the relationships between landscape systems and human land-use strategies on mountains and littoral plains from a long-term perspective. The study adopts a high resolution analysis of small-scale study areas located in the Mediterranean region of north-eastern Catalonia. The study areas are distributed along an altitudinal transect from the high mountain (above 2000m a.s.l.) to the littoral plain of Empordà (Fig. 1). High resolution interdisciplinary research has been carried out from 2010, based on the integration of palaeoenvironmental and archaeological data. The micro-scale approach is used to understand human-environmental relationships. It allows better understanding of the local-regional nature of environmental changes and the synergies between catchment-based systems, hydro-sedimentary regimes, human mobility, land-uses, human environments, demography, etc.
Resumo:
Nowadays, Wireless Sensor Networks (WSN) arealready a very important data source to obtain data about the environment. Thus, they are key to the creation of Cyber-Physical Systems (CPS). Given the popularity of P2P middlewares as ameans to efficiently process information and distribute services, being able to integrate them to WSN¿s is an interesting proposal. JXTA is a widely used P2P middleware that allows peers to easily exchange information, heavily relying on its main architectural highlight, the capability to organize peers with common interests into peer groups. However, right now, approaches to integrate WSNs to a JXTA network seldom take advantage of peer groups. For this reason, in this paper we present jxSensor, an integrationlayer for sensor motes which facilitates the deployment of CPS¿s under this architecture. This integration has been done taking into account JXTA¿s idiosyncrasies and proposing novel ideas,such as the Virtual Peer, a group of sensors that acts as a single entity within the peer group context.