942 resultados para systems integration


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Urban mobility is one of the main challenges facing urban areas due to the growing population and to traffic congestion, resulting in environmental pressures. The pathway to urban sustainable mobility involves strengthening of intermodal mobility. The integrated use of different transport modes is getting more and more important and intermodality has been mentioned as a way for public transport compete with private cars. The aim of the current dissertation is to define a set of strategies to improve urban mobility in Lisbon and by consequence reduce the environmental impacts of transports. In order to do that several intermodal practices over Europe were analysed and the transport systems of Brussels and Lisbon were studied and compared, giving special attention to intermodal systems. In the case study was gathered data from both cities in the field, by using and observing the different transport modes, and two surveys were done to the cities users. As concluded by the study, Brussels and Lisbon present significant differences. In Brussels the measures to promote intermodality are evident, while in Lisbon a lot still needs to be done. It also made clear the necessity for improvements in Lisbon’s public transports to a more intermodal passenger transport system, through integration of different transport modes and better information and ticketing system. Some of the points requiring developments are: interchanges’ waiting areas; integration of bicycle in public transport; information about correspondences with other transport modes; real-time information to passengers pre-trip and on-trip, especially in buses and trams. After the identification of the best practices in Brussels and the weaknesses in Lisbon the possibility of applying some of the practices in Brussels to Lisbon was evaluated. Brussels demonstrated to be a good example of intermodality and for that reason some of the recommendations to improve intermodal mobility in Lisbon can follow the practices in place in Brussels.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Usually, data warehousing populating processes are data-oriented workflows composed by dozens of granular tasks that are responsible for the integration of data coming from different data sources. Specific subset of these tasks can be grouped on a collection together with their relationships in order to form higher- level constructs. Increasing task granularity allows for the generalization of processes, simplifying their views and providing methods to carry out expertise to new applications. Well-proven practices can be used to describe general solutions that use basic skeletons configured and instantiated according to a set of specific integration requirements. Patterns can be applied to ETL processes aiming to simplify not only a possible conceptual representation but also to reduce the gap that often exists between two design perspectives. In this paper, we demonstrate the feasibility and effectiveness of an ETL pattern-based approach using task clustering, analyzing a real world ETL scenario through the definitions of two commonly used clusters of tasks: a data lookup cluster and a data conciliation and integration cluster.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This article compiles the main topics addressed by management systems (MSs) literature concerningMSs integration by performing a systematic literature review. In this paper, it is intended to present themain limitations of non-integratedmanagement systems (IMSs), the main motivations driving an IMS implementation, the major resistances faced, the most common resultant benefits, the suitable guidelines and standards and the critical success factors. In addition, this paper addresses the issues concerning integration strategies and models, the integration levels or degrees achieved by an IMS and the audit function in an integrated context. The motivations that drive companies to integrate their management subsystems, the obstacles faced and the benefits collected may have internal or external origins. The publishing of standards guiding companies on how to integrate their management subsystems has been done mainly at a national level. There are several models that could be used in order to support companies in their management subsystems integration processes, and a sequential or an all-in strategy may be adopted. Four audit typologies can be distinguished, and the adoption of any of these typologies should consider resource availability and audit team know-how, among other features.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Tese de Doutoramento - Programa Doutoral em Engenharia Industrial e Sistemas (PDEIS)

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Undergraduate medical education is moving from traditional disciplinary basic science courses into more integrated curricula. Integration models based on organ systems originated in the 1950s, but few longitudinal studies have evaluated their effectiveness. This article outlines the development and implementation of the Organic and Functional Systems (OFS) courses at the University of Minho in Portugal, using evidence collected over 10 years. It describes the organization of content, student academic performance and acceptability of the courses, the evaluation of preparedness for future courses and the retention of knowledge on basic sciences. Students consistently rated the OFS courses highly. Physician tutors in subsequent clinical attachments considered that students were appropriately prepared. Performance in the International Foundations of Medicine examination of a self-selected sample of students revealed similar performances in basic science items after the last OFS course and 4 years later, at the moment of graduation. In conclusion, the organizational and pedagogical approaches of the OFS courses achieve high acceptability by students and result in positive outcomes in terms of preparedness for subsequent training and long-term retention of basic science knowledge.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This is a study of a state of the art implementation of a new computer integrated testing (CIT) facility within a company that designs and manufactures transport refrigeration systems. The aim was to use state of the art hardware, software and planning procedures in the design and implementation of three CIT systems. Typical CIT system components include data acquisition (DAQ) equipment, application and analysis software, communication devices, computer-based instrumentation and computer technology. It is shown that the introduction of computer technology into the area of testing can have a major effect on such issues as efficiency, flexibility, data accuracy, test quality, data integrity and much more. Findings reaffirm how the overall area of computer integration continues to benefit any organisation, but with more recent advances in computer technology, communication methods and software capabilities, less expensive more sophisticated test solutions are now possible. This allows more organisations to benefit from the many advantages associated with CIT. Examples of computer integration test set-ups and the benefits associated with computer integration have been discussed.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Projecte de recerca elaborat a partir d’una estada al Auditing and Integration of Management Systems Research Laboratory de la Universitat d’Alberta, Canadà, des de maig fins a setembre del 2007. Aquest centre porta a terme recerca de caire teòrica i bàsica aplicada a l’assegurament de la qualitat, i més concretament a l’estandardització i integració de sistemes de gestió. En primer lloc, s’han analitzat les dades obtingudes en l’estudi empíric descriptiu realitzat a Catalunya durant l’any 2005, focalitzat en els estàndards de gestió més utilitzats per les empreses catalanes, en que s’hi incloïen les normes ISO 9001, ISO 14001, OSHAS 18001 així com els nous estàndards de suport de la sèrie ISO 10000. En segon terme, i a partir d’aquest anàlisis previ, s’ha iniciat el disseny d’una metodologia flexible per a la integració dels sistemes de gestió basats en estàndards internacionals.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The need for better gene transfer systems towards improved risk=benefit balance for patients remains a major challenge in the clinical translation of gene therapy (GT). We have investigated the improvement of integrating vectors safety in combining (i) new short synthetic genetic insulator elements (GIE) and (ii) directing genetic integration to heterochromatin. We have designed SIN-insulated retrovectors with two candidate GIEs and could identify a specific combination of insulator 2 repeats which translates into best functional activity, high titers and boundary effect in both gammaretro (p20) and lentivectors (DCaro4) (see Duros et al, abstract ibid). Since GIEs are believed to shield the transgenic cassette from inhibitory effects and silencing, DCaro4 has been further tested with chimeric HIV-1 derived integrases which comprise C-ter chromodomains targeting heterochromatin through either histone H3 (ML6chimera) or methylatedCpGislands (ML10). With DCaro4 only and both chimeras, a homogeneous expression is evidenced in over 20% of the cells which is sustained over time. With control lentivectors, less than 2% of cells express GFP as compared to background using a control double-mutant in both catalytic and ledgf binding-sites; in addition, a two-times increase of expression can be induced with histone deacetylase inhibitors. Our approach could significantly reduce integration into open chromatin sensitive sites in stem cells at the time of transduction, a feature which might significantly decrease subsequent genotoxicity, according to X-SCIDs patients data.Work performed with the support of EC-DG research within the FP6-Network of Excellence, CLINIGENE: LSHB-CT-2006-018933

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Recently, the introduction of second generation sequencing and further advance-ments in confocal microscopy have enabled system-level studies for the functional characterization of genes. The degree of complexity intrinsic to these approaches needs the development of bioinformatics methodologies and computational models for extracting meaningful biological knowledge from the enormous amount of experi¬mental data which is continuously generated. This PhD thesis presents several novel bioinformatics methods and computational models to address specific biological questions in Plant Biology by using the plant Arabidopsis thaliana as a model system. First, a spatio-temporal qualitative analysis of quantitative transcript and protein profiles is applied to show the role of the BREVIS RADIX (BRX) protein in the auxin- cytokinin crosstalk for root meristem growth. Core of this PhD work is the functional characterization of the interplay between the BRX protein and the plant hormone auxin in the root meristem by using a computational model based on experimental evidence. Hyphotesis generated by the modelled to the discovery of a differential endocytosis pattern in the root meristem that splits the auxin transcriptional response via the plasma membrane to nucleus partitioning of BRX. This positional information system creates an auxin transcriptional pattern that deviates from the canonical auxin response and is necessary to sustain the expression of a subset of BRX-dependent auxin-responsive genes to drive root meristem growth. In the second part of this PhD thesis, we characterized the genome-wide impact of large scale deletions on four divergent Arabidopsis natural strains, through the integration of Ultra-High Throughput Sequencing data with data from genomic hybridizations on tiling arrays. Analysis of the identified deletions revealed a considerable portion of protein coding genes affected and supported a history of genomic rearrangements shaped by evolution. In the last part of the thesis, we showed that VIP3 gene in Arabidopsis has an evo-lutionary conserved role in the 3' to 5' mRNA degradation machinery, by applying a novel approach for the analysis of mRNA-Seq data from random-primed mRNA. Altogether, this PhD research contains major advancements in the study of natural genomic variation in plants and in the application of computational morphodynamics models for the functional characterization of biological pathways essential for the plant. - Récemment, l'introduction du séquençage de seconde génération et les avancées dans la microscopie confocale ont permis des études à l'échelle des différents systèmes cellulaires pour la caractérisation fonctionnelle de gènes. Le degrés de complexité intrinsèque à ces approches ont requis le développement de méthodologies bioinformatiques et de modèles mathématiques afin d'extraire de la masse de données expérimentale générée, des information biologiques significatives. Ce doctorat présente à la fois des méthodes bioinformatiques originales et des modèles mathématiques pour répondre à certaines questions spécifiques de Biologie Végétale en utilisant la plante Arabidopsis thaliana comme modèle. Premièrement, une analyse qualitative spatio-temporelle de profiles quantitatifs de transcripts et de protéines est utilisée pour montrer le rôle de la protéine BREVIS RADIX (BRX) dans le dialogue entre l'auxine et les cytokinines, des phytohormones, dans la croissance du méristème racinaire. Le noyau de ce travail de thèse est la caractérisation fonctionnelle de l'interaction entre la protéine BRX et la phytohormone auxine dans le méristème de la racine en utilisant des modèles informatiques basés sur des preuves expérimentales. Les hypothèses produites par le modèle ont mené à la découverte d'un schéma différentiel d'endocytose dans le méristème racinaire qui divise la réponse transcriptionnelle à l'auxine par le partitionnement de BRX de la membrane plasmique au noyau de la cellule. Cette information positionnelle crée une réponse transcriptionnelle à l'auxine qui dévie de la réponse canonique à l'auxine et est nécessaire pour soutenir l'expression d'un sous ensemble de gènes répondant à l'auxine et dépendant de BRX pour conduire la croissance du méristème. Dans la seconde partie de cette thèse de doctorat, nous avons caractérisé l'impact sur l'ensemble du génome des délétions à grande échelle sur quatre souches divergentes naturelles d'Arabidopsis, à travers l'intégration du séquençage à ultra-haut-débit avec l'hybridation génomique sur puces ADN. L'analyse des délétions identifiées a révélé qu'une proportion considérable de gènes codant était affectée, supportant l'idée d'un historique de réarrangement génomique modelé durant l'évolution. Dans la dernière partie de cette thèse, nous avons montré que le gène VÏP3 dans Arabidopsis a conservé un rôle évolutif dans la machinerie de dégradation des ARNm dans le sens 3' à 5', en appliquant une nouvelle approche pour l'analyse des données de séquençage d'ARNm issue de transcripts amplifiés aléatoirement. Dans son ensemble, cette recherche de doctorat contient des avancées majeures dans l'étude des variations génomiques naturelles des plantes et dans l'application de modèles morphodynamiques informatiques pour la caractérisation de réseaux biologiques essentiels à la plante. - Le développement des plantes est écrit dans leurs codes génétiques. Pour comprendre comment les plantes sont capables de s'adapter aux changements environnementaux, il est essentiel d'étudier comment leurs gènes gouvernent leur formation. Plus nous essayons de comprendre le fonctionnement d'une plante, plus nous réalisons la complexité des mécanismes biologiques, à tel point que l'utilisation d'outils et de modèles mathématiques devient indispensable. Dans ce travail, avec l'utilisation de la plante modèle Arabidopsis thalicinci nous avons résolu des problèmes biologiques spécifiques à travers le développement et l'application de méthodes informatiques concrètes. Dans un premier temps, nous avons investigué comment le gène BREVIS RADIX (BRX) régule le développement de la racine en contrôlant la réponse à deux hormones : l'auxine et la cytokinine. Nous avons employé une analyse statistique sur des mesures quantitatives de transcripts et de produits de gènes afin de démontrer que BRX joue un rôle antagonisant dans le dialogue entre ces deux hormones. Lorsque ce-dialogue moléculaire est perturbé, la racine primaire voit sa longueur dramatiquement réduite. Pour comprendre comment BRX répond à l'auxine, nous avons développé un modèle informatique basé sur des résultats expérimentaux. Les simulations successives ont mené à la découverte d'un signal positionnel qui contrôle la réponse de la racine à l'auxine par la régulation du mouvement intracellulaire de BRX. Dans la seconde partie de cette thèse, nous avons analysé le génome entier de quatre souches naturelles d'Arabidopsis et nous avons trouvé qu'une grande partie de leurs gènes étaient manquant par rapport à la souche de référence. Ce résultat indique que l'historique des modifications génomiques conduites par l'évolution détermine une disponibilité différentielle des gènes fonctionnels dans ces plantes. Dans la dernière partie de ce travail, nous avons analysé les données du transcriptome de la plante où le gène VIP3 était non fonctionnel. Ceci nous a permis de découvrir le rôle double de VIP3 dans la régulation de l'initiation de la transcription et dans la dégradation des transcripts. Ce rôle double n'avait jusqu'alors été démontrée que chez l'homme. Ce travail de doctorat supporte le développement et l'application de méthodologies informatiques comme outils inestimables pour résoudre la complexité des problèmes biologiques dans la recherche végétale. L'intégration de la biologie végétale et l'informatique est devenue de plus en plus importante pour l'avancée de nos connaissances sur le fonctionnement et le développement des plantes.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The integration of geophysical data into the subsurface characterization problem has been shown in many cases to significantly improve hydrological knowledge by providing information at spatial scales and locations that is unattainable using conventional hydrological measurement techniques. The investigation of exactly how much benefit can be brought by geophysical data in terms of its effect on hydrological predictions, however, has received considerably less attention in the literature. Here, we examine the potential hydrological benefits brought by a recently introduced simulated annealing (SA) conditional stochastic simulation method designed for the assimilation of diverse hydrogeophysical data sets. We consider the specific case of integrating crosshole ground-penetrating radar (GPR) and borehole porosity log data to characterize the porosity distribution in saturated heterogeneous aquifers. In many cases, porosity is linked to hydraulic conductivity and thus to flow and transport behavior. To perform our evaluation, we first generate a number of synthetic porosity fields exhibiting varying degrees of spatial continuity and structural complexity. Next, we simulate the collection of crosshole GPR data between several boreholes in these fields, and the collection of porosity log data at the borehole locations. The inverted GPR data, together with the porosity logs, are then used to reconstruct the porosity field using the SA-based method, along with a number of other more elementary approaches. Assuming that the grid-cell-scale relationship between porosity and hydraulic conductivity is unique and known, the porosity realizations are then used in groundwater flow and contaminant transport simulations to assess the benefits and limitations of the different approaches.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this paper, we consider the ATM networks in which the virtual path concept is implemented. The question of how to multiplex two or more diverse traffic classes while providing different quality of service requirements is a very complicated open problem. Two distinct options are available: integration and segregation. In an integration approach all the traffic from different connections are multiplexed onto one VP. This implies that the most restrictive QOS requirements must be applied to all services. Therefore, link utilization will be decreased because unnecessarily stringent QOS is provided to all connections. With the segregation approach the problem can be much simplified if different types of traffic are separated by assigning a VP with dedicated resources (buffers and links). Therefore, resources may not be efficiently utilized because no sharing of bandwidth can take place across the VP. The probability that the bandwidth required by the accepted connections exceeds the capacity of the link is evaluated with the probability of congestion (PC). Since the PC can be expressed as the CLP, we shall simply carry out bandwidth allocation using the PC. We first focus on the influence of some parameters (CLP, bit rate and burstiness) on the capacity required by a VP supporting a single traffic class using the new convolution approach. Numerical results are presented both to compare the required capacity and to observe which conditions under each approach are preferred

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper presents the use of a mobile robot platform as an innovative educational tool in order to promote and integrate different curriculum knowledge. Hence, it is presented the acquired experience within a summer course named ldquoapplied mobile roboticsrdquo. The main aim of the course is to integrate different subjects as electronics, programming, architecture, perception systems, communications, control and trajectory planning by using the educational open mobile robot platform PRIM. The summer course is addressed to a wide range of student profiles. However, it is of special interests to the students of electrical and computer engineering around their final academic year. The summer course consists of the theoretical and laboratory sessions, related to the following topics: design & programming of electronic devices, modelling and control systems, trajectory planning and control, and computer vision systems. Therefore, the clues for achieving a renewed path of progress in robotics are the integration of several knowledgeable fields, such as computing, communications, and control sciences, in order to perform a higher level reasoning and use decision tools with strong theoretical base

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Expert supervision systems are software applications specially designed to automate process monitoring. The goal is to reduce the dependency on human operators to assure the correct operation of a process including faulty situations. Construction of this kind of application involves an important task of design and development in order to represent and to manipulate process data and behaviour at different degrees of abstraction for interfacing with data acquisition systems connected to the process. This is an open problem that becomes more complex with the number of variables, parameters and relations to account for the complexity of the process. Multiple specialised modules tuned to solve simpler tasks that operate under a co-ordination provide a solution. A modular architecture based on concepts of software agents, taking advantage of the integration of diverse knowledge-based techniques, is proposed for this purpose. The components (software agents, communication mechanisms and perception/action mechanisms) are based on ICa (Intelligent Control architecture), software middleware supporting the build-up of applications with software agent features

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Colorectal cancer is a heterogeneous disease that manifests through diverse clinical scenarios. During many years, our knowledge about the variability of colorectal tumors was limited to the histopathological analysis from which generic classifications associated with different clinical expectations are derived. However, currently we are beginning to understand that under the intense pathological and clinical variability of these tumors there underlies strong genetic and biological heterogeneity. Thus, with the increasing available information of inter-tumor and intra-tumor heterogeneity, the classical pathological approach is being displaced in favor of novel molecular classifications. In the present article, we summarize the most relevant proposals of molecular classifications obtained from the analysis of colorectal tumors using powerful high throughput techniques and devices. We also discuss the role that cancer systems biology may play in the integration and interpretation of the high amount of data generated and the challenges to be addressed in the future development of precision oncology. In addition, we review the current state of implementation of these novel tools in the pathological laboratory and in clinical practice.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Background: To enhance our understanding of complex biological systems like diseases we need to put all of the available data into context and use this to detect relations, pattern and rules which allow predictive hypotheses to be defined. Life science has become a data rich science with information about the behaviour of millions of entities like genes, chemical compounds, diseases, cell types and organs, which are organised in many different databases and/or spread throughout the literature. Existing knowledge such as genotype - phenotype relations or signal transduction pathways must be semantically integrated and dynamically organised into structured networks that are connected with clinical and experimental data. Different approaches to this challenge exist but so far none has proven entirely satisfactory. Results: To address this challenge we previously developed a generic knowledge management framework, BioXM™, which allows the dynamic, graphic generation of domain specific knowledge representation models based on specific objects and their relations supporting annotations and ontologies. Here we demonstrate the utility of BioXM for knowledge management in systems biology as part of the EU FP6 BioBridge project on translational approaches to chronic diseases. From clinical and experimental data, text-mining results and public databases we generate a chronic obstructive pulmonary disease (COPD) knowledge base and demonstrate its use by mining specific molecular networks together with integrated clinical and experimental data. Conclusions: We generate the first semantically integrated COPD specific public knowledge base and find that for the integration of clinical and experimental data with pre-existing knowledge the configuration based set-up enabled by BioXM reduced implementation time and effort for the knowledge base compared to similar systems implemented as classical software development projects. The knowledgebase enables the retrieval of sub-networks including protein-protein interaction, pathway, gene - disease and gene - compound data which are used for subsequent data analysis, modelling and simulation. Pre-structured queries and reports enhance usability; establishing their use in everyday clinical settings requires further simplification with a browser based interface which is currently under development.