894 resultados para DATA INTEGRATION
Resumo:
Part 14: Interoperability and Integration
Resumo:
Background: Statistical analysis of DNA microarray data provides a valuable diagnostic tool for the investigation of genetic components of diseases. To take advantage of the multitude of available data sets and analysis methods, it is desirable to combine both different algorithms and data from different studies. Applying ensemble learning, consensus clustering and cross-study normalization methods for this purpose in an almost fully automated process and linking different analysis modules together under a single interface would simplify many microarray analysis tasks. Results: We present ArrayMining.net, a web-application for microarray analysis that provides easy access to a wide choice of feature selection, clustering, prediction, gene set analysis and cross-study normalization methods. In contrast to other microarray-related web-tools, multiple algorithms and data sets for an analysis task can be combined using ensemble feature selection, ensemble prediction, consensus clustering and cross-platform data integration. By interlinking different analysis tools in a modular fashion, new exploratory routes become available, e.g. ensemble sample classification using features obtained from a gene set analysis and data from multiple studies. The analysis is further simplified by automatic parameter selection mechanisms and linkage to web tools and databases for functional annotation and literature mining. Conclusion: ArrayMining.net is a free web-application for microarray analysis combining a broad choice of algorithms based on ensemble and consensus methods, using automatic parameter selection and integration with annotation databases.
Resumo:
This paper presents a harmonised framework of sediment quality assessment and dredging material characterisation for estuaries and port zones of North and South Atlantic. This framework, based on the weight-of-evidence approach, provides a structure and a process for conducting sediment/dredging material assessment that leads to a decision. The main structure consists of step 1 (examination of available data); step 2 (chemical characterisation and toxicity assessment); decision 1 (any chemical level higher than reference values? are sediments toxic?); step 3 (assessment of benthic community structure); step 4 (integration of the results); decision 2 (are sediments toxic or benthic community impaired?); step 5 (construction of the decision matrix) and decision 3 (is there environmental risk?). The sequence of assessments may be interrupted when the information obtained is judged to be sufficient for a correct characterisation of the risk posed by the sediments/dredging material. This framework brought novel features compared to other sediment/dredging material risk assessment frameworks: data integration through multivariate analysis allows the identification of which samples are toxic and/or related to impaired benthic communities; it also discriminates the chemicals responsible for negative biological effects; and the framework dispenses the use of a reference area. We demonstrated the successful application of this framework in different port and estuarine zones of the North (Gulf of Cadiz) and South Atlantic (Santos and Paranagua Estuarine Systems).
Resumo:
Sediment quality from Paranagua Estuarine System (PES), a highly important port and ecological zone, was evaluated by assessing three lines of evidence: (1) sediment physical-chemical characteristics; (2) sediment toxicity (elutriates, sediment-water interface, and whole sediment); and (3) benthic community structure. Results revealed a gradient of increasing degradation of sediments (i.e. higher concentrations of trace metals, higher toxicity, and impoverishment of benthic community structure) towards inner PES. Data integration by principal component analysis (PCA) showed positive correlation between some contaminants (mainly As, Cr, Ni, and Pb) and toxicity in samples collected from stations located in upper estuary and one station placed away from contamination sources. Benthic community structure seems to be affected by both pollution and natural fine characteristics of the sediments, which reinforces the importance of a weight-of-evidence approach to evaluate sediments of PES. (C) 2008 Elsevier B.V. All rights reserved.
Resumo:
Dissertação de Mestrado, Contabilidade, Faculdade de Economia, Universidade do Algarve, 2016
Resumo:
Thesis (Ph.D, Education) -- Queen's University, 2016-09-22 22:05:24.246
Resumo:
This thesis concerns the analysis of the socio-economic transformation of communities in Bronze Age southwestern Cyprus. Through the adoption of a dialectical perspective of analysis, individuals and environment are considered part of the same unity: they are cooperating agents in shaping society and culture. The Bronze Age is a period of intense transformation in the organization of local communities, made of a continuous renegotiation of the socio-economic roles and interactions. The archaeological record from this portion of the island allows one to go beyond the investigation of the complex and articulated transition from the EBA-MBA agro-pastoral and self-sufficient communities to the LBA centralized and trade-oriented urban-centres. Through a shifting of analytical scales, the emerging picture suggests major transformations in the individual-community-territory dialectical relations. A profound change in the materials conditions of social life, as well as in the superstructural realm, was particularly entailed by the dissolution of the relation to the earth, due to the emergence of new forms of land exploitation/ownership and to the shift of the settlement pattern in previously unknown areas. One of the key points of this thesis is the methodological challenge of working with legacy survey data as I re-analysed a diverse archaeological legacy, which is the result of more than fifty years of survey projects, rescue and research-oriented excavations, as well as casual discoveries. Source critique and data evaluation are essential requirements in an integrative and cross-disciplinary regional perspective, in the comprehensive processing of heterogeneous archaeological and environmental datasets. Through the estimation of data precision and certainty, I developed an effective - but simple - method to critically evaluate existing datasets and to inter-correlate them without losing their original complexity. This powerful method for data integration can be applied to similar datasets belonging to other regions and other periods as it originates from the evaluation of larger methodological and theoretical issues that are not limited to my spatial and temporal focus. As I argue in this thesis, diverse archaeological legacies can be efficiently re-analysed through an integrative and regional methodology. The adoption of a regional scale of analysis can provide an excellent perspective on the complexity of transformations in ancient societies, thus creating a fundamental bridge between the local stories and grand landscape narratives.
Resumo:
A replicação de base de dados tem como objectivo a cópia de dados entre bases de dados distribuídas numa rede de computadores. A replicação de dados é importante em várias situações, desde a realização de cópias de segurança da informação, ao balanceamento de carga, à distribuição da informação por vários locais, até à integração de sistemas heterogéneos. A replicação possibilita uma diminuição do tráfego de rede, pois os dados ficam disponíveis localmente possibilitando também o seu acesso no caso de indisponibilidade da rede. Esta dissertação baseia-se na realização de um trabalho que consistiu no desenvolvimento de uma aplicação genérica para a replicação de bases de dados a disponibilizar como open source software. A aplicação desenvolvida possibilita a integração de dados entre vários sistemas, com foco na integração de dados heterogéneos, na fragmentação de dados e também na possibilidade de adaptação a várias situações. ABSTRACT: Data replication is a mechanism to synchronize and integrate data between distributed databases over a computer network. Data replication is an important tool in several situations, such as the creation of backup systems, load balancing between various nodes, distribution of information between various locations, integration of heterogeneous systems. Replication enables a reduction in network traffic, because data remains available locally even in the event of a temporary network failure. This thesis is based on the work carried out to develop an application for database replication to be made accessible as open source software. The application that was built allows for data integration between various systems, with particular focus on, amongst others, the integration of heterogeneous data, the fragmentation of data, replication in cascade, data format changes between replicas, master/slave and multi master synchronization.
Resumo:
An investigation of the construction data management needs of the Florida Department of Transportation (FDOT) with regard to XML standards including development of data dictionary and data mapping. The review of existing XML schemas indicated the need for development of specific XML schemas. XML schemas were developed for all FDOT construction data management processes. Additionally, data entry, approval and data retrieval applications were developed for payroll compliance reporting and pile quantity payment development.
Resumo:
Discovering the means to prevent and cure schizophrenia is a vision that motivates many scientists. But in order to achieve this goal, we need to understand its neurobiological basis. The emergent metadiscipline of cognitive neuroscience fields an impressive array of tools that can be marshaled towards achieving this goal, including powerful new methods of imaging the brain (both structural and functional) as well as assessments of perceptual and cognitive capacities based on psychophysical procedures, experimental tasks and models developed by cognitive science. We believe that the integration of data from this array of tools offers the greatest possibilities and potential for advancing understanding of the neural basis of not only normal cognition but also the cognitive impairments that are fundamental to schizophrenia. Since sufficient expertise in the application of these tools and methods rarely reside in a single individual, or even a single laboratory, collaboration is a key element in this endeavor. Here, we review some of the products of our integrative efforts in collaboration with our colleagues on the East Coast of Australia and Pacific Rim. This research focuses on the neural basis of executive function deficits and impairments in early auditory processing in patients using various combinations of performance indices (from perceptual and cognitive paradigms), ERPs, fMRI and sMRI. In each case, integration of two or more sources of information provides more information than any one source alone by revealing new insights into structure-function relationships. Furthermore, the addition of other imaging methodologies (such as DTI) and approaches (such as computational models of cognition) offers new horizons in human brain imaging research and in understanding human behavior.
Resumo:
The 3D Water Chemistry Atlas is an intuitive, open source, Web-based system that enables the three-dimensional (3D) sub-surface visualization of ground water monitoring data, overlaid on the local geological model (formation and aquifer strata). This paper firstly describes the results of evaluating existing virtual globe technologies, which led to the decision to use the Cesium open source WebGL Virtual Globe and Map Engine as the underlying platform. Next it describes the backend database and search, filtering, browse and analysis tools that were developed to enable users to interactively explore the groundwater monitoring data and interpret it spatially and temporally relative to the local geological formations and aquifers via the Cesium interface. The result is an integrated 3D visualization system that enables environmental managers and regulators to assess groundwater conditions, identify inconsistencies in the data, manage impacts and risks and make more informed decisions about coal seam gas extraction, waste water extraction, and water reuse.
Resumo:
Digital technology offers enormous benefits (economic, quality of design and efficiency in use) if adopted to implement integrated ways of representing the physical world in a digital form. When applied across the full extent of the built and natural world, it is referred to as the Digital Built Environment (DBE) and encompasses a wide range of approaches and technology initiatives, all aimed at the same end goal: the development of a virtual world that sufficiently mirrors the real world to form the basis for the smart cities of the present and future, enable efficient infrastructure design and programmed maintenance, and create a new foundation for economic growth and social well-being through evidence-based analysis. The creation of a National Data Policy for the DBE will facilitate the creation of additional high technology industries in Australia; provide Governments, industries and citizens with greater knowledge of the environments they occupy and plan; and offer citizen-driven innovations for the future. Australia has slipped behind other nations in the adoption and execution of Building Information Modelling (BIM) and the principal concern is that the gap is widening. Data driven innovation added $67 billion to the Australian economy in 20131. Strong open data policy equates to $16 billion in new value2. Australian Government initiatives such as the Digital Earth inspired “National Map” offer a platform and pathway to embrace the concept of a “BIM Globe”, while also leveraging unprecedented growth in open source / open data collaboration. Australia must address the challenges by learning from international experiences—most notably the UK and NZ—and mandate the use of BIM across Government, extending the Framework for Spatial Data Foundation to include the Built Environment as a theme and engaging collaboration through a “BIM globe” metaphor. This proposed DBE strategy will modernise the Australian urban planning and the construction industry. It will change the way we develop our cities by fundamentally altering the dynamics and behaviours of the supply chains and unlocking new and more efficient ways of collaborating at all stages of the project life-cycle. There are currently two major modelling approaches that contribute to the challenge of delivering the DBE. Though these collectively encompass many (often competing) approaches or proprietary software systems, all can be categorised as either: a spatial modelling approach, where the focus is generally on representing the elements that make up the world within their geographic context; and a construction modelling approach, where the focus is on models that support the life cycle management of the built environment. These two approaches have tended to evolve independently, addressing two broad industry sectors: the one concerned with understanding and managing global and regional aspects of the world that we inhabit, including disciplines concerned with climate, earth sciences, land ownership, urban and regional planning and infrastructure management; the other is concerned with planning, design, construction and operation of built facilities and includes architectural and engineering design, product manufacturing, construction, facility management and related disciplines (a process/technology commonly known as Building Information Modelling, BIM). The spatial industries have a strong voice in the development of public policy in Australia, while the construction sector, which in 2014 accounted for around 8.5% of Australia’s GDP3, has no single voice and because of its diversity, is struggling to adapt to and take advantage of the opportunity presented by these digital technologies. The experience in the UK over the past few years has demonstrated that government leadership is very effective in stimulating industry adoption of digital technologies by, on the one hand, mandating the use of BIM on public procurement projects while at the same time, providing comparatively modest funding to address the common issues that confront the industry in adopting that way of working across the supply chain. The reported result has been savings of £840m in construction costs in 2013/14 according to UK Cabinet Office figures4. There is worldwide recognition of the value of bringing these two modelling technologies together. Australia has the expertise to exercise leadership in this work, but it requires a commitment by government to recognise the importance of BIM as a companion methodology to the spatial technologies so that these two disciplinary domains can cooperate in the development of data policies and information exchange standards to smooth out common workflows. buildingSMART Australasia, SIBA and their academic partners have initiated this dialogue in Australia and wish to work collaboratively, with government support and leadership, to explore the opportunities open to us as we develop an Australasian Digital Built Environment. As part of that programme, we must develop and implement a strategy to accelerate the adoption of BIM processes across the Australian construction sector while at the same time, developing an integrated approach in concert with the spatial sector that will position Australia at the forefront of international best practice in this area. Australia and New Zealand cannot afford to be on the back foot as we face the challenges of rapid urbanisation and change in the global environment. Although we can identify some exemplary initiatives in this area, particularly in New Zealand in response to the need for more resilient urban development in the face of earthquake threats, there is still much that needs to be done. We are well situated in the Asian region to take a lead in this challenge, but we are at imminent risk of losing the initiative if we do not take action now. Strategic collaboration between Governments, Industry and Academia will create new jobs and wealth, with the potential, for example, to save around 20% on the delivery costs of new built assets, based on recent UK estimates.
Resumo:
Self-tracking, the process of recording one's own behaviours, thoughts and feelings, is a popular approach to enhance one's self-knowledge. While dedicated self-tracking apps and devices support data collection, previous research highlights that the integration of data constitutes a barrier for users. In this study we investigated how members of the Quantified Self movement---early adopters of self-tracking tools---overcome these barriers. We conducted a qualitative analysis of 51 videos of Quantified Self presentations to explore intentions for collecting data, methods for integrating and representing data, and how intentions and methods shaped reflection. The findings highlight two different intentions---striving for self-improvement and curiosity in personal data---which shaped how these users integrated data, i.e. the effort required. Furthermore, we identified three methods for representing data---binary, structured and abstract---which influenced reflection. Binary representations supported reflection-in-action, whereas structured and abstract representations supported iterative processes of data collection, integration and reflection. For people tracking out of curiosity, this iterative engagement with personal data often became an end in itself, rather than a means to achieve a goal. We discuss how these findings contribute to our current understanding of self-tracking amongst Quantified Self members and beyond, and we conclude with directions for future work to support self-trackers with their aspirations.
Resumo:
Harmful algal blooms (HABs) are a significant and potentially expanding problem around the world. Resource management and public health protection require sufficient information to reduce the impacts of HABs by response strategies and through warnings and advisories. To be effective, these programs can best be served by an integration of improved detection methods with both evolving monitoring systems and new communications capabilities. Data sets are typically collected from a variety of sources, these can be considered as several types: point data, such as water samples; transects, such as from shipboard continuous sampling; and synoptic, such as from satellite imagery. Generation of a field of the HAB distribution requires all of these sampling approaches. This means that the data sets need to be interpreted and analyzed with each other to create the field or distribution of the HAB. The HAB field is also a necessary input into models that forecast blooms. Several systems have developed strategies that demonstrate these approaches. These range from data sets collected at key sites, such as swimming beaches, to automated collection systems, to integration of interpreted satellite data. Improved data collection, particularly in speed and cost, will be one of the advances of the next few years. Methods to improve creation of the HAB field from the variety of data types will be necessary for routine nowcasting and forecasting of HABs.
Resumo:
In the face of dramatic declines in groundfish populations and a lack of sufficient stock assessment information, a need has arisen for new methods of assessing groundfish populations. We describe the integration of seafloor transect data gathered by a manned submersible with high-resolution sonar imagery to produce a habitat-based stock assessment system for groundfish. The data sets used in this study were collected from Heceta Bank, Oregon, and were derived from 42 submersible dives (1988–90) and a multibeam sonar survey (1998). The submersible habitat survey investigated seafloor topography and groundfish abundance along 30-minute transects over six predetermined stations and found a statistical relationship between habitat variability and groundfish distribution and abundance. These transects were analyzed in a geographic information system (GIS) by using dynamic segmentation to display changes in habitat along the transects. We used the submersible data to extrapolate fish abundance within uniform habitat patches over broad areas of the bank by means of a habitat classification based on the sonar imagery. After applying a navigation correction to the submersible-based habitat segments, a good correlation with major boundaries on the backscatter and topographic boundaries on the imagery were apparent. Extrapolation of the extent of uniform habitats was made in the vicinity of the dive stations and a preliminary stock assessment of several species of demersal fish was calculated. Such a habitat-based approach will allow researchers to characterize marine communities over large areas of the seafloor.