926 resultados para Genomic data integration


Relevância:

80.00% 80.00%

Publicador:

Resumo:

This thesis concerns the analysis of the socio-economic transformation of communities in Bronze Age southwestern Cyprus. Through the adoption of a dialectical perspective of analysis, individuals and environment are considered part of the same unity: they are cooperating agents in shaping society and culture. The Bronze Age is a period of intense transformation in the organization of local communities, made of a continuous renegotiation of the socio-economic roles and interactions. The archaeological record from this portion of the island allows one to go beyond the investigation of the complex and articulated transition from the EBA-MBA agro-pastoral and self-sufficient communities to the LBA centralized and trade-oriented urban-centres. Through a shifting of analytical scales, the emerging picture suggests major transformations in the individual-community-territory dialectical relations. A profound change in the materials conditions of social life, as well as in the superstructural realm, was particularly entailed by the dissolution of the relation to the earth, due to the emergence of new forms of land exploitation/ownership and to the shift of the settlement pattern in previously unknown areas. One of the key points of this thesis is the methodological challenge of working with legacy survey data as I re-analysed a diverse archaeological legacy, which is the result of more than fifty years of survey projects, rescue and research-oriented excavations, as well as casual discoveries. Source critique and data evaluation are essential requirements in an integrative and cross-disciplinary regional perspective, in the comprehensive processing of heterogeneous archaeological and environmental datasets. Through the estimation of data precision and certainty, I developed an effective - but simple - method to critically evaluate existing datasets and to inter-correlate them without losing their original complexity. This powerful method for data integration can be applied to similar datasets belonging to other regions and other periods as it originates from the evaluation of larger methodological and theoretical issues that are not limited to my spatial and temporal focus. As I argue in this thesis, diverse archaeological legacies can be efficiently re-analysed through an integrative and regional methodology. The adoption of a regional scale of analysis can provide an excellent perspective on the complexity of transformations in ancient societies, thus creating a fundamental bridge between the local stories and grand landscape narratives.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

A replicação de base de dados tem como objectivo a cópia de dados entre bases de dados distribuídas numa rede de computadores. A replicação de dados é importante em várias situações, desde a realização de cópias de segurança da informação, ao balanceamento de carga, à distribuição da informação por vários locais, até à integração de sistemas heterogéneos. A replicação possibilita uma diminuição do tráfego de rede, pois os dados ficam disponíveis localmente possibilitando também o seu acesso no caso de indisponibilidade da rede. Esta dissertação baseia-se na realização de um trabalho que consistiu no desenvolvimento de uma aplicação genérica para a replicação de bases de dados a disponibilizar como open source software. A aplicação desenvolvida possibilita a integração de dados entre vários sistemas, com foco na integração de dados heterogéneos, na fragmentação de dados e também na possibilidade de adaptação a várias situações. ABSTRACT: Data replication is a mechanism to synchronize and integrate data between distributed databases over a computer network. Data replication is an important tool in several situations, such as the creation of backup systems, load balancing between various nodes, distribution of information between various locations, integration of heterogeneous systems. Replication enables a reduction in network traffic, because data remains available locally even in the event of a temporary network failure. This thesis is based on the work carried out to develop an application for database replication to be made accessible as open source software. The application that was built allows for data integration between various systems, with particular focus on, amongst others, the integration of heterogeneous data, the fragmentation of data, replication in cascade, data format changes between replicas, master/slave and multi master synchronization.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Aim of the present study was to develop a statistical approach to define the best cut-off Copy number alterations (CNAs) calling from genomic data provided by high throughput experiments, able to predict a specific clinical end-point (early relapse, 18 months) in the context of Multiple Myeloma (MM). 743 newly diagnosed MM patients with SNPs array-derived genomic and clinical data were included in the study. CNAs were called both by a conventional (classic, CL) and an outcome-oriented (OO) method, and Progression Free Survival (PFS) hazard ratios of CNAs called by the two approaches were compared. The OO approach successfully identified patients at higher risk of relapse and the univariate survival analysis showed stronger prognostic effects for OO-defined high-risk alterations, as compared to that defined by CL approach, statistically significant for 12 CNAs. Overall, 155/743 patients relapsed within 18 months from the therapy start. A small number of OO-defined CNAs were significantly recurrent in early-relapsed patients (ER-CNAs) - amp1q, amp2p, del2p, del12p, del17p, del19p -. Two groups of patients were identified either carrying or not ≥1 ER-CNAs (249 vs. 494, respectively), the first one with significantly shorter PFS and overall survivals (OS) (PFS HR 2.15, p<0001; OS HR 2.37, p<0.0001). The risk of relapse defined by the presence of ≥1 ER-CNAs was independent from those conferred both by R-IIS 3 (HR=1.51; p=0.01) and by low quality (< stable disease) clinical response (HR=2.59 p=0.004). Notably, the type of induction therapy was not descriptive, suggesting that ER is strongly related to patients’ baseline genomic architecture. In conclusion, the OO- approach employed allowed to define CNAs-specific dynamic clonality cut-offs, improving the CNAs calls’ accuracy to identify MM patients with the highest probability to ER. As being outcome-dependent, the OO-approach is dynamic and might be adjusted according to the selected outcome variable of interest.

Relevância:

50.00% 50.00%

Publicador:

Resumo:

Background: High-density tiling arrays and new sequencing technologies are generating rapidly increasing volumes of transcriptome and protein-DNA interaction data. Visualization and exploration of this data is critical to understanding the regulatory logic encoded in the genome by which the cell dynamically affects its physiology and interacts with its environment. Results: The Gaggle Genome Browser is a cross-platform desktop program for interactively visualizing high-throughput data in the context of the genome. Important features include dynamic panning and zooming, keyword search and open interoperability through the Gaggle framework. Users may bookmark locations on the genome with descriptive annotations and share these bookmarks with other users. The program handles large sets of user-generated data using an in-process database and leverages the facilities of SQL and the R environment for importing and manipulating data. A key aspect of the Gaggle Genome Browser is interoperability. By connecting to the Gaggle framework, the genome browser joins a suite of interconnected bioinformatics tools for analysis and visualization with connectivity to major public repositories of sequences, interactions and pathways. To this flexible environment for exploring and combining data, the Gaggle Genome Browser adds the ability to visualize diverse types of data in relation to its coordinates on the genome. Conclusions: Genomic coordinates function as a common key by which disparate biological data types can be related to one another. In the Gaggle Genome Browser, heterogeneous data are joined by their location on the genome to create information-rich visualizations yielding insight into genome organization, transcription and its regulation and, ultimately, a better understanding of the mechanisms that enable the cell to dynamically respond to its environment.

Relevância:

50.00% 50.00%

Publicador:

Resumo:

The integration of the Human Immunodeficiency Virus (HIV) genetic information into the host genome is fundamental for its replication and long-term persistence in the host. Isolating and characterizing the integration sites can be useful for obtaining data such as identifying the specific genomic location of integration or understanding the forces dictating HIV integration site selection. The methods outlined in this article describe a highly efficient and precise technique for identifying HIV integration sites in the host genome on a small scale using molecular cloning techniques and standard sequencing or on a massive scale using 454 pyrosequencing.

Relevância:

50.00% 50.00%

Publicador:

Resumo:

Estrogen is a ligand for the estrogen receptor (ER), which on binding 17beta-estradiol, functions as a ligand-activated transcription factor and regulates the transcription of target genes. This is the slow genomic mode of action. However, rapid non-genomic actions of estrogen also exist at the cell membrane. Using a novel two-pulse paradigm in which the first pulse rapidly initiates non-genomic actions using a membrane-limited estrogen conjugate (E-BSA), while the second pulse promotes genomic transcription from a consensus estrogen response element (ERE), we have demonstrated that rapid actions of estrogen potentiate the slower transcriptional response from an ERE-reporter in neuroblastoma cells. Since rapid actions of estrogen activate kinases, we used selective inhibitors in the two-pulse paradigm to determine the intracellular signaling cascades important in such potentiation. Inhibition of protein kinase A (PKA), PKC, mitogen activated protein kinase (MAPK) or phosphatidylinositol 3-OH kinase (PI-3K) in the first pulse decreases potentiation of transcription. Also, our data with both dominant negative and constitutive mutants of Galpha subunits show that Galpha(q) initiates the rapid signaling cascade at the membrane in SK-N-BE(2)C neuroblastoma cells. We discuss two models of multiple kinase activation at the membrane Pulses of estrogen induce lordosis behavior in female rats. Infusion of E-BSA into the ventromedial hypothalamus followed by 17beta-estradiol in the second pulse could induce lordosis behavior, demonstrating the applicability of this paradigm in vivo. A model where non-genomic actions of estrogen couple to genomic actions unites both aspects of hormone action.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Dissertation submitted in partial fulfillment of the requirements for the Degree of Master of Science in Geospatial Technologies.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Magdeburg, Univ., Fak. für Informatik, Diss., 2014

Relevância:

40.00% 40.00%

Publicador:

Resumo:

An essential step of the life cycle of retroviruses is the stable insertion of a copy of their DNA genome into the host cell genome, and lentiviruses are no exception. This integration step, catalyzed by the viral-encoded integrase, ensures long-term expression of the viral genes, thus allowing a productive viral replication and rendering retroviral vectors also attractive for the field of gene therapy. At the same time, this ability to integrate into the host genome raises safety concerns regarding the use of retroviral-based gene therapy vectors, due to the genomic locations of integration sites. The availability of the human genome sequence made possible the analysis of the integration site preferences, which revealed to be nonrandom and retrovirus-specific, i.e. all lentiviruses studied so far favor integration in active transcription units, while other retroviruses have a different integration site distribution. Several mechanisms have been proposed that may influence integration targeting, which include (i) chromatin accessibility, (ii) cell cycle effects, and (iii) tethering proteins. Recent data provide evidence that integration site selection can occur via a tethering mechanism, through the recruitment of the lentiviral integrase by the cellular LEDGF/p75 protein, both proteins being the two major players in lentiviral integration targeting.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

It has been demonstrated in earlier studies that patients with a cochlear implant have increased abilities for audio-visual integration because the crude information transmitted by the cochlear implant requires the persistent use of the complementary speech information from the visual channel. The brain network for these abilities needs to be clarified. We used an independent components analysis (ICA) of the activation (H2 (15) O) positron emission tomography data to explore occipito-temporal brain activity in post-lingually deaf patients with unilaterally implanted cochlear implants at several months post-implantation (T1), shortly after implantation (T0) and in normal hearing controls. In between-group analysis, patients at T1 had greater blood flow in the left middle temporal cortex as compared with T0 and normal hearing controls. In within-group analysis, patients at T0 had a task-related ICA component in the visual cortex, and patients at T1 had one task-related ICA component in the left middle temporal cortex and the other in the visual cortex. The time courses of temporal and visual activities during the positron emission tomography examination at T1 were highly correlated, meaning that synchronized integrative activity occurred. The greater involvement of the visual cortex and its close coupling with the temporal cortex at T1 confirm the importance of audio-visual integration in more experienced cochlear implant subjects at the cortical level.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

The increasing volume of data describing humandisease processes and the growing complexity of understanding, managing, and sharing such data presents a huge challenge for clinicians and medical researchers. This paper presents the@neurIST system, which provides an infrastructure for biomedical research while aiding clinical care, by bringing together heterogeneous data and complex processing and computing services. Although @neurIST targets the investigation and treatment of cerebral aneurysms, the system’s architecture is generic enough that it could be adapted to the treatment of other diseases.Innovations in @neurIST include confining the patient data pertaining to aneurysms inside a single environment that offers cliniciansthe tools to analyze and interpret patient data and make use of knowledge-based guidance in planning their treatment. Medicalresearchers gain access to a critical mass of aneurysm related data due to the system’s ability to federate distributed informationsources. A semantically mediated grid infrastructure ensures that both clinicians and researchers are able to seamlessly access andwork on data that is distributed across multiple sites in a secure way in addition to providing computing resources on demand forperforming computationally intensive simulations for treatment planning and research.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Abstract Accurate characterization of the spatial distribution of hydrological properties in heterogeneous aquifers at a range of scales is a key prerequisite for reliable modeling of subsurface contaminant transport, and is essential for designing effective and cost-efficient groundwater management and remediation strategies. To this end, high-resolution geophysical methods have shown significant potential to bridge a critical gap in subsurface resolution and coverage between traditional hydrological measurement techniques such as borehole log/core analyses and tracer or pumping tests. An important and still largely unresolved issue, however, is how to best quantitatively integrate geophysical data into a characterization study in order to estimate the spatial distribution of one or more pertinent hydrological parameters, thus improving hydrological predictions. Recognizing the importance of this issue, the aim of the research presented in this thesis was to first develop a strategy for the assimilation of several types of hydrogeophysical data having varying degrees of resolution, subsurface coverage, and sensitivity to the hydrologic parameter of interest. In this regard a novel simulated annealing (SA)-based conditional simulation approach was developed and then tested in its ability to generate realizations of porosity given crosshole ground-penetrating radar (GPR) and neutron porosity log data. This was done successfully for both synthetic and field data sets. A subsequent issue that needed to be addressed involved assessing the potential benefits and implications of the resulting porosity realizations in terms of groundwater flow and contaminant transport. This was investigated synthetically assuming first that the relationship between porosity and hydraulic conductivity was well-defined. Then, the relationship was itself investigated in the context of a calibration procedure using hypothetical tracer test data. Essentially, the relationship best predicting the observed tracer test measurements was determined given the geophysically derived porosity structure. Both of these investigations showed that the SA-based approach, in general, allows much more reliable hydrological predictions than other more elementary techniques considered. Further, the developed calibration procedure was seen to be very effective, even at the scale of tomographic resolution, for predictions of transport. This also held true at locations within the aquifer where only geophysical data were available. This is significant because the acquisition of hydrological tracer test measurements is clearly more complicated and expensive than the acquisition of geophysical measurements. Although the above methodologies were tested using porosity logs and GPR data, the findings are expected to remain valid for a large number of pertinent combinations of geophysical and borehole log data of comparable resolution and sensitivity to the hydrological target parameter. Moreover, the obtained results allow us to have confidence for future developments in integration methodologies for geophysical and hydrological data to improve the 3-D estimation of hydrological properties.