975 resultados para literature-data integration


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Platelets in the circulation are triggered by vascular damage to activate, aggregate and form a thrombus that prevents excessive blood loss. Platelet activation is stringently regulated by intracellular signalling cascades, which when activated inappropriately lead to myocardial infarction and stroke. Strategies to address platelet dysfunction have included proteomics approaches which have lead to the discovery of a number of novel regulatory proteins of potential therapeutic value. Global analysis of platelet proteomes may enhance the outcome of these studies by arranging this information in a contextual manner that recapitulates established signalling complexes and predicts novel regulatory processes. Platelet signalling networks have already begun to be exploited with interrogation of protein datasets using in silico methodologies that locate functionally feasible protein clusters for subsequent biochemical validation. Characterization of these biological systems through analysis of spatial and temporal organization of component proteins is developing alongside advances in the proteomics field. This focused review highlights advances in platelet proteomics data mining approaches that complement the emerging systems biology field. We have also highlighted nucleated cell types as key examples that can inform platelet research. Therapeutic translation of these modern approaches to understanding platelet regulatory mechanisms will enable the development of novel anti-thrombotic strategies.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Glycogen synthase kinase 3 (GSK3, of which there are two isoforms, GSK3alpha and GSK3beta) was originally characterized in the context of regulation of glycogen metabolism, though it is now known to regulate many other cellular processes. Phosphorylation of GSK3alpha(Ser21) and GSK3beta(Ser9) inhibits their activity. In the heart, emphasis has been placed particularly on GSK3beta, rather than GSK3alpha. Importantly, catalytically-active GSK3 generally restrains gene expression and, in the heart, catalytically-active GSK3 has been implicated in anti-hypertrophic signalling. Inhibition of GSK3 results in changes in the activities of transcription and translation factors in the heart and promotes hypertrophic responses, and it is generally assumed that signal transduction from hypertrophic stimuli to GSK3 passes primarily through protein kinase B/Akt (PKB/Akt). However, recent data suggest that the situation is far more complex. We review evidence pertaining to the role of GSK3 in the myocardium and discuss effects of genetic manipulation of GSK3 activity in vivo. We also discuss the signalling pathways potentially regulating GSK3 activity and propose that, depending on the stimulus, phosphorylation of GSK3 is independent of PKB/Akt. Potential GSK3 substrates studied in relation to myocardial hypertrophy include nuclear factors of activated T cells, beta-catenin, GATA4, myocardin, CREB, and eukaryotic initiation factor 2Bvarepsilon. These and other transcription factor substrates putatively important in the heart are considered. We discuss whether cardiac pathologies could be treated by therapeutic intervention at the GSK3 level but conclude that any intervention would be premature without greater understanding of the precise role of GSK3 in cardiac processes.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Progress in functional neuroimaging of the brain increasingly relies on the integration of data from complementary imaging modalities in order to improve spatiotemporal resolution and interpretability. However, the usefulness of merely statistical combinations is limited, since neural signal sources differ between modalities and are related non-trivially. We demonstrate here that a mean field model of brain activity can simultaneously predict EEG and fMRI BOLD with proper signal generation and expression. Simulations are shown using a realistic head model based on structural MRI, which includes both dense short-range background connectivity and long-range specific connectivity between brain regions. The distribution of modeled neural masses is comparable to the spatial resolution of fMRI BOLD, and the temporal resolution of the modeled dynamics, importantly including activity conduction, matches the fastest known EEG phenomena. The creation of a cortical mean field model with anatomically sound geometry, extensive connectivity, and proper signal expression is an important first step towards the model-based integration of multimodal neuroimages.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The currently available model-based global data sets of atmospheric circulation are a by-product of the daily requirement of producing initial conditions for numerical weather prediction (NWP) models. These data sets have been quite useful for studying fundamental dynamical and physical processes, and for describing the nature of the general circulation of the atmosphere. However, due to limitations in the early data assimilation systems and inconsistencies caused by numerous model changes, the available model-based global data sets may not be suitable for studying global climate change. A comprehensive analysis of global observations based on a four-dimensional data assimilation system with a realistic physical model should be undertaken to integrate space and in situ observations to produce internally consistent, homogeneous, multivariate data sets for the earth's climate system. The concept is equally applicable for producing data sets for the atmosphere, the oceans, and the biosphere, and such data sets will be quite useful for studying global climate change.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Brain activity can be measured non-invasively with functional imaging techniques. Each pixel in such an image represents a neural mass of about 105 to 107 neurons. Mean field models (MFMs) approximate their activity by averaging out neural variability while retaining salient underlying features, like neurotransmitter kinetics. However, MFMs incorporating the regional variability, realistic geometry and connectivity of cortex have so far appeared intractable. This lack of biological realism has led to a focus on gross temporal features of the EEG. We address these impediments and showcase a "proof of principle" forward prediction of co-registered EEG/fMRI for a full-size human cortex in a realistic head model with anatomical connectivity, see figure 1. MFMs usually assume homogeneous neural masses, isotropic long-range connectivity and simplistic signal expression to allow rapid computation with partial differential equations. But these approximations are insufficient in particular for the high spatial resolution obtained with fMRI, since different cortical areas vary in their architectonic and dynamical properties, have complex connectivity, and can contribute non-trivially to the measured signal. Our code instead supports the local variation of model parameters and freely chosen connectivity for many thousand triangulation nodes spanning a cortical surface extracted from structural MRI. This allows the introduction of realistic anatomical and physiological parameters for cortical areas and their connectivity, including both intra- and inter-area connections. Proper cortical folding and conduction through a realistic head model is then added to obtain accurate signal expression for a comparison to experimental data. To showcase the synergy of these computational developments, we predict simultaneously EEG and fMRI BOLD responses by adding an established model for neurovascular coupling and convolving "Balloon-Windkessel" hemodynamics. We also incorporate regional connectivity extracted from the CoCoMac database [1]. Importantly, these extensions can be easily adapted according to future insights and data. Furthermore, while our own simulation is based on one specific MFM [2], the computational framework is general and can be applied to models favored by the user. Finally, we provide a brief outlook on improving the integration of multi-modal imaging data through iterative fits of a single underlying MFM in this realistic simulation framework.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Background: Expression microarrays are increasingly used to obtain large scale transcriptomic information on a wide range of biological samples. Nevertheless, there is still much debate on the best ways to process data, to design experiments and analyse the output. Furthermore, many of the more sophisticated mathematical approaches to data analysis in the literature remain inaccessible to much of the biological research community. In this study we examine ways of extracting and analysing a large data set obtained using the Agilent long oligonucleotide transcriptomics platform, applied to a set of human macrophage and dendritic cell samples. Results: We describe and validate a series of data extraction, transformation and normalisation steps which are implemented via a new R function. Analysis of replicate normalised reference data demonstrate that intrarray variability is small (only around 2 of the mean log signal), while interarray variability from replicate array measurements has a standard deviation (SD) of around 0.5 log(2) units (6 of mean). The common practise of working with ratios of Cy5/Cy3 signal offers little further improvement in terms of reducing error. Comparison to expression data obtained using Arabidopsis samples demonstrates that the large number of genes in each sample showing a low level of transcription reflect the real complexity of the cellular transcriptome. Multidimensional scaling is used to show that the processed data identifies an underlying structure which reflect some of the key biological variables which define the data set. This structure is robust, allowing reliable comparison of samples collected over a number of years and collected by a variety of operators. Conclusions: This study outlines a robust and easily implemented pipeline for extracting, transforming normalising and visualising transcriptomic array data from Agilent expression platform. The analysis is used to obtain quantitative estimates of the SD arising from experimental (non biological) intra- and interarray variability, and for a lower threshold for determining whether an individual gene is expressed. The study provides a reliable basis for further more extensive studies of the systems biology of eukaryotic cells.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Aerosol indirect effects continue to constitute one of the most important uncertainties for anthropogenic climate perturbations. Within the international AEROCOM initiative, the representation of aerosol-cloud-radiation interactions in ten different general circulation models (GCMs) is evaluated using three satellite datasets. The focus is on stratiform liquid water clouds since most GCMs do not include ice nucleation effects, and none of the model explicitly parameterises aerosol effects on convective clouds. We compute statistical relationships between aerosol optical depth (τa) and various cloud and radiation quantities in a manner that is consistent between the models and the satellite data. It is found that the model-simulated influence of aerosols on cloud droplet number concentration (Nd ) compares relatively well to the satellite data at least over the ocean. The relationship between �a and liquid water path is simulated much too strongly by the models. This suggests that the implementation of the second aerosol indirect effect mainly in terms of an autoconversion parameterisation has to be revisited in the GCMs. A positive relationship between total cloud fraction (fcld) and �a as found in the satellite data is simulated by the majority of the models, albeit less strongly than that in the satellite data in most of them. In a discussion of the hypotheses proposed in the literature to explain the satellite-derived strong fcld–�a relationship, our results indicate that none can be identified as a unique explanation. Relationships similar to the ones found in satellite data between �a and cloud top temperature or outgoing long-wave radiation (OLR) are simulated by only a few GCMs. The GCMs that simulate a negative OLR - �a relationship show a strong positive correlation between �a and fcld. The short-wave total aerosol radiative forcing as simulated by the GCMs is strongly influenced by the simulated anthropogenic fraction of �a, and parameterisation assumptions such as a lower bound on Nd . Nevertheless, the strengths of the statistical relationships are good predictors for the aerosol forcings in the models. An estimate of the total short-wave aerosol forcing inferred from the combination of these predictors for the modelled forcings with the satellite-derived statistical relationships yields a global annual mean value of −1.5±0.5Wm−2. In an alternative approach, the radiative flux perturbation due to anthropogenic aerosols can be broken down into a component over the cloud-free portion of the globe (approximately the aerosol direct effect) and a component over the cloudy portion of the globe (approximately the aerosol indirect effect). An estimate obtained by scaling these simulated clearand cloudy-sky forcings with estimates of anthropogenic �a and satellite-retrieved Nd–�a regression slopes, respectively, yields a global, annual-mean aerosol direct effect estimate of −0.4±0.2Wm−2 and a cloudy-sky (aerosol indirect effect) estimate of −0.7±0.5Wm−2, with a total estimate of −1.2±0.4Wm−2.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The strategic integration of the human resource (HR) function is regarded as crucial in the literature on (strategic) human resource management ((S)HRM). Evidence on the contextual or structural influences on this integration is, however, limited. The structural implications of unionism are particularly intriguing given the evolution of study of the employment relationship. Pluralism is typically seen as antithetical to SHRM, and unions as an impediment to the strategic integration of HR functions, but there are also suggestions in the literature that unionism might facilitate the strategic integration of HR. This paper deploys large-scale international survey evidence to examine the organization-level influence of unionism on this strategic integration, allowing for other established and plausible influences. The analysis reveals that exceptionally, where the organization-level role of unions is particularly contested, unionism does impede the strategic integration of HR. However, it is the predominance of the facilitation of the strategic integration of HR by unionism which is most remarkable.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Smart healthcare is a complex domain for systems integration due to human and technical factors and heterogeneous data sources involved. As a part of smart city, it is such a complex area where clinical functions require smartness of multi-systems collaborations for effective communications among departments, and radiology is one of the areas highly relies on intelligent information integration and communication. Therefore, it faces many challenges regarding integration and its interoperability such as information collision, heterogeneous data sources, policy obstacles, and procedure mismanagement. The purpose of this study is to conduct an analysis of data, semantic, and pragmatic interoperability of systems integration in radiology department, and to develop a pragmatic interoperability framework for guiding the integration. We select an on-going project at a local hospital for undertaking our case study. The project is to achieve data sharing and interoperability among Radiology Information Systems (RIS), Electronic Patient Record (EPR), and Picture Archiving and Communication Systems (PACS). Qualitative data collection and analysis methods are used. The data sources consisted of documentation including publications and internal working papers, one year of non-participant observations and 37 interviews with radiologists, clinicians, directors of IT services, referring clinicians, radiographers, receptionists and secretary. We identified four primary phases of data analysis process for the case study: requirements and barriers identification, integration approach, interoperability measurements, and knowledge foundations. Each phase is discussed and supported by qualitative data. Through the analysis we also develop a pragmatic interoperability framework that summaries the empirical findings and proposes recommendations for guiding the integration in the radiology context.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

With the prospect of exascale computing, computational methods requiring only local data become especially attractive. Consequently, the typical domain decomposition of atmospheric models means horizontally-explicit vertically-implicit (HEVI) time-stepping schemes warrant further attention. In this analysis, Runge-Kutta implicit-explicit schemes from the literature are analysed for their stability and accuracy using a von Neumann stability analysis of two linear systems. Attention is paid to the numerical phase to indicate the behaviour of phase and group velocities. Where the analysis is tractable, analytically derived expressions are considered. For more complicated cases, amplification factors have been numerically generated and the associated amplitudes and phase diagnosed. Analysis of a system describing acoustic waves has necessitated attributing the three resultant eigenvalues to the three physical modes of the system. To do so, a series of algorithms has been devised to track the eigenvalues across the frequency space. The result enables analysis of whether the schemes exactly preserve the non-divergent mode; and whether there is evidence of spurious reversal in the direction of group velocities or asymmetry in the damping for the pair of acoustic modes. Frequency ranges that span next-generation high-resolution weather models to coarse-resolution climate models are considered; and a comparison is made of errors accumulated from multiple stability-constrained shorter time-steps from the HEVI scheme with a single integration from a fully implicit scheme over the same time interval. Two schemes, “Trap2(2,3,2)” and “UJ3(1,3,2)”, both already used in atmospheric models, are identified as offering consistently good stability and representation of phase across all the analyses. Furthermore, according to a simple measure of computational cost, “Trap2(2,3,2)” is the least expensive.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Purpose – This paper aims to provide a synthetic review of the empirical literature on the multinational enterprise (MNE), subsidiaries and performance. Design/methodology/approach – The paper examines the following: the theoretical and conceptual foundation of multinationality (M) and performance (P) measures; the impact of MNE strategic investment motives on performance; the influence of contextual external and internal environment factors on performance; the strategy to optimize value chain activities of the MNE by cooperating with external partners in an asymmetric network, the key drivers of enhanced shareholder value and the implications of performance; and the need to access primary data provided by firms and managers themselves when analyzing the internal functioning of the MNE and its subsidiaries. Findings – The overall message from this literature review is that empirical research should be designed on the basis of relevant theoretical and conceptual foundations of the performance construct. Originality/value – The paper provides a systematic and synthetic review of theoretical and empirical literature.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The susceptibility of a catchment to flooding is affected by its soil moisture prior to an extreme rainfall event. While soil moisture is routinely observed by satellite instruments, results from previous work on the assimilation of remotely sensed soil moisture into hydrologic models have been mixed. This may have been due in part to the low spatial resolution of the observations used. In this study, the remote sensing aspects of a project attempting to improve flow predictions from a distributed hydrologic model by assimilating soil moisture measurements are described. Advanced Synthetic Aperture Radar (ASAR) Wide Swath data were used to measure soil moisture as, unlike low resolution microwave data, they have sufficient resolution to allow soil moisture variations due to local topography to be detected, which may help to take into account the spatial heterogeneity of hydrological processes. Surface soil moisture content (SSMC) was measured over the catchments of the Severn and Avon rivers in the South West UK. To reduce the influence of vegetation, measurements were made only over homogeneous pixels of improved grassland determined from a land cover map. Radar backscatter was corrected for terrain variations and normalized to a common incidence angle. SSMC was calculated using change detection. To search for evidence of a topographic signal, the mean SSMC from improved grassland pixels on low slopes near rivers was compared to that on higher slopes. When the mean SSMC on low slopes was 30–90%, the higher slopes were slightly drier than the low slopes. The effect was reversed for lower SSMC values. It was also more pronounced during a drying event. These findings contribute to the scant information in the literature on the use of high resolution SAR soil moisture measurement to improve hydrologic models.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The past years have shown an enormous advancement in sequencing and array-based technologies, producing supplementary or alternative views of the genome stored in various formats and databases. Their sheer volume and different data scope pose a challenge to jointly visualize and integrate diverse data types. We present AmalgamScope a new interactive software tool focusing on assisting scientists with the annotation of the human genome and particularly the integration of the annotation files from multiple data types, using gene identifiers and genomic coordinates. Supported platforms include next-generation sequencing and microarray technologies. The available features of AmalgamScope range from the annotation of diverse data types across the human genome to integration of the data based on the annotational information and visualization of the merged files within chromosomal regions or the whole genome. Additionally, users can define custom transcriptome library files for any species and use the file exchanging distant server options of the tool.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The mineralogy of airborne dust affects the impact of dust particles on direct and indirect radiative forcing, on atmospheric chemistry and on biogeochemical cycling. It is determined partly by the mineralogy of the dust-source regions and partly by size-dependent fractionation during erosion and transport. Here we present a data set that characterizes the clay and silt-sized fractions of global soil units in terms of the abundance of 12 minerals that are important for dust–climate interactions: quartz, feldspars, illite, smectite, kaolinite, chlorite, vermiculite, mica, calcite, gypsum, hematite and goethite. The basic mineralogical information is derived from the literature, and is then expanded following explicit rules, in order to characterize as many soil units as possible. We present three alternative realizations of the mineralogical maps, taking the uncertainties in the mineralogical data into account. We examine the implications of the new database for calculations of the single scattering albedo of airborne dust and thus for dust radiative forcing.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper reviews the literature concerning the practice of using Online Analytical Processing (OLAP) systems to recall information stored by Online Transactional Processing (OLTP) systems. Such a review provides a basis for discussion on the need for the information that are recalled through OLAP systems to maintain the contexts of transactions with the data captured by the respective OLTP system. The paper observes an industry trend involving the use of OLTP systems to process information into data, which are then stored in databases without the business rules that were used to process information and data stored in OLTP databases without associated business rules. This includes the necessitation of a practice, whereby, sets of business rules are used to extract, cleanse, transform and load data from disparate OLTP systems into OLAP databases to support the requirements for complex reporting and analytics. These sets of business rules are usually not the same as business rules used to capture data in particular OLTP systems. The paper argues that, differences between the business rules used to interpret these same data sets, risk gaps in semantics between information captured by OLTP systems and information recalled through OLAP systems. Literature concerning the modeling of business transaction information as facts with context as part of the modelling of information systems were reviewed to identify design trends that are contributing to the design quality of OLTP and OLAP systems. The paper then argues that; the quality of OLTP and OLAP systems design has a critical dependency on the capture of facts with associated context, encoding facts with contexts into data with business rules, storage and sourcing of data with business rules, decoding data with business rules into the facts with the context and recall of facts with associated contexts. The paper proposes UBIRQ, a design model to aid the co-design of data with business rules storage for OLTP and OLAP purposes. The proposed design model provides the opportunity for the implementation and use of multi-purpose databases, and business rules stores for OLTP and OLAP systems. Such implementations would enable the use of OLTP systems to record and store data with executions of business rules, which will allow for the use of OLTP and OLAP systems to query data with business rules used to capture the data. Thereby ensuring information recalled via OLAP systems preserves the contexts of transactions as per the data captured by the respective OLTP system.