71 resultados para farm accountancy data network
Resumo:
Abstract : The human body is composed of a huge number of cells acting together in a concerted manner. The current understanding is that proteins perform most of the necessary activities in keeping a cell alive. The DNA, on the other hand, stores the information on how to produce the different proteins in the genome. Regulating gene transcription is the first important step that can thus affect the life of a cell, modify its functions and its responses to the environment. Regulation is a complex operation that involves specialized proteins, the transcription factors. Transcription factors (TFs) can bind to DNA and activate the processes leading to the expression of genes into new proteins. Errors in this process may lead to diseases. In particular, some transcription factors have been associated with a lethal pathological state, commonly known as cancer, associated with uncontrolled cellular proliferation, invasiveness of healthy tissues and abnormal responses to stimuli. Understanding cancer-related regulatory programs is a difficult task, often involving several TFs interacting together and influencing each other's activity. This Thesis presents new computational methodologies to study gene regulation. In addition we present applications of our methods to the understanding of cancer-related regulatory programs. The understanding of transcriptional regulation is a major challenge. We address this difficult question combining computational approaches with large collections of heterogeneous experimental data. In detail, we design signal processing tools to recover transcription factors binding sites on the DNA from genome-wide surveys like chromatin immunoprecipitation assays on tiling arrays (ChIP-chip). We then use the localization about the binding of TFs to explain expression levels of regulated genes. In this way we identify a regulatory synergy between two TFs, the oncogene C-MYC and SP1. C-MYC and SP1 bind preferentially at promoters and when SP1 binds next to C-NIYC on the DNA, the nearby gene is strongly expressed. The association between the two TFs at promoters is reflected by the binding sites conservation across mammals, by the permissive underlying chromatin states 'it represents an important control mechanism involved in cellular proliferation, thereby involved in cancer. Secondly, we identify the characteristics of TF estrogen receptor alpha (hERa) target genes and we study the influence of hERa in regulating transcription. hERa, upon hormone estrogen signaling, binds to DNA to regulate transcription of its targets in concert with its co-factors. To overcome the scarce experimental data about the binding sites of other TFs that may interact with hERa, we conduct in silico analysis of the sequences underlying the ChIP sites using the collection of position weight matrices (PWMs) of hERa partners, TFs FOXA1 and SP1. We combine ChIP-chip and ChIP-paired-end-diTags (ChIP-pet) data about hERa binding on DNA with the sequence information to explain gene expression levels in a large collection of cancer tissue samples and also on studies about the response of cells to estrogen. We confirm that hERa binding sites are distributed anywhere on the genome. However, we distinguish between binding sites near promoters and binding sites along the transcripts. The first group shows weak binding of hERa and high occurrence of SP1 motifs, in particular near estrogen responsive genes. The second group shows strong binding of hERa and significant correlation between the number of binding sites along a gene and the strength of gene induction in presence of estrogen. Some binding sites of the second group also show presence of FOXA1, but the role of this TF still needs to be investigated. Different mechanisms have been proposed to explain hERa-mediated induction of gene expression. Our work supports the model of hERa activating gene expression from distal binding sites by interacting with promoter bound TFs, like SP1. hERa has been associated with survival rates of breast cancer patients, though explanatory models are still incomplete: this result is important to better understand how hERa can control gene expression. Thirdly, we address the difficult question of regulatory network inference. We tackle this problem analyzing time-series of biological measurements such as quantification of mRNA levels or protein concentrations. Our approach uses the well-established penalized linear regression models where we impose sparseness on the connectivity of the regulatory network. We extend this method enforcing the coherence of the regulatory dependencies: a TF must coherently behave as an activator, or a repressor on all its targets. This requirement is implemented as constraints on the signs of the regressed coefficients in the penalized linear regression model. Our approach is better at reconstructing meaningful biological networks than previous methods based on penalized regression. The method is tested on the DREAM2 challenge of reconstructing a five-genes/TFs regulatory network obtaining the best performance in the "undirected signed excitatory" category. Thus, these bioinformatics methods, which are reliable, interpretable and fast enough to cover large biological dataset, have enabled us to better understand gene regulation in humans.
Proteomic data from human cell cultures refine mechanisms of chaperone-mediated protein homeostasis.
Resumo:
In the crowded environment of human cells, folding of nascent polypeptides and refolding of stress-unfolded proteins is error prone. Accumulation of cytotoxic misfolded and aggregated species may cause cell death, tissue loss, degenerative conformational diseases, and aging. Nevertheless, young cells effectively express a network of molecular chaperones and folding enzymes, termed here "the chaperome," which can prevent formation of potentially harmful misfolded protein conformers and use the energy of adenosine triphosphate (ATP) to rehabilitate already formed toxic aggregates into native functional proteins. In an attempt to extend knowledge of chaperome mechanisms in cellular proteostasis, we performed a meta-analysis of human chaperome using high-throughput proteomic data from 11 immortalized human cell lines. Chaperome polypeptides were about 10 % of total protein mass of human cells, half of which were Hsp90s and Hsp70s. Knowledge of cellular concentrations and ratios among chaperome polypeptides provided a novel basis to understand mechanisms by which the Hsp60, Hsp70, Hsp90, and small heat shock proteins (HSPs), in collaboration with cochaperones and folding enzymes, assist de novo protein folding, import polypeptides into organelles, unfold stress-destabilized toxic conformers, and control the conformal activity of native proteins in the crowded environment of the cell. Proteomic data also provided means to distinguish between stable components of chaperone core machineries and dynamic regulatory cochaperones.
Resumo:
Background: This study analyzed prognostic factors and treatment outcomes of primary thyroid lymphoma. Patients and Methods: Data were retrospectively collected for 87 patients (53 stage I and 34 stage II) with median age 65 years. Fifty-two patients were treated with single modality (31 with chemotherapy alone and 21 with radiotherapy alone) and 35 with combined modality treatment. Median follow-up was 51 months. Results: Sixty patients had aggressive lymphoma and 27 had indolent lymphoma. The 5- and 10-year overall survival (OS) rates were 74% and 71%, respectively, and the disease-free survival (DFS) rates were 68% and 64%. Univariate analysis revealed that age, tumor size, stage, lymph node involvement, B symptoms, and treatment modality were prognostic factors for OS, DFS, and local control (LC). Patients with thyroiditis had significantly better LC rates. In multivariate analysis, OS was influenced by age, B symptoms, lymph node involvement, and tumor size, whereas DFS and LC were influenced by B symptoms and tumor size. Compared with single modality treatment, patients treated with combined modality had better 5-year OS, DFS, and LC. Conclusions: Combined modality leads to an excellent prognosis for patients with aggressive lymphoma but does not improve OS and LC in patients with indolent lymphoma.
Resumo:
The paper presents the Multiple Kernel Learning (MKL) approach as a modelling and data exploratory tool and applies it to the problem of wind speed mapping. Support Vector Regression (SVR) is used to predict spatial variations of the mean wind speed from terrain features (slopes, terrain curvature, directional derivatives) generated at different spatial scales. Multiple Kernel Learning is applied to learn kernels for individual features and thematic feature subsets, both in the context of feature selection and optimal parameters determination. An empirical study on real-life data confirms the usefulness of MKL as a tool that enhances the interpretability of data-driven models.
Resumo:
Although cross-sectional diffusion tensor imaging (DTI) studies revealed significant white matter changes in mild cognitive impairment (MCI), the utility of this technique in predicting further cognitive decline is debated. Thirty-five healthy controls (HC) and 67 MCI subjects with DTI baseline data were neuropsychologically assessed at one year. Among them, there were 40 stable (sMCI; 9 single domain amnestic, 7 single domain frontal, 24 multiple domain) and 27 were progressive (pMCI; 7 single domain amnestic, 4 single domain frontal, 16 multiple domain). Fractional anisotropy (FA) and longitudinal, radial, and mean diffusivity were measured using Tract-Based Spatial Statistics. Statistics included group comparisons and individual classification of MCI cases using support vector machines (SVM). FA was significantly higher in HC compared to MCI in a distributed network including the ventral part of the corpus callosum, right temporal and frontal pathways. There were no significant group-level differences between sMCI versus pMCI or between MCI subtypes after correction for multiple comparisons. However, SVM analysis allowed for an individual classification with accuracies up to 91.4% (HC versus MCI) and 98.4% (sMCI versus pMCI). When considering the MCI subgroups separately, the minimum SVM classification accuracy for stable versus progressive cognitive decline was 97.5% in the multiple domain MCI group. SVM analysis of DTI data provided highly accurate individual classification of stable versus progressive MCI regardless of MCI subtype, indicating that this method may become an easily applicable tool for early individual detection of MCI subjects evolving to dementia.
Resumo:
ABSTRACT : A firm's competitive advantage can arise from internal resources as well as from an interfirm network. -This dissertation investigates the competitive advantage of a firm involved in an innovation network by integrating strategic management theory and social network theory. It develops theory and provides empirical evidence that illustrates how a networked firm enables the network value and appropriates this value in an optimal way according to its strategic purpose. The four inter-related essays in this dissertation provide a framework that sheds light on the extraction of value from an innovation network by managing and designing the network in a proactive manner. The first essay reviews research in social network theory and knowledge transfer management, and identifies the crucial factors of innovation network configuration for a firm's learning performance or innovation output. The findings suggest that network structure, network relationship, and network position all impact on a firm's performance. Although the previous literature indicates that there are disagreements about the impact of dense or spare structure, as well as strong or weak ties, case evidence from Chinese software companies reveals that dense and strong connections with partners are positively associated with firms' performance. The second essay is a theoretical essay that illustrates the limitations of social network theory for explaining the source of network value and offers a new theoretical model that applies resource-based view to network environments. It suggests that network configurations, such as network structure, network relationship and network position, can be considered important network resources. In addition, this essay introduces the concept of network capability, and suggests that four types of network capabilities play an important role in unlocking the potential value of network resources and determining the distribution of network rents between partners. This essay also highlights the contingent effects of network capability on a firm's innovation output, and explains how the different impacts of network capability depend on a firm's strategic choices. This new theoretical model has been pre-tested with a case study of China software industry, which enhances the internal validity of this theory. The third essay addresses the questions of what impact network capability has on firm innovation performance and what are the antecedent factors of network capability. This essay employs a structural equation modelling methodology that uses a sample of 211 Chinese Hi-tech firms. It develops a measurement of network capability and reveals that networked firms deal with cooperation between, and coordination with partners on different levels according to their levels of network capability. The empirical results also suggests that IT maturity, the openness of culture, management system involved, and experience with network activities are antecedents of network capabilities. Furthermore, the two-group analysis of the role of international partner(s) shows that when there is a culture and norm gap between foreign partners, a firm must mobilize more resources and effort to improve its performance with respect to its innovation network. The fourth essay addresses the way in which network capabilities influence firm innovation performance. By using hierarchical multiple regression with data from Chinese Hi-tech firms, the findings suggest that there is a significant partial mediating effect of knowledge transfer on the relationships between network capabilities and innovation performance. The findings also reveal that the impacts of network capabilities divert with the environment and strategic decision the firm has made: exploration or exploitation. Network constructing capability provides a greater positive impact on and yields more contributions to innovation performance than does network operating capability in an exploration network. Network operating capability is more important than network constructing capability for innovative firms in an exploitation network. Therefore, these findings highlight that the firm can shape the innovation network proactively for better benefits, but when it does so, it should adjust its focus and change its efforts in accordance with its innovation purposes or strategic orientation.
Resumo:
As a thorough aggregation of probability and graph theory, Bayesian networks currently enjoy widespread interest as a means for studying factors that affect the coherent evaluation of scientific evidence in forensic science. Paper I of this series of papers intends to contribute to the discussion of Bayesian networks as a framework that is helpful for both illustrating and implementing statistical procedures that are commonly employed for the study of uncertainties (e.g. the estimation of unknown quantities). While the respective statistical procedures are widely described in literature, the primary aim of this paper is to offer an essentially non-technical introduction on how interested readers may use these analytical approaches - with the help of Bayesian networks - for processing their own forensic science data. Attention is mainly drawn to the structure and underlying rationale of a series of basic and context-independent network fragments that users may incorporate as building blocs while constructing larger inference models. As an example of how this may be done, the proposed concepts will be used in a second paper (Part II) for specifying graphical probability networks whose purpose is to assist forensic scientists in the evaluation of scientific evidence encountered in the context of forensic document examination (i.e. results of the analysis of black toners present on printed or copied documents).
Resumo:
Plants such as Arabidopsis thaliana respond to foliar shade and neighbors who may become competitors for light resources by elongation growth to secure access to unfiltered sunlight. Challenges faced during this shade avoidance response (SAR) are different under a light-absorbing canopy and during neighbor detection where light remains abundant. In both situations, elongation growth depends on auxin and transcription factors of the phytochrome interacting factor (PIF) class. Using a computational modeling approach to study the SAR regulatory network, we identify and experimentally validate a previously unidentified role for long hypocotyl in far red 1, a negative regulator of the PIFs. Moreover, we find that during neighbor detection, growth is promoted primarily by the production of auxin. In contrast, in true shade, the system operates with less auxin but with an increased sensitivity to the hormonal signal. Our data suggest that this latter signal is less robust, which may reflect a cost-to-robustness tradeoff, a system trait long recognized by engineers and forming the basis of information theory.
Resumo:
Introduction: The field of Connectomic research is growing rapidly, resulting from methodological advances in structural neuroimaging on many spatial scales. Especially progress in Diffusion MRI data acquisition and processing made available macroscopic structural connectivity maps in vivo through Connectome Mapping Pipelines (Hagmann et al, 2008) into so-called Connectomes (Hagmann 2005, Sporns et al, 2005). They exhibit both spatial and topological information that constrain functional imaging studies and are relevant in their interpretation. The need for a special-purpose software tool for both clinical researchers and neuroscientists to support investigations of such connectome data has grown. Methods: We developed the ConnectomeViewer, a powerful, extensible software tool for visualization and analysis in connectomic research. It uses the novel defined container-like Connectome File Format, specifying networks (GraphML), surfaces (Gifti), volumes (Nifti), track data (TrackVis) and metadata. Usage of Python as programming language allows it to by cross-platform and have access to a multitude of scientific libraries. Results: Using a flexible plugin architecture, it is possible to enhance functionality for specific purposes easily. Following features are already implemented: * Ready usage of libraries, e.g. for complex network analysis (NetworkX) and data plotting (Matplotlib). More brain connectivity measures will be implemented in a future release (Rubinov et al, 2009). * 3D View of networks with node positioning based on corresponding ROI surface patch. Other layouts possible. * Picking functionality to select nodes, select edges, get more node information (ConnectomeWiki), toggle surface representations * Interactive thresholding and modality selection of edge properties using filters * Arbitrary metadata can be stored for networks, thereby allowing e.g. group-based analysis or meta-analysis. * Python Shell for scripting. Application data is exposed and can be modified or used for further post-processing. * Visualization pipelines using filters and modules can be composed with Mayavi (Ramachandran et al, 2008). * Interface to TrackVis to visualize track data. Selected nodes are converted to ROIs for fiber filtering The Connectome Mapping Pipeline (Hagmann et al, 2008) processed 20 healthy subjects into an average Connectome dataset. The Figures show the ConnectomeViewer user interface using this dataset. Connections are shown that occur in all 20 subjects. The dataset is freely available from the homepage (connectomeviewer.org). Conclusions: The ConnectomeViewer is a cross-platform, open-source software tool that provides extensive visualization and analysis capabilities for connectomic research. It has a modular architecture, integrates relevant datatypes and is completely scriptable. Visit www.connectomics.org to get involved as user or developer.
Resumo:
Automatic environmental monitoring networks enforced by wireless communication technologies provide large and ever increasing volumes of data nowadays. The use of this information in natural hazard research is an important issue. Particularly useful for risk assessment and decision making are the spatial maps of hazard-related parameters produced from point observations and available auxiliary information. The purpose of this article is to present and explore the appropriate tools to process large amounts of available data and produce predictions at fine spatial scales. These are the algorithms of machine learning, which are aimed at non-parametric robust modelling of non-linear dependencies from empirical data. The computational efficiency of the data-driven methods allows producing the prediction maps in real time which makes them superior to physical models for the operational use in risk assessment and mitigation. Particularly, this situation encounters in spatial prediction of climatic variables (topo-climatic mapping). In complex topographies of the mountainous regions, the meteorological processes are highly influenced by the relief. The article shows how these relations, possibly regionalized and non-linear, can be modelled from data using the information from digital elevation models. The particular illustration of the developed methodology concerns the mapping of temperatures (including the situations of Föhn and temperature inversion) given the measurements taken from the Swiss meteorological monitoring network. The range of the methods used in the study includes data-driven feature selection, support vector algorithms and artificial neural networks.
Resumo:
Objectives: To compare the clinical characteristics, species distribution and antifungal susceptibility of Candida bloodstream isolates (BSI) in breakthrough (BTC) vs. non-breakthrough candidemia (NBTC) and to study the effect of prolonged vs. short fluconazole (F) exposure in BTC.Methods: Candida BSI were prospectively collected during 2004- 2006 from 27 hospitals (seven university, 20 affiliated) of the FUNGINOS network. Susceptibility to F, voriconazole (V) and caspofungin (C) was tested in the FUNGINOS mycology reference laboratory by microtitre broth dilution method with the Sensititre YeastOneTM test panel. Clinical data were collected using standardized CRFs. BTC was defined as occurring during antifungal treatment/prophylaxis of at least three days duration prior to the candidemia. Susceptibility of BSI was defined according to 2010/2011 CLSI clinical breakpoints.Results: Out of 567 candidemia episodes, 550 Candida BSI were available. Of these, 43 (7.6%) were from BTC (37/43, 86% were isolated after F exposure). 38 BTC (88.4%) and 315 NBTC (55.6%) occurred in university hospitals (P < 0.001). The majority of patients developing BTC were immunocompromised: higher proportions of haematological malignancies (62.8% in BTC vs. 47.1% in NBTC, P < 0.001), neutropenia (37.2% vs. 11.8%, P < 0.001), acute GvHD (14% vs. 0.2%, P < 0.001), immunosuppressive drugs (74.4% vs. 7.8%, P < 0.001), and mucositis (32.6% vs. 2.3%, P < 0.001) were observed. Other differences between BTC and NBTC were higher proportions of patients with central venous catheters in the 2 weeks preceding candidemia (95.3% vs. 83.4%, P = 0.047) and receiving total parenteral nutrition (62.8% vs. 35.9%, P < 0.001), but a lower proportion of patients treated with gastric proton pump inhibitors (23.3% vs. 72.1%, P < 0.001). Overall mortality of BTC and NBTC was not different (34.9% vs. 31.7%, P = 0.73), while a trend to higher attributable mortality in BTC was found (13.9% vs. 6.9%, P = 0.12). Species identification showed a majority of C. albicans in both groups (51.2% in BTC vs. 62.9% in NBTC, P = 0.26), followed by C. glabrata (18.6% vs. 18.5%), C. tropicalis (2.3% vs. 6.3%) and C. parapsilosis (7.0% vs. 4.7%). Significantly more C. krusei were detected in BTC versus NBTC (11.6% vs. 1.6%, P = 0.002). The geometric mean MIC for F, V and C between BTC and NBTC isolates was not significantly different. However, in BTC there was a significant association between duration of F exposure and the Candida spp.: >10 days of F was associated with a significant shift from susceptible Candida spp. (C. albicans, C. parapsilosis, C. tropicalis, C. famata) to non-susceptible species (C. glabrata, C. krusei, C. norvegensis). Among 21 BTC episodes occurring after £10 days of F, 19% of the isolates were non-susceptible, in contrast to 68.7% in 16 BTC episodes occurring after >10 days of F (P = 0.003).Conclusions: Breakthrough candidemia occurred more often in immunocompromised hosts. Fluconazole administered for >10 days was associated with a shift to non-susceptible Candida spp.. Length of fluconazole exposure should be taken into consideration for the choice of empirical antifungal treatment.
Resumo:
Background: Mantle cell lymphoma (MCL) is a rare subtype (3-9%) of Non Hodgkin Lymphoma (NHL) with a relatively poor prognosis (5-year survival < 40%). Although consolidation of first remission with autologous stem cell transplantation (ASCT) is regarded as "golden standard", less than half of the patients may be subjected to this intensive treatment due to advanced age and co-morbidities. Standard-dose non-myeloablative radioimmunotherapy (RIT) seems to be a very efficient approach for treatment of certain NHL. However, there are almost no data available on the efficacy and safety of RIT in MCL. Methods and Patients: In the RIT-Network, a web-based international registry collecting real observational data from RIT-treated patients, 115 MCL patients treated with ibritumomab tiuxetan were recorded. Most of the patients were elderly males with advanced stage of the disease: median age - 63 (range 31-78); males - 70.4%, stage III/IV - 92%. RIT (i.e. application of ibritumomab tiuxetan) was a part of the first line therapy in 48 pts. (43%). Further 38 pts. (33%) received ibritumomab tiuxetan after two previous chemotherapy regimens, and 33 pts. (24%) after completing 3-8 lines. In 75 cases RIT was applied as a consolidation of chemotherapy induced response; the rest of the patients received ibritumomab tiuxetan because of relapse/refractory disease. At the moment follow up data are available for 74 MCL patients. Results: After RIT the patients achieved high response rate: CR 60.8%, PR 25.7%, and SD 2.7%. Only 10.8% of the patients progressed. For survival analysis many data had to be censored since the documentation had not been completed yet. The projected 3-year overall survival (OAS, fig.1 - image 001.gif) after radioimmunotherapy was 72% for pts. subjected to RIT consolidation versus 29% for those treated in relapse/refractory disease (p=0.03). RIT was feasible for almost all patients; only 3 procedure-related deaths were reported in the whole group. The main adverse event was hematological toxicity (grade III/IV cytopenias) showing a median time of recovery of Hb, WBC and Plt of 45, 40 and 38 days respectively. Conclusion: Standard-dose non-myeloablative RIT is a feasible and safe treatment modality, even for elderly MCL pts. Consolidation radioimmunotherapy with ibritumomab tiuxetan may prolong survival of patients who achieved clinical response after chemotherapy. Therefore, this consolidation approach should be considered as a treatment strategy for those, who are not eligible for ASCT. RIT also has a potential role as a palliation therapy in relapsing/resistant cases.
Resumo:
The paper deals with the development and application of the methodology for automatic mapping of pollution/contamination data. General Regression Neural Network (GRNN) is considered in detail and is proposed as an efficient tool to solve this problem. The automatic tuning of isotropic and an anisotropic GRNN model using cross-validation procedure is presented. Results are compared with k-nearest-neighbours interpolation algorithm using independent validation data set. Quality of mapping is controlled by the analysis of raw data and the residuals using variography. Maps of probabilities of exceeding a given decision level and ?thick? isoline visualization of the uncertainties are presented as examples of decision-oriented mapping. Real case study is based on mapping of radioactively contaminated territories.
Resumo:
Integration of biological data of various types and the development of adapted bioinformatics tools represent critical objectives to enable research at the systems level. The European Network of Excellence ENFIN is engaged in developing an adapted infrastructure to connect databases, and platforms to enable both the generation of new bioinformatics tools and the experimental validation of computational predictions. With the aim of bridging the gap existing between standard wet laboratories and bioinformatics, the ENFIN Network runs integrative research projects to bring the latest computational techniques to bear directly on questions dedicated to systems biology in the wet laboratory environment. The Network maintains internally close collaboration between experimental and computational research, enabling a permanent cycling of experimental validation and improvement of computational prediction methods. The computational work includes the development of a database infrastructure (EnCORE), bioinformatics analysis methods and a novel platform for protein function analysis FuncNet.