999 resultados para omics technology
Resumo:
Biobanks represent key resources for clinico-genomic research and are needed to pave the way to personalised medicine. To achieve this goal, it is crucial that scientists can securely access and share high-quality biomaterial and related data. Therefore, there is a growing interest in integrating biobanks into larger biomedical information and communication technology (ICT) infrastructures. The European project p-medicine is currently building an innovative ICT infrastructure to meet this need. This platform provides tools and services for conducting research and clinical trials in personalised medicine. In this paper, we describe one of its main components, the biobank access framework p-BioSPRE (p-medicine Biospecimen Search and Project Request Engine). This generic framework enables and simplifies access to existing biobanks, but also to offer own biomaterial collections to research communities, and to manage biobank specimens and related clinical data over the ObTiMA Trial Biomaterial Manager. p-BioSPRE takes into consideration all relevant ethical and legal standards, e.g., safeguarding donors’ personal rights and enabling biobanks to keep control over the donated material and related data. The framework thus enables secure sharing of biomaterial within open and closed research communities, while flexibly integrating related clinical and omics data. Although the development of the framework is mainly driven by user scenarios from the cancer domain, in this case, acute lymphoblastic leukaemia and Wilms tumour, it can be extended to further disease entities.
Resumo:
Circulating low density lipoproteins (LDL) are thought to play a crucial role in the onset and development of atherosclerosis, though the detailed molecular mechanisms responsible for their biological effects remain controversial. The complexity of biomolecules (lipids, glycans and protein) and structural features (isoforms and chemical modifications) found in LDL particles hampers the complete understanding of the mechanism underlying its atherogenicity. For this reason the screening of LDL for features discriminative of a particular pathology in search of biomarkers is of high importance. Three major biomolecule classes (lipids, protein and glycans) in LDL particles were screened using mass spectrometry coupled to liquid chromatography. Dual-polarity screening resulted in good lipidome coverage, identifying over 300 lipid species from 12 lipid sub-classes. Multivariate analysis was used to investigate potential discriminators in the individual lipid sub-classes for different study groups (age, gender, pathology). Additionally, the high protein sequence coverage of ApoB-100 routinely achieved (≥70%) assisted in the search for protein modifications correlating to aging and pathology. The large size and complexity of the datasets required the use of chemometric methods (Partial Least Square-Discriminant Analysis, PLS-DA) for their analysis and for the identification of ions that discriminate between study groups. The peptide profile from enzymatically digested ApoB-100 can be correlated with the high structural complexity of lipids associated with ApoB-100 using exploratory data analysis. In addition, using targeted scanning modes, glycosylation sites within neutral and acidic sugar residues in ApoB-100 are also being explored. Together or individually, knowledge of the profiles and modifications of the major biomolecules in LDL particles will contribute towards an in-depth understanding, will help to map the structural features that contribute to the atherogenicity of LDL, and may allow identification of reliable, pathology-specific biomarkers. This research was supported by a Marie Curie Intra-European Fellowship within the 7th European Community Framework Program (IEF 255076). Work of A. Rudnitskaya was supported by Portuguese Science and Technology Foundation, through the European Social Fund (ESF) and "Programa Operacional Potencial Humano - POPH".
Resumo:
This project aims at deepening the understanding of the molecular basis of the phenotypic heterogeneity of prion diseases. Prion diseases represent the first and clearest example of “protein misfolding diseases”, that are all the neurodegenerative diseases caused by the accumulation of misfolded proteins in the central nervous system. In the field of protein misfolding diseases, the term “strain” describes the heterogeneity observed among the same disease in the clinical and pathologic progression, biochemical features of the aggregated protein, conformational memory and pattern of lesions. In this work, the two most common strains of Creutzfeldt-Jakob Disease (CJD), named MM1 and VV2, were analyzed. This thesis investigates the strain paradigm with the production of new multi omic data, and, on such data, appropriate computational analysis combining bioinformatics, data science and statistical approaches was performed. In this work, genomic and transcriptomic profiling allowed an improved characterization of the molecular features of the two most common strains of CJD, identifying multiple possible genetic contributors to the disease and finding several shared impaired pathways between the VV2 strain and Parkinson Disease. On the epigenomic level, the tridimensional chromatin folding in peripheral immune cells of CJD patients at onset and of healthy controls was investigated with Hi-C. While being the first application of this very advanced technology in prion diseases and one of the first in general in neurobiology, this work found a significant and diffuse loss of genomic interactions in immune cells of CJD patients at disease onset, particularly in the PRNP locus, suggesting a possible impairment of chromatin conformation in the disease. The results of this project represent a novelty in the state of the art in this field, both from a biomedical and technological point of view.
Resumo:
There are several tools in the literature that support innovation in organizations. Some of the most cited are the so-called technology roadmapping methods, also known as TRM. However, these methods are designed primarily for organizations that adopt the market pull strategy of technology-product integration. Organizations that adopt the technology push integration strategy are neglected in the literature. Furthermore, with the advent of open innovation, it is possible to note the need to consider the adoption of partnerships in the innovation process. Thus, this study proposes a method of technology roadmapping, identified as method for technology push (MTP), applicable to organizations that adopt the technology push integration strategy, such as SMEs and independent research centers in an open-innovation environment. The method was developed through action-research and was assessed from two analytical standpoints: externally, via a specific literature review on its theoretical contributions, and internally, through the analysis of potential users` perceptions on the feasibility of applying MTP. The results indicate both the unique character of the method and its perceived implementation feasibility. Future research is suggested in order to validate the method in different types of organizations (C) 2011 Elsevier Ltd. All rights reserved.
Resumo:
Many authors point out that the front-end of new product development (NPD) is a critical success factor in the NPD process and that numerous companies face difficulties in carrying it out appropriately. Therefore, it is important to develop new theories and proposals that support the effective implementation of this earliest phase of NPD. This paper presents a new method to support the development of front-end activities based on integrating technology roadmapping (TRM) and project portfolio management (PPM). This new method, called the ITP Method, was implemented at a small Brazilian high-tech company in the nanotechnology industry to explore the integration proposal. The case study demonstrated that the ITP Method provides a systematic procedure for the fuzzy front-end and integrates innovation perspectives into a single roadmap, which allows for a better alignment of business efforts and communication of product innovation goals. Furthermore, the results indicated that the method may also improve quality, functional integration and strategy alignment. (C) 2010 Elsevier Inc. All rights reserved.
Resumo:
This paper presents a proposal for a Quality Management System for a generic GNSS Surveying Company as an alternative for management and service quality improvements. As a result of the increased demand for GNSS measurements, a large number of new or restructured companies were established to operate in that market. Considering that GNSS surveying is a new process, some changes must be performed in order to accommodate the old surveying techniques and the old fashioned management to the new reality. This requires a new management model that must be based on a well-described procedure sequence aiming at the Total Management Quality for the company. The proposed Quality Management System was based on the requirements of the Quality System ISO 9000:2000, applied to the whole company, focusing on the productive process of GNSS surveying work.
Resumo:
Considering the increasing popularity of network-based control systems and the huge adoption of IP networks (such as the Internet), this paper studies the influence of network quality of service (QoS) parameters over quality of control parameters. An example of a control loop is implemented using two LonWorks networks (CEA-709.1) interconnected by an emulated IP network, in which important QoS parameters such as delay and delay jitter can be completely controlled. Mathematical definitions are provided according to the literature, and the results of the network-based control loop experiment are presented and discussed.
Resumo:
This work presents a case study on technology assessment for power quality improvement devices. A system compatibility test protocol for power quality mitigation devices was developed in order to evaluate the functionality of three-phase voltage restoration devices. In order to validate this test protocol, the micro-DVR, a reduced power development platform for DVR (dynamic voltage restorer) devices, was tested and the results are discussed based on voltage disturbances standards. (C) 2011 Elsevier B.V. All rights reserved.
Resumo:
A green ceramic tape micro-heat exchanger was developed using Low Temperature Co-fired Ceramics technology (LTCC). The device was designed by using Computational Aided Design software and simulations were made using a Computational Fluid Dynamics package (COMSOL Multiphysics) to evaluate the homogeneity of fluid distribution in the microchannels. Four geometries were proposed and simulated in two and three dimensions to show that geometric details directly affect the distribution of velocity in the micro-heat exchanger channels. The simulation results were quite useful for the design of the microfluidic device. The micro-heat exchanger was then constructed using the LTCC technology and is composed of five thermal exchange plates in cross-flow arrangement and two connecting plates, with all plates stacked to form a device with external dimensions of 26 x 26 x 6 mm(3).
Resumo:
The application of airborne laser scanning (ALS) technologies in forest inventories has shown great potential to improve the efficiency of forest planning activities. Precise estimates, fast assessment and relatively low complexity can explain the good results in terms of efficiency. The evolution of GPS and inertial measurement technologies, as well as the observed lower assessment costs when these technologies are applied to large scale studies, can explain the increasing dissemination of ALS technologies. The observed good quality of results can be expressed by estimates of volumes and basal area with estimated error below the level of 8.4%, depending on the size of sampled area, the quantity of laser pulses per square meter and the number of control plots. This paper analyzes the potential of an ALS assessment to produce certain forest inventory statistics in plantations of cloned Eucalyptus spp with precision equal of superior to conventional methods. The statistics of interest in this case were: volume, basal area, mean height and dominant trees mean height. The ALS flight for data assessment covered two strips of approximately 2 by 20 Km, in which clouds of points were sampled in circular plots with a radius of 13 m. Plots were sampled in different parts of the strips to cover different stand ages. The clouds of points generated by the ALS assessment: overall height mean, standard error, five percentiles (height under which we can find 10%, 30%, 50%,70% and 90% of the ALS points above ground level in the cloud), and density of points above ground level in each percentile were calculated. The ALS statistics were used in regression models to estimate mean diameter, mean height, mean height of dominant trees, basal area and volume. Conventional forest inventory sample plots provided real data. For volume, an exploratory assessment involving different combinations of ALS statistics allowed for the definition of the most promising relationships and fitting tests based on well known forest biometric models. The models based on ALS statistics that produced the best results involved: the 30% percentile to estimate mean diameter (R(2)=0,88 and MQE%=0,0004); the 10% and 90% percentiles to estimate mean height (R(2)=0,94 and MQE%=0,0003); the 90% percentile to estimate dominant height (R(2)=0,96 and MQE%=0,0003); the 10% percentile and mean height of ALS points to estimate basal area (R(2)=0,92 and MQE%=0,0016); and, to estimate volume, age and the 30% and 90% percentiles (R(2)=0,95 MQE%=0,002). Among the tested forest biometric models, the best fits were provided by the modified Schumacher using age and the 90% percentile, modified Clutter using age, mean height of ALS points and the 70% percentile, and modified Buckman using age, mean height of ALS points and the 10% percentile.
Resumo:
Turtle excluder devices (TEDs) are being trialed on a voluntary basis in many Australian prawn (shrimp) trawl fisheries to reduce sea turtle captures. Analysis of TED introductions into shrimp trawl fisheries of the United States provided major insights into why conflicts occurred between shrimpers, conservationists, and government agencies. A conflict over the introduction and subsequent regulation of TEDs occurred because the problem and the solution were perceived differently by the various stakeholders. Attempts to negotiate and mediate the conflict broke down, resulting in litigation against the U.S. government by conservationists and shrimpers. Litigation was not an efficient resolution to the sea turtle-TED-trawl conflict but it appears that litigation was the only remaining path of resolution once the issue became polarized. We review two major Australian trawl fisheries to identify any significant differences in circumstances that may affect TED acceptance. Australian trawl fisheries are structured differently and good communication occurs between industry and researchers. TEDs are being introduced as mature technology. Furthermore, bycatch issues are of increasing concern to all stakeholders. These factors, combined with insights derived from previous conflicts concerning TEDs in the United Stares, increase the possibilities that TEDs will be introduced to Australian fishers with better acceptance.