77 resultados para Data Acquisition Methods.
Resumo:
BACKGROUND: The annotation of protein post-translational modifications (PTMs) is an important task of UniProtKB curators and, with continuing improvements in experimental methodology, an ever greater number of articles are being published on this topic. To help curators cope with this growing body of information we have developed a system which extracts information from the scientific literature for the most frequently annotated PTMs in UniProtKB. RESULTS: The procedure uses a pattern-matching and rule-based approach to extract sentences with information on the type and site of modification. A ranked list of protein candidates for the modification is also provided. For PTM extraction, precision varies from 57% to 94%, and recall from 75% to 95%, according to the type of modification. The procedure was used to track new publications on PTMs and to recover potential supporting evidence for phosphorylation sites annotated based on the results of large scale proteomics experiments. CONCLUSIONS: The information retrieval and extraction method we have developed in this study forms the basis of a simple tool for the manual curation of protein post-translational modifications in UniProtKB/Swiss-Prot. Our work demonstrates that even simple text-mining tools can be effectively adapted for database curation tasks, providing that a thorough understanding of the working process and requirements are first obtained. This system can be accessed at http://eagl.unige.ch/PTM/.
Resumo:
Land plants have had the reputation of being problematic for DNA barcoding for two general reasons: (i) the standard DNA regions used in algae, animals and fungi have exceedingly low levels of variability and (ii) the typically used land plant plastid phylogenetic markers (e.g. rbcL, trnL-F, etc.) appear to have too little variation. However, no one has assessed how well current phylogenetic resources might work in the context of identification (versus phylogeny reconstruction). In this paper, we make such an assessment, particularly with two of the markers commonly sequenced in land plant phylogenetic studies, plastid rbcL and internal transcribed spacers of the large subunits of nuclear ribosomal DNA (ITS), and find that both of these DNA regions perform well even though the data currently available in GenBank/EBI were not produced to be used as barcodes and BLAST searches are not an ideal tool for this purpose. These results bode well for the use of even more variable regions of plastid DNA (such as, for example, psbA-trnH) as barcodes, once they have been widely sequenced. In the short term, efforts to bring land plant barcoding up to the standards being used now in other organisms should make swift progress. There are two categories of DNA barcode users, scientists in fields other than taxonomy and taxonomists. For the former, the use of mitochondrial and plastid DNA, the two most easily assessed genomes, is at least in the short term a useful tool that permits them to get on with their studies, which depend on knowing roughly which species or species groups they are dealing with, but these same DNA regions have important drawbacks for use in taxonomic studies (i.e. studies designed to elucidate species limits). For these purposes, DNA markers from uniparentally (usually maternally) inherited genomes can only provide half of the story required to improve taxonomic standards being used in DNA barcoding. In the long term, we will need to develop more sophisticated barcoding tools, which would be multiple, low-copy nuclear markers with sufficient genetic variability and PCR-reliability; these would permit the detection of hybrids and permit researchers to identify the 'genetic gaps' that are useful in assessing species limits.
Resumo:
BACKGROUND: PCR has the potential to detect and precisely quantify specific DNA sequences, but it is not yet often used as a fully quantitative method. A number of data collection and processing strategies have been described for the implementation of quantitative PCR. However, they can be experimentally cumbersome, their relative performances have not been evaluated systematically, and they often remain poorly validated statistically and/or experimentally. In this study, we evaluated the performance of known methods, and compared them with newly developed data processing strategies in terms of resolution, precision and robustness. RESULTS: Our results indicate that simple methods that do not rely on the estimation of the efficiency of the PCR amplification may provide reproducible and sensitive data, but that they do not quantify DNA with precision. Other evaluated methods based on sigmoidal or exponential curve fitting were generally of both poor resolution and precision. A statistical analysis of the parameters that influence efficiency indicated that it depends mostly on the selected amplicon and to a lesser extent on the particular biological sample analyzed. Thus, we devised various strategies based on individual or averaged efficiency values, which were used to assess the regulated expression of several genes in response to a growth factor. CONCLUSION: Overall, qPCR data analysis methods differ significantly in their performance, and this analysis identifies methods that provide DNA quantification estimates of high precision, robustness and reliability. These methods allow reliable estimations of relative expression ratio of two-fold or higher, and our analysis provides an estimation of the number of biological samples that have to be analyzed to achieve a given precision.
Resumo:
For the development and evaluation of cardiac magnetic resonance (MR) imaging sequences and methodologies, the availability of a periodically moving phantom to model respiratory and cardiac motion would be of substantial benefit. Given the specific physical boundary conditions in an MR environment, the choice of materials and power source of such phantoms is heavily restricted. Sophisticated commercial solutions are available; however, they are often relatively costly and user-specific modifications may not easily be implemented. We therefore sought to construct a low-cost MR-compatible motion phantom that could be easily reproduced and had design flexibility. A commercially available K'NEX construction set (Hyper Space Training Tower, K'NEX Industries, Inc., Hatfield, PA) was used to construct a periodically moving phantom head. The phantom head performs a translation with a superimposed rotation, driven by a motor over a 2-m rigid rod. To synchronize the MR data acquisition with phantom motion (without introducing radiofrequency-related image artifacts), a fiberoptic control unit generates periodic trigger pulses synchronized to the phantom motion. Total material costs of the phantom are US$ < 200.00, and a total of 80 man-hours were required to design and construct the original phantom. With schematics of the present solution, the phantom reproduction may be achieved in approximately 15 man-hours. The presented MR-compatible periodically moving phantom can easily be reproduced, and user-specific modifications may be implemented. Such an approach allows a detailed investigation of motion-related phenomena in MR images.
Resumo:
BACKGROUND: Several European HIV observational data bases have, over the last decade, accumulated a substantial number of resistance test results and developed large sample repositories, There is a need to link these efforts together, We here describe the development of such a novel tool that allows to bind these data bases together in a distributed fashion for which the control and data remains with the cohorts rather than classic data mergers.METHODS: As proof-of-concept we entered two basic queries into the tool: available resistance tests and available samples. We asked for patients still alive after 1998-01-01, and between 180 and 195 cm of height, and how many samples or resistance tests there would be available for these patients, The queries were uploaded with the tool to a central web server from which each participating cohort downloaded the queries with the tool and ran them against their database, The numbers gathered were then submitted back to the server and we could accumulate the number of available samples and resistance tests.RESULTS: We obtained the following results from the cohorts on available samples/resistance test: EuResist: not availableI11,194; EuroSIDA: 20,71611,992; ICONA: 3,751/500; Rega: 302/302; SHCS: 53,78311,485, In total, 78,552 samples and 15,473 resistance tests were available amongst these five cohorts. Once these data items have been identified, it is trivial to generate lists of relevant samples that would be usefuI for ultra deep sequencing in addition to the already available resistance tests, Saon the tool will include small analysis packages that allow each cohort to pull a report on their cohort profile and also survey emerging resistance trends in their own cohort,CONCLUSIONS: We plan on providing this tool to all cohorts within the Collaborative HIV and Anti-HIV Drug Resistance Network (CHAIN) and will provide the tool free of charge to others for any non-commercial use, The potential of this tool is to ease collaborations, that is, in projects requiring data to speed up identification of novel resistance mutations by increasing the number of observations across multiple cohorts instead of awaiting single cohorts or studies to reach the critical number needed to address such issues.
Resumo:
It is well established that cancer cells can recruit CD11b(+) myeloid cells to promote tumor angiogenesis and tumor growth. Increasing interest has emerged on the identification of subpopulations of tumor-infiltrating CD11b(+) myeloid cells using flow cytometry techniques. In the literature, however, discrepancies exist on the phenotype of these cells (Coffelt et al., Am J Pathol 2010;176:1564-1576). Since flow cytometry analysis requires particular precautions for accurate sample preparation and trustable data acquisition, analysis, and interpretation, some discrepancies might be due to technical reasons rather than biological grounds. We used the syngenic orthotopic 4T1 mammary tumor model in immunocompetent BALB/c mice to analyze and compare the phenotype of CD11b(+) myeloid cells isolated from peripheral blood and from tumors, using six-color flow cytometry. We report here that the nonspecific antibody binding through Fc receptors, the presence of dead cells and cell doublets in tumor-derived samples concur to generate artifacts in the phenotype of tumor-infiltrating CD11b(+) subpopulations. We show that the heterogeneity of tumor-infiltrating CD11b(+) subpopulations analyzed without particular precautions was greatly reduced upon Fc block treatment, dead cells, and cell doublets exclusion. Phenotyping of tumor-infiltrating CD11b(+) cells was particularly sensitive to these parameters compared to circulating CD11b(+) cells. Taken together, our results identify Fc block treatment, dead cells, and cell doublets exclusion as simple but crucial steps for the proper analysis of tumor-infiltrating CD11b(+) cell populations.
Resumo:
BACKGROUND: Used in conjunction with biological surveillance, behavioural surveillance provides data allowing for a more precise definition of HIV/STI prevention strategies. In 2008, mapping of behavioural surveillance in EU/EFTA countries was performed on behalf of the European Centre for Disease prevention and Control. METHOD: Nine questionnaires were sent to all 31 member States and EEE/EFTA countries requesting data on the overall behavioural and second generation surveillance system and on surveillance in the general population, youth, men having sex with men (MSM), injecting drug users (IDU), sex workers (SW), migrants, people living with HIV/AIDS (PLWHA), and sexually transmitted infection (STI) clinics patients. Requested data included information on system organisation (e.g. sustainability, funding, institutionalisation), topics covered in surveys and main indicators. RESULTS: Twenty-eight of the 31 countries contacted supplied data. Sixteen countries reported an established behavioural surveillance system, and 13 a second generation surveillance system (combination of biological surveillance of HIV/AIDS and STI with behavioural surveillance). There were wide differences as regards the year of survey initiation, number of populations surveyed, data collection methods used, organisation of surveillance and coordination with biological surveillance. The populations most regularly surveyed are the general population, youth, MSM and IDU. SW, patients of STI clinics and PLWHA are surveyed less regularly and in only a small number of countries, and few countries have undertaken behavioural surveys among migrant or ethnic minorities populations. In many cases, the identification of populations with risk behaviour and the selection of populations to be included in a BS system have not been formally conducted, or are incomplete. Topics most frequently covered are similar across countries, although many different indicators are used. In most countries, sustainability of surveillance systems is not assured. CONCLUSION: Although many European countries have established behavioural surveillance systems, there is little harmonisation as regards the methods and indicators adopted. The main challenge now faced is to build and maintain organised and functional behavioural and second generation surveillance systems across Europe, to increase collaboration, to promote robust, sustainable and cost-effective data collection methods, and to harmonise indicators.
Resumo:
L'expérience LHCb sera installée sur le futur accélérateur LHC du CERN. LHCb est un spectromètre à un bras consacré aux mesures de précision de la violation CP et à l'étude des désintégrations rares des particules qui contiennent un quark b. Actuellement LHCb se trouve dans la phase finale de recherche et développement et de conception. La construction a déjà commencé pour l'aimant et les calorimètres. Dans le Modèle Standard, la violation CP est causée par une phase complexe dans la matrice 3x3 CKM (Cabibbo-Kobayashi-Maskawa) de mélange des quarks. L'expérience LHCb compte utiliser les mesons B pour tester l'unitarité de cette matrice, en mesurant de diverses manières indépendantes tous les angles et côtés du "triangle d'unitarité". Cela permettra de surdéterminer le modèle et, peut-être, de mettre en évidence des incohérences qui seraient le signal de l'existence d'une physique au-delà du Modèle Standard. La reconstruction du vertex de désintégration des particules est une condition fondamentale pour l'expérience LHCb. La présence d'un vertex secondaire déplacé est une signature de la désintégration de particules avec un quark b. Cette signature est utilisée dans le trigger topologique du LHCb. Le Vertex Locator (VeLo) doit fournir des mesures précises de coordonnées de passage des traces près de la région d'interaction. Ces points sont ensuite utilisés pour reconstruire les trajectoires des particules et l'identification des vertices secondaires et la mesure des temps de vie des hadrons avec quark b. L'électronique du VeLo est une partie essentielle du système d'acquisition de données et doit se conformer aux spécifications de l'électronique de LHCb. La conception des circuits doit maximiser le rapport signal/bruit pour obtenir la meilleure performance de reconstruction des traces dans le détecteur. L'électronique, conçue en parallèle avec le développement du détecteur de silicium, a parcouru plusieurs phases de "prototyping" décrites dans cette thèse.<br/><br/>The LHCb experiment is being built at the future LHC accelerator at CERN. It is a forward single-arm spectrometer dedicated to precision measurements of CP violation and rare decays in the b quark sector. Presently it is finishing its R&D and final design stage. The construction already started for the magnet and calorimeters. In the Standard Model, CP violation arises via the complex phase of the 3 x 3 CKM (Cabibbo-Kobayashi-Maskawa) quark mixing matrix. The LHCb experiment will test the unitarity of this matrix by measuring in several theoretically unrelated ways all angles and sides of the so-called "unitary triangle". This will allow to over-constrain the model and - hopefully - to exhibit inconsistencies which will be a signal of physics beyond the Standard Model. The Vertex reconstruction is a fundamental requirement for the LHCb experiment. Displaced secondary vertices are a distinctive feature of b-hadron decays. This signature is used in the LHCb topology trigger. The Vertex Locator (VeLo) has to provide precise measurements of track coordinates close to the interaction region. These are used to reconstruct production and decay vertices of beauty-hadrons and to provide accurate measurements of their decay lifetimes. The Vertex Locator electronics is an essential part of the data acquisition system and must conform to the overall LHCb electronics specification. The design of the electronics must maximise the signal to noise ratio in order to achieve the best tracking reconstruction performance in the detector. The electronics is being designed in parallel with the silicon detector development and went trough several prototyping phases, which are described in this thesis.
Resumo:
Changes in human lives are studied in psychology, sociology, and adjacent fields as outcomes of developmental processes, institutional regulations and policies, culturally and normatively structured life courses, or empirical accounts. However, such studies have used a wide range of complementary, but often divergent, concepts. This review has two aims. First, we report on the structure that has emerged from scientific life course research by focusing on abstracts from longitudinal and life course studies beginning with the year 2000. Second, we provide a sense of the disciplinary diversity of the field and assess the value of the concept of 'vulnerability' as a heuristic tool for studying human lives. Applying correspondence analysis to 10,632 scientific abstracts, we find a disciplinary divide between psychology and sociology, and observe indications of both similarities of-and differences between-studies, driven at least partly by the data and methods employed. We also find that vulnerability takes a central position in this scientific field, which leads us to suggest several reasons to see value in pursuing theory development for longitudinal and life course studies in this direction.
Resumo:
PURPOSE: To test the hypothesis that both coronary anatomy and ventricular function can be assessed simultaneously using a single four-dimensional (4D) acquisition. METHODS: A free-running 4D whole-heart self-navigated acquisition incorporating a golden angle radial trajectory was implemented and tested in vivo in nine healthy adult human subjects. Coronary magnetic resonance angiography (MRA) datasets with retrospective selection of acquisition window width and position were extracted and quantitatively compared with baseline self-navigated electrocardiography (ECG) -triggered coronary MRA. From the 4D datasets, the left-ventricular end-systolic, end-diastolic volumes (ESV & EDV) and ejection fraction (EF) were computed and compared with values obtained from conventional 2D cine images. RESULTS: The 4D datasets enabled dynamic assessment of the whole heart with isotropic spatial resolution of 1.15 mm(3) . Coronary artery image quality was very similar to that of the ECG-triggered baseline scan despite some SNR penalty. A good agreement between 4D and 2D cine imaging was found for EDV, ESV, and EF. CONCLUSION: The hypothesis that both coronary anatomy and ventricular function can be assessed simultaneously in vivo has been tested positive. Retrospective and flexible acquisition window selection allows to best visualize each coronary segment at its individual time point of quiescence. Magn Reson Med, 2014. © 2014 Wiley Periodicals, Inc.
Resumo:
PURPOSE OF REVIEW: Only 5% of the Alzheimer's cases are explained by genetic mutations, whereas the remaining 95% are sporadic. The pathophysiological mechanisms underlying sporadic Alzheimer's disease are not well understood, suggesting a complex multifactorial cause. This review summarizes the recent findings on research aiming to show how biomarkers can be used for revealing the underlying mechanisms of preclinical stage Alzheimer's disease and help in their diagnosis. RECENT FINDINGS: The undisputed successful publicly accessible repositories provide longitudinal brain images, clinical, genetic and proteomic information of Alzheimer's disease. By combining with increasingly sophisticated data analysis methods, it is a great opportunity for searching new biomarkers. Innovative studies validated theoretical models of disease progression demonstrating the sequential ordering of well-established biomarkers. Novel observations shed light on the interaction between biomarkers to confirm that disease progression is related to multiple pathological factors. A typical example is the tau-associated neuronal toxicity that can be additionally potentiated by amyloid β peptides. To increase further the complexity, studies report specific impact of common genetic variants that can be traced from childhood through middle age up to the symptomatic onset of Alzheimer's disease. SUMMARY: The discovery of efficient therapies to prevent the disease or modify the progression of disease requires a more thorough understanding of the underlying biological processes. Neuroimaging, genetic and proteomic biomarkers for Alzheimer's disease are critically discussed and proposed to be included in clinical descriptions and diagnostic guidelines.
Resumo:
Imaging in neuroscience, clinical research and pharmaceutical trials often employs the 3D magnetisation-prepared rapid gradient-echo (MPRAGE) sequence to obtain structural T1-weighted images with high spatial resolution of the human brain. Typical research and clinical routine MPRAGE protocols with ~1mm isotropic resolution require data acquisition time in the range of 5-10min and often use only moderate two-fold acceleration factor for parallel imaging. Recent advances in MRI hardware and acquisition methodology promise improved leverage of the MR signal and more benign artefact properties in particular when employing increased acceleration factors in clinical routine and research. In this study, we examined four variants of a four-fold-accelerated MPRAGE protocol (2D-GRAPPA, CAIPIRINHA, CAIPIRINHA elliptical, and segmented MPRAGE) and compared clinical readings, basic image quality metrics (SNR, CNR), and automated brain tissue segmentation for morphological assessments of brain structures. The results were benchmarked against a widely-used two-fold-accelerated 3T ADNI MPRAGE protocol that served as reference in this study. 22 healthy subjects (age=20-44yrs.) were imaged with all MPRAGE variants in a single session. An experienced reader rated all images of clinically useful image quality. CAIPIRINHA MPRAGE scans were perceived on average to be of identical value for reading as the reference ADNI-2 protocol. SNR and CNR measurements exhibited the theoretically expected performance at the four-fold acceleration. The results of this study demonstrate that the four-fold accelerated protocols introduce systematic biases in the segmentation results of some brain structures compared to the reference ADNI-2 protocol. Furthermore, results suggest that the increased noise levels in the accelerated protocols play an important role in introducing these biases, at least under the present study conditions.
Resumo:
PURPOSE: The purpose of our study was to assess whether a model combining clinical factors, MR imaging features, and genomics would better predict overall survival of patients with glioblastoma (GBM) than either individual data type. METHODS: The study was conducted leveraging The Cancer Genome Atlas (TCGA) effort supported by the National Institutes of Health. Six neuroradiologists reviewed MRI images from The Cancer Imaging Archive (http://cancerimagingarchive.net) of 102 GBM patients using the VASARI scoring system. The patients' clinical and genetic data were obtained from the TCGA website (http://www.cancergenome.nih.gov/). Patient outcome was measured in terms of overall survival time. The association between different categories of biomarkers and survival was evaluated using Cox analysis. RESULTS: The features that were significantly associated with survival were: (1) clinical factors: chemotherapy; (2) imaging: proportion of tumor contrast enhancement on MRI; and (3) genomics: HRAS copy number variation. The combination of these three biomarkers resulted in an incremental increase in the strength of prediction of survival, with the model that included clinical, imaging, and genetic variables having the highest predictive accuracy (area under the curve 0.679±0.068, Akaike's information criterion 566.7, P<0.001). CONCLUSION: A combination of clinical factors, imaging features, and HRAS copy number variation best predicts survival of patients with GBM.
Resumo:
Introduction: CD103 is a specific integrin present on some CD4+ lymphocytes of the mucosal immune system. It has been hypothesized that most CD4+ lymphocytes in pulmonary sarcoidosis do not originate from mucosal sites but from redistribution from the peripheral blood, and therefore do not bear the CD103 integrin. Several studies have suggested that a low CD103+ percentage among bronchoalveolar lavage (BAL) CD4+ lymphocytes discriminates between sarcoidosis and other causes of lymphocytic alveolitis, but contradictory data exist. Methods: We reviewed 1151 consecutive patients with BAL lymphocytosis >10% and flow cytometry performed between 2006 and 2014. 944 cases were excluded due to poor BAL quality (n= 97), unavailable clinical data (n= 760), or unclear diagnosis (n= 87). The remaining 207 patients were grouped into 9 diagnostic categories. To assess the discriminative value of the CD103+CD4+/CD4+ ratio to distinguish sarcoidosis from the other entities, area under ROC curves (AUC) were determined. Results: Sarcoidosis patients (n=53) had a lower CD103+CD4+/CD4+ ratio than the other diagnostic categories. AUC was 62% for sarcoidosis compared to all other patients and 69% for sarcoidosis compared to other interstitial lung diseases. When combining CD103+CD4+/CD4+ and CD4+/CD8+ ratios, AUC increased to 76% and 78% respectively. When applying published cut-offs from 4 previous studies to our population, AUC varied between 54 and 73%. Conclusions: The CD103+CD4+/CD4+ ratio does not accurately discriminate between sarcoidosis and other causes of lymphocytic alveolitis, neither alone nor in combination with CD4+/CD8+ ratio, and is not a relevant marker for the diagnosis of sarcoidosis.