96 resultados para Application specific instruction-set processor


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Résumé: L'évaluation de l'exposition aux nuisances professionnelles représente une étape importante dans l'analyse de poste de travail. Les mesures directes sont rarement utilisées sur les lieux même du travail et l'exposition est souvent estimée sur base de jugements d'experts. Il y a donc un besoin important de développer des outils simples et transparents, qui puissent aider les spécialistes en hygiène industrielle dans leur prise de décision quant aux niveaux d'exposition. L'objectif de cette recherche est de développer et d'améliorer les outils de modélisation destinés à prévoir l'exposition. Dans un premier temps, une enquête a été entreprise en Suisse parmi les hygiénistes du travail afin d'identifier les besoins (types des résultats, de modèles et de paramètres observables potentiels). Il a été constaté que les modèles d'exposition ne sont guère employés dans la pratique en Suisse, l'exposition étant principalement estimée sur la base de l'expérience de l'expert. De plus, l'émissions de polluants ainsi que leur dispersion autour de la source ont été considérés comme des paramètres fondamentaux. Pour tester la flexibilité et la précision des modèles d'exposition classiques, des expériences de modélisations ont été effectuées dans des situations concrètes. En particulier, des modèles prédictifs ont été utilisés pour évaluer l'exposition professionnelle au monoxyde de carbone et la comparer aux niveaux d'exposition répertoriés dans la littérature pour des situations similaires. De même, l'exposition aux sprays imperméabilisants a été appréciée dans le contexte d'une étude épidémiologique sur une cohorte suisse. Dans ce cas, certains expériences ont été entreprises pour caractériser le taux de d'émission des sprays imperméabilisants. Ensuite un modèle classique à deux-zone a été employé pour évaluer la dispersion d'aérosol dans le champ proche et lointain pendant l'activité de sprayage. D'autres expériences ont également été effectuées pour acquérir une meilleure compréhension des processus d'émission et de dispersion d'un traceur, en se concentrant sur la caractérisation de l'exposition du champ proche. Un design expérimental a été développé pour effectuer des mesures simultanées dans plusieurs points d'une cabine d'exposition, par des instruments à lecture directe. Il a été constaté que d'un point de vue statistique, la théorie basée sur les compartiments est sensée, bien que l'attribution à un compartiment donné ne pourrait pas se faire sur la base des simples considérations géométriques. Dans une étape suivante, des données expérimentales ont été collectées sur la base des observations faites dans environ 100 lieux de travail différents: des informations sur les déterminants observés ont été associées aux mesures d'exposition des informations sur les déterminants observés ont été associé. Ces différentes données ont été employées pour améliorer le modèle d'exposition à deux zones. Un outil a donc été développé pour inclure des déterminants spécifiques dans le choix du compartiment, renforçant ainsi la fiabilité des prévisions. Toutes ces investigations ont servi à améliorer notre compréhension des outils des modélisations ainsi que leurs limitations. L'intégration de déterminants mieux adaptés aux besoins des experts devrait les inciter à employer cet outil dans leur pratique. D'ailleurs, en augmentant la qualité des outils des modélisations, cette recherche permettra non seulement d'encourager leur utilisation systématique, mais elle pourra également améliorer l'évaluation de l'exposition basée sur les jugements d'experts et, par conséquent, la protection de la santé des travailleurs. Abstract Occupational exposure assessment is an important stage in the management of chemical exposures. Few direct measurements are carried out in workplaces, and exposures are often estimated based on expert judgements. There is therefore a major requirement for simple transparent tools to help occupational health specialists to define exposure levels. The aim of the present research is to develop and improve modelling tools in order to predict exposure levels. In a first step a survey was made among professionals to define their expectations about modelling tools (what types of results, models and potential observable parameters). It was found that models are rarely used in Switzerland and that exposures are mainly estimated from past experiences of the expert. Moreover chemical emissions and their dispersion near the source have also been considered as key parameters. Experimental and modelling studies were also performed in some specific cases in order to test the flexibility and drawbacks of existing tools. In particular, models were applied to assess professional exposure to CO for different situations and compared with the exposure levels found in the literature for similar situations. Further, exposure to waterproofing sprays was studied as part of an epidemiological study on a Swiss cohort. In this case, some laboratory investigation have been undertaken to characterize the waterproofing overspray emission rate. A classical two-zone model was used to assess the aerosol dispersion in the near and far field during spraying. Experiments were also carried out to better understand the processes of emission and dispersion for tracer compounds, focusing on the characterization of near field exposure. An experimental set-up has been developed to perform simultaneous measurements through direct reading instruments in several points. It was mainly found that from a statistical point of view, the compartmental theory makes sense but the attribution to a given compartment could ñó~be done by simple geometric consideration. In a further step the experimental data were completed by observations made in about 100 different workplaces, including exposure measurements and observation of predefined determinants. The various data obtained have been used to improve an existing twocompartment exposure model. A tool was developed to include specific determinants in the choice of the compartment, thus largely improving the reliability of the predictions. All these investigations helped improving our understanding of modelling tools and identify their limitations. The integration of more accessible determinants, which are in accordance with experts needs, may indeed enhance model application for field practice. Moreover, while increasing the quality of modelling tool, this research will not only encourage their systematic use, but might also improve the conditions in which the expert judgments take place, and therefore the workers `health protection.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Background: Experimental data have suggested that adoptive transfer of CD4+CD25+Foxp3+ regulatory T cells (Tregs), capable of controlling immune responses to specifi c auto- or alloantigens, could be used as a therapeutic strategy to promote specifi c tolerance in T-cell mediated diseases and in organ transplantation (Tx). However, before advocating the application of immunotherapy with Tregs in Tx, we need to improve our understanding of their in vivo homeostasis, traffi cking pattern and effector function in response to alloantigens. Methods : Donor-antigen specifi c murine Tregs were generated and characterized in vitro following our described protocols. Using an adoptive transfer and skin allotransplantation model, we have analyzed the in vivo expansion and homing of fl uorescent-labeled effector T cells (Teff) and Tregs, at different time-points after Tx, using fl ow-cytometry as well as fl uorescence microscopy techniques. Results: Tregs expressed CD62L, CCR7 and CD103 allowing their homing into lymphoid and non-lymphoid tissues (gut, skin) after intravenous injection. While hyporesponsive to TCR stimulation in vitro, transferred Tregs survived, migrated to secondary lymphoid organs and preferentially expanded within the allograft draining lymph nodes. Furthermore, Foxp3+ cells could be detected inside the allograft as early as day 3-5 after Tx. At a much later time-point (day 60 after Tx), graft-infi ltrating Foxp3+ cells were also detectable in tolerant recipients. When transferred alone, CD4+CD25- Teff cells expanded within secondary lymphoid organs and infi ltrated the allograft by day 3-5 after Tx. The co-transfer of Tregs limited the expansion of alloreactive Teff cells as well as their recruitment into the allograft. The promotion of graft survival observed in the presence of Tregs was in part mediated by the inhibition of the production of effector cytokines by CD4+CD25- T cells. Conclusion: Taken together, our results suggest that the suppression of allograft rejection and the induction of Tx tolerance are in part dependant on the alloantigendriven homing and expansion of Tregs. Thus, the appropriate localization of Tregs may be critical for their suppressive function in vivo.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Background: Intranasal administration of high amount of allergen was shown to induce tolerance and to reverse the allergic phenotype. However, mechanisms of tolerance induction via the mucosal route are still unclear. Objectives: To characterize the therapeutic effects of intranasal application of ovalbumin (OVA) in a mouse model of bronchial inflammation as well as the cellular and molecular mechanisms leading to protection upon re-exposure to allergen. Methods: After induction of bronchial inflammation, mice were treated intranasally with OVA and re-exposed to OVA aerosols 10 days later. Bronchoalveolar lavage fluid (BALF), T cell proliferation and cytokine secretion were examined. The respective role of CD4(+)CD25(+) and CD4(+)CD25(-) T cells in the induction of tolerance was analysed. Results: Intranasal treatment with OVA drastically reduced inflammatory cell recruitment into BALF and bronchial hyperresponsiveness upon re-exposure to allergen. Both OVA- specific-proliferation of T cells, T(h)1 and T(h)2 cytokine production from lung and bronchial lymph nodes were inhibited. Transfer of CD4(+)CD25(-) T cells, which strongly expressed membrane-bound transforming growth factor beta (mTGF beta), from tolerized mice protected asthmatic recipient mice from subsequent aerosol challenges. The presence of CD4(+)CD25(+)(Foxp3(+)) T cells during the process of tolerization was indispensable to CD4(+)CD25(-) T cells to acquire regulatory properties. Whereas the presence of IL-10 appeared dispensable in this model, the suppression of CD4(+)CD25(-)mTGF beta(+) T cells in transfer experiments significantly impaired the down-regulation of airways inflammation. Conclusion: Nasal application of OVA in established asthma led to the induction of CD4(+)CD25(-)mTGF beta(+) T cells with regulatory properties, able to confer protection upon allergen re-exposure.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The PFAPA syndrome is characterized by periodic fever, associated with pharyngitis, cervical adenitis and/or aphtous stomatitis and belongs to the auto-inflammatory diseases. Diagnostic criteria are based on clinical features and the exclusion of other periodic fever syndromes. An analysis of a large cohort of patients has shown weaknesses for these criteria and there is a lack of international consensus. An International Conference was held in Morges in November 2008 to propose a new set of classification criteria based on a consensus among experts in the field. We aimed to verify the applicability of the new set of classification criteria. 80 patients diagnosed with PFAPA syndrome from 3 centers (Genoa, Lausanne and Geneva) for pediatric rheumatology were included in the study. A detailed description of the clinical and laboratory features was obtained. The new classification criteria and the actual diagnostic criteria were applied to the patients. Only 43/80 patients (53.8%) fulfilled all criteria of the new classification. 31 patients were excluded because they didn't meet one of the 7 diagnostic criteria, 8 because of 2 criteria, and one because of 3 criteria. When we applied the current criteria to the same patients, 11/80 patients (13%) needed to be excluded. 8/80 patients (10%) were excluded from both sets. Exclusion was related only to some of the criteria. Number of patients for each not fulfilled criterion (new set of criteria/actual criteria): age (1/6), symptoms between episodes (2/2), delayed growth (3/3), main symptoms (21/0), periodicity, length of fever, interval between episodes, and length of disease (19/0). The application of some of the new criteria was not easy, as they were both very restrictive and needed precise information from the patients. Our work has shown that the new set of classification criteria can be applied to patients suspected for PFAPA syndrome, but it seems to be more restrictive than the actual diagnostic criteria. A further work of validation needs to be done for this new set of classification criteria in order to determine if these criteria allow a good discrimination between PFAPA patients and other causes of recurrent fever syndromes.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Failure to detect a species in an area where it is present is a major source of error in biological surveys. We assessed whether it is possible to optimize single-visit biological monitoring surveys of highly dynamic freshwater ecosystems by framing them a priori within a particular period of time. Alternatively, we also searched for the optimal number of visits and when they should be conducted. We developed single-species occupancy models to estimate the monthly probability of detection of pond-breeding amphibians during a four-year monitoring program. Our results revealed that detection probability was species-specific and changed among sampling visits within a breeding season and also among breeding seasons. Thereby, the optimization of biological surveys with minimal survey effort (a single visit) is not feasible as it proves impossible to select a priori an adequate sampling period that remains robust across years. Alternatively, a two-survey combination at the beginning of the sampling season yielded optimal results and constituted an acceptable compromise between sampling efficacy and survey effort. Our study provides evidence of the variability and uncertainty that likely affects the efficacy of monitoring surveys, highlighting the need of repeated sampling in both ecological studies and conservation management.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

BACKGROUND: Synthetic contiguous overlapping peptides (COPs) may represent an alternative to allergen extracts or recombinant allergens for allergen specific immunotherapy. In combination, COPs encompass the entire allergen sequence, providing all potential T cell epitopes, while preventing IgE conformational epitopes of the native allergen. METHODS: Individual COPs were derived from the sequence of Bet v 1, the major allergen of birch pollen, and its known crystal structure, and designed to avoid IgE binding. Three sets of COPs were tested in vitro in competition ELISA and basophil degranulation assays. Their in vivo reactivity was determined by intraperitoneal challenge in rBet v 1 sensitized mice as well as by skin prick tests in volunteers with allergic rhinoconjunctivitis to birch pollen. RESULTS: The combination, named AllerT, of three COPs selected for undetectable IgE binding in competition assays and for the absence of basophil activation in vitro was unable to induce anaphylaxis in sensitized mice in contrast to rBet v 1. In addition no positive reactivity to AllerT was observed in skin prick tests in human volunteers allergic to birch pollen. In contrast, a second set of COPs, AllerT4-T5 displayed some residual IgE binding in competition ELISA and a weak subliminal reactivity to skin prick testing. CONCLUSIONS: The hypoallergenicity of contiguous overlapping peptides was confirmed by low, if any, IgE binding activity in vitro, by the absence of basophil activation and the absence of in vivo induction of allergic reactions in mouse and human. TRIAL REGISTRATION: ClinicalTrials.gov NCT01719133.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Network analysis naturally relies on graph theory and, more particularly, on the use of node and edge metrics to identify the salient properties in graphs. When building visual maps of networks, these metrics are turned into useful visual cues or are used interactively to filter out parts of a graph while querying it, for instance. Over the years, analysts from different application domains have designed metrics to serve specific needs. Network science is an inherently cross-disciplinary field, which leads to the publication of metrics with similar goals; different names and descriptions of their analytics often mask the similarity between two metrics that originated in different fields. Here, we study a set of graph metrics and compare their relative values and behaviors in an effort to survey their potential contributions to the spatial analysis of networks.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

PURPOSE: Patients diagnosed with a specific neoplasm tend to have a subsequent excess risk of the same neoplasm. The age incidence of a second neoplasm at the same site is approximately constant with age, and consequently the relative risk is greater at younger age. It is unclear whether such a line of reasoning can be extended from a specific neoplasm to the incidence of all neoplasms in subjects diagnosed with a defined neoplasm. METHODS: We considered the age-specific incidence of all non-hormone-related epithelial neoplasms after a first primary colorectal cancer (n = 9542) in the Vaud Cancer Registry data set. RESULTS: In subjects with a previous colorectal cancer, the incidence rate of all other epithelial non-hormone-related cancers was stable around 800 per 100,000 between age 30 and 60 years, and rose only about twofold to reach 1685 at age 70 to 79 years and 1826 per 100,000 at age 80 years or older. After excluding synchronous cancers, the rise was only about 1.5-fold, that is, from about 700 to 1000. In the general population, the incidence rate of all epithelial non-hormone-related cancers was 29 per 100,000 at age 30 to 39 years, and rose 30-fold to 883 per 100,000 at age 70 to 79 years. Excluding colorectal cancers, the rise of all non-hormone-related cancers was from 360 per 100,000 at age 40 to 49 years to 940 at age 70 to 79 years after colorectal cancer, and from 90 to 636 per 100,000 in the general population (i.e., 2.6- vs. 7.1-fold). CONCLUSIONS: The rise of incidence with age of all epithelial non-hormone-related second cancers after colorectal cancer is much smaller than in the general population. This can possibly be related to the occurrence of a single mutational event in a population of susceptible individuals, although alternative models are plausible within the complexity of the process of carcinogenesis.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

INTRODUCTION: PFAPA syndrome is characterized by periodic fever, associated with pharyngitis, cervical adenitis and/or aphthous stomatitis and belongs to the auto-inflammatory diseases. Diagnostic criteria are based on clinical features and the exclusion of other periodic fever syndromes. An analysis of a large cohort of patients has shown weaknesses for these criteria and there is a lack of international consensus. An International Conference was held in Morges in November 2008 to propose a new set of classification criteria based on a consensus among experts in the field.OBJECTIVE: We aimed to verify the applicability of the new set of classification criteria.PATIENTS & METHODS: 80 patients diagnosed with PFAPA syndrome from 3 centers (Genoa, Lausanne and Geneva) for pediatric rheumatology were included in the study. A detailed description of the clinical and laboratory features was obtained. The new classification criteria and the actual diagnostic criteria were applied to the patients.RESULTS: Only 40/80 patients (50%) fulfilled all criteria of the new classification. 31 patients were excluded because they didn't meet one of the 7 diagnostic criteria, 7 because of 2 criteria, and one because of 3 criteria. When we applied the current criteria to the same patients, 11/80 patients (13.7%) needed to be excluded. 8/80 patients (10%) were excluded from both sets. Exclusion was related only to some of the criteria. Number of patients for each not fulfilled criterion (new set of criteria/actual criteria): age (1/6), symptoms between episodes (2/2), delayed growth (4/1), main symptoms (21/0), periodicity, length of fever, interval between episodes, and length of disease (20/0). The application of some of the new criteria was not easy, as they were both very restrictive and needed precise information from the patients.CONCLUSION: Our work has shown that the new set of classification criteria can be applied to patients suspected for PFAPA syndrome, but it seems to be more restrictive than the actual diagnostic criteria. A further work of validation needs to be done in order to determine if this new set of classification criteria allow a good discrimination between PFAPA patients and other causes of recurrent fever syndromes.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Neutrality tests in quantitative genetics provide a statistical framework for the detection of selection on polygenic traits in wild populations. However, the existing method based on comparisons of divergence at neutral markers and quantitative traits (Q(st)-F(st)) suffers from several limitations that hinder a clear interpretation of the results with typical empirical designs. In this article, we propose a multivariate extension of this neutrality test based on empirical estimates of the among-populations (D) and within-populations (G) covariance matrices by MANOVA. A simple pattern is expected under neutrality: D = 2F(st)/(1 - F(st))G, so that neutrality implies both proportionality of the two matrices and a specific value of the proportionality coefficient. This pattern is tested using Flury's framework for matrix comparison [common principal-component (CPC) analysis], a well-known tool in G matrix evolution studies. We show the importance of using a Bartlett adjustment of the test for the small sample sizes typically found in empirical studies. We propose a dual test: (i) that the proportionality coefficient is not different from its neutral expectation [2F(st)/(1 - F(st))] and (ii) that the MANOVA estimates of mean square matrices between and among populations are proportional. These two tests combined provide a more stringent test for neutrality than the classic Q(st)-F(st) comparison and avoid several statistical problems. Extensive simulations of realistic empirical designs suggest that these tests correctly detect the expected pattern under neutrality and have enough power to efficiently detect mild to strong selection (homogeneous, heterogeneous, or mixed) when it is occurring on a set of traits. This method also provides a rigorous and quantitative framework for disentangling the effects of different selection regimes and of drift on the evolution of the G matrix. We discuss practical requirements for the proper application of our test in empirical studies and potential extensions.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

AbstractDigitalization gives to the Internet the power by allowing several virtual representations of reality, including that of identity. We leave an increasingly digital footprint in cyberspace and this situation puts our identity at high risks. Privacy is a right and fundamental social value that could play a key role as a medium to secure digital identities. Identity functionality is increasingly delivered as sets of services, rather than monolithic applications. So, an identity layer in which identity and privacy management services are loosely coupled, publicly hosted and available to on-demand calls could be more realistic and an acceptable situation. Identity and privacy should be interoperable and distributed through the adoption of service-orientation and implementation based on open standards (technical interoperability). Ihe objective of this project is to provide a way to implement interoperable user-centric digital identity-related privacy to respond to the need of distributed nature of federated identity systems. It is recognized that technical initiatives, emerging standards and protocols are not enough to guarantee resolution for the concerns surrounding a multi-facets and complex issue of identity and privacy. For this reason they should be apprehended within a global perspective through an integrated and a multidisciplinary approach. The approach dictates that privacy law, policies, regulations and technologies are to be crafted together from the start, rather than attaching it to digital identity after the fact. Thus, we draw Digital Identity-Related Privacy (DigldeRP) requirements from global, domestic and business-specific privacy policies. The requirements take shape of business interoperability. We suggest a layered implementation framework (DigldeRP framework) in accordance to model-driven architecture (MDA) approach that would help organizations' security team to turn business interoperability into technical interoperability in the form of a set of services that could accommodate Service-Oriented Architecture (SOA): Privacy-as-a-set-of- services (PaaSS) system. DigldeRP Framework will serve as a basis for vital understanding between business management and technical managers on digital identity related privacy initiatives. The layered DigldeRP framework presents five practical layers as an ordered sequence as a basis of DigldeRP project roadmap, however, in practice, there is an iterative process to assure that each layer supports effectively and enforces requirements of the adjacent ones. Each layer is composed by a set of blocks, which determine a roadmap that security team could follow to successfully implement PaaSS. Several blocks' descriptions are based on OMG SoaML modeling language and BPMN processes description. We identified, designed and implemented seven services that form PaaSS and described their consumption. PaaSS Java QEE project), WSDL, and XSD codes are given and explained.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

X-ray microtomography has become a new tool in earth sciences to obtain non-destructive 3D-image data from geological objects in which variations in mineralogy, chemical composition and/or porosity create sufficient x-ray density contrasts.We present here first, preliminary results of an application to the external and internal morphology of Permian to Recent Larger Foraminifera. We use a SkyScan-1072 high-resolution desk-top micro-CT system. The system has a conical x-ray source with a spot size of about 5µm that runs at 20-100kV, 0-250µA, resulting in a maximal resolution of 5µm. X-ray transmission images are captured by a scintillator coupled via fibre optics to a 1024x1024 pixel 12-bit CCD. The object is placed between the x-ray source and the scintillator on a stub that rotates 360°around its vertical axis in steps as small as 0.24 degrees. Sample size is limited to 2 cm due to the absorption of geologic material for x-rays. The transmission images are back projected using a Feldkamp algorithm into a vertical stack of up to 1000 1Kx1K images that represent horizontal cuts of the object. This calculation takes 2 to several hours on a Double-Processor 2.4GHz PC. The stack of images (.bmp) can be visualized with any 3D-imaging software, used to produce cuts of Larger Foraminifera. Among other applications, the 3D-imaging software furnished by SkyScan can produce 3D-models by defining a threshold density value to distinguish "solid" from "void. Several models with variable threshold values and colors can be imbricated, rotated and cut together. The best results were obtained with microfossils devoid of chamber-filling cements (Permian, Eocene, Recent). However, even slight differences in cement mineralogy/composition can result in surprisingly good x-ray density contrasts.X-ray microtomography may develop into a powerful tool for larger microfossils with a complex internal structure, because it is non-destructive, requires no preparation of the specimens, and produces a true 3D-image data set. We will use these data sets in the future to produce cuts in any direction to compare them with arbitrary cuts of complex microfossils in thin sections. Many groups of benthic and planktonic foraminifera may become more easily determinable in thin section by this way.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

BACKGROUND: We sought to improve upon previously published statistical modeling strategies for binary classification of dyslipidemia for general population screening purposes based on the waist-to-hip circumference ratio and body mass index anthropometric measurements. METHODS: Study subjects were participants in WHO-MONICA population-based surveys conducted in two Swiss regions. Outcome variables were based on the total serum cholesterol to high density lipoprotein cholesterol ratio. The other potential predictor variables were gender, age, current cigarette smoking, and hypertension. The models investigated were: (i) linear regression; (ii) logistic classification; (iii) regression trees; (iv) classification trees (iii and iv are collectively known as "CART"). Binary classification performance of the region-specific models was externally validated by classifying the subjects from the other region. RESULTS: Waist-to-hip circumference ratio and body mass index remained modest predictors of dyslipidemia. Correct classification rates for all models were 60-80%, with marked gender differences. Gender-specific models provided only small gains in classification. The external validations provided assurance about the stability of the models. CONCLUSIONS: There were no striking differences between either the algebraic (i, ii) vs. non-algebraic (iii, iv), or the regression (i, iii) vs. classification (ii, iv) modeling approaches. Anticipated advantages of the CART vs. simple additive linear and logistic models were less than expected in this particular application with a relatively small set of predictor variables. CART models may be more useful when considering main effects and interactions between larger sets of predictor variables.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Expert curation and complete collection of mutations in genes that affect human health is essential for proper genetic healthcare and research. Expert curation is given by the curators of gene-specific mutation databases or locus-specific databases (LSDBs). While there are over 700 such databases, they vary in their content, completeness, time available for curation, and the expertise of the curator. Curation and LSDBs have been discussed, written about, and protocols have been provided for over 10 years, but there have been no formal recommendations for the ideal form of these entities. This work initiates a discussion on this topic to assist future efforts in human genetics. Further discussion is welcome.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The specificity of recognition of pMHC complexes by T lymphocytes is determined by the V regions of the TCR alpha- and beta-chains. Recent experimental evidence has suggested that Ag-specific TCR repertoires may exhibit a more V alpha- than V beta-restricted usage. Whether V alpha usage is narrowed during immune responses to Ag or if, on the contrary, restricted V alpha usage is already defined at the early stages of TCR repertoire selection, however, has remained unexplored. Here, we analyzed V and CDR3 TCR regions of single circulating naive T cells specifically detected ex vivo and isolated with HLA-A2/melan-A peptide multimers. Similarly to what was previously observed for melan-A-specific Ag-experienced T cells, we found a relatively wide V beta usage, but a preferential V alpha 2.1 usage. Restricted V alpha 2.1 usage was also found among single CD8(+) A2/melan-A multimer(+) thymocytes, indicating that V alpha-restricted selection takes place in the thymus. V alpha 2.1 usage, however, was independent from functional avidity of Ag recognition. Thus, interaction of the pMHC complex with selected V alpha-chains contributes to set the broad Ag specificity, as underlined by preferential binding of A2/melan-A multimers to V alpha 2.1-bearing TCRs, whereas functional outcomes result from the sum of these with other interactions between pMHC complex and TCR.