908 resultados para Keys to Database Searching


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Learning object repositories are a basic piece of virtual learning environments used for content management. Nevertheless, learning objects have special characteristics that make traditional solutions for content management ine ective. In particular, browsing and searching for learning objects cannot be based on the typical authoritative meta-data used for describing content, such as author, title or publicationdate, among others. We propose to build a social layer on top of a learning object repository, providing nal users with additional services fordescribing, rating and curating learning objects from a teaching perspective. All these interactions among users, services and resources can be captured and further analyzed, so both browsing and searching can be personalized according to user pro le and the educational context, helping users to nd the most valuable resources for their learning process. In this paper we propose to use reputation schemes and collaborative filtering techniques for improving the user interface of a DSpace based learning object repository.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

BACKGROUND: Several European HIV observational data bases have, over the last decade, accumulated a substantial number of resistance test results and developed large sample repositories, There is a need to link these efforts together, We here describe the development of such a novel tool that allows to bind these data bases together in a distributed fashion for which the control and data remains with the cohorts rather than classic data mergers.METHODS: As proof-of-concept we entered two basic queries into the tool: available resistance tests and available samples. We asked for patients still alive after 1998-01-01, and between 180 and 195 cm of height, and how many samples or resistance tests there would be available for these patients, The queries were uploaded with the tool to a central web server from which each participating cohort downloaded the queries with the tool and ran them against their database, The numbers gathered were then submitted back to the server and we could accumulate the number of available samples and resistance tests.RESULTS: We obtained the following results from the cohorts on available samples/resistance test: EuResist: not availableI11,194; EuroSIDA: 20,71611,992; ICONA: 3,751/500; Rega: 302/302; SHCS: 53,78311,485, In total, 78,552 samples and 15,473 resistance tests were available amongst these five cohorts. Once these data items have been identified, it is trivial to generate lists of relevant samples that would be usefuI for ultra deep sequencing in addition to the already available resistance tests, Saon the tool will include small analysis packages that allow each cohort to pull a report on their cohort profile and also survey emerging resistance trends in their own cohort,CONCLUSIONS: We plan on providing this tool to all cohorts within the Collaborative HIV and Anti-HIV Drug Resistance Network (CHAIN) and will provide the tool free of charge to others for any non-commercial use, The potential of this tool is to ease collaborations, that is, in projects requiring data to speed up identification of novel resistance mutations by increasing the number of observations across multiple cohorts instead of awaiting single cohorts or studies to reach the critical number needed to address such issues.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Sensor networks have many applications in monitoring and controlling of environmental properties such as sound, acceleration, vibration and temperature. Due to limitedresources in computation capability, memory and energy, they are vulnerable to many kinds of attacks. The ZigBee specification based on the 802.15.4 standard, defines a set of layers specifically suited to sensor networks. These layers support secure messaging using symmetric cryptographic. This paper presents two different ways for grabbing the cryptographic key in ZigBee: remote attack and physical attack. It also surveys and categorizes some additional attacks which can be performed on ZigBee networks: eavesdropping, spoofing, replay and DoS attacks at different layers. From this analysis, it is shown that some vulnerabilities still in the existing security schema in ZigBee technology.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Purpose: The aim of this review was to systematically evaluate and compare the frequency of veneer chipping and core fracture of zirconia fixed dental prostheses (FOPS) and porcelain-fused-to-metal (PFM) FDPs and determine possible influencing factors. Materials and Methods: The SCOPUS database and International Association of Dental Research abstracts were searched for clinical studies involving zirconia and PFM FDPs. Furthermore, studies that were integrated into systematic reviews on PFM FDPs were also evaluated. The principle investigators of any clinical studies on zirconia FDPs were contacted to provide additional information. Based on the available information for each FOP, a data file was constructed. Veneer chipping was divided into three grades (grade 1 = polishing, grade 2 = repair, grade 3 = replacement). To assess the frequency of veneer chipping and possible influencing factors, a piecewise exponential model was used to adjust for a study effect. Results: None of the studies on PFM FDPs (reviews and additional searching) sufficiently satisfied the criteria of this review to be included. Thirteen clinical studies on zirconia FDPs and two studies that investigated both zirconia and PFM FDPs were identified. These studies involved 664 zirconia and 134 PFM FDPs at baseline. Follow-up data were available for 595 zirconia and 127 PFM FDPs. The mean observation period was approximately 3 years for both groups. The frequency of core fracture was less than 1% in the zirconia group and 0% in the PFM group. When all studies were included, 142 veneer chippings were recorded for zirconia FDPs (24%) and 43 for PFM FDPs (34%). However, the studies differed extensively with regard to veneer chipping of zirconia: 85% of all chippings occurred in 4 studies, and 43% of all chippings included zirconia FDPs. If only studies that evaluated both types of core materials were included, the frequency of chipping was 54% for the zirconia-supported FDPs and 34% for PFM FDPs. When adjusting the survival rate for the study effect, the difference between zirconia and PFM FDPs was statistically significant for all grades of chippings (P = .001), as well as for chipping grade 3 (P = .02). If all grades of veneer chippings were taken into account, the survival of PFM FDPs was 97%, while the survival rate of the zirconia FDPs was 90% after 3 years for a typical study. For both PFM and zirconia FDPs, the frequency of grades 1 and 2 veneer chippings was considerably higher than grade 3. Veneer chipping was significantly less frequent in pressed materials than in hand-layered materials, both for zirconia and PFM FDPs (P = .04). Conclusions: Since the frequency of veneer chipping was significantly higher in the zirconia FDPs than PFM FDPs, and as refined processing procedures have started to yield better results in the laboratory, new clinical studies with these new procedures must confirm whether the frequency of veneer chipping can be reduced to the level of PFM. Int J Prosthodont 2010;23:493-502

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A variety of cellular proteins has the ability to recognize DNA lesions induced by the anti-cancer drug cisplatin, with diverse consequences on their repair and on the therapeutic effectiveness of this drug. We report a novel gene involved in the cell response to cisplatin in vertebrates. The RDM1 gene (for RAD52 Motif 1) was identified while searching databases for sequences showing similarities to RAD52, a protein involved in homologous recombination and DNA double-strand break repair. Ablation of RDM1 in the chicken B cell line DT40 led to a more than 3-fold increase in sensitivity to cisplatin. However, RDM1-/- cells were not hypersensitive to DNA damages caused by ionizing radiation, UV irradiation, or the alkylating agent methylmethane sulfonate. The RDM1 protein displays a nucleic acid binding domain of the RNA recognition motif (RRM) type. By using gel-shift assays and electron microscopy, we show that purified, recombinant chicken RDM1 protein interacts with single-stranded DNA as well as double-stranded DNA, on which it assembles filament-like structures. Notably, RDM1 recognizes DNA distortions induced by cisplatin-DNA adducts in vitro. Finally, human RDM1 transcripts are abundant in the testis, suggesting a possible role during spermatogenesis.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Diplomityössä on tutkittu reaaliaikaisen toimintolaskennan toteuttamista suomalaisen lasersiruja valmistavan PK-yrityksen tietojärjestelmään. Lisäksi on tarkasteltu toimintolaskennan vaikutuksia operatiiviseen toimintaan sekä toimintojen johtamiseen. Työn kirjallisuusosassa on käsitelty kirjallisuuslähteiden perusteella toimintolaskennan teorioita, laskentamenetelmiä sekä teknisessä toteutuksessa käytettyjä teknologioita. Työn toteutusosassa suunniteltiin ja toteutettiin WWW-pohjainen toimintolaskentajärjestelmä case-yrityksen kustannuslaskennan sekä taloushallinnon avuksi. Työkalu integroitiin osaksi yrityksen toiminnanohjaus- sekä valmistuksenohjausjärjestelmää. Perinteisiin toimintolaskentamallien tiedonkeruujärjestelmiin verrattuna case-yrityksessä syötteet toimintolaskentajärjestelmälle tulevat reaaliaikaisesti osana suurempaa tietojärjestelmäintegraatiota.Diplomityö pyrkii luomaan suhteen toimintolaskennan vaatimusten ja tietokantajärjestelmien välille. Toimintolaskentajärjestelmää yritys voi hyödyntää esimerkiksi tuotteiden hinnoittelussa ja kustannuslaskennassa näkemällä tuotteisiin liittyviä kustannuksia eri näkökulmista. Päätelmiä voidaan tehdä tarkkaan kustannusinformaatioon perustuen sekä määrittää järjestelmän tuottaman datan perusteella, onko tietyn projektin, asiakkuuden tai tuotteen kehittäminen taloudellisesti kannattavaa.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Background and Aims: Eosinophilic Esophagitis (EoE) is reported with increasing frequency over the last two decades. However, it is still unknown whether this reflects a true increase in incidence or just an increased awareness by gastroenterologists. Therefore, we evaluated the incidence and cumulative prevalence of EoE in Olten county over the last 20 years. Methods: Olten county is an area of approximately 91,000 inhabitants without pronounced demographic changes in the last two decades. EoE evaluation is based upon two gastroenterology centers and one pathology center. No public programs for increased EoE awareness were implemented in this region. All EoE patients diagnosed from 1989 to 2009 were entered prospectively into the Olten county database. Results: Fourty-six patients (76% males, mean age 41±16 yrs) were diagnosed with EoE from 1989 to 2009. Ninety-four percent presented with dysphagia. In 70% of the patients concomitant allergies were found. The number of upper endoscopies per year was stable during the entire observation period. An average annual incidence rate of 2/100,000 was found (range 0-8) with a marked increase in the period from 2001 to 2009. A current cumulative EoE prevalence of 43/100,000 inhabitants was calculated. The mean diagnostic delay (time from first symptoms to diagnosis) was 4.3 years from 1989 to 1998 and 4.8 years from 1999 to 2009. Conclusions: Over the last 20 years, a significant increase in EoE incidence was found in a stable indicator region of Switzerland. The constant rate of upper endoscopies, the constant diagnostic delay, as well as the lack of EoE awareness programs in Olten county indicate a true increase in EoE incidence.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

BACKGROUND: Classical disease phenotypes are mainly based on descriptions of symptoms and the hypothesis that a given pattern of symptoms provides a diagnosis. With refined technologies there is growing evidence that disease expression in patients is much more diverse and subtypes need to be defined to allow a better targeted treatment. One of the aims of the Mechanisms of the Development of Allergy Project (MeDALL,FP7) is to re-define the classical phenotypes of IgE-associated allergic diseases from birth to adolescence, by consensus among experts using a systematic review of the literature and identify possible gaps in research for new disease markers. This paper describes the methods to be used for the systematic review of the classical IgE-associated phenotypes applicable in general to other systematic reviews also addressing phenotype definitions based on evidence. METHODS/DESIGN: Eligible papers were identified by PubMed search (complete database through April 2011). This search yielded 12,043 citations. The review includes intervention studies (randomized and clinical controlled trials) and observational studies (cohort studies including birth cohorts, case-control studies) as well as case series. Systematic and non-systematic reviews, guidelines, position papers and editorials are not excluded but dealt with separately. Two independent reviewers in parallel conducted consecutive title and abstract filtering scans. For publications where title and abstract fulfilled the inclusion criteria the full text was assessed. In the final step, two independent reviewers abstracted data using a pre-designed data extraction form with disagreements resolved by discussion among investigators. DISCUSSION: The systematic review protocol described here allows to generate broad,multi-phenotype reviews and consensus phenotype definitions. The in-depth analysis of the existing literature on the classification of IgE-associated allergic diseases through such a systematic review will 1) provide relevant information on the current epidemiologic definitions of allergic diseases, 2) address heterogeneity and interrelationships and 3) identify gaps in knowledge.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

AIM: To report a case series of five patients diagnosed with choroidal schwannoma at the Liverpool Ocular Oncology Centre. METHODS: Patients with choroidal schwannoma were identified by searching the computerised database of the Liverpool Ocular Oncology Centre. RESULTS: The patients (3 males, 2 females) ranged in age from 15 years to 45 years. Three tumours were treated by enucleation, trans-scleral local resection, and combined bevacizumab and photodynamic therapy, respectively. Two were observed after confirmation of the diagnosis by biopsy. CONCLUSIONS: Choroidal schwannoma has a variety of clinical manifestations. Associated features include hard exudates, retinal feeder vessels and serous retinal detachment. Biopsy with immunohistochemistry is required for diagnosis. Tumours not amenable to resection may respond to photodynamic therapy.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

BACKGROUND: Developing and updating high-quality guidelines requires substantial time and resources. To reduce duplication of effort and enhance efficiency, we developed a process for guideline adaptation and assessed initial perceptions of its feasibility and usefulness. METHODS: Based on preliminary developments and empirical studies, a series of meetings with guideline experts were organised to define a process for guideline adaptation (ADAPTE) and to develop a manual and a toolkit made available on a website (http://www.adapte.org). Potential users, guideline developers and implementers, were invited to register and to complete a questionnaire evaluating their perception about the proposed process. RESULTS: The ADAPTE process consists of three phases (set-up, adaptation, finalisation), 9 modules and 24 steps. The adaptation phase involves identifying specific clinical questions, searching for, retrieving and assessing available guidelines, and preparing the draft adapted guideline. Among 330 registered individuals (46 countries), 144 completed the questionnaire. A majority found the ADAPTE process clear (78%), comprehensive (69%) and feasible (60%), and the manual useful (79%). However, 21% found the ADAPTE process complex. 44% feared that they will not find appropriate and high-quality source guidelines. DISCUSSION: A comprehensive framework for guideline adaptation has been developed to meet the challenges of timely guideline development and implementation. The ADAPTE process generated important interest among guideline developers and implementers. The majority perceived the ADAPTE process to be feasible, useful and leading to improved methodological rigour and guideline quality. However, some de novo development might be needed if no high quality guideline exists for a given topic.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Les cellules CD8? T cytolytiques (CTL) sont les principaux effecteurs du système immunitaire adaptatif contre les infections et les tumeurs. La récente identification d?antigènes tumoraux humains reconnus par des cellules T cytolytiques est la base pour le, développement des vaccins antigène spécifiques contre le cancer. Le nombre d?antigènes tumoraux reconnus par des CTL que puisse être utilisé comme cible pour la vaccination des patients atteints du cancer est encore limité. Une nouvelle technique, simple et rapide, vient d?être proposée pour l?identification d?antigènes reconnus par des CTL. Elle se base sur l?utilisation de librairies combinatoriales de peptides arrangées en un format de "scanning" ou balayage par position (PS-SCL). La première partie de cette étude a consisté à valider cette nouvelle technique par une analyse détaillée de la reconnaissance des PS-SCL par différents clones de CTL spécifiques pour des antigènes associés à la tumeur (TAA) connus ainsi que par des clones de spécificité inconnue. Les résultats de ces analyses révèlent que pour tous les clones, la plupart des acides aminés qui composent la séquence du peptide antigénique naturel ont été identifiés par l?utilisation des PS-SCL. Les résultats obtenus ont permis d?identifier des peptides analogues ayant une antigènicité augmentée par rapport au peptide naturel, ainsi que des peptides comportant de multiples modifications de séquence, mais présentant la même réactivité que le peptide naturel. La deuxième partie de cette étude a consisté à effectuer des analyses biométriques des résultats complexes générés par la PS-SCL. Cette approche a permis l?identification des séquences correspondant aux épitopes naturels à partir de bases de données de peptides publiques. Parmi des milliers de peptides, les séquences naturelles se trouvent comprises dans les 30 séquences ayant les scores potentiels de stimulation les plus élevés pour chaque TAA étudié. Mais plus important encore, l?utilisation des PS-SCL avec un clone réactif contre des cellules tumorales mais de spécificité inconnue nous a permis d?identifier I?epitope reconnu par ce clone. Les données présentées ici encouragent l?utilisation des PS-SCL pour l?identification et l?optimisation d?épitopes pour des CTL réactifs anti-tumoraux, ainsi que pour l?étude de la reconnaissance dégénérée d?antigènes par les CTL.<br/><br/>CD8+ cytolytic T lymphocytes (CTL) are the main effector cells of the adaptive immune system against infection and tumors. The recent identification of moleculariy defined human tumor Ags recognized by autologous CTL has opened new opportunities for the development of Ag-specific cancer vaccines. Despite extensive work, however, the number of CTL-defined tumor Ags that are suitable targets for the vaccination of cancer patients is still limited, especially because of the laborious and time consuming nature of the procedures currentiy used for their identification. The use of combinatorial peptide libraries in positionai scanning format (Positional Scanning Synthetic Combinatorial Libraries, PS-SCL)' has recently been proposed as an alternative approach for the identification of these epitopes. To validate this approach, we analyzed in detail the recognition of PS-SCL by tumor-reactive CTL clones specific for multiple well-defined tumor-associated Ags (TAA) as well as by tumor-reactive CTL clones of unknown specificity. The results of these analyses revealed that for all the TAA-specific clones studied most of the amino acids composing the native antigenic peptide sequences could be identified through the use of PS-SCL. Based on the data obtained from the screening of PS-SCL, we could design peptide analogs of increased antigenicity as well as cross-reactive analog peptides containing multiple amino acid substitutions. In addition, the resuits of PS-SCL-screening combined with a recently developed biometric data analysis (PS-SCL-based biometric database analysis) allowed the identification of the native peptides in public protein databases among the 30 most active sequences, and this was the case for all the TAA studied. More importantiy, the screening of PS- SCL with a tumor-reactive CTL clone of unknown specificity resulted in the identification of the actual epitope. Overall, these data encourage the use of PS-SCL not oniy for the identification and optimization of tumor-associated CTL epitopes, but also for the analysis of degeneracy in T lymphocyte receptor (TCR) recognition of tumor Ags.<br/><br/>Les cellules T CD8? cytolytiques font partie des globules blancs du sang et sont les principales responsables de la lutte contre les infections et les tumeurs. Les immunologistes cherchent depuis des années à identifier des molécules exprimées et présentées à la surface des tumeurs qui puissent être reconnues par des cellules T CD8? cytolytiques capables ensuite de tuer ces tumeurs de façon spécifique. Ce type de molécules représente la base pour le développement de vaccins contre le cancer puisqu?elles pourraient être injectées aux patients afin d?induire une réponse anti- tumorale. A présent, il y a très peu de molécules capables de stimuler le système immunitaire contre les tumeurs qui sont connues parce que les techniques développées à ce jour pour leur identification sont complexes et longues. Une nouvelle technique vient d?être proposée pour l?identification de ce type de molécules qui se base sur l?utilisation de librairies de peptides. Ces librairies représentent toutes les combinaisons possibles des composants de base des molécules recherchées. La première partie de cette étude a consisté à valider cette nouvelle technique en utilisant des cellules T CD8? cytolytiques capables de tuer des cellules tumorales en reconnaissant une molécule connue présente à leur surface. On a démontré que l?utilisation des librairies permet d?identifier la plupart des composants de base de la molécule reconnue par les cellules T CD8? cytolytiques utilisées. La deuxième partie de cette étude a consisté à effectuer une recherche des molécules potentiellement actives dans des protéines présentes dans des bases des données en utilisant un programme informatique qui permet de classer les molécules sur la base de leur activité biologique. Parmi des milliers de molécules de la base de données, celles reconnues par nos cellules T CD8? cytolytiques ont été trouvées parmi les plus actives. Plus intéressant encore, la combinaison de ces deux techniques nous a permis d?identifier la molécule reconnue par une population de cellules T CD8? cytolytiques ayant une activité anti-tumorale, mais pour laquelle on ne connaissait pas la spécificité. Nos résultats encouragent l?utilisation des librairies pour trouver et optimiser des molécules reconnues spécifiquement par des cellules T CD8? cytolytiques capables de tuer des tumeurs.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Les reconstructions palinspastiques fournissent le cadre idéal à de nombreuses études géologiques, géographiques, océanographique ou climatiques. En tant qu?historiens de la terre, les "reconstructeurs" essayent d?en déchiffrer le passé. Depuis qu?ils savent que les continents bougent, les géologues essayent de retracer leur évolution à travers les âges. Si l?idée originale de Wegener était révolutionnaire au début du siècle passé, nous savons depuis le début des années « soixante » que les continents ne "dérivent" pas sans but au milieu des océans mais sont inclus dans un sur-ensemble associant croûte « continentale » et « océanique »: les plaques tectoniques. Malheureusement, pour des raisons historiques aussi bien que techniques, cette idée ne reçoit toujours pas l'écho suffisant parmi la communauté des reconstructeurs. Néanmoins, nous sommes intimement convaincus qu?en appliquant certaines méthodes et certains principes il est possible d?échapper à l?approche "Wégenerienne" traditionnelle pour enfin tendre vers la tectonique des plaques. Le but principal du présent travail est d?exposer, avec tous les détails nécessaires, nos outils et méthodes. Partant des données paléomagnétiques et paléogéographiques classiquement utilisées pour les reconstructions, nous avons développé une nouvelle méthodologie replaçant les plaques tectoniques et leur cinématique au coeur du problème. En utilisant des assemblages continentaux (aussi appelés "assemblées clés") comme des points d?ancrage répartis sur toute la durée de notre étude (allant de l?Eocène jusqu?au Cambrien), nous développons des scénarios géodynamiques permettant de passer de l?une à l?autre en allant du passé vers le présent. Entre deux étapes, les plaques lithosphériques sont peu à peu reconstruites en additionnant/ supprimant les matériels océaniques (symbolisés par des isochrones synthétiques) aux continents. Excepté lors des collisions, les plaques sont bougées comme des entités propres et rigides. A travers les âges, les seuls éléments évoluant sont les limites de plaques. Elles sont préservées aux cours du temps et suivent une évolution géodynamique consistante tout en formant toujours un réseau interconnecté à travers l?espace. Cette approche appelée "limites de plaques dynamiques" intègre de multiples facteurs parmi lesquels la flottabilité des plaques, les taux d'accrétions aux rides, les courbes de subsidence, les données stratigraphiques et paléobiogéographiques aussi bien que les évènements tectoniques et magmatiques majeurs. Cette méthode offre ainsi un bon contrôle sur la cinématique des plaques et fournit de sévères contraintes au modèle. Cette approche "multi-source" nécessite une organisation et une gestion des données efficaces. Avant le début de cette étude, les masses de données nécessaires était devenues un obstacle difficilement surmontable. Les SIG (Systèmes d?Information Géographiques) et les géo-databases sont des outils informatiques spécialement dédiés à la gestion, au stockage et à l?analyse des données spatialement référencées et de leurs attributs. Grâce au développement dans ArcGIS de la base de données PaleoDyn nous avons pu convertir cette masse de données discontinues en informations géodynamiques précieuses et facilement accessibles pour la création des reconstructions. Dans le même temps, grâce à des outils spécialement développés, nous avons, tout à la fois, facilité le travail de reconstruction (tâches automatisées) et amélioré le modèle en développant fortement le contrôle cinématique par la création de modèles de vitesses des plaques. Sur la base des 340 terranes nouvellement définis, nous avons ainsi développé un set de 35 reconstructions auxquelles est toujours associé un modèle de vitesse. Grâce à cet ensemble de données unique, nous pouvons maintenant aborder des problématiques majeurs de la géologie moderne telles que l?étude des variations du niveau marin et des changements climatiques. Nous avons commencé par aborder un autre problème majeur (et non définitivement élucidé!) de la tectonique moderne: les mécanismes contrôlant les mouvements des plaques. Nous avons pu observer que, tout au long de l?histoire de la terre, les pôles de rotation des plaques (décrivant les mouvements des plaques à la surface de la terre) tendent à se répartir le long d'une bande allant du Pacifique Nord au Nord de l'Amérique du Sud, l'Atlantique Central, l'Afrique du Nord, l'Asie Centrale jusqu'au Japon. Fondamentalement, cette répartition signifie que les plaques ont tendance à fuir ce plan médian. En l'absence d'un biais méthodologique que nous n'aurions pas identifié, nous avons interprété ce phénomène comme reflétant l'influence séculaire de la Lune sur le mouvement des plaques. La Lune sur le mouvement des plaques. Le domaine océanique est la clé de voute de notre modèle. Nous avons attaché un intérêt tout particulier à le reconstruire avec beaucoup de détails. Dans ce modèle, la croûte océanique est préservée d?une reconstruction à l?autre. Le matériel crustal y est symbolisé sous la forme d?isochrones synthétiques dont nous connaissons les âges. Nous avons également reconstruit les marges (actives ou passives), les rides médio-océaniques et les subductions intra-océaniques. En utilisant ce set de données très détaillé, nous avons pu développer des modèles bathymétriques 3-D unique offrant une précision bien supérieure aux précédents.<br/><br/>Palinspastic reconstructions offer an ideal framework for geological, geographical, oceanographic and climatology studies. As historians of the Earth, "reconstructers" try to decipher the past. Since they know that continents are moving, geologists a trying to retrieve the continents distributions through ages. If Wegener?s view of continent motions was revolutionary at the beginning of the 20th century, we know, since the Early 1960?s that continents are not drifting without goal in the oceanic realm but are included in a larger set including, all at once, the oceanic and the continental crust: the tectonic plates. Unfortunately, mainly due to technical and historical issues, this idea seems not to receive a sufficient echo among our particularly concerned community. However, we are intimately convinced that, by applying specific methods and principles we can escape the traditional "Wegenerian" point of view to, at last, reach real plate tectonics. This is the main aim of this study to defend this point of view by exposing, with all necessary details, our methods and tools. Starting with the paleomagnetic and paleogeographic data classically used in reconstruction studies, we developed a modern methodology placing the plates and their kinematics at the centre of the issue. Using assemblies of continents (referred as "key assemblies") as anchors distributed all along the scope of our study (ranging from Eocene time to Cambrian time) we develop geodynamic scenarios leading from one to the next, from the past to the present. In between, lithospheric plates are progressively reconstructed by adding/removing oceanic material (symbolized by synthetic isochrones) to major continents. Except during collisions, plates are moved as single rigid entities. The only evolving elements are the plate boundaries which are preserved and follow a consistent geodynamical evolution through time and form an interconnected network through space. This "dynamic plate boundaries" approach integrates plate buoyancy factors, oceans spreading rates, subsidence patterns, stratigraphic and paleobiogeographic data, as well as major tectonic and magmatic events. It offers a good control on plate kinematics and provides severe constraints for the model. This multi-sources approach requires an efficient data management. Prior to this study, the critical mass of necessary data became a sorely surmountable obstacle. GIS and geodatabases are modern informatics tools of specifically devoted to store, analyze and manage data and associated attributes spatially referenced on the Earth. By developing the PaleoDyn database in ArcGIS software we converted the mass of scattered data offered by the geological records into valuable geodynamical information easily accessible for reconstructions creation. In the same time, by programming specific tools we, all at once, facilitated the reconstruction work (tasks automation) and enhanced the model (by highly increasing the kinematic control of plate motions thanks to plate velocity models). Based on the 340 terranes properly defined, we developed a revised set of 35 reconstructions associated to their own velocity models. Using this unique dataset we are now able to tackle major issues of the geology (such as the global sea-level variations and climate changes). We started by studying one of the major unsolved issues of the modern plate tectonics: the driving mechanism of plate motions. We observed that, all along the Earth?s history, plates rotation poles (describing plate motions across the Earth?s surface) tend to follow a slight linear distribution along a band going from the Northern Pacific through Northern South-America, Central Atlantic, Northern Africa, Central Asia up to Japan. Basically, it sighifies that plates tend to escape this median plan. In the absence of a non-identified methodological bias, we interpreted it as the potential secular influence ot the Moon on plate motions. The oceanic realms are the cornerstone of our model and we attached a particular interest to reconstruct them with many details. In this model, the oceanic crust is preserved from one reconstruction to the next. The crustal material is symbolised by the synthetic isochrons from which we know the ages. We also reconstruct the margins (active or passive), ridges and intra-oceanic subductions. Using this detailed oceanic dataset, we developed unique 3-D bathymetric models offering a better precision than all the previously existing ones.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

OBJECTIVE: To evaluate the variability of bond strength test results of adhesive systems (AS) and to correlate the results with clinical parameters of clinical studies investigating cervical restorations. MATERIALS AND METHODS: Regarding the clinical studies, the internal database which had previously been used for a meta-analysis on cervical restorations was updated with clinical studies published between 2008 and 2012 by searching the PubMed and SCOPUS databases. PubMed and the International Association for Dental Research abstracts online were searched for laboratory studies on microtensile, macrotensile and macroshear bond strength tests. The inclusion criteria were (1) dentin, (2) testing of at least four adhesive systems, (3) same diameter of composite and (4) 24h of water storage prior to testing. The clinical outcome variables were retention loss, marginal discoloration, detectable margins, and a clinical index comprising the three parameters by weighing them. Linear mixed models which included a random study effect were calculated for both, the laboratory and the clinical studies. The variability was assessed by calculating a ratio of variances, dividing the variance among the estimated bonding effects obtained in the linear mixed models by the sum of all variance components estimated in these models. RESULTS: Thirty-two laboratory studies fulfilled the inclusion criteria comprising 183 experiments. Of those, 86 used the microtensile test evaluating 22 adhesive systems (AS). Twenty-seven used the macrotensile test with 17 AS, and 70 used the macroshear test with 24 AS. For 28 AS the results from clinical studies were available. Microtensile and macrotensile (Spearman rho=0.66, p=0.007) were moderately correlated and also microtensile and macroshear (Spearman rho=0.51, p=0.03) but not macroshear and macrotensile (Spearman rho=0.34, p=0.22). The effect of the adhesive system was significant for microtensile and macroshear (p<0.001) but not for macrotensile. The effect of the adhesive system could explain 36% of the variability of the microtensile test, 27% of the macrotensile and 33% of the macroshear test. For the clinical trials, about 49% of the variability of retained restorations could be explained by the adhesive system. With respect to the correlation between bond strength tests and clinical parameters, only a moderate correlation between micro- and macrotensile test results and marginal discoloration was demonstrated. However, no correlation between these tests and a retention loss or marginal integrity was shown. The correlation improved when more studies were included compared to assessing only one study. SIGNIFICANCE: The high variability of bond strength test results highlights the need to establish individual acceptance levels for a given test institute. The weak correlation of bond-strength test results with clinical parameters leads to the conclusion that one should not rely solely on bond strength tests to predict the clinical performance of an adhesive system but one should conduct other laboratory tests like tests on the marginal adaptation of fillings in extracted teeth and the retention loss of restorations in non-retentive cavities after artificial aging.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Analyzing the type and frequency of patient-specific mutations that give rise to Duchenne muscular dystrophy (DMD) is an invaluable tool for diagnostics, basic scientific research, trial planning, and improved clinical care. Locus-specific databases allow for the collection, organization, storage, and analysis of genetic variants of disease. Here, we describe the development and analysis of the TREAT-NMD DMD Global database (http://umd.be/TREAT_DMD/). We analyzed genetic data for 7,149 DMD mutations held within the database. A total of 5,682 large mutations were observed (80% of total mutations), of which 4,894 (86%) were deletions (1 exon or larger) and 784 (14%) were duplications (1 exon or larger). There were 1,445 small mutations (smaller than 1 exon, 20% of all mutations), of which 358 (25%) were small deletions and 132 (9%) small insertions and 199 (14%) affected the splice sites. Point mutations totalled 756 (52% of small mutations) with 726 (50%) nonsense mutations and 30 (2%) missense mutations. Finally, 22 (0.3%) mid-intronic mutations were observed. In addition, mutations were identified within the database that would potentially benefit from novel genetic therapies for DMD including stop codon read-through therapies (10% of total mutations) and exon skipping therapy (80% of deletions and 55% of total mutations).

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this thesis author approaches the problem of automated text classification, which is one of basic tasks for building Intelligent Internet Search Agent. The work discusses various approaches to solving sub-problems of automated text classification, such as feature extraction and machine learning on text sources. Author also describes her own multiword approach to feature extraction and pres-ents the results of testing this approach using linear discriminant analysis based classifier, and classifier combining unsupervised learning for etalon extraction with supervised learning using common backpropagation algorithm for multilevel perceptron.