792 resultados para BIAS-ENHANCED NUCLEATION
Resumo:
The effectiveness of lipid-lowering medication critically depends on the patients' compliance and the efficacy of the prescribed drug. The primary objective of this multicentre study was to compare the efficacy of rosuvastatin with or without access to compliance initiatives, in bringing patients to the Joint European Task Force's (1998) recommended low-density lipoprotein cholesterol (LDL-C) level goal (LDL-C, <3.0 mmol/L) at week 24. Secondary objectives were comparison of the number and percentage of patients achieving European goals (1998, 2003) for LDL-C and other lipid parameters. Patients with primary hypercholesterolaemia and a 10-year coronary heart disease risk of >20% received open label rosuvastatin treatment for 24 weeks with or without access to compliance enhancement tools. The initial daily dosage of 10 mg could be doubled at week 12. Compliance tools included: a) a starter pack for subjects containing a videotape, an educational leaflet, a passport/goal diary and details of the helpline and/or website; b) regular personalised letters to provide message reinforcement; c) a toll-free helpline and a website. The majority of patients (67%) achieved the 1998 European goal for LDL-C at week 24. 31% required an increase in dosage of rosuvastatin to 20 mg at week 12. Compliance enhancement tools did not increase the number of patients achieving either the 1998 or the 2003 European target for plasma lipids. Rosuvastatin was well tolerated during this study. The safety profile was comparable with other drugs of the same class. 63 patients in the 10 mg group and 58 in the 10 mg Plus group discontinued treatment. The main reasons for discontinuation were adverse events (39 patients in the 10 mg group; 35 patients in the 10 mg Plus group) and loss to follow-up (13 patients in the 10 mg group; 9 patients in the 10 mg Plus group). The two most frequently reported adverse events were myalgia (34 patients, 3% respectively) and back pain (23 patients, 2% respectively). The overall rate of temporary or permanent study discontinuation due to adverse events was 9% (n = 101) in patients receiving 10 mg rosuvastatin and 3% (n = 9) in patients titrated up to 20 mg rosuvastatin. Rosuvastatin was effective in lowering LDL-C values in patients with hypercholesterolaemia to the 1998 European target at week 24. However, compliance enhancement tools did not increase the number of patients achieving any European targets for plasma lipids.
Resumo:
BACKGROUND: Health professionals and policymakers aspire to make healthcare decisions based on the entire relevant research evidence. This, however, can rarely be achieved because a considerable amount of research findings are not published, especially in case of 'negative' results - a phenomenon widely recognized as publication bias. Different methods of detecting, quantifying and adjusting for publication bias in meta-analyses have been described in the literature, such as graphical approaches and formal statistical tests to detect publication bias, and statistical approaches to modify effect sizes to adjust a pooled estimate when the presence of publication bias is suspected. An up-to-date systematic review of the existing methods is lacking. METHODS/DESIGN: The objectives of this systematic review are as follows:âeuro¢ To systematically review methodological articles which focus on non-publication of studies and to describe methods of detecting and/or quantifying and/or adjusting for publication bias in meta-analyses.âeuro¢ To appraise strengths and weaknesses of methods, the resources they require, and the conditions under which the method could be used, based on findings of included studies.We will systematically search Web of Science, Medline, and the Cochrane Library for methodological articles that describe at least one method of detecting and/or quantifying and/or adjusting for publication bias in meta-analyses. A dedicated data extraction form is developed and pilot-tested. Working in teams of two, we will independently extract relevant information from each eligible article. As this will be a qualitative systematic review, data reporting will involve a descriptive summary. DISCUSSION: Results are expected to be publicly available in mid 2013. This systematic review together with the results of other systematic reviews of the OPEN project (To Overcome Failure to Publish Negative Findings) will serve as a basis for the development of future policies and guidelines regarding the assessment and handling of publication bias in meta-analyses.
Resumo:
Laparoscopy is one of the cornerstones in the surgical revolution and transformed outcome and recovery for various surgical procedures. Even if these changes were widely accepted for basic interventions, like appendectomies and cholecystectomies, laparoscopy still remains challenged for more advanced operations in many aspects. Despite these discussion, there is an overwhelming acceptance in the surgical community that laparoscopy did transform the recovery for several abdominal procedures. The importance of improved peri-operative patient management and its influence on outcome started to become a focus of attention 20 years ago and is now increasingly spreading, as shown by the incoming volume of data on this topic. The enhanced recovery after surgery (ERAS) concept incorporates simple measures of general management, and requires multidisciplinary collaboration from hospital staff as well as the patient and the relatives. Several studies have demonstrated a significant decrease in postoperative complication rate, length of hospital stay and reduced overall cost. The key elements of success are fluid restriction, a functioning epidural and preoperative carbohydrate intake. With the expansion of laparoscopic techniques, ERAS increasingly incorporates laparoscopic patients, especially in colorectal surgery. However, the precise impact of laparoscopy on ERAS is still not clearly defined. Increasing evidence suggests that laparoscopy itself is an additional ERAS item that should be considered as routine where feasible in order to obtain the best surgical outcomes.
Resumo:
A particular property of the matched desiredimpulse response receiver is introduced in this paper, namely,the fact that full exploitation of the diversity is obtained withmultiple beamformers when the channel is spatially and timelydispersive. This particularity makes the receiver specially suitablefor mobile and underwater communications. The new structureprovides better performance than conventional and weightedVRAKE receivers, and a diversity gain with no needs of additionalradio frequency equipment. The baseband hardware neededfor this new receiver may be obtained through reconfigurabilityof the RAKE architectures available at the base station. Theproposed receiver is tested through simulations assuming UTRAfrequency-division-duplexing mode.
Resumo:
BACKGROUND: Selective publication of studies, which is commonly called publication bias, is widely recognized. Over the years a new nomenclature for other types of bias related to non-publication or distortion related to the dissemination of research findings has been developed. However, several of these different biases are often still summarized by the term 'publication bias'. METHODS/DESIGN: As part of the OPEN Project (To Overcome failure to Publish nEgative fiNdings) we will conduct a systematic review with the following objectives:- To systematically review highly cited articles that focus on non-publication of studies and to present the various definitions of biases related to the dissemination of research findings contained in the articles identified.- To develop and discuss a new framework on nomenclature of various aspects of distortion in the dissemination process that leads to public availability of research findings in an international group of experts in the context of the OPEN Project.We will systematically search Web of Knowledge for highly cited articles that provide a definition of biases related to the dissemination of research findings. A specifically designed data extraction form will be developed and pilot-tested. Working in teams of two, we will independently extract relevant information from each eligible article.For the development of a new framework we will construct an initial table listing different levels and different hazards en route to making research findings public. An international group of experts will iteratively review the table and reflect on its content until no new insights emerge and consensus has been reached. DISCUSSION: Results are expected to be publicly available in mid-2013. This systematic review together with the results of other systematic reviews of the OPEN project will serve as a basis for the development of future policies and guidelines regarding the assessment and prevention of publication bias.
Resumo:
OBJECTIVE: The aim of this study was to examine the differences between those who gave informed consent to a study on substance use and those who did not, and to analyze whether differences changed with varying nonconsent rates. METHOD: Cross-sectional questionnaire data on demographics, alcohol, smoking, and cannabis use were obtained for 6,099 French- and 5,720 German-speaking 20-year-old Swiss men. Enrollment took place over 11 months for the Cohort Study on Substance Use Risk Factors (C-SURF). Consenters and nonconsenters were asked to complete a short questionnaire. Data for nearly the entire population were available because 94% responded. Weekly differences in consent rates were analyzed. Regressions examined the associations of substance use with consent giving and consent rates and the interaction between the two. RESULTS: Nonconsenters had higher substance use patterns, although they were more often alcohol abstainers; differences were small and not always significant and did not decrease as consent rates increased. CONCLUSIONS: Substance use currently is a minor sensitive topic among young men, resulting in small differences between nonconsenters and consenters. As consent rates increase, additional individuals are similar to those observed at lower consent rates. Estimates of analytical studies looking at associations of substance use with other variables will not differ at reasonable consent rates of 50%-80%. Descriptive prevalence studies may be biased, but only at very low rates of consent.
Resumo:
The priming agent β-aminobutyric acid (BABA) is known to enhance Arabidopsis resistance to the bacterial pathogen Pseudomonas syringae pv. tomato (Pst) DC3000 by potentiating salicylic acid (SA) defence signalling, notably PR1 expression. The molecular mechanisms underlying this phenomenon remain unknown. A genome-wide microarray analysis of BABA priming during Pst DC3000 infection revealed direct and primed up-regulation of genes that are responsive to SA, the SA analogue benzothiadiazole and pathogens. In addition, BABA was found to inhibit the Arabidopsis response to the bacterial effector coronatine (COR). COR is known to promote bacterial virulence by inducing the jasmonic acid (JA) response to antagonize SA signalling activation. BABA specifically repressed the JA response induced by COR without affecting other plant JA responses. This repression was largely SA-independent, suggesting that it is not caused by negative cross-talk between SA and JA signalling cascades. Treatment with relatively high concentrations of purified COR counteracted BABA inhibition. Under these conditions, BABA failed to protect Arabidopsis against Pst DC3000. BABA did not induce priming and resistance in plants inoculated with a COR-deficient strain of Pst DC3000 or in the COR-insensitive mutant coi1-16. In addition, BABA blocked the COR-dependent re-opening of stomata during Pst DC3000 infection. Our data suggest that BABA primes for enhanced resistance to Pst DC3000 by interfering with the bacterial suppression of Arabidopsis SA-dependent defences. This study also suggests the existence of a signalling node that distinguishes COR from other JA responses.
Resumo:
BACKGROUND: Takayasu arteritis (TA) is a rare form of chronic inflammatory granulomatous arteritis of the aorta and its major branches. Late gadolinium enhancement (LGE) with magnetic resonance imaging (MRI) has demonstrated its value for the detection of vessel wall alterations in TA. The aim of this study was to assess LGE of the coronary artery wall in patients with TA compared to patients with stable CAD. METHODS: We enrolled 9 patients (8 female, average age 46±13 years) with proven TA. In the CAD group 9 patients participated (8 male, average age 65±10 years). Studies were performed on a commercial 3T whole-body MR imaging system (Achieva; Philips, Best, The Netherlands) using a 3D inversion prepared navigator gated spoiled gradient-echo sequence, which was repeated 34-45 minutes after low-dose gadolinium administration. RESULTS: No coronary vessel wall enhancement was observed prior to contrast in either group. Post contrast, coronary LGE on IR scans was detected in 28 of 50 segments (56%) seen on T2-Prep scans in TA and in 25 of 57 segments (44%) in CAD patients. LGE quantitative assessment of coronary artery vessel wall CNR post contrast revealed no significant differences between the two groups (CNR in TA: 6.0±2.4 and 7.3±2.5 in CAD; p = 0.474). CONCLUSION: Our findings suggest that LGE of the coronary artery wall seems to be common in patients with TA and similarly pronounced as in CAD patients. The observed coronary LGE seems to be rather unspecific, and differentiation between coronary vessel wall fibrosis and inflammation still remains unclear.
Resumo:
Monocarboxylate transporters (MCTs) are essential for the use of lactate, an energy substrate known to be overproduced in brain during an ischemic episode. The expression of MCT1 and MCT2 was investigated at 48 h of reperfusion from focal ischemia induced by unilateral extradural compression in Wistar rats. Increased MCT1 mRNA expression was detected in the injured cortex and hippocampus of compressed animals compared to sham controls. In the contralateral, uncompressed hemisphere, increases in MCT1 mRNA level in the cortex and MCT2 mRNA level in the hippocampus were noted. Interestingly, strong MCT1 and MCT2 protein expression was found in peri-lesional macrophages/microglia and in an isolectin B4+/S100beta+ cell population in the corpus callosum. In vitro, MCT1 and MCT2 protein expression was observed in the N11 microglial cell line, whereas an enhancement of MCT1 expression by tumor necrosis factor-alpha (TNF-alpha) was shown in these cells. Modulation of MCT expression in microglia suggests that these transporters may help sustain microglial functions during recovery from focal brain ischemia. Overall, our study indicates that changes in MCT expression around and also away from the ischemic area, both at the mRNA and protein levels, are a part of the metabolic adaptations taking place in the brain after ischemia.
Resumo:
In mice, vaccination with high peptide doses generates higher frequencies of specific CD8+ T cells, but with lower avidity compared to vaccination with lower peptide doses. To investigate the impact of peptide dose on CD8+ T cell responses in humans, melanoma patients were vaccinated with 0.1 or 0.5 mg Melan-A/MART-1 peptide, mixed with CpG 7909 and Incomplete Freund's adjuvant. Neither the kinetics nor the amplitude of the Melan-A-specific CD8+ T cell responses differed between the two vaccination groups. Also, CD8+ T cell differentiation and cytokine production ex vivo were similar in the two groups. Interestingly, after low peptide dose vaccination, Melan-A-specific CD8+ T cells showed enhanced degranulation upon peptide stimulation, as assessed by CD107a upregulation and perforin release ex vivo. In accordance, CD8+ T cell clones derived from low peptide dose-vaccinated patients showed significantly increased degranulation and stronger cytotoxicity. In parallel, Melan-A-specific CD8+ T cells and clones from low peptide dose-vaccinated patients expressed lower CD8 levels, despite similar or even stronger binding to tetramers. Furthermore, CD8+ T cell clones from low peptide dose-vaccinated patients bound CD8 binding-deficient tetramers more efficiently, suggesting that they may express higher affinity TCRs. We conclude that low peptide dose vaccination generated CD8+ T cell responses with stronger cytotoxicity and lower CD8 dependence.
Resumo:
AIM: Alpha1-adrenergic receptors (alpha1-ARs) are classified into three subtypes: alpha1A-AR, alpha1B-AR, and alpha1D-AR. Triple disruption of alpha1A-AR, alpha1B-AR, and alpha1D-AR genes results in hypotension and produces no contractile response of the thoracic aorta to noradrenalin. Presently, we characterized vascular contractility against other vasoconstrictors, such as potassium, prostaglandin F2alpha (PGF(2alpha)) and 5-hydroxytryptamine (5-HT), in alpha1A-AR, alpha1B-AR, and alpha1D-AR triple knockout (alpha1-AR triple KO) mice. MAIN METHODS: The contractile responses to the stimulation with vasoconstrictors were studied using isolated thoracic aorta. KEY FINDINGS: As a result, the phasic and tonic contraction induced by a high concentration of potassium (20 mM) was enhanced in the isolated thoracic aorta of alpha1-AR triple KO mice compared with that of wild-type (WT) mice. In addition, vascular responses to PGF(2alpha) and 5-HT were also enhanced in the isolated thoracic aorta of alpha1-AR triple KO mice compared with WT mice. Similar to in vitro findings with isolated thoracic aorta, in vivo pressor responses to PGF(2alpha) were enhanced in alpha1-AR triple KO mice. Real-time reverse transcription-polymerase chain reaction analysis and western blot analysis indicate that gene expression of the 5-hydroxytryptamine 2A (5-HT(2A)) receptor was up-regulated in the thoracic aorta of alpha1-AR triple KO mice while the prostaglandin F2alpha receptor (FP) was unchanged. SIGNIFICANCE: These results suggest that loss of alpha1-ARs can lead to enhancement of vascular responsiveness to the vasoconstrictors and may imply that alpha1-ARs and the subsequent signaling regulate the vascular responsiveness to other stimulations such as depolarization, 5-HT and PGF(2alpha).
Resumo:
MOTIVATION: Comparative analyses of gene expression data from different species have become an important component of the study of molecular evolution. Thus methods are needed to estimate evolutionary distances between expression profiles, as well as a neutral reference to estimate selective pressure. Divergence between expression profiles of homologous genes is often calculated with Pearson's or Euclidean distance. Neutral divergence is usually inferred from randomized data. Despite being widely used, neither of these two steps has been well studied. Here, we analyze these methods formally and on real data, highlight their limitations and propose improvements. RESULTS: It has been demonstrated that Pearson's distance, in contrast to Euclidean distance, leads to underestimation of the expression similarity between homologous genes with a conserved uniform pattern of expression. Here, we first extend this study to genes with conserved, but specific pattern of expression. Surprisingly, we find that both Pearson's and Euclidean distances used as a measure of expression similarity between genes depend on the expression specificity of those genes. We also show that the Euclidean distance depends strongly on data normalization. Next, we show that the randomization procedure that is widely used to estimate the rate of neutral evolution is biased when broadly expressed genes are abundant in the data. To overcome this problem, we propose a novel randomization procedure that is unbiased with respect to expression profiles present in the datasets. Applying our method to the mouse and human gene expression data suggests significant gene expression conservation between these species. CONTACT: marc.robinson-rechavi@unil.ch; sven.bergmann@unil.ch SUPPLEMENTARY INFORMATION: Supplementary data are available at Bioinformatics online.
Resumo:
Les reconstructions palinspastiques fournissent le cadre idéal à de nombreuses études géologiques, géographiques, océanographique ou climatiques. En tant qu?historiens de la terre, les "reconstructeurs" essayent d?en déchiffrer le passé. Depuis qu?ils savent que les continents bougent, les géologues essayent de retracer leur évolution à travers les âges. Si l?idée originale de Wegener était révolutionnaire au début du siècle passé, nous savons depuis le début des années « soixante » que les continents ne "dérivent" pas sans but au milieu des océans mais sont inclus dans un sur-ensemble associant croûte « continentale » et « océanique »: les plaques tectoniques. Malheureusement, pour des raisons historiques aussi bien que techniques, cette idée ne reçoit toujours pas l'écho suffisant parmi la communauté des reconstructeurs. Néanmoins, nous sommes intimement convaincus qu?en appliquant certaines méthodes et certains principes il est possible d?échapper à l?approche "Wégenerienne" traditionnelle pour enfin tendre vers la tectonique des plaques. Le but principal du présent travail est d?exposer, avec tous les détails nécessaires, nos outils et méthodes. Partant des données paléomagnétiques et paléogéographiques classiquement utilisées pour les reconstructions, nous avons développé une nouvelle méthodologie replaçant les plaques tectoniques et leur cinématique au coeur du problème. En utilisant des assemblages continentaux (aussi appelés "assemblées clés") comme des points d?ancrage répartis sur toute la durée de notre étude (allant de l?Eocène jusqu?au Cambrien), nous développons des scénarios géodynamiques permettant de passer de l?une à l?autre en allant du passé vers le présent. Entre deux étapes, les plaques lithosphériques sont peu à peu reconstruites en additionnant/ supprimant les matériels océaniques (symbolisés par des isochrones synthétiques) aux continents. Excepté lors des collisions, les plaques sont bougées comme des entités propres et rigides. A travers les âges, les seuls éléments évoluant sont les limites de plaques. Elles sont préservées aux cours du temps et suivent une évolution géodynamique consistante tout en formant toujours un réseau interconnecté à travers l?espace. Cette approche appelée "limites de plaques dynamiques" intègre de multiples facteurs parmi lesquels la flottabilité des plaques, les taux d'accrétions aux rides, les courbes de subsidence, les données stratigraphiques et paléobiogéographiques aussi bien que les évènements tectoniques et magmatiques majeurs. Cette méthode offre ainsi un bon contrôle sur la cinématique des plaques et fournit de sévères contraintes au modèle. Cette approche "multi-source" nécessite une organisation et une gestion des données efficaces. Avant le début de cette étude, les masses de données nécessaires était devenues un obstacle difficilement surmontable. Les SIG (Systèmes d?Information Géographiques) et les géo-databases sont des outils informatiques spécialement dédiés à la gestion, au stockage et à l?analyse des données spatialement référencées et de leurs attributs. Grâce au développement dans ArcGIS de la base de données PaleoDyn nous avons pu convertir cette masse de données discontinues en informations géodynamiques précieuses et facilement accessibles pour la création des reconstructions. Dans le même temps, grâce à des outils spécialement développés, nous avons, tout à la fois, facilité le travail de reconstruction (tâches automatisées) et amélioré le modèle en développant fortement le contrôle cinématique par la création de modèles de vitesses des plaques. Sur la base des 340 terranes nouvellement définis, nous avons ainsi développé un set de 35 reconstructions auxquelles est toujours associé un modèle de vitesse. Grâce à cet ensemble de données unique, nous pouvons maintenant aborder des problématiques majeurs de la géologie moderne telles que l?étude des variations du niveau marin et des changements climatiques. Nous avons commencé par aborder un autre problème majeur (et non définitivement élucidé!) de la tectonique moderne: les mécanismes contrôlant les mouvements des plaques. Nous avons pu observer que, tout au long de l?histoire de la terre, les pôles de rotation des plaques (décrivant les mouvements des plaques à la surface de la terre) tendent à se répartir le long d'une bande allant du Pacifique Nord au Nord de l'Amérique du Sud, l'Atlantique Central, l'Afrique du Nord, l'Asie Centrale jusqu'au Japon. Fondamentalement, cette répartition signifie que les plaques ont tendance à fuir ce plan médian. En l'absence d'un biais méthodologique que nous n'aurions pas identifié, nous avons interprété ce phénomène comme reflétant l'influence séculaire de la Lune sur le mouvement des plaques. La Lune sur le mouvement des plaques. Le domaine océanique est la clé de voute de notre modèle. Nous avons attaché un intérêt tout particulier à le reconstruire avec beaucoup de détails. Dans ce modèle, la croûte océanique est préservée d?une reconstruction à l?autre. Le matériel crustal y est symbolisé sous la forme d?isochrones synthétiques dont nous connaissons les âges. Nous avons également reconstruit les marges (actives ou passives), les rides médio-océaniques et les subductions intra-océaniques. En utilisant ce set de données très détaillé, nous avons pu développer des modèles bathymétriques 3-D unique offrant une précision bien supérieure aux précédents.<br/><br/>Palinspastic reconstructions offer an ideal framework for geological, geographical, oceanographic and climatology studies. As historians of the Earth, "reconstructers" try to decipher the past. Since they know that continents are moving, geologists a trying to retrieve the continents distributions through ages. If Wegener?s view of continent motions was revolutionary at the beginning of the 20th century, we know, since the Early 1960?s that continents are not drifting without goal in the oceanic realm but are included in a larger set including, all at once, the oceanic and the continental crust: the tectonic plates. Unfortunately, mainly due to technical and historical issues, this idea seems not to receive a sufficient echo among our particularly concerned community. However, we are intimately convinced that, by applying specific methods and principles we can escape the traditional "Wegenerian" point of view to, at last, reach real plate tectonics. This is the main aim of this study to defend this point of view by exposing, with all necessary details, our methods and tools. Starting with the paleomagnetic and paleogeographic data classically used in reconstruction studies, we developed a modern methodology placing the plates and their kinematics at the centre of the issue. Using assemblies of continents (referred as "key assemblies") as anchors distributed all along the scope of our study (ranging from Eocene time to Cambrian time) we develop geodynamic scenarios leading from one to the next, from the past to the present. In between, lithospheric plates are progressively reconstructed by adding/removing oceanic material (symbolized by synthetic isochrones) to major continents. Except during collisions, plates are moved as single rigid entities. The only evolving elements are the plate boundaries which are preserved and follow a consistent geodynamical evolution through time and form an interconnected network through space. This "dynamic plate boundaries" approach integrates plate buoyancy factors, oceans spreading rates, subsidence patterns, stratigraphic and paleobiogeographic data, as well as major tectonic and magmatic events. It offers a good control on plate kinematics and provides severe constraints for the model. This multi-sources approach requires an efficient data management. Prior to this study, the critical mass of necessary data became a sorely surmountable obstacle. GIS and geodatabases are modern informatics tools of specifically devoted to store, analyze and manage data and associated attributes spatially referenced on the Earth. By developing the PaleoDyn database in ArcGIS software we converted the mass of scattered data offered by the geological records into valuable geodynamical information easily accessible for reconstructions creation. In the same time, by programming specific tools we, all at once, facilitated the reconstruction work (tasks automation) and enhanced the model (by highly increasing the kinematic control of plate motions thanks to plate velocity models). Based on the 340 terranes properly defined, we developed a revised set of 35 reconstructions associated to their own velocity models. Using this unique dataset we are now able to tackle major issues of the geology (such as the global sea-level variations and climate changes). We started by studying one of the major unsolved issues of the modern plate tectonics: the driving mechanism of plate motions. We observed that, all along the Earth?s history, plates rotation poles (describing plate motions across the Earth?s surface) tend to follow a slight linear distribution along a band going from the Northern Pacific through Northern South-America, Central Atlantic, Northern Africa, Central Asia up to Japan. Basically, it sighifies that plates tend to escape this median plan. In the absence of a non-identified methodological bias, we interpreted it as the potential secular influence ot the Moon on plate motions. The oceanic realms are the cornerstone of our model and we attached a particular interest to reconstruct them with many details. In this model, the oceanic crust is preserved from one reconstruction to the next. The crustal material is symbolised by the synthetic isochrons from which we know the ages. We also reconstruct the margins (active or passive), ridges and intra-oceanic subductions. Using this detailed oceanic dataset, we developed unique 3-D bathymetric models offering a better precision than all the previously existing ones.