133 resultados para export operation methods
em Université de Lausanne, Switzerland
Resumo:
GOAL: To evaluate the impact of the Ross operation, recently (1997) introduced in our unit, for the treatment of patients with congenital aortic valve stenosis. METHODS: The period from January 1997 to December 2000 was compared with the previous 5 years (1992-96). Thirty-seven children (< 16 yrs) and 49 young adults (16-50 yrs) with congenital aortic valve stenosis underwent one of these treatments: percutaneous balloon dilatation (PBD), aortic valve commissurotomy, aortic valve replacement and the Ross operation. The Ross operation was performed in 16 patients, mean age 24.5 yrs (range 9-46 yrs) with a bicuspid stenotic aortic valve, 7/10 adults with calcifications, 2/10 adults with previous aortic valve commissurotomy, 4/6 children with aortic regurgitation following PBD, and 1/6 children who had had a previous aortic valve replacement with a prosthetic valve and aortic root enlargement. RESULTS: PBD was followed by death in two neonates (fibroelastosis); all other children survived PBD. Although there were no deaths, PBD in adults was recently abandoned, owing to unfavourable results. Aortic valve commissurotomy showed good results in children (no deaths). Aortic valve replacement, although associated with good results (no deaths), has been recently abandoned in children in favour of the Ross operation. Over a mean follow-up of 16 months (2-40 months) all patients are asymptomatic following Ross operation, with no echocardiographic evidence of aortic valve regurgitation in 10/16 patients and with trivial regurgitation in 6/16 patients. CONCLUSIONS: The approach now for children and young adults with congenital aortic valve stenosis should be as follows: (1) PBD is the first choice in neonates and infants; (2) Aortic valve commissurotomy is the first choice for children, neonates and infants after failed PBD; (3) The Ross operation is increasingly used in children after failed PBD and in young adults, even with a calcified aortic valve.
Resumo:
OBJECTIVE: Prenatal diagnosis has been shown to decrease pre-operative acidosis and might prevent the occurrence of disturbed developmental outcome. The aim of this study is to evaluate parameters for acidosis and their predictive value on developmental outcome in newborns with congenital heart disease. METHODS: A total of 117 patients requiring surgery for structural heart disease in the first 31 days of life were included. Diagnosis was established either pre- or postnatally. Preoperative values of lactate, pH and base excess levels were compared to the occurrence of disturbed developmental outcome, i.e. an underperformance of more than 10% on the P90 of a standardized Dutch developmental scale. Patients were divided into groups according to blood levels of acidosis parameters, using receiver operating characteristics curves to determine cut-off values for pH, base excess and lactate. RESULTS: No significant difference in developmental outcome was found using values for pH or base excess as a cut-off level. Preoperative lactate values exceeding 6.1 mmol/l resulted in a significant increase in impaired development compared to infants with a pre-operative lactate lower than 6.1 mmol/l: 40.9% vs 15.1% in (p=0.03). CONCLUSIONS: Pre-operative lactate values might have a prognostic value on developmental outcome in newborns with congenital heart disease. The limited prognostic value of pH can be explained by the fact that pH can be easily corrected, while lactate better reflects the total oxygen debt experienced by these patients.
Resumo:
Background: We have recently shown that the median diagnostic delay to establish Crohn's disease (CD) diagnosis (i.e. the period from first symptom onset to diagnosis) in the Swiss IBD Cohort (SIBDC) was 9 months. Seventy five percent of all CD patients were diagnosed within 24 months. The clinical impact of a long diagnostic delay on the natural history of CD is unknown. Aim: To compare the frequency and type of CD-related complications in the patient groups with long diagnostic delay (>24 months) vs. the ones diagnosed within 24 months. Methods: Retrospective analysis of data from the SIBDCS, comprising a large sample of CD patients followed in hospitals and private practices across Switzerland. The proportions of the following outcomes were compared between groups of patients diagnosed 1, 2-5, 6-10, 11-15, and ≥ 16 years ago and stratified according to the length of diagnostic delay: bowel stenoses, internal fistulas, perianal fistulas, CD-related surgical interventions, and extraintestinal manifestations. Results: Two hundred CD patients (121 female, mean age 44.9 ± 15.0 years, 38% smokers, 71% ever treated with immunomodulators and 35% with anti-TNF) with long diagnostic delay were compared to 697 CD patients (358 female, mean age 39.1 ± 14.9 years, 33% smokers, 74% ever treated with immunomodulators and 33% with anti-TNF) diagnosed within 24 months. No differences in the outcomes were observed between the two patient groups within year one after CD diagnosis. Among those diagnosed 2-5 years ago, CD patients with long diagnostic delay (n = 45) presented more frequently with internal fistulas (11.1% vs. 3.1%, p = 0.03) and bowel stenoses (28.9% vs. 15.7%, p = 0.05), and they more frequently underwent CD-related operations (15.6% vs. 5.0%, p = 0.02) compared to the patients diagnosed within 24 months (n = 159). Among those diagnosed 6-10 years ago, CD patients with long diagnostic delay (n = 48) presented more frequently with extraintestinal manifestations (60.4% vs. 34.6%, p = 0.001) than those diagnosed within 24 months (n = 182). For the patients diagnosed 11-15 years ago, no differences in outcomes were found between the long diagnostic delay group (n = 106) and the one diagnosed within 24 months (n = 32). Among those diagnosed ≥ 16 years ago, the group with long diagnostic delay (n = 71) more frequently underwent CD-related operations (63.4% vs. 46.5%, p = 0.01) compared to the group diagnosed with CD within 24 months (n = 241). Conclusions: A long diagnostic delay in CD patients is associated with a more complicated disease course and higher number of CD-related operations in the years following the diagnosis. Our results indicate that efforts should be undertaken to shorten the diagnostic delay in CD patients in order to reduce the risk for progression towards a complicated disease phenotype.
Resumo:
OBJECTIVES: The impact of diagnostic delay (a period from appearance of first symptoms to diagnosis) on the clinical course of Crohn's disease (CD) is unknown. We examined whether length of diagnostic delay affects disease outcomes. METHODS: Data from the Swiss IBD cohort study were analyzed. Patients were recruited from university centers (68%), regional hospitals (14%), and private practices (18%). The frequencies of occurrence of bowel stenoses, internal fistulas, perianal fistulas, and CD-related surgery (intestinal and perianal) were analyzed. RESULTS: A total of 905 CD patients (53.4% female, median age at diagnosis 26 (20-36) years) were stratified into four groups according to the quartiles of diagnostic delay (0-3, 4-9, 10-24, and ≥25 months, respectively). Median diagnostic delay was 9 (3-24) months. The frequency of immunomodulator and/or antitumor necrosis factor drug use did not differ among the four groups. The length of diagnostic delay was positively correlated with the occurrence of bowel stenosis (odds ratio (OR) 1.76, P=0.011 for delay of ≥25 months) and intestinal surgery (OR 1.76, P=0.014 for delay of 10-24 months and OR 2.03, P=0.003 for delay of ≥25 months). Disease duration was positively associated and non-ileal disease location was negatively associated with bowel stenosis (OR 1.07, P<0.001, and OR 0.41, P=0.005, respectively) and intestinal surgery (OR 1.14, P<0.001, and OR 0.23, P<0.001, respectively). CONCLUSIONS: The length of diagnostic delay is correlated with an increased risk of bowel stenosis and CD-related intestinal surgery. Efforts should be undertaken to shorten the diagnostic delay.
Resumo:
Background and Aims: The impact of d iagnostic delay ( a period from appearance of f irst s ymptoms t o diagnosis) o n the clinical c ourse o f Crohn's disease (CD) i s unknown. W e examined whether length of d iagnostic delay a ffects d isease outcome. Methods: Data from the Swiss IBD cohort study were analyzed. T he frequencies of o ccurrence of b owel s tenoses, internal fistulas, perianal f istulas, and CD-related surgery at distinct i ntervals a fter C D diagnosis (0 - < 2 , 2 - < 6, 6 years) were c ompared f or g roups o f patients w ith different length of d iagnostic delay. Results: T he data from a g roup o f 200 CD patients with long diagnostic delay (> 24 months, 76th - 100th p ercentile) were c ompared to t hose from a group of 4 61 patients with a short diagnostic delay ( within 9 months, 1st - 50th p ercentile). T reatment r egimens d id n ot d iffer between t he two groups. Two years following diagnosis, p atients with long diagnostic delay presented more frequently with bowel stenoses (25% vs. 13.1%, p = 0.044), internal fistulas (10% vs. 2%, p = 0.018), perianal f istulas ( 20% vs. 8 .1%, p = 0.023) a nd more frequently underwent intestinal surgery (15% vs. 5 .1%, p = 0.024) t han patients with short diagnostic delay. Intestinal surgery was a lso m ore frequently p erformed 6 y ears after diagnosis in t he group with long d iagnostic delay ( 56.2% vs. 42.3%, p = 0.005) w hen compared to t he g roup with short diagnostic delay. Conclusions: L ong diagnostic delay i s associated with worse o utcome c haracterized by t he development o f increased bowel damage, n ecessitating more frequently operations in t he years following CD d iagnosis. Efforts should be undertaken to shorten the diagnostic delay.
Resumo:
The water content dynamics in the upper soil surface during evaporation is a key element in land-atmosphere exchanges. Previous experimental studies have suggested that the soil water content increases at the depth of 5 to 15 cm below the soil surface during evapo- ration, while the layer in the immediate vicinity of the soil surface is drying. In this study, the dynamics of water content profiles exposed to solar radiative forcing was monitored at a high temporal resolution using dielectric methods both in the presence and absence of evaporation. A 4-d comparison of reported moisture content in coarse sand in covered and uncovered buckets using a commercial dielectric-based probe (70 MHz ECH2O-5TE, Decagon Devices, Pullman, WA) and the standard 1-GHz time domain reflectometry method. Both sensors reported a positive correlation between temperature and water content in the 5- to 10-cm depth, most pronounced in the morning during heating and in the afternoon during cooling. Such positive correlation might have a physical origin induced by evaporation at the surface and redistribution due to liquid water fluxes resulting from the temperature- gradient dynamics within the sand profile at those depths. Our experimental data suggest that the combined effect of surface evaporation and temperature-gradient dynamics should be considered to analyze experimental soil water profiles. Additional effects related to the frequency of operation and to protocols for temperature compensation of the dielectric sensors may also affect the probes' response during large temperature changes.
Resumo:
Background: We have recently shown that the median diagnostic delay to establish Crohn's disease (CD) diagnosis in the Swiss IBD Cohort (SIBDC) was 9 months. Seventy five percent of all CD patients were diagnosed within 24 months. The clinical impact of a long diagnostic delay on the natural history of CD is unknown. Aim: To compare the frequency and type of CD-related complications in the patient groups with long diagnostic delay (>24 months) vs. the ones diagnosed within 24 months. Methods: Retrospective analysis of data from the SIBDCS, comprising a large sample of CD patients followed in hospitals and private practices across Switzerland. Results: Two hundred CD patients (121 female, mean age 44.9 ± 15.0 years, 38% smokers, 71% ever treated with immunomodulators and 35% with anti-TNF) with long diagnostic delay were compared to 697 CD patients (358 female, mean age 39.1 ± 14.9 years, 33% smokers, 74% ever treated with immunomodulators and 33% with anti-TNF) diagnosed within 24 months. No differences in the outcomes were observed between the two patient groups within year one after CD diagnosis. Among those diagnosed 2-5 years ago, CD patients with long diagnostic delay (n = 45) presented more frequently with internal fistulas (11.1% vs. 3.1%, p = 0.03) and bowel stenoses (28.9% vs. 15.7%, p = 0.05), and they more frequently underwent CD-related operations (15.6% vs. 5.0%, p = 0.02) compared to the patients diagnosed within 24 months (n = 159). Among those diagnosed 6-10 years ago, CD patients with long diagnostic delay (n = 48) presented more frequently with extraintestinal manifestations (60.4% vs. 34.6%, p = 0.001) than those diagnosed within 24 months (n = 182). For the patients diagnosed 11-15 years ago, no differences in outcomes were found between the long diagnostic delay group (n = 106) and the one diagnosed within 24 months (n = 32). Among those diagnosed >= 16 years ago, the group with long diagnostic delay (n = 71) more frequently underwent CD-related operations (63.4% vs. 46.5%, p = 0.01) compared to the group diagnosed with CD within 24 months (n = 241). Conclusions: A long diagnostic delay in CD patients is associated with a more complicated disease course and higher number of CD-related operations in the years following the diagnosis. Our results indicate that efforts should be undertaken to shorten the diagnostic delay in CD patients in order to reduce the risk for progression towards a complicated disease phenotype.
Resumo:
The mouse has emerged as an animal model for many diseases. At IRO, we have used this animal to understand the development of many eye diseases and treatment of some of them. Precise evaluation of vision is a prerequisite for both these approaches. In this unit we describe three ways to measure vision: testing the optokinetic response, and evaluating the fundus by direct observation and by fluorescent angiography.
Resumo:
Captan and folpet are fungicides largely used in agriculture. They have similar chemical structures, except that folpet has an aromatic ring unlike captan. Their half-lives in blood are very short, given that they are readily broken down to tetrahydrophthalimide (THPI) and phthalimide (PI), respectively. Few authors measured these biomarkers in plasma or urine, and analysis was conducted either by gas chromatography coupled to mass spectrometry or liquid chromatography with UV detection. The objective of this study was thus to develop simple, sensitive and specific liquid chromatography-atmospheric pressure chemical ionization-tandem mass spectrometry (LC/APCI-MS/MS) methods to quantify both THPI and PI in human plasma and urine. Briefly, deuterated THPI was added as an internal standard and purification was performed by solid-phase extraction followed by LC/APCI-MS/MS analysis in negative ion mode for both compounds. Validation of the methods was conducted using spiked blank plasma and urine samples at concentrations ranging from 1 to 250 μg/L and 1 to 50 μg/L, respectively, along with samples of volunteers and workers exposed to captan or folpet. The methods showed a good linearity (R (2) > 0.99), recovery (on average 90% for THPI and 75% for PI), intra- and inter-day precision (RSD, <15%) and accuracy (<20%), and stability. The limit of detection was 0.58 μg/L in urine and 1.47 μg/L in plasma for THPI and 1.14 and 2.17 μg/L, respectively, for PI. The described methods proved to be accurate and suitable to determine the toxicokinetics of both metabolites in human plasma and urine.
Resumo:
The aim of this retrospective study was to compare the clinical and radiographic results after TKA (PFC, DePuy), performed either by computer assisted navigation (CAS, Brainlab, Johnson&Johnson) or by conventional means. Material and methods: Between May and December 2006 we reviewed 36 conventional TKA performed between 2002 and 2003 (group A) and 37 navigated TKA performed between 2005 and 2006 (group B) by the same experienced surgeon. The mean age in group A was 74 years (range 62-90) and 73 (range 58-85) in group B with a similar age distribution. The preoperative mechanical axes in group A ranged from -13° varus to +13° valgus (mean absolute deviation 6.83°, SD 3.86), in group B from -13° to +16° (mean absolute deviation 5.35, SD 4.29). Patients with a previous tibial osteotomy or revision arthroplasty were excluded from the study. Examination was done by an experienced orthopedic resident independent of the surgeon. All patients had pre- and postoperative long standing radiographs. The IKSS and the WOMAC were utilized to determine the clinical outcome. Patient's degree of satisfaction was assessed on a visual analogous scale (VAS). Results: 32 of the 37 navigated TKAs (86,5%) showed a postoperative mechanical axis within the limits of 3 degrees of valgus or varus deviation compared to only 24 (66%) of the 36 standard TKAs. This difference was significant (p = 0.045). The mean absolute deviation from neutral axis was 3.00° (range -5° to +9°, SD: 1.75) in group A in comparison to 1.54° (range -5° to +4°, SD: 1.41) in group B with a highly significant difference (p = 0.000). Furthermore, both groups showed a significant postoperative improvement of their mean IKSS-values (group A: 89 preoperative to 169 postoperative, group B 88 to 176) without a significant difference between the two groups. Neither the WOMAC nor the patient's degree of satisfaction - as assessed by VAS - showed significant differences. Operation time was significantly higher in group B (mean 119.9 min.) than in group A (mean 99.6 min., p <0.000). Conclusion: Our study showed consistent significant improvement of postoperative frontal alignment in TKA by computer assisted navigation (CAS) compared to standard methods, even in the hands of a surgeon well experienced in standard TKA implantation. However, the follow-up time of this study was not long enough to judge differences in clinical outcome. Thus, the relevance of computer navigation for clinical outcome and survival of TKA remains to be proved in long term studies to justify the longer operation time. References 1 Stulberg SD. Clin Orth Rel Res. 2003;(416):177-84. 2 Chauhan SK. JBJS Br. 2004;86(3):372-7. 3 Bäthis H, et al. Orthopäde. 2006;35(10):1056-65.
Resumo:
OBJECTIVES: : To evaluate the outcome after Hartmann's procedure (HP) versus primary anastomosis (PA) with diverting ileostomy for perforated left-sided diverticulitis. BACKGROUND: : The surgical management of left-sided colonic perforation with purulent or fecal peritonitis remains controversial. PA with ileostomy seems to be superior to HP; however, results in the literature are affected by a significant selection bias. No randomized clinical trial has yet compared the 2 procedures. METHODS: : Sixty-two patients with acute left-sided colonic perforation (Hinchey III and IV) from 4 centers were randomized to HP (n = 30) and to PA (with diverting ileostomy, n = 32), with a planned stoma reversal operation after 3 months in both groups. Data were analyzed on an intention-to-treat basis. The primary end point was the overall complication rate. The study was discontinued following an interim analysis that found significant differences of relevant secondary end points as well as a decreasing accrual rate (NCT01233713). RESULTS: : Patient demographics were equally distributed in both groups (Hinchey III: 76% vs 75% and Hinchey IV: 24% vs 25%, for HP vs PA, respectively). The overall complication rate for both resection and stoma reversal operations was comparable (80% vs 84%, P = 0.813). Although the outcome after the initial colon resection did not show any significant differences (mortality 13% vs 9% and morbidity 67% vs 75% in HP vs PA), the stoma reversal rate after PA with diverting ileostomy was higher (90% vs 57%, P = 0.005) and serious complications (Grades IIIb-IV: 0% vs 20%, P = 0.046), operating time (73 minutes vs 183 minutes, P < 0.001), hospital stay (6 days vs 9 days, P = 0.016), and lower in-hospital costs (US $16,717 vs US $24,014) were significantly reduced in the PA group. CONCLUSIONS: : This is the first randomized clinical trial favoring PA with diverting ileostomy over HP in patients with perforated diverticulitis.
Resumo:
Methicillin resistant Staphylococcus aureus (MRSA) bacteria have emerged in the early 1980's in numerous health care institutions around the world. The main transmission mechanism within hospitals and healthcare facilities is through the hands of health care workers. Resistant to several antibiotics, the MRSA is one of the most feared pathogens in the hospital setting since it is very difficult to eradicate with the standard treatments. There are still a limited number of anti-MRSA antibiotics but the first cases of resistance to these compounds have already been reported and their frequency is likely to increase in the coming years. Every year, the MRSA infections result in major human and financial costs, due to the high associated mortality and expenses related to the required care. Measures towards a faster detection of resistant bacteria and establishment of appropriate antibiotic treatment parameters are fundamental. Also as part as infection prevention, diminution of bacteria present on the commonly touched surfaces could also limit the spread and selection of antibiotic resistant bacteria. During my thesis, projects were developed around MRSA and antibiotic resistance investigation using innovative technologies. The thesis was subdivided in three main parts with the use of atomic force microscopy AFM for antibiotic resistance detection in part 1, the importance of the bacterial inoculum size in the selection of antibiotic resistance in part 2 and the testing of antimicrobial surfaces creating by sputtering copper onto polyester in part 3. In part 1 the AFM was used two different ways, first for the measurement of stiffness (elasticity) of bacteria and second as a nanosensor for antibiotic susceptibility testing. The stiffness of MRSA with different susceptibility profiles to vancomycin was investigated using the stiffness tomography mode of the AFM and results have demonstrated and increased stiffness in the vancomycin resistant strains that also paralleled with increased thickness of the bacterial cell wall. Parts of the AFM were also used to build a new antibiotic susceptibility-testing device. This nano sensor was able to measure vibrations emitted from living bacteria that ceased definitively upon antibiotic exposure to which they were susceptible but restarted after antibiotic removal to which they were resistant, allowing in a matter of minute the assessment of antibiotic susceptibility determination. In part 2 the inoculum effect (IE) of vancomycin, daptomycin and linezolid and its importance in antibiotic resistance selection was investigated with MRSA during a 15 days of cycling experiment. Results indicated that a high bacterial inoculum and a prolonged antibiotic exposure were two key factors in the in vitro antibiotic resistance selection in MRSA and should be taken into consideration when choosing the drug treatment. Finally in part 3 bactericidal textile surfaces were investigated against MRSA. Polyesters coated after 160 seconds of copper sputtering have demonstrated a high bactericidal activity reducing the bacterial load of at least 3 logio after one hour of contact. -- Au cours des dernières décennies, des bactéries multirésistantes aux antibiotiques (BMR) ont émergé dans les hôpitaux du monde entier. Depuis lors, le nombre de BMR et la prévalence des infections liées aux soins (IAS) continuent de croître et sont associés à une augmentation des taux de morbidité et de mortalité ainsi qu'à des coûts élevés. De plus, le nombre de résistance à différentes classes d'antibiotiques a également augmenté parmi les BMR, limitant ainsi les options thérapeutiques disponibles lorsqu'elles ont liées a des infections. Des mesures visant une détection plus rapide des bactéries résistantes ainsi que l'établissement des paramètres de traitement antibiotiques adéquats sont primordiales lors d'infections déjà présentes. Dans une optique de prévention, la diminution des bactéries présentes sur les surfaces communément touchées pourrait aussi freiner la dissémination et l'évolution des bactéries résistantes. Durant ma thèse, différents projets incluant des nouvelles technologies et évoluant autour de la résistance antibiotique ont été traités. Des nouvelles technologies telles que le microscope à force atomique (AFM) et la pulvérisation cathodique de cuivre (PCC) ont été utilisées, et le Staphylococcus aureus résistant à la méticilline (SARM) a été la principale BMR étudiée. Deux grandes lignes de recherche ont été développées; la première visant à détecter la résistance antibiotique plus rapidement avec l'AFM et la seconde visant à prévenir la dissémination des BMR avec des surfaces crées grâce à la PCC. L'AFM a tout d'abord été utilisé en tant que microscope à sonde locale afin d'investiguer la résistance à la vancomycine chez les SARMs. Les résultats ont démontré que la rigidité de la paroi augmentait avec la résistance à la vancomycine et que celle-ci corrélait aussi avec une augmentation de l'épaisseur des parois, vérifiée grâce à la microscopie électronique. Des parties d'un AFM ont été ensuite utilisées afin de créer un nouveau dispositif de test de sensibilité aux antibiotiques, un nanocapteur. Ce nanocapteur mesure des vibrations produites par les bactéries vivantes. Après l'ajout d'antibiotique, les vibrations cessent définitivement chez les bactéries sensibles à l'antibiotique. En revanche pour les bactéries résistantes, les vibrations reprennent après le retrait de l'antibiotique dans le milieu permettant ainsi, en l'espace de minutes de détecter la sensibilité de la bactérie à un antibiotique. La PCC a été utilisée afin de créer des surfaces bactéricides pour la prévention de la viabilité des BMR sur des surfaces inertes. Des polyesters finement recouverts de cuivre (Cu), connu pour ses propriétés bactéricides, ont été produits et testés contre des SARMs. Une méthode de détection de viabilité des bactéries sur ces surfaces a été mise au point, et les polyesters obtenus après 160 secondes de pulvérisation au Cu ont démontré une excellente activité bactéricide, diminuant la charge bactérienne d'au moins 3 logio après une heure de contact. En conclusion, l'utilisation de nouvelles technologies nous a permis d'évoluer vers de méthodes de détection de la résistance antibiotique plus rapides ainsi que vers le développement d'un nouveau type de surface bactéricide, dans le but d'améliorer le diagnostic et la gestion des BMR.
Resumo:
OBJECTIVE: We examined the correlation between clinical wear rates of restorative materials and enamel (TRAC Research Foundation, Provo, USA) and the results of six laboratory test methods (ACTA, Alabama (generalized, localized), Ivoclar (vertical, volumetric), Munich, OHSU (abrasion, attrition), Zurich). METHODS: Individual clinical wear data were available from clinical trials that were conducted by TRAC Research Foundation (formerly CRA) together with general practitioners. For each of the n=28 materials (21 composite resins for intra-coronal restorations [20 direct and 1 indirect], 5 resin materials for crowns, 1 amalgam, enamel) a minimum of 30 restorations had been placed in posterior teeth, mainly molars. The recall intervals were up to 5 years with the majority of materials (n=27) being monitored, however, only for up to 2 years. For the laboratory data, the databases MEDLINE and IADR abstracts were searched for wear data on materials which were also clinically tested by TRAC Research Foundation. Only those data for which the same test parameters (e.g. number of cycles, loading force, type of antagonist) had been published were included in the study. A different quantity of data was available for each laboratory method: Ivoclar (n=22), Zurich (n=20), Alabama (n=17), OHSU and ACTA (n=12), Munich (n=7). The clinical results were summed up in an index and a linear mixed model was fitted to the log wear measurements including the following factors: material, time (0.5, 1, 2 and 3 years), tooth (premolar/molar) and gender (male/female) as fixed effects, and patient as random effect. Relative ranks were created for each material and method; the same was performed with the clinical results. RESULTS: The mean age of the subjects was 40 (±12) years. The materials had been mostly applied in molars (81%) and 95% of the intracoronal restorations were Class II restorations. The mean number of individual wear data per material was 25 (range 14-42). The mean coefficient of variation of clinical wear data was 53%. The only significant correlation was reached by OHSU (abrasion) with a Spearman r of 0.86 (p=0.001). Zurich, ACTA, Alabama generalized wear and Ivoclar (volume) had correlation coefficients between 0.3 and 0.4. For Zurich, Alabama generalized wear and Munich, the correlation coefficient improved if only composites for direct use were taken into consideration. The combination of different laboratory methods did not significantly improve the correlation. SIGNIFICANCE: The clinical wear of composite resins is mainly dependent on differences between patients and less on the differences between materials. Laboratory methods to test conventional resins for wear are therefore less important, especially since most of them do not reflect the clinical wear.