917 resultados para Histologic lip measurements and analyses


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Magnetic Resonance Spectroscopy (MRS) is an advanced clinical and research application which guarantees a specific biochemical and metabolic characterization of tissues by the detection and quantification of key metabolites for diagnosis and disease staging. The "Associazione Italiana di Fisica Medica (AIFM)" has promoted the activity of the "Interconfronto di spettroscopia in RM" working group. The purpose of the study is to compare and analyze results obtained by perfoming MRS on scanners of different manufacturing in order to compile a robust protocol for spectroscopic examinations in clinical routines. This thesis takes part into this project by using the GE Signa HDxt 1.5 T at the Pavillion no. 11 of the S.Orsola-Malpighi hospital in Bologna. The spectral analyses have been performed with the jMRUI package, which includes a wide range of preprocessing and quantification algorithms for signal analysis in the time domain. After the quality assurance on the scanner with standard and innovative methods, both spectra with and without suppression of the water peak have been acquired on the GE test phantom. The comparison of the ratios of the metabolite amplitudes over Creatine computed by the workstation software, which works on the frequencies, and jMRUI shows good agreement, suggesting that quantifications in both domains may lead to consistent results. The characterization of an in-house phantom provided by the working group has achieved its goal of assessing the solution content and the metabolite concentrations with good accuracy. The goodness of the experimental procedure and data analysis has been demonstrated by the correct estimation of the T2 of water, the observed biexponential relaxation curve of Creatine and the correct TE value at which the modulation by J coupling causes the Lactate doublet to be inverted in the spectrum. The work of this thesis has demonstrated that it is possible to perform measurements and establish protocols for data analysis, based on the physical principles of NMR, which are able to provide robust values for the spectral parameters of clinical use.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A study of hadron production by photons opens unique ways to address a number of fundamental problems in strong interaction physics as well as fundamental questions in Quantum Field Theory. In particular, an understanding of two-photon processes is of crucial importance for constraining the hadronic uncertainties in precision measurements and in searches for new physics. The process of

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Während der letzten Jahre wurde für Spinfilter-Detektoren ein wesentlicher Schritt in Richtung stark erhöhter Effizienz vollzogen. Das ist eine wichtige Voraussetzung für spinaufgelöste Messungen mit Hilfe von modernen Elektronensp ektrometern und Impulsmikroskopen. In dieser Doktorarbeit wurden bisherige Arbeiten der parallel abbildenden Technik weiterentwickelt, die darauf beruht, dass ein elektronenoptisches Bild unter Ausnutzung der k-parallel Erhaltung in der Niedrigenergie-Elektronenbeugung auch nach einer Reflektion an einer kristallinen Oberfläche erhalten bleibt. Frühere Messungen basierend auf der spekularen Reflexion an einerrnW(001) Oberfläche [Kolbe et al., 2011; Tusche et al., 2011] wurden auf einenrnviel größeren Parameterbereich erweitert und mit Ir(001) wurde ein neues System untersucht, welches eine sehr viel längere Lebensdauer der gereinigten Kristalloberfläche im UHV aufweist. Die Streuenergie- und Einfallswinkel-“Landschaft” der Spinempfindlichkeit S und der Reflektivität I/I0 von gestreuten Elektronen wurde im Bereich von 13.7 - 36.7 eV Streuenergie und 30◦ - 60◦ Streuwinkel gemessen. Die dazu neu aufgebaute Messanordnung umfasst eine spinpolarisierte GaAs Elektronenquellernund einen drehbaren Elektronendetektor (Delayline Detektor) zur ortsauflösenden Detektion der gestreuten Elektronen. Die Ergebnisse zeigen mehrere Regionen mit hoher Asymmetrie und großem Gütefaktor (figure of merit FoM), definiert als S2 · I/I0. Diese Regionen eröffnen einen Weg für eine deutliche Verbesserung der Vielkanal-Spinfiltertechnik für die Elektronenspektroskopie und Impulsmikroskopie. Im praktischen Einsatz erwies sich die Ir(001)-Einkristalloberfläche in Bezug auf längere Lebensdauer im UHV (ca. 1 Messtag), verbunden mit hoher FOM als sehr vielversprechend. Der Ir(001)-Detektor wurde in Verbindung mit einem Halbkugelanalysator bei einem zeitaufgelösten Experiment im Femtosekunden-Bereich am Freie-Elektronen-Laser FLASH bei DESY eingesetzt. Als gute Arbeitspunkte erwiesen sich 45◦ Streuwinkel und 39 eV Streuenergie, mit einer nutzbaren Energiebreite von 5 eV, sowie 10 eV Streuenergie mit einem schmaleren Profil von < 1 eV aber etwa 10× größerer Gütefunktion. Die Spinasymmetrie erreicht Werte bis 70 %, was den Einfluss von apparativen Asymmetrien deutlich reduziert. Die resultierende Messungen und Energie-Winkel-Landschaft zeigt recht gute Übereinstimmung mit der Theorie (relativistic layer-KKR SPLEED code [Braun et al., 2013; Feder et al.,rn2012])

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Die vorliegende Arbeit untersucht die Struktur und Zusammensetzung der untersten Atmosphäre im Rahmen der PARADE-Messkampagne (PArticles and RAdicals: Diel observations of the impact of urban and biogenic Emissions) am Kleinen Feldberg in Deutschland im Spätsommer 2011. Dazu werden Messungen von meteorologischen Grundgrößen (Temperatur, Feuchte, Druck, Windgeschwindigkeit und -richtung) zusammen mit Radiosonden und flugzeuggetragenen Messungen von Spurengasen (Kohlenstoffmonoxid, -dioxid, Ozon und Partikelanzahlkonzentrationen) ausgewertet. Ziel ist es, mit diesen Daten, die thermodynamischen und dynamischen Eigenschaften und deren Einfluss auf die chemische Luftmassenzusammensetzung in der planetaren Grenzschicht zu bestimmen. Dazu werden die Radiosonden und Flugzeugmessungen mit Lagrangeschen Methoden kombiniert und es wird zwischen rein kinematischen Modellen (LAGRANTO und FLEXTRA) sowie sogenannten Partikeldispersionsmodellen (FLEXPART) unterschieden. Zum ersten Mal wurde im Rahmen dieser Arbeit dabei auch eine Version von FLEXPART-COSMO verwendet, die von den meteorologischen Analysefeldern des Deutschen Wetterdienstes angetrieben werden. Aus verschiedenen bekannten Methoden der Grenzschichthöhenbestimmung mit Radiosondenmessungen wird die Bulk-Richardson-Zahl-Methode als Referenzmethode verwendet, da sie eine etablierte Methode sowohl für Messungen und als auch Modellanalysen darstellt. Mit einer Toleranz von 125 m, kann zu 95 % mit mindestens drei anderen Methoden eine Übereinstimmung zu der ermittelten Grenzschichthöhe festgestellt werden, was die Qualität der Grenzschichthöhe bestätigt. Die Grenzschichthöhe variiert während der Messkampagne zwischen 0 und 2000 m über Grund, wobei eine hohe Grenzschicht nach dem Durchzug von Kaltfronten beobachtet wird, hingegen eine niedrige Grenzschicht unter Hochdruckeinfluss und damit verbundener Subsidenz bei windarmen Bedingungen im Warmsektor. Ein Vergleich zwischen den Grenzschichthöhen aus Radiosonden und aus Modellen (COSMO-DE, COSMO-EU, COSMO-7) zeigt nur geringe Unterschiede um -6 bis +12% während der Kampagne am Kleinen Feldberg. Es kann allerdings gezeigt werden, dass in größeren Simulationsgebieten systematische Unterschiede zwischen den Modellen (COSMO-7 und COSMO-EU) auftreten. Im Rahmen dieser Arbeit wird deutlich, dass die Bodenfeuchte, die in diesen beiden Modellen unterschiedlich initialisiert wird, zu verschiedenen Grenzschichthöhen führt. Die Folge sind systematische Unterschiede in der Luftmassenherkunft und insbesondere der Emissionssensitivität. Des Weiteren kann lokale Mischung zwischen der Grenzschicht und der freien Troposphäre bestimmt werden. Dies zeigt sich in der zeitlichen Änderung der Korrelationen zwischen CO2 und O3 aus den Flugzeugmessungen, und wird im Vergleich mit Rückwärtstrajektorien und Radiosondenprofilen bestärkt. Das Einmischen der Luftmassen in die Grenzschicht beeinflusst dabei die chemische Zusammensetzung in der Vertikalen und wahrscheinlich auch am Boden. Diese experimentelle Studie bestätigt die Relevanz der Einmischungsprozesse aus der freien Troposphäre und die Verwendbarkeit der Korrelationsmethode, um Austausch- und Einmischungsprozesse an dieser Grenzfläche zu bestimmen.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Stratosphärische Partikel sind typischerweise mit dem bloßen Auge nicht wahrnehmbar. Dennoch haben sie einen signifikanten Einfluss auf die Strahlungsbilanz der Erde und die heteorogene Chemie in der Stratosphäre. Kontinuierliche, vertikal aufgelöste, globale Datensätze sind daher essenziell für das Verständnis physikalischer und chemischer Prozesse in diesem Teil der Atmosphäre. Beginnend mit den Messungen des zweiten Stratospheric Aerosol Measurement (SAM II) Instruments im Jahre 1978 existiert eine kontinuierliche Zeitreihe für stratosphärische Aerosol-Extinktionsprofile, welche von Messinstrumenten wie dem zweiten Stratospheric Aerosol and Gas Experiment (SAGE II), dem SCIAMACHY, dem OSIRIS und dem OMPS bis heute fortgeführt wird. rnrnIn dieser Arbeit wird ein neu entwickelter Algorithmus vorgestellt, der das sogenannte ,,Zwiebel-Schäl Prinzip'' verwendet, um Extinktionsprofile zwischen 12 und 33 km zu berechnen. Dafür wird der Algorithmus auf Radianzprofile einzelner Wellenlängen angewandt, die von SCIAMACHY in der Limb-Geometrie gemessen wurden. SCIAMACHY's einzigartige Methode abwechselnder Limb- und Nadir-Messungen bietet den Vorteil, hochaufgelöste vertikale und horizontale Messungen mit zeitlicher und räumlicher Koinzidenz durchführen zu können. Die dadurch erlangten Zusatzinformationen können verwendet werden, um die Effekte von horizontalen Gradienten entlang der Sichtlinie des Messinstruments zu korrigieren, welche vor allem kurz nach Vulkanausbrüchen und für polare Stratosphärenwolken beobachtet werden. Wenn diese Gradienten für die Berechnung von Extinktionsprofilen nicht beachtet werden, so kann dies dazu führen, dass sowohl die optischen Dicke als auch die Höhe von Vulkanfahnen oder polarer Stratosphärenwolken unterschätzt werden. In dieser Arbeit wird ein Verfahren vorgestellt, welches mit Hilfe von dreidimensionalen Strahlungstransportsimulationen und horizontal aufgelösten Datensätzen die berechneten Extinktionsprofile korrigiert.rnrnVergleichsstudien mit den Ergebnissen von Satelliten- (SAGE II) und Ballonmessungen zeigen, dass Extinktionsprofile von stratosphärischen Partikeln mit Hilfe des neu entwickelten Algorithmus berechnet werden können und gut mit bestehenden Datensätzen übereinstimmen. Untersuchungen des Nabro Vulkanausbruchs 2011 und des Auftretens von polaren Stratosphärenwolken in der südlichen Hemisphäre zeigen, dass das Korrekturverfahren für horizontale Gradienten die berechneten Extinktionsprofile deutlich verbessert.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We present a geospatial model to predict the radiofrequency electromagnetic field from fixed site transmitters for use in epidemiological exposure assessment. The proposed model extends an existing model toward the prediction of indoor exposure, that is, at the homes of potential study participants. The model is based on accurate operation parameters of all stationary transmitters of mobile communication base stations, and radio broadcast and television transmitters for an extended urban and suburban region in the Basel area (Switzerland). The model was evaluated by calculating Spearman rank correlations and weighted Cohen's kappa (kappa) statistics between the model predictions and measurements obtained at street level, in the homes of volunteers, and in front of the windows of these homes. The correlation coefficients of the numerical predictions with street level measurements were 0.64, with indoor measurements 0.66, and with window measurements 0.67. The kappa coefficients were 0.48 (95%-confidence interval: 0.35-0.61) for street level measurements, 0.44 (95%-CI: 0.32-0.57) for indoor measurements, and 0.53 (95%-CI: 0.42-0.65) for window measurements. Although the modeling of shielding effects by walls and roofs requires considerable simplifications of a complex environment, we found a comparable accuracy of the model for indoor and outdoor points.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Software is available, which simulates all basic electrophoretic systems, including moving boundary electrophoresis, zone electrophoresis, ITP, IEF and EKC, and their combinations under almost exactly the same conditions used in the laboratory. These dynamic models are based upon equations derived from the transport concepts such as electromigration, diffusion, electroosmosis and imposed hydrodynamic buffer flow that are applied to user-specified initial distributions of analytes and electrolytes. They are able to predict the evolution of electrolyte systems together with associated properties such as pH and conductivity profiles and are as such the most versatile tool to explore the fundamentals of electrokinetic separations and analyses. In addition to revealing the detailed mechanisms of fundamental phenomena that occur in electrophoretic separations, dynamic simulations are useful for educational purposes. This review includes a list of current high-resolution simulators, information on how a simulation is performed, simulation examples for zone electrophoresis, ITP, IEF and EKC and a comprehensive discussion of the applications and achievements.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Although non-organic hearing losses are relatively rare, it is important to identify suspicious findings early to be able to administer specific tests, such as objective measurements and specific counseling. In this retrospective study, we searched for findings that were specific ti or typical for non-organic hearing losses. Patient records from a 6 year period (2003-2008) from the University ENT Department of Bern, Switzerland, were reviewed. In this period, 40 subjects were diagnosed with a non-organic hearing loss (22 children, ages 7-16, mean 10.6 years; 18 adults, ages 19-57, mean 39.7 years; 25 females and 15 males). Pure tone audiograms in children and adults showed predominantly sensorineural and frequency-independent hearing losses, mostly in the range of 40-60 dB. In all cases, objective measurements (otoacoustic emissions and/or auditory-evoked potentials) indicated normal or substantially better hearing thresholds than those found in pure tone audiometry. In nine subjects (22.5%; 2 children, 7 adults), hearing aids had been fitted before the first presentation at our center. Six children (27%) had a history of middle ear problems with a transient hearing loss and 11 (50%) knew a person with a hearing loss. Two new and hitherto unreported findings emerged from the analysis: it was observed that a small air-bone gap of 5-20 dB was typical for non-organic hearing losses and that speech audiometry might show considerably poorer results than expected from pure tone audiometry.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We characterized lipid and lipoprotein changes associated with a lopinavir/ritonavir-containing regimen. We enrolled previously antiretroviral-naive patients participating in the Swiss HIV Cohort Study. Fasting blood samples (baseline) were retrieved retrospectively from stored frozen plasma and posttreatment (follow-up) samples were collected prospectively at two separate visits. Lipids and lipoproteins were analyzed at a single reference laboratory. Sixty-five patients had two posttreatment lipid profile measurements and nine had only one. Most of the measured lipids and lipoprotein plasma concentrations increased on lopinavir/ritonavir-based treatment. The percentage of patients with hypertriglyceridemia (TG >150?mg/dl) increased from 28/74 (38%) at baseline to 37/65 (57%) at the second follow-up. We did not find any correlation between lopinavir plasma levels and the concentration of triglycerides. There was weak evidence of an increase in small dense LDL-apoB during the first year of treatment but not beyond 1 year (odds ratio 4.5, 90% CI 0.7 to 29 and 0.9, 90% CI 0.5 to 1.5, respectively). However, 69% of our patients still had undetectable small dense LDL-apoB levels while on treatment. LDL-cholesterol increased by a mean of 17?mg/dl (90% CI -3 to 37) during the first year of treatment, but mean values remained below the cut-off for therapeutic intervention. Despite an increase in the majority of measured lipids and lipoproteins particularly in the first year after initiation, we could not detect an obvious increase of cardiovascular risk resulting from the observed lipid changes.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Bridging the gap between research and policy is of growing importance in international development. The National Centre of Competence in Research (NCCR) North-South has rich experience in collaborating beyond academic boundaries to make their research relevant to various societal actors. This publication is the first to provide an overview of the effectiveness of NCCR North-South researchers’ efforts to interact with policy, practice, and local communities with a view to effecting a change in practices. A systematic assessment of researchers’ interactions with non-academic partners is presented, based on principles of monitoring and evaluation. On this basis, tools for collective learning and widespread adaptation are proposed. The report shows with what types of societal actors NCCR North-South researchers collaborate and analyses examples of how researchers conduct dialogue beyond academic boundaries, leading to specific outcomes. It also explains the frame conditions considered decisive for successful and sustainable policy dialogue and concludes with recommendations about how the NCCR North-South can increase the effectiveness of its research for development. The publication is a valuable source of inspiration for those interested in better understanding how to generate the multiple benefits of making science relevant to society.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Monte Carlo (MC) based dose calculations can compute dose distributions with an accuracy surpassing that of conventional algorithms used in radiotherapy, especially in regions of tissue inhomogeneities and surface discontinuities. The Swiss Monte Carlo Plan (SMCP) is a GUI-based framework for photon MC treatment planning (MCTP) interfaced to the Eclipse treatment planning system (TPS). As for any dose calculation algorithm, also the MCTP needs to be commissioned and validated before using the algorithm for clinical cases. Aim of this study is the investigation of a 6 MV beam for clinical situations within the framework of the SMCP. In this respect, all parts i.e. open fields and all the clinically available beam modifiers have to be configured so that the calculated dose distributions match the corresponding measurements. Dose distributions for the 6 MV beam were simulated in a water phantom using a phase space source above the beam modifiers. The VMC++ code was used for the radiation transport through the beam modifiers (jaws, wedges, block and multileaf collimator (MLC)) as well as for the calculation of the dose distributions within the phantom. The voxel size of the dose distributions was 2mm in all directions. The statistical uncertainty of the calculated dose distributions was below 0.4%. Simulated depth dose curves and dose profiles in terms of [Gy/MU] for static and dynamic fields were compared with the corresponding measurements using dose difference and γ analysis. For the dose difference criterion of ±1% of D(max) and the distance to agreement criterion of ±1 mm, the γ analysis showed an excellent agreement between measurements and simulations for all static open and MLC fields. The tuning of the density and the thickness for all hard wedges lead to an agreement with the corresponding measurements within 1% or 1mm. Similar results have been achieved for the block. For the validation of the tuned hard wedges, a very good agreement between calculated and measured dose distributions was achieved using a 1%/1mm criteria for the γ analysis. The calculated dose distributions of the enhanced dynamic wedges (10°, 15°, 20°, 25°, 30°, 45° and 60°) met the criteria of 1%/1mm when compared with the measurements for all situations considered. For the IMRT fields all compared measured dose values agreed with the calculated dose values within a 2% dose difference or within 1 mm distance. The SMCP has been successfully validated for a static and dynamic 6 MV photon beam, thus resulting in accurate dose calculations suitable for applications in clinical cases.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We hypothesized that fluid administration may increase regional splanchnic perfusion after abdominal surgery-even in the absence of a cardiac stroke volume (SV) increase and independent of accompanying endotoxemia. Sixteen anesthetized pigs underwent abdominal surgery with flow probe fitting around splanchnic vessels and carotid arteries. They were randomized to continuous placebo or endotoxin infusion, and when clinical signs of hypovolemia (mean arterial pressure, <60 mmHg; heart rate, >100 beats · min(-1); urine production, <0.5 mL · kg(-1) · h(-1); arterial lactate concentration, >2 mmol · L(-1)) and/or low pulmonary artery occlusion pressure (target 5-8 mmHg) were present, they received repeated boli of colloids (50 mL) as long as SV increased 10% or greater. Stroke volume and regional blood flows were monitored 2 min before and 30 min after fluid challenges. Of 132 fluid challenges, 45 (34%) resulted in an SV increase of 10% or greater, whereas 82 (62%) resulted in an increase of 10% or greater in one or more of the abdominal flows (P < 0.001). During blood flow redistribution, celiac trunk (19% of all measurements) and hepatic artery flow (15%) most often decreased, whereas portal vein (10%) and carotid artery (7%) flow decreased less frequently (P = 0.015, between regions). In control animals, celiac trunk (30% vs. 9%, P = 0.004) and hepatic artery (25% vs. 11%, P = 0.040) flow decreased more often than in endotoxin-infused pigs. Accordingly, blood flow redistribution is a common phenomenon in the postoperative period and is only marginally influenced by endotoxemia. Fluid management based on SV changes may not be useful for improving regional abdominal perfusion.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The synthesis of a caged RNA phosphoramidite building block containing the oxidatively damaged base 5-hydroxycytidine (5-HOrC) has been accomplished. To determine the effect of this highly mutagenic lesion on complementary base recognition and coding properties, this building block was incorporated into a 12-mer oligoribonucleotide for Tm and CD measurements and a 31-mer template strand for primer extension experiments with HIV-, AMV- and MMLV-reverse transcriptase (RT). In UV-melting experiments, we find an unusual biphasic transition with two distinct Tm's when 5-HOrC is paired against a DNA or RNA complement with the base guanine in opposing position. The higher Tm closely matches that of a C-G base pair while the lower is close to that of a C-A mismatch. In single nucleotide extension reactions, we find substantial misincorporation of dAMP and to a lesser extent dTMP, with dAMP almost equaling that of the parent dGMP in the case of HIV-RT. A working hypothesis for the biphasic melting transition does not invoke tautomeric variability of 5-HOrC but rather local structural perturbations of the base pair at low temperature induced by interactions of the 5-HO group with the phosphate backbone. The properties of this RNA damage is discussed in the context of its putative biological function.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We present results from the international field campaign DAURE (Detn. of the sources of atm. Aerosols in Urban and Rural Environments in the Western Mediterranean), with the objective of apportioning the sources of fine carbonaceous aerosols. Submicron fine particulate matter (PM1) samples were collected during Feb.-March 2009 and July 2009 at an urban background site in Barcelona (BCN) and at a forested regional background site in Montseny (MSY). We present radiocarbon (14C) anal. for elemental and org. carbon (EC and OC) and source apportionment for these data. We combine the results with those from component anal. of aerosol mass spectrometer (AMS) measurements, and compare to levoglucosan-based ests. of biomass burning OC, source apportionment of filter data with inorg. compn. + EC + OC, submicron bulk potassium (K) concns., and gaseous acetonitrile concns. At BCN, 87 % and 91 % of the EC on av., in winter and summer, resp., had a fossil origin, whereas at MSY these fractions were 66 % and 79 %. The contribution of fossil sources to org. carbon (OC) at BCN was 40 % and 48 %, in winter and summer, resp., and 31 % and 25 % at MSY. The combination of results obtained using the 14C technique, AMS data, and the correlations between fossil OC and fossil EC imply that the fossil OC at Barcelona is ∼47 % primary whereas at MSY the fossil OC is mainly secondary (∼85 %). Day-to-day variation in total carbonaceous aerosol loading and the relative contributions of different sources predominantly depended on the meteorol. transport conditions. The estd. biogenic secondary OC at MSY only increased by ∼40 % compared to the order-of-magnitude increase obsd. for biogenic volatile org. compds. (VOCs) between winter and summer, which highlights the uncertainties in the estn. of that component. Biomass burning contributions estd. using the 14C technique ranged from similar to slightly higher than when estd. using other techniques, and the different estns. were highly or moderately correlated. Differences can be explained by the contribution of secondary org. matter (not included in the primary biomass burning source ests.), and/or by an over-estn. of the biomass burning OC contribution by the 14C technique if the estd. biomass burning EC/OC ratio used for the calcns. is too high for this region. Acetonitrile concns. correlate well with the biomass burning EC detd. by 14C. K is a noisy tracer for biomass burning. [on SciFinder(R)]

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The aim of our study is to evaluate the performance of surface sealants and conventional polishing after ageing procedures. Eighty circular composite restorations were performed on extracted human molars. After standardised roughening, the restorations were either sealed with one of three surface sealants (Lasting Touch (LT), BisCover LV (BC), G-Coat Plus (GP) or a dentin adhesive Heliobond (HB)) or were manually polished with silicon polishers (MP) (n = 16). The average roughness (Ra) and colourimetric parameters (CP) (L*a*b*) were evaluated. The specimens underwent an artificial ageing process by thermocycling, staining (coffee) and abrasive (toothbrushing) procedures. After each ageing step, Ra and CP measurements were repeated. A qualitative surface analysis was performed with SEM. The differences between the test groups regarding Ra and CP values were analysed with nonparametric ANOVA analysis (α = 0.05). The lowest Ra values were achieved with HB. BC and GP resulted in Ra values below 0.2 μm (clinically relevant threshold), whereas LT and MP sometimes led to higher Ra values. LT showed a significantly higher discolouration after the first coffee staining, but this was normalised to the other groups after toothbrushing. The differences between the measurements and test groups for Ra and CP were statistically significant. However, the final colour difference showed no statistical difference among the five groups. SEM evaluation showed clear alterations after ageing in all coating groups. Surface sealants and dentin adhesives have the potential to reduce surface roughness but tend to debond over time. Surface sealants can only be recommended for polishing provisional restorations.