891 resultados para pacs: information storage and retrieval
Resumo:
The present study is part of the EU Integrated Project “GEHA – Genetics of Healthy Aging” (Franceschi C et al., Ann N Y Acad Sci. 1100: 21-45, 2007), whose aim is to identify genes involved in healthy aging and longevity, which allow individuals to survive to advanced age in good cognitive and physical function and in the absence of major age-related diseases. Aims The major aims of this thesis were the following: 1. to outline the recruitment procedure of 90+ Italian siblings performed by the recruiting units of the University of Bologna (UNIBO) and Rome (ISS). The procedures related to the following items necessary to perform the study were described and commented: identification of the eligible area for recruitment, demographic aspects related to the need of getting census lists of 90+siblings, mail and phone contact with 90+ subjects and their families, bioethics aspects of the whole procedure, standardization of the recruitment methodology and set-up of a detailed flow chart to be followed by the European recruitment centres (obtainment of the informed consent form, anonimization of data by using a special code, how to perform the interview, how to collect the blood, how to enter data in the GEHA Phenotypic Data Base hosted at Odense). 2. to provide an overview of the phenotypic characteristics of 90+ Italian siblings recruited by the recruiting units of the University of Bologna (UNIBO) and Rome (ISS). The following items were addressed: socio-demographic characteristics, health status, cognitive assessment, physical conditions (handgrip strength test, chair-stand test, physical ability including ADL, vision and hearing ability, movement ability and doing light housework), life-style information (smoking and drinking habits) and subjective well-being (attitude towards life). Moreover, haematological parameters collected in the 90+ sibpairs as optional parameters by the Bologna and Rome recruiting units were used for a more comprehensive evaluation of the results obtained using the above mentioned phenotypic characteristics reported in the GEHA questionnaire. 3. to assess 90+ Italian siblings as far as their health/functional status is concerned on the basis of three classification methods proposed in previous studies on centenarians, which are based on: • actual functional capabilities (ADL, SMMSE, visual and hearing abilities) (Gondo et al., J Gerontol. 61A (3): 305-310, 2006); • actual functional capabilities and morbidity (ADL, ability to walk, SMMSE, presence of cancer, ictus, renal failure, anaemia, and liver diseases) (Franceschi et al., Aging Clin Exp Res, 12:77-84, 2000); • retrospectively collected data about past history of morbidity and age of disease onset (hypertension, heart disease, diabetes, stroke, cancer, osteopororis, neurological diseases, chronic obstructive pulmonary disease and ocular diseases) (Evert et al., J Gerontol A Biol Sci Med Sci. 58A (3): 232-237, 2003). Firstly these available models to define the health status of long-living subjects were applied to the sample and, since the classifications by Gondo and Franceschi are both based on the present functional status, they were compared in order to better recognize the healthy aging phenotype and to identify the best group of 90+ subjects out of the entire studied population. 4. to investigate the concordance of health and functional status among 90+ siblings in order to divide sibpairs in three categories: the best (both sibs are in good shape), the worst (both sibs are in bad shape) and an intermediate group (one sib is in good shape and the other is in bad shape). Moreover, the evaluation wanted to discover which variables are concordant among siblings; thus, concordant variables could be considered as familiar variables (determined by the environment or by genetics). 5. to perform a survival analysis by using mortality data at 1st January 2009 from the follow-up as the main outcome and selected functional and clinical parameters as explanatory variables. Methods A total of 765 90+ Italian subjects recruited by UNIBO (549 90+ siblings, belonging to 258 families) and ISS (216 90+ siblings, belonging to 106 families) recruiting units are included in the analysis. Each subject was interviewed according to a standardized questionnaire, comprising extensively utilized questions that have been validated in previous European studies on elderly subjects and covering demographic information, life style, living conditions, cognitive status (SMMSE), mood, health status and anthropometric measurements. Moreover, subjects were asked to perform some physical tests (Hand Grip Strength test and Chair Standing test) and a sample of about 24 mL of blood was collected and then processed according to a common protocol for the preparation and storage of DNA aliquots. Results From the analysis the main findings are the following: - a standardized protocol to assess cognitive status, physical performances and health status of European nonagenarian subjects was set up, in respect to ethical requirements, and it is available as a reference for other studies in this field; - GEHA families are enriched in long-living members and extreme survival, and represent an appropriate model for the identification of genes involved in healthy aging and longevity; - two simplified sets of criteria to classify 90+ sibling according to their health status were proposed, as operational tools for distinguishing healthy from non healthy subjects; - cognitive and functional parameters have a major role in categorizing 90+ siblings for the health status; - parameters such as education and good physical abilities (500 metres walking ability, going up and down the stairs ability, high scores at hand grip and chair stand tests) are associated with a good health status (defined as “cognitive unimpairment and absence of disability”); - male nonagenarians show a more homogeneous phenotype than females, and, though far fewer in number, tend to be healthier than females; - in males the good health status is not protective for survival, confirming the male-female health survival paradox; - survival after age 90 was dependent mainly on intact cognitive status and absence of functional disabilities; - haemoglobin and creatinine levels are both associated with longevity; - the most concordant items among 90+ siblings are related to the functional status, indicating that they contain a familiar component. It is still to be investigated at what level this familiar component is determined by genetics or by environment or by the interaction between genetics, environment and chance (and at what level). Conclusions In conclusion, we could state that this study, in accordance with the main objectives of the whole GEHA project, represents one of the first attempt to identify the biological and non biological determinants of successful/unsuccessful aging and longevity. Here, the analysis was performed on 90+ siblings recruited in Northern and Central Italy and it could be used as a reference for others studies in this field on Italian population. Moreover, it contributed to the definition of “successful” and “unsuccessful” aging and categorising a very large cohort of our most elderly subjects into “successful” and “unsuccessful” groups provided an unrivalled opportunity to detect some of the basic genetic/molecular mechanisms which underpin good health as opposed to chronic disability. Discoveries in the topic of the biological determinants of healthy aging represent a real possibility to identify new markers to be utilized for the identification of subgroups of old European citizens having a higher risk to develop age-related diseases and disabilities and to direct major preventive medicine strategies for the new epidemic of chronic disease in the 21st century.
Resumo:
The hydrogen production in the green microalga Chlamydomonas reinhardtii was evaluated by means of a detailed physiological and biotechnological study. First, a wide screening of the hydrogen productivity was done on 22 strains of C. reinhardtii, most of which mutated at the level of the D1 protein. The screening revealed for the first time that mutations upon the D1 protein may result on an increased hydrogen production. Indeed, productions ranged between 0 and more than 500 mL hydrogen per liter of culture (Torzillo, Scoma et al., 2007a), the highest producer (L159I-N230Y) being up to 5 times more performant than the strain cc124 widely adopted in literature (Torzillo, Scoma, et al., 2007b). Improved productivities by D1 protein mutants were generally a result of high photosynthetic capabilities counteracted by high respiration rates. Optimization of culture conditions were addressed according to the results of the physiological study of selected strains. In a first step, the photobioreactor (PBR) was provided with a multiple-impeller stirring system designed, developed and tested by us, using the strain cc124. It was found that the impeller system was effectively able to induce regular and turbulent mixing, which led to improved photosynthetic yields by means of light/dark cycles. Moreover, improved mixing regime sustained higher respiration rates, compared to what obtained with the commonly used stir bar mixing system. As far as the results of the initial screening phase are considered, both these factors are relevant to the hydrogen production. Indeed, very high energy conversion efficiencies (light to hydrogen) were obtained with the impeller device, prooving that our PBR was a good tool to both improve and study photosynthetic processes (Giannelli, Scoma et al., 2009). In the second part of the optimization, an accurate analysis of all the positive features of the high performance strain L159I-N230Y pointed out, respect to the WT, it has: (1) a larger chlorophyll optical cross-section; (2) a higher electron transfer rate by PSII; (3) a higher respiration rate; (4) a higher efficiency of utilization of the hydrogenase; (5) a higher starch synthesis capability; (6) a higher per cell D1 protein amount; (7) a higher zeaxanthin synthesis capability (Torzillo, Scoma et al., 2009). These information were gathered with those obtained with the impeller mixing device to find out the best culture conditions to optimize productivity with strain L159I-N230Y. The main aim was to sustain as long as possible the direct PSII contribution, which leads to hydrogen production without net CO2 release. Finally, an outstanding maximum rate of 11.1 ± 1.0 mL/L/h was reached and maintained for 21.8 ± 7.7 hours, when the effective photochemical efficiency of PSII (ΔF/F'm) underwent a last drop to zero. If expressed in terms of chl (24.0 ± 2.2 µmoles/mg chl/h), these rates of production are 4 times higher than what reported in literature to date (Scoma et al., 2010a submitted). DCMU addition experiments confirmed the key role played by PSII in sustaining such rates. On the other hand, experiments carried out in similar conditions with the control strain cc124 showed an improved final productivity, but no constant PSII direct contribution. These results showed that, aside from fermentation processes, if proper conditions are supplied to selected strains, hydrogen production can be substantially enhanced by means of biophotolysis. A last study on the physiology of the process was carried out with the mutant IL. Although able to express and very efficiently utilize the hydrogenase enzyme, this strain was unable to produce hydrogen when sulfur deprived. However, in a specific set of experiments this goal was finally reached, pointing out that other than (1) a state 1-2 transition of the photosynthetic apparatus, (2) starch storage and (3) anaerobiosis establishment, a timely transition to the hydrogen production is also needed in sulfur deprivation to induce the process before energy reserves are driven towards other processes necessary for the survival of the cell. This information turned out to be crucial when moving outdoor for the hydrogen production in a tubular horizontal 50-liter PBR under sunlight radiation. First attempts with laboratory grown cultures showed that no hydrogen production under sulfur starvation can be induced if a previous adaptation of the culture is not pursued outdoor. Indeed, in these conditions the hydrogen production under direct sunlight radiation with C. reinhardtii was finally achieved for the first time in literature (Scoma et al., 2010b submitted). Experiments were also made to optimize productivity in outdoor conditions, with respect to the light dilution within the culture layers. Finally, a brief study of the anaerobic metabolism of C. reinhardtii during hydrogen oxidation has been carried out. This study represents a good integration to the understanding of the complex interplay of pathways that operate concomitantly in this microalga.
Resumo:
Information processing and storage in the brain may be presented by the oscillations and cell assemblies. Here we address the question of how individual neurons associate together to assemble neural networks and present spontaneous electrical activity. Therefore, we dissected the neonatal brain at three different levels: acute 1-mm thick brain slice, cultured organotypic 350-µm thick brain slice and dissociated neuronal cultures. The spatio-temporal properties of neural activity were investigated by using a 60-channel Micro-electrode arrays (MEA), and the cell assemblies were studied by using a template-matching method. We find local on-propagating as well as large- scale propagating spontaneous oscillatory activity in acute slices, spontaneous network activity characterized by synchronized burst discharges in organotypic cultured slices, and autonomous bursting behaviour in dissociated neuronal cultures. Furthermore, repetitive spike patterns emerge after one week of dissociated neuronal culture and dramatically increase their numbers as well as their complexity and occurrence in the second week. Our data indicate that neurons can self-organize themselves, assembly to a neural network, present spontaneous oscillations, and emerge spatio-temporal activation patterns. The spontaneous oscillations and repetitive spike patterns may serve fundamental functions for information processing and storage in the brain.
Resumo:
Graphene, the thinnest two-dimensional material possible, is considered as a realistic candidate for the numerous applications in electronic, energy storage and conversion devices due to its unique properties, such as high optical transmittance, high conductivity, excellent chemical and thermal stability. However, the electronic and chemical properties of graphene are highly dependent on their preparation methods. Therefore, the development of novel chemical exfoliation process which aims at high yield synthesis of high quality graphene while maintaining good solution processability is of great concern. This thesis focuses on the solution production of high-quality graphene by wet-chemical exfoliation methods and addresses the applications of the chemically exfoliated graphene in organic electronics and energy storage devices.rnPlatinum is the most commonly used catalysts for fuel cells but they suffered from sluggish electron transfer kinetics. On the other hand, heteroatom doped graphene is known to enhance not only electrical conductivity but also long term operation stability. In this regard, a simple synthetic method is developed for the nitrogen doped graphene (NG) preparation. Moreover, iron (Fe) can be incorporated into the synthetic process. As-prepared NG with and without Fe shows excellent catalytic activity and stability compared to that of Pt based catalysts.rnHigh electrical conductivity is one of the most important requirements for the application of graphene in electronic devices. Therefore, for the fabrication of electrically conductive graphene films, a novel methane plasma assisted reduction of GO is developed. The high electrical conductivity of plasma reduced GO films revealed an excellent electrochemical performance in terms of high power and energy densities when used as an electrode in the micro-supercapacitors.rnAlthough, GO can be prepared in bulk scale, large amount of defect density and low electrical conductivity are major drawbacks. To overcome the intrinsic limitation of poor quality of GO and/or reduced GO, a novel protocol is extablished for mass production of high-quality graphene by means of electrochemical exfoliation of graphite. The prepared graphene shows high electrical conductivity, low defect density and good solution processability. Furthermore, when used as electrodes in organic field-effect transistors and/or in supercapacitors, the electrochemically exfoliated graphene shows excellent device performances. The low cost and environment friendly production of such high-quality graphene is of great importance for future generation electronics and energy storage devices. rn
Resumo:
OBJECTIVES: Donation after circulatory declaration of death (DCDD) could significantly improve the number of cardiac grafts for transplantation. Graft evaluation is particularly important in the setting of DCDD given that conditions of cardio-circulatory arrest and warm ischaemia differ, leading to variable tissue injury. The aim of this study was to identify, at the time of heart procurement, means to predict contractile recovery following cardioplegic storage and reperfusion using an isolated rat heart model. Identification of reliable approaches to evaluate cardiac grafts is key in the development of protocols for heart transplantation with DCDD. METHODS: Hearts isolated from anaesthetized male Wistar rats (n = 34) were exposed to various perfusion protocols. To simulate DCDD conditions, rats were exsanguinated and maintained at 37°C for 15-25 min (warm ischaemia). Isolated hearts were perfused with modified Krebs-Henseleit buffer for 10 min (unloaded), arrested with cardioplegia, stored for 3 h at 4°C and then reperfused for 120 min (unloaded for 60 min, then loaded for 60 min). Left ventricular (LV) function was assessed using an intraventricular micro-tip pressure catheter. Statistical significance was determined using the non-parametric Spearman rho correlation analysis. RESULTS: After 120 min of reperfusion, recovery of LV work measured as developed pressure (DP)-heart rate (HR) product ranged from 0 to 15 ± 6.1 mmHg beats min(-1) 10(-3) following warm ischaemia of 15-25 min. Several haemodynamic parameters measured during early, unloaded perfusion at the time of heart procurement, including HR and the peak systolic pressure-HR product, correlated significantly with contractile recovery after cardioplegic storage and 120 min of reperfusion (P < 0.001). Coronary flow, oxygen consumption and lactate dehydrogenase release also correlated significantly with contractile recovery following cardioplegic storage and 120 min of reperfusion (P < 0.05). CONCLUSIONS: Haemodynamic and biochemical parameters measured at the time of organ procurement could serve as predictive indicators of contractile recovery. We believe that evaluation of graft suitability is feasible prior to transplantation with DCDD, and may, consequently, increase donor heart availability.
Resumo:
Understanding the canopy cover of an urban environment leads to better estimates of carbon storage and more informed management decisions by urban foresters. The most commonly used method for assessing urban forest cover type extent is ground surveys, which can be both timeconsuming and expensive. The analysis of aerial photos is an alternative method that is faster, cheaper, and can cover a larger number of sites, but may be less accurate. The objectives of this paper were (1) to compare three methods of cover type assessment for Los Angeles, CA: handdelineation of aerial photos in ArcMap, supervised classification of aerial photos in ERDAS Imagine, and ground-collected data using the Urban Forest Effects (UFORE) model protocol; (2) to determine how well remote sensing methods estimate carbon storage as predicted by the UFORE model; and (3) to explore the influence of tree diameter and tree density on carbon storage estimates. Four major cover types (bare ground, fine vegetation, coarse vegetation, and impervious surfaces) were determined from 348 plots (0.039 ha each) randomly stratified according to land-use. Hand-delineation was better than supervised classification at predicting ground-based measurements of cover type and UFORE model-predicted carbon storage. Most error in supervised classification resulted from shadow, which was interpreted as unknown cover type. Neither tree diameter or tree density per plot significantly affected the relationship between carbon storage and canopy cover. The efficiency of remote sensing rather than in situ data collection allows urban forest managers the ability to quickly assess a city and plan accordingly while also preserving their often-limited budget.
Resumo:
Information management and geoinformation systems (GIS) have become indispensable in a large majority of protected areas all over the world. These tools are used for management purposes as well as for research and in recent years have become even more important for visitor information, education and communication. This study is divided into two parts: the first part provides a general overview of GIS and information management in a selected number of national park organizations. The second part lists and evaluates the needs of evolving large protected areas in Switzerland. The results show a wide use of GIS and information management tools in well established protected areas. The more isolated use of singular GIS tools has increasingly been replaced by an integrated geoinformation management. However, interview partners pointed out that human resources for GIS in most parks are limited. The interviews also highlight uneven access to national geodata. The view of integrated geoinformation management is not yet fully developed in the park projects in Switzerland. Short-term needs, such as software and data availability, motivate a large number of responses collected within an exhaustive questionnaire. Nevertheless, the need for coordinated action has been identified and should be followed up. The park organizations in North America show how an effective coordination and cooperation might be organized.
Resumo:
Teaching is a dynamic activity. It can be very effective, if its impact is constantly monitored and adjusted to the demands of changing social contexts and needs of learners. This implies that teachers need to be aware about teaching and learning processes. Moreover, they should constantly question their didactical methods and the learning resources, which they provide to their students. They should reflect if their actions are suitable, and they should regulate their teaching, e.g., by updating learning materials based on new knowledge about learners, or by motivating learners to engage in further learning activities. In the last years, a rising interest in ‘learning analytics’ is observable. This interest is motivated by the availability of massive amounts of educational data. Also, the continuously increasing processing power, and a strong motivation for discovering new information from these pools of educational data, is pushing further developments within the learning analytics research field. Learning analytics could be a method for reflective teaching practice that enables and guides teachers to investigate and evaluate their work in future learning scenarios. However, this potentially positive impact has not yet been sufficiently verified by learning analytics research. Another method that pursues these goals is ‘action research’. Learning analytics promises to initiate action research processes because it facilitates awareness, reflection and regulation of teaching activities analogous to action research. Therefore, this thesis joins both concepts, in order to improve the design of learning analytics tools. Central research question of this thesis are: What are the dimensions of learning analytics in relation to action research, which need to be considered when designing a learning analytics tool? How does a learning analytics dashboard impact the teachers of technology-enhanced university lectures regarding ‘awareness’, ‘reflection’ and ‘action’? Does it initiate action research? Which are central requirements for a learning analytics tool, which pursues such effects? This project followed design-based research principles, in order to answer these research questions. The main contributions are: a theoretical reference model that connects action research and learning analytics, the conceptualization and implementation of a learning analytics tool, a requirements catalogue for useful and usable learning analytics design based on evaluations, a tested procedure for impact analysis, and guidelines for the introduction of learning analytics into higher education.
Resumo:
Despite the astounding success of the fast fashion retailers, the management practices leading to these results have not been subject to extensive research so far. Given this background, we analyze the impact of information sharing and vertical integration on the performance of 51 German apparel companies. We find that the positive impact of vertical integration is mediated by information sharing, i.e. that the ability to improve the information flow is a key success factor of vertically integrated apparel supply chains. Thus, the success of an expansion strategy based on vertical integration critically depends on effective ways to share logistical information.
Resumo:
The hippocampus receives input from upper levels of the association cortex and is implicated in many mnemonic processes, but the exact mechanisms by which it codes and stores information is an unresolved topic. This work examines the flow of information through the hippocampal formation while attempting to determine the computations that each of the hippocampal subfields performs in learning and memory. The formation, storage, and recall of hippocampal-dependent memories theoretically utilize an autoassociative attractor network that functions by implementing two competitive, yet complementary, processes. Pattern separation, hypothesized to occur in the dentate gyrus (DG), refers to the ability to decrease the similarity among incoming information by producing output patterns that overlap less than the inputs. In contrast, pattern completion, hypothesized to occur in the CA3 region, refers to the ability to reproduce a previously stored output pattern from a partial or degraded input pattern. Prior to addressing the functional role of the DG and CA3 subfields, the spatial firing properties of neurons in the dentate gyrus were examined. The principal cell of the dentate gyrus, the granule cell, has spatially selective place fields; however, the behavioral correlates of another excitatory cell, the mossy cell of the dentate polymorphic layer, are unknown. This report shows that putative mossy cells have spatially selective firing that consists of multiple fields similar to previously reported properties of granule cells. Other cells recorded from the DG had single place fields. Compared to cells with multiple fields, cells with single fields fired at a lower rate during sleep, were less likely to burst, and were more likely to be recorded simultaneously with a large population of neurons that were active during sleep and silent during behavior. These data suggest that single-field and multiple-field cells constitute at least two distinct cell classes in the DG. Based on these characteristics, we propose that putative mossy cells tend to fire in multiple, distinct locations in an environment, whereas putative granule cells tend to fire in single locations, similar to place fields of the CA1 and CA3 regions. Experimental evidence supporting the theories of pattern separation and pattern completion comes from both behavioral and electrophysiological tests. These studies specifically focused on the function of each subregion and made implicit assumptions about how environmental manipulations changed the representations encoded by the hippocampal inputs. However, the cell populations that provided these inputs were in most cases not directly examined. We conducted a series of studies to investigate the neural activity in the entorhinal cortex, dentate gyrus, and CA3 in the same experimental conditions, which allowed a direct comparison between the input and output representations. The results show that the dentate gyrus representation changes between the familiar and cue altered environments more than its input representations, whereas the CA3 representation changes less than its input representations. These findings are consistent with longstanding computational models proposing that (1) CA3 is an associative memory system performing pattern completion in order to recall previous memories from partial inputs, and (2) the dentate gyrus performs pattern separation to help store different memories in ways that reduce interference when the memories are subsequently recalled.
Resumo:
Digital technologies have profoundly changed not only the ways we create, distribute, access, use and re-use information but also many of the governance structures we had in place. Overall, "older" institutions at all governance levels have grappled and often failed to master the multi-faceted and multi-directional issues of the Internet. Regulatory entrepreneurs have yet to discover and fully mobilize the potential of digital technologies as an influential factor impacting upon the regulability of the environment and as a potential regulatory tool in themselves. At the same time, we have seen a deterioration of some public spaces and lower prioritization of public objectives, when strong private commercial interests are at play, such as most tellingly in the field of copyright. Less tangibly, private ordering has taken hold and captured through contracts spaces, previously regulated by public law. Code embedded in technology often replaces law. Non-state action has in general proliferated and put serious pressure upon conventional state-centered, command-and-control models. Under the conditions of this "messy" governance, the provision of key public goods, such as freedom of information, has been made difficult or is indeed jeopardized.The grand question is how can we navigate this complex multi-actor, multi-issue space and secure the attainment of fundamental public interest objectives. This is also the question that Ian Brown and Chris Marsden seek to answer with their book, Regulating Code, as recently published under the "Information Revolution and Global Politics" series of MIT Press. This book review critically assesses the bold effort by Brown and Marsden.
Resumo:
The paper argues for a distinction between sensory-and conceptual-information storage in the human information-processing system. Conceptual information is characterized as meaningful and symbolic, while sensory information may exist in modality-bound form. Furthermore, it is assumed that sensory information does not contribute to conscious remembering and can be used only in data-driven process reptitions, which can be accompanied by a kind of vague or intuitive feeling. Accordingly, pure top-down and willingly controlled processing, such as free recall, should not have any access to sensory data. Empirical results from different research areas and from two experiments conducted by the authors are presented in this article to support these theoretical distinctions. The experiments were designed to separate a sensory-motor and a conceptual component in memory for two-digit numbers and two-letter items, when parts of the numbers or items were imaged or drawn on a tablet. The results of free recall and recognition are discussed in a theoretical framework which distinguishes sensory and conceptual information in memory.
Resumo:
Objectives: To investigate surface roughness and microhardness of two recent resin-ceramic materials for computer-aided design/computer-aided manufacturing (CAD/CAM) after polishing with three polishing systems. Surface roughness and microhardness were measured immediately after polishing and after six months storage including monthly artificial toothbrushing. Methods: Sixty specimens of Lava Ultimate (3M ESPE) and 60 specimens of VITA ENAMIC (VITA Zahnfabrik) were roughened in a standardized manner and polished with one of three polishing systems (n=20/group): Sof-Lex XT discs (SOFLEX; three-step (medium-superfine); 3M ESPE), VITA Polishing Set Clinical (VITA; two-step; VITA Zahnfabrik), or KENDA Unicus (KENDA; one-step; KENDA Dental). Surface roughness (Ra; μm) was measured with a profilometer and microhardness (Vickers; VHN) with a surface hardness indentation device. Ra and VHN were measured immediately after polishing and after six months storage (tap water, 37°C) including monthly artificial toothbrushing (500 cycles/month, toothpaste RDA ~70). Ra- and VHN-values were analysed with nonparametric ANOVA followed by Wilcoxon rank sum tests (α=0.05). Results: For Lava Ultimate, Ra (mean [standard deviation] before/after storage) remained the same when polished with SOFLEX (0.18 [0.09]/0.19 [0.10]; p=0.18), increased significantly with VITA (1.10 [0.44]/1.27 [0.39]; p=0.0001), and decreased significantly with KENDA (0.35 [0.07]/0.33 [0.08]; p=0.03). VHN (mean [standard deviation] before/after storage) decreased significantly regardless of polishing system (SOFLEX: 134.1 [5.6]/116.4 [3.6], VITA: 138.2 [10.5]/115.4 [5.9], KENDA: 135.1 [6.2]/116.7 [6.3]; all p<0.0001). For VITA ENAMIC, Ra (mean [standard deviation] before/after storage) increased significantly when polished with SOFLEX (0.37 [0.18]/0.41 [0.14]; p=0.01) and remained the same with VITA (1.32 [0.37]/1.31 [0.40]; p=0.58) and with KENDA (0.81 [0.35]/0.78 [0.32]; p=0.21). VHN (mean [standard deviation] before/after storage) remained the same regardless of polishing system (SOFLEX: 284.9 [24.6]/282.4 [31.8], VITA: 284.6 [28.5]/276.4 [25.8], KENDA: 292.6 [26.9]/282.9 [24.3]; p=0.42-1.00). Conclusion: Surface roughness and microhardness of Lava Ultimate was more affected by storage and artificial toothbrushing than was VITA ENAMIC.
Resumo:
PURPOSE To determine the impact of long-term storage on adhesion between titanium and zirconia using resin cements. MATERIALS AND METHODS Titanium grade 4 blocks were adhesively fixed onto zirconia disks with four resin cements: Panavia F 2.0 (Kuraray Europe), GC G-Cem (GC Europe), RelyX Unicem (3M ESPE), and SmartCem 2 (Dentsply DeguDent). Shear bond strength was determined after storage in a water bath for 24 h, 16, 90, and 150 days at 37°C, and after 6000 cycles between 5°C and 55°C. Fracture behavior was evaluated using scanning electron microscopy. RESULTS After storage for at least 90 days and after thermocycling, GC G-Cem (16.9 MPa and 15.1 MPa, respectively) and RelyX Unicem (10.8 MPa and 15.7 MPa, respectively) achieved higher shear bond strength compared to SmartCem 2 (7.1 MPa and 4.0 MPa, respectively) and Panavia F2 (4.1 MPa and 7.4 MPa, respectively). At day 150, GC G-Cem and RelyX Unicem caused exclusively mixed fractures. SmartCem 2 and Panavia F2 showed adhesive fractures in one-third of the cases; all other fractures were of mixed type. After 24 h (GC G-Cem: 26.0, RelyX Unicem: 20.5 MPa, SmartCem 2: 16.1 MPa, Panavia F2: 23.6 MPa) and 16 days (GC G-Cem: 12.8, RelyX Unicem: 14.2 MPa, SmartCem 2: 9.8 MPa, Panavia F2: 14.7 MPa) of storage, shear bond strength was similar among the four cements. CONCLUSION Long-term storage and thermocycling differentially affects the bonding of resin cement between titanium and zirconia.