279 resultados para Tooth transplantation


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Liver transplantation recipients, like other solid organ transplantation recipients, have an increased risk of dermatologic problems due to their long-term immunosuppression and benefit from pre-and post-transplantation screenings, and management by a dermatologist and dermatologic care should be integrated into the comprehensive, multidisciplinary care of liver transplantation recipients [1,2]. Cutaneous findings include aesthetic alterations, infections, precancerous lesions, and malignancies. The severity of skin alterations ranges from benign, unpleasant changes to life-threatening conditions [3-5]. In addition to skin cancer diagnosis and management, visits with a dermatologist serve to educate and improve the patient's sun-protection behavior. Among all solid organ transplantations, liver transplantation requires the least amount of immunosuppression, sometimes even permitting its complete cessation [6]. As a result, patients who have undergone liver transplantation tend to have fewer dermatologic complications compared with other solid organ transplantation recipients [7]. However, due to the large volume of the liver, patients undergoing liver transplantation receive more donor lymphocytes than kidney, heart, or lung transplantation recipients. Because of the immunosuppression, the transplanted lymphocytes proliferate and rarely trigger graft-versus-host-disease [8,9]. This topic will provide an overview of dermatologic disorders that may be seen following liver transplantation. A detailed discussion of skin cancer following solid organ transplantation and the general management of patients following liver transplantation are discussed separately. (See "Development of malignancy following solid organ transplantation" and "Management of skin cancer in solid organ transplant recipients" and "Long-term management of adult liver transplant recipients".)

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Purpose: To assess liver remnant volume regeneration and maintenance, and complications in the long-time follow-up of donors after living donor liver transplantation using CT and MRI. Materials and Methods: 47 donors with a mean age of 33.5 years who donated liver tissue for transplantation and who were available for follow-up imaging were included in this retrospective study. Contrast-enhanced CT and MR studies were acquired for routine follow-up. Two observers evaluated pre- and postoperative images regarding anatomy and pathological findings. Volumes were manually measured on contrast-enhanced images in the portal venous phase, and potential postoperative complications were documented. Pre- and postoperative liver volumes were compared for evaluating liver remnant regeneration. Results: 47 preoperative and 89 follow-up studies covered a period of 22.4 months (range: 1 - 84). After right liver lobe (RLL) donation, the mean liver remnant volume was 522.0 ml (± 144.0; 36.1 %; n = 18), after left lateral section (LLS) donation 1,121.7 ml (± 212.8; 79.9 %; n = 24), and after left liver lobe (LLL) donation 1,181.5 ml (± 279.5; 72.0 %; n = 5). Twelve months after donation, the liver remnant volume were 87.3 % (RLL; ± 11.8; n = 11), 95.0 % (LS; ± 11.6; n = 18), and 80.1 % (LLL; ± 2.0; n = 2 LLL) of the preoperative total liver volume. Rapid initial regeneration and maintenance at 80 % of the preoperative liver volume were observed over the total follow-up period. Minor postoperative complications were found early in 4 patients. No severe or late complications or mortality occurred. Conclusion: Rapid regeneration of liver remnant volumes in all donors and volume maintenance over the long-term follow-up period of up to 84 months without severe or late complications are important observations for assessing the safety of LDLT donors. Key Points: Liver remnant volumes of LDLT donors rapidly regenerated after donation and volumes were maintained over the long-term follow-up period of up to 84 months without severe or late complications.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The constant shortage of available organs is a major obstacle and limiting factor in heart transplantation; the discrepancy between the number of donors and potential recipients leads to waiting-list mortality of 10-12% per year in Europe and the USA. If adopted for heart transplantation, donation after circulatory determination of death (DCDD) would be expected to improve the availability of organs substantially for both adults and children. With DCDD, however, hearts to be transplanted undergo a period of warm ischaemia before procurement, which is of particular concern because tissue damage occurs rapidly and might be sufficient to preclude transplantation. Nonetheless, the heart is able to withstand limited periods of warm ischaemia, which could provide a window of opportunity for DCDD. Development of clinical approaches specifically for DCDD is critical for the exploitation of these organs, because current practices for donor heart procurement, evaluation, and storage have been optimized for conventional donation after brain death, without consideration of warm ischaemia before organ procurement. Establishment of clinical protocols and ethical and legal frameworks for DCDD of other organs is underway. This Review provides a timely evaluation of the potential for DCDD in heart transplantation.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Progressive interstitial fibrosis and tubular atrophy (IF/TA) is a leading cause of chronic allograft dysfunction. Increased extracellular matrix remodeling regulated by matrix metalloproteases (MMPs) and their inhibitors (TIMPs) has been implicated in the development of IF/TA. The aim of this study was to investigate whether urinary/serum MMPs/TIMPs correlate with subclinical IF/TA detected in surveillance biopsies within the first 6months post-transplant. We measured eight different MMPs/TIMPs simultaneously in urine and serum samples from patients classified as normal histology (n=15), IF/TA 1 (n=15) and IF/TA 2-3 (n=10). There was no difference in urinary MMPs/TIMPs among the three groups, and only 1/8 serum MMPs/TIMPs (i.e. MMP-1) was significantly elevated in biopsies with IF/TA 2-3 (p=0.01). In addition, urinary/serum MMPs/TIMPs were not different between surveillance biopsies demonstrating an early development of IF/TA (i.e. delta IF/TA≥1 compared to a previous biopsy obtained three months before; n=11) and stable grade of IF/TA (i.e. delta IF/TA=0; n=20). Next, we investigated whether urinary/serum MMP/TIMP levels are elevated during acute subclinical tubulitis in surveillance biopsies obtained within the first 6months post-transplant (n=25). Compared to biopsies with normal histology, serum MMPs/TIMPs were not different; however, all urinary MMP/TIMP levels were numerically higher during subclinical tubulitis (MMP-1, MMP-7, TIMP-1 with p≤0.04). We conclude that urinary/serum MMPs/TIMPs do hardly correlate with existing or early developing IF/TA in surveillance biopsies obtained within the first 6months post-transplant. This could be explained by the dynamic process of extracellular matrix remodeling, which seems to be active during acute tubulo-interstitial injury/inflammation, but not in quiescent IF/TA.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

BACKGROUND: Risk factors and outcomes of bronchial stricture after lung transplantation are not well defined. An association between acute rejection and development of stricture has been suggested in small case series. We evaluated this relationship using a large national registry. METHODS: All lung transplantations between April 1994 and December 2008 per the United Network for Organ Sharing (UNOS) database were analyzed. Generalized linear models were used to determine the association between early rejection and development of stricture after adjusting for potential confounders. The association of stricture with postoperative lung function and overall survival was also evaluated. RESULTS: Nine thousand three hundred thirty-five patients were included for analysis. The incidence of stricture was 11.5% (1,077/9,335), with no significant change in incidence during the study period (P=0.13). Early rejection was associated with a significantly greater incidence of stricture (adjusted odds ratio [AOR], 1.40; 95% confidence interval [CI], 1.22-1.61; p<0.0001). Male sex, restrictive lung disease, and pretransplantation requirement for hospitalization were also associated with stricture. Those who experienced stricture had a lower postoperative peak percent predicted forced expiratory volume at 1 second (FEV1) (median 74% versus 86% for bilateral transplants only; p<0.0001), shorter unadjusted survival (median 6.09 versus 6.82 years; p<0.001) and increased risk of death after adjusting for potential confounders (adjusted hazard ratio 1.13; 95% CI, 1.03-1.23; p=0.007). CONCLUSIONS: Early rejection is associated with an increased incidence of stricture. Recipients with stricture demonstrate worse postoperative lung function and survival. Prospective studies may be warranted to further assess causality and the potential for coordinated rejection and stricture surveillance strategies to improve postoperative outcomes.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

BACKGROUND  Polymorphisms in the interferon-λ (IFNL) 3/4 region have been associated with reduced hepatitis C virus clearance. We explored the role of such polymorphisms on the incidence of CMV infection in solid-organ transplant (SOT) recipients. METHODS  Caucasian patients participating in the Swiss Transplant Cohort Study in 2008-2011 were included. A novel functional TT/-G polymorphism (rs368234815) in the CpG region upstream of IFNL3 was investigated. RESULTS  A total of 840 SOT recipients at risk for CMV were included, among whom 373 (44%) received antiviral prophylaxis. The 12-months cumulative incidence of CMV replication and disease were 0.44 and 0.08, respectively. Patient homozygous for the minor rs368234815 allele (-G/-G) tended to have a higher cumulative incidence of CMV replication (SHR=1.30 [95%CI 0.97-1.74], P=0.07) compared to other patients (TT/TT or TT/-G). The association was significant among patients followed by a preemptive approach (SHR=1.46 [1.01-2.12], P=0.047), especially in patients receiving an organ from a seropositive donor (D+, SHR=1.92 [95%CI 1.30-2.85], P=0.001), but not among those who received antiviral prophylaxis (SHR=1.13 [95%CI 0.70-1.83], P=0.6). These associations remained significant in multivariate competing risk regression models. CONCLUSIONS  Polymorphisms in the IFNL3/4 region influence susceptibility to CMV replication in SOT recipients, particularly in patients not receiving antiviral prophylaxis.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

BACKGROUND  Single nucleotide polymorphisms (SNPs) in immune genes have been associated with susceptibility to invasive mold infection (IMI) among hematopoietic stem cell (HSCT) but not solid organ transplant (SOT) recipients. METHODS  24 SNPs from systematically selected genes were genotyped among 1101 SOT recipients (715 kidneys, 190 liver, 102 lungs, 79 hearts, 15 other) from the Swiss Transplant Cohort Study. Association between SNPs and the endpoint were assessed by log-rank test and Cox regression models. Cytokine production upon Aspergillus stimulation was measured by ELISA in PBMCs from healthy volunteers and correlated with relevant genotypes. RESULTS  Mold colonization (N=45) and proven/probable IMI (N=26) were associated with polymorphisms in interleukin-1 beta (IL1B, rs16944; log-rank test, recessive mode, colonization P=0.001 and IMI P=0.00005), interleukin-1 receptor antagonist (IL1RN, rs419598; P=0.01 and P=0.02) and β-defensin-1 (DEFB1, rs1800972; P=0.001 and P=0.0002, respectively). The associations with IL1B and DEFB1 remained significant in a multivariate regression model (IL1B rs16944 P=0.002; DEFB1 rs1800972 P=0.01). Presence of two copies of the rare allele of rs16944 or rs419598 was associated with reduced Aspergillus-induced IL-1β and TNFα secretion by PBMCs. CONCLUSIONS  Functional polymorphisms in IL1B and DEFB1 influence susceptibility to mold infection in SOT recipients. This observation may contribute to individual risk stratification.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

OBJECTIVES To assess the available evidence on the effectiveness of accelerated orthodontic tooth movement through surgical and non-surgical approaches in orthodontic patients. METHODS Randomized controlled trials and controlled clinical trials were identified through electronic and hand searches (last update: March 2014). Orthognathic surgery, distraction osteogenesis, and pharmacological approaches were excluded. Risk of bias was assessed using the Cochrane risk of bias tool. RESULTS Eighteen trials involving 354 participants were included for qualitative and quantitative synthesis. Eight trials reported on low-intensity laser, one on photobiomodulation, one on pulsed electromagnetic fields, seven on corticotomy, and one on interseptal bone reduction. Two studies on corticotomy and two on low-intensity laser, which had low or unclear risk of bias, were mathematically combined using the random effects model. Higher canine retraction rate was evident with corticotomy during the first month of therapy (WMD=0.73; 95% CI: 0.28, 1.19, p<0.01) and with low-intensity laser (WMD=0.42mm/month; 95% CI: 0.26, 0.57, p<0.001) in a period longer than 3 months. The quality of evidence supporting the interventions is moderate for laser therapy and low for corticotomy intervention. CONCLUSIONS There is some evidence that low laser therapy and corticotomy are effective, whereas the evidence is weak for interseptal bone reduction and very weak for photobiomodulation and pulsed electromagnetic fields. Overall, the results should be interpreted with caution given the small number, quality, and heterogeneity of the included studies. Further research is required in this field with additional attention to application protocols, adverse effects, and cost-benefit analysis. CLINICAL SIGNIFICANCE From the qualitative and quantitative synthesis of the studies, it could be concluded that there is some evidence that low laser therapy and corticotomy are associated with accelerated orthodontic tooth movement, while further investigation is required before routine application.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

OBJECTIVES To evaluate the effect of biannual fluoride varnish applications in preschool children as an adjunct to school-based oral health promotion and supervised tooth brushing with 1000ppm fluoride toothpaste. METHODS 424 preschool children, 2-5 year of age, from 10 different pre schools in Athens were invited to this double-blind randomized controlled trial and 328 children completed the 2-year programme. All children received oral health education with hygiene instructions twice yearly and attended supervised tooth brushing once daily. The test group was treated with fluoride varnish (0.9% diflurosilane) biannually while the control group had placebo applications. The primary endpoints were caries prevalence and increment; secondary outcomes were gingival health, mutans streptococci growth and salivary buffer capacity. RESULTS The groups were balanced at baseline and no significant differences in caries prevalence or increment were displayed between the groups after 1 and 2 years, respectively. There was a reduced number of new pre-cavitated enamel lesions during the second year of the study (p=0.05) but the decrease was not statistically significant. The secondary endpoints were unaffected by the varnish treatments. CONCLUSIONS Under the present conditions, biannual fluoride varnish applications in preschool children did not show significant caries-preventive benefits when provided as an adjunct to school-based supervised tooth brushing with 1000ppm fluoride toothpaste. CLINICAL SIGNIFICANCE In community based, caries prevention programmes, for high caries risk preschool children, a fluoride varnish may add little to caries prevention, when 1000ppm fluoride toothpaste is used daily.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The purpose of this study was to determine if storage for up to 4 h in human saliva results in a decrease of erosive tooth wear (ETW) and in an increase of surface microhardness (SMH) of enamel samples after an erosive attack with subsequent abrasion. Furthermore, we determined the impact of individual salivary parameters on ETW and SMH. Enamel samples were distributed into five groups: group 1 had neither erosion nor saliva treatment; groups 2-5 were treated with erosion, then group 2 was placed in a humid chamber and groups 3-5 were incubated in saliva for 30 min, 2 h, and 4 h, respectively. After erosion and saliva treatments, all groups were treated with abrasion. Surface microhardness and ETW were measured before and after erosion, incubation in saliva, and abrasion. Surface microhardness and ETW showed significant changes throughout the experiment: SMH decreased and ETW increased in groups 2-5, regardless of the length of incubation in saliva. The results of groups 3-5 (exposed to saliva) were not significantly different from those of group 2 (not exposed to saliva). Exposure of eroded enamel to saliva for up to 4 h was not able to increase SMH or reduce ETW. However, additional experiments with artificial saliva without proteins showed protection from erosive tooth wear. The recommendation to postpone toothbrushing of enamel after an erosive attack should be reconsidered.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Erosive tooth wear in children is a common condition. Besides the anatomical differences between deciduous and permanent teeth, additional histological differences may influence their susceptibility to dissolution. Considering laboratory studies alone, it is not clear whether deciduous teeth are more liable to erosive wear than permanent teeth. However, results from epidemiological studies imply that the primary dentition is less wear resistant than permanent teeth, possibly due to the overlapping of erosion with mechanical forces (like attrition or abrasion). Although low severity of tooth wear in children does not cause a significant impact on their quality of life, early erosive damage to their permanent teeth may compromise their dentition for their entire lifetime and require extensive restorative procedures. Therefore, early diagnosis of erosive wear and adequate preventive measures are important. Knowledge on the aetiological factors of erosive wear is a prerequisite for preventive strategies. Like in adults, extrinsic and intrinsic factors, or a combination of them, are possible reasons for erosive tooth wear in children and adolescents. Several factors directly related to erosive tooth wear in children are presently discussed, such as socio-economic aspects, gastroesophageal reflux or vomiting, and intake of some medicaments, as well as behavioural factors such as unusual eating and drinking habits. Additionally, frequent and excessive consumption of erosive foodstuffs and drinks are of importance.