62 resultados para Tree transplantation methods
em BORIS: Bern Open Repository and Information System - Berna - Suiça
Resumo:
AIM: We sought to investigate the prevalence of posttraumatic stress disorder, anxiety, and depression in patients and their partners after implantation of a mechanical assist device as a bridge to heart transplantation. METHODS: This was a retrospective assessment of 41 patients (age 46.3 +/- 12.0 years; male-female ratio, 38:3; time since transplantation, 55.3 +/- 34.2 months [range, 7-122 months) and 27 partners (male-female ratio 2:25) by standardized instruments (Impact of Event Scale, Hospital Anxiety and Depression Scale), in 2 University Heart Transplant Centers (Vienna, Austria, Munster, Germany). The duration of the support systems (MicroMed DeBakey-VAD in 17 patients, Novacor in 10, Thoratec in 8, TCI HeartMate in 5, and Berlin Heart Incor in 1 patient) ranged from 28 to 711 (176 +/- 146) days. RESULTS: None of the patients, but 23% of the partners (n = 6), met the criteria for posttraumatic stress disorder (Maercker cutoff >0). The Impact of Event Scale (IES) sum scales differed significantly between the 2 groups (21.2 +/- 15.1, mean +/- SD) for the patients versus 38.1 +/- 27.8 for the partners, respectively; P = .001). Two percent of the patients, but 19% of the partners, showed mild to moderate depression; 4% of patients, but 23% of their partners, reported mild to moderate anxiety. None of the results were significantly influenced by the time since transplantation, patient age, diagnoses, type of assist device, or indication for heart transplantation. CONCLUSIONS: Despite patients being much closer to a life threat, their partners experience significantly more psychologic distress even in the long run. Our findings highlight the need for attention to the supporting persons.
Resumo:
BACKGROUND: Activation of the complement system and polymorphonuclear neutrophilic leukocytes plays a major role in mediating reperfusion injury after lung transplantation. We hypothesized that early interference with complement activation would reduce lung reperfusion injury after transplantation. METHODS: Unilateral left lung autotransplantation was performed in 6 sheep. After hilar stripping the left lung was flushed with Euro-Collins solution and preserved for 2 hours in situ at 15 degrees C. After reperfusion the right main bronchus and pulmonary artery were occluded, leaving the animal dependent on the reperfused lung (reperfused group). C1-esterase inhibitor group animals (n = 6) received 200 U/kg body weight of C1-esterase inhibitor as a short infusion, half 10 minutes before, the other half 10 minutes after reperfusion. Controls (n = 6) underwent hilar preparation only. Pulmonary function was assessed by alveolar-arterial oxygen difference and pulmonary vascular resistance. The release of beta-N-acetylglucosaminidase served as indicator of polymorphonuclear neutrophilic leukocyte activation. Extravascular lung water was an indicator for pulmonary edema formation. Biopsy specimens were taken from all groups 3 hours after reperfusion for light and electron microscopy. RESULTS: In the reperfused group, alveolar-arterial oxygen difference and pulmonary vascular resistance were significantly elevated after reperfusion. All animals developed frank alveolar edema. The biochemical marker beta-N-acetylglucosaminidase showed significant leukocyte activation. In the C1-esterase inhibitor group, alveolar-arterial oxygen difference, pulmonary vascular resistance, and the level of polymorphonuclear neutrophilic leukocyte activation were significantly lower. CONCLUSIONS: Treatment with C1-esterase inhibitor reduces reperfusion injury and improves pulmonary function in this experimental model.
Resumo:
BACKGROUND: Osteoporosis has been recognized as an important side effect of long-term and of pulsed steroid application after heart transplantation. METHODS: In June 1989 a prospective clinical trial was started to study bone demineralization by quantitative computed tomographic scan. All patients received vitamin D and calcium. In group I (n = 30) synthetic calcitonin (40 Medical Research Council Standard Units subcutaneously per day was administered in 14-day cycles, whereas group II patients (n = 31) received a placebo preparation. Repeat trabecular and cortical quantitative computed tomographic scans of the thoracic (T12) and lumbar spine (L1, L2, L3) were obtained within 48 weeks after heart transplantation. RESULTS: Expressed as the means of T12, L1, L2, and L3, trabecular bone density decreased significantly from 100+/-24 to 79+/-29 mg/mL within 3 weeks after heart transplantation, followed by a further reduction to 67+/-29 mg/mL after 3 months in the calcitonin group. The values for cortical bone density decreased significantly from 229+/-37 to 202+/-40 mg/mL (calcitonin) 3 weeks after heart transplantation. Comparable results were obtained in the placebo group. In both groups bone density remained stable thereafter. Intergroup differences were not of statistical significance. CONCLUSIONS: In heart transplant recipients progressive trabecular bone demineralization is limited to the first 3 postoperative months. Thereafter, bone density remained stable. A positive effect of synthetic calcitonin in addition to prophylactic calcium and vitamin D application could not be proved by repeat quantitative computed tomography.
Resumo:
RATIONALE The use of 6-minute-walk distance (6MWD) as an indicator of exercise capacity to predict postoperative survival in lung transplantation has not previously been well studied. OBJECTIVES To evaluate the association between 6MWD and postoperative survival following lung transplantation. METHODS Adult, first time, lung-only transplantations per the United Network for Organ Sharing database from May 2005 to December 2011 were analyzed. Kaplan-Meier methods and Cox proportional hazards modeling were used to determine the association between preoperative 6MWD and post-transplant survival after adjusting for potential confounders. A receiver operating characteristic curve was used to determine the 6MWD value that provided maximal separation in 1-year mortality. A subanalysis was performed to assess the association between 6MWD and post-transplant survival by disease category. MEASUREMENTS AND MAIN RESULTS A total of 9,526 patients were included for analysis. The median 6MWD was 787 ft (25th-75th percentiles = 450-1,082 ft). Increasing 6MWD was associated with significantly lower overall hazard of death (P < 0.001). Continuous increase in walk distance through 1,200-1,400 ft conferred an incremental survival advantage. Although 6MWD strongly correlated with survival, the impact of a single dichotomous value to predict outcomes was limited. All disease categories demonstrated significantly longer survival with increasing 6MWD (P ≤ 0.009) except pulmonary vascular disease (P = 0.74); however, the low volume in this category (n = 312; 3.3%) may limit the ability to detect an association. CONCLUSIONS 6MWD is significantly associated with post-transplant survival and is best incorporated into transplant evaluations on a continuous basis given limited ability of a single, dichotomous value to predict outcomes.
Resumo:
BACKGROUND/AIMS: Skin tumours, in particular squamous-cell carcinomas (SCC), are the most common malignant conditions developing in transplant recipients. The aim of this study is to investigate the frequency and type of skin cancer in patients receiving immunosuppressive therapy after organ transplantation. METHODS: Multivariate logistic regression analysis was performed on data of 243 renal transplant patients who attended the dermatology outpatient clinic for the first time after transplantation in the period January 2002-October 2005. RESULTS: We found an increased risk of actinic keratosis (AK) and SCC in renal transplant recipients with a basal cell carcinoma (BCC) / SCC ratio of 1:7. Older patients had AK more frequently (odds ratio [OR] 1.11, 95% confidence interval [CI] 1.06-1.15; p <0.0001) and SCC (OR 1.14, CI 1.07-1.22; p <0.0001) than younger patients. Men had AK (OR 0.19, CI 0.08-0.45; p = 0.0002) and SCC (OR 0.25, CI 0.07-0.89; p = 0.0332) more frequently than women. The duration of immunosuppressive therapy correlated significantly with the numbers of AKs (OR 1.15, CI 1.08-1.24; p <0.0001) and SCCs (OR 1.16, CI 1.05-1.28; p = 0.0025), and patients with fair skin had more AKs (OR 0.31, CI 0.14-1.24; p <0.0001) and SCCs (OR 0.11, CI 0.02-0.52; p = 0.0054) than darker skinned patients. We could not identify any specific immunosuppressive drug as a distinct risk factor for AK or non-melanoma skin cancer (NMSC). CONCLUSION: Skin cancers are increased in the renal transplant population. Main risk factors for skin cancers are fair skin type and long duration of immunosuppressive therapy. A follow-up programme is necessary for early detection of skin cancer and precancerous conditions. Preventive strategies should include specialist dermatological monitoring and self-examination.
Resumo:
Background and aims Differences in chemical composition of root compounds and root systems among tree species may affect organic matter (OM) distribution, source and composition in forest soils. The objective of this study was to elucidate the contribution of species specific cutin and suberin biomarkers as proxies for shoot- and root-derived organic carbon (OC) to soil OM at different depths with increasing distance to the stems of four different tree species. Methods The contribution of cutin- and suberin-derived lipids to OM in a Cutanic Alisol was analyzed with increasing soil depth and distance to the stems of Fagus sylvatica L., Picea abies (L.) Karst., Quercus robur L. and Pseudotsuga menziesii (Mirb.) Franco. Cutin and suberin monomers of plants and soils were analyzed by alkaline hydrolysis and subsequent gas chromatography–mass spectrometry. Results The amount and distribution of suberin-derived lipids in soil clearly reflected the specific root system of the different tree species. The amount of cutin-derived lipids decreased strongly with soil depth, indicating that the input of leaf/needle material is restricted to the topsoil. In contrast to the suberin-derived lipids, the spatial pattern of cutin monomer contribution to soil OM did not depend on tree species. Conclusions Our results document the importance of tree species as a main factor controlling the composition and distribution of OM in forest soils. They reveal the impact of tree species on root-derived OM distribution and the necessity to distinguish among different zones when studying soil OM storage in forests.
Resumo:
QUESTIONS UNDER STUDY / PRINCIPLES: Interest groups advocate centre-specific outcome data as a useful tool for patients in choosing a hospital for their treatment and for decision-making by politicians and the insurance industry. Haematopoietic stem cell transplantation (HSCT) requires significant infrastructure and represents a cost-intensive procedure. It therefore qualifies as a prime target for such a policy. METHODS: We made use of the comprehensive database of the Swiss Blood Stem Cells Transplant Group (SBST) to evaluate potential use of mortality rates. Nine institutions reported a total of 4717 HSCT - 1427 allogeneic (30.3%), 3290 autologous (69.7%) - in 3808 patients between the years 1997 and 2008. Data were analysed for survival- and transplantation-related mortality (TRM) at day 100 and at 5 years. RESULTS: The data showed marked and significant differences between centres in unadjusted analyses. These differences were absent or marginal when the results were adjusted for disease, year of transplant and the EBMT risk score (a score incorporating patient age, disease stage, time interval between diagnosis and transplantation, and, for allogeneic transplants, donor type and donor-recipient gender combination) in a multivariable analysis. CONCLUSIONS: These data indicate comparable quality among centres in Switzerland. They show that comparison of crude centre-specific outcome data without adjustment for the patient mix may be misleading. Mandatory data collection and systematic review of all cases within a comprehensive quality management system might, in contrast, serve as a model to ascertain the quality of other cost-intensive therapies in Switzerland.
Resumo:
OBJECTIVES:: To determine prevalence and characteristics of end-stage renal diseases (ESRD) [dialysis and renal transplantation (RT)] among European HIV-infected patients. METHODS:: Cross-sectional multicenter survey of EuroSIDA clinics during 2008. RESULTS:: Prevalence of ESRD was 0.5%. Of 122 patients with ESRD 96 were on dialysis and 26 had received a RT. Median age was 47 years, 73% were males and 43% were black. Median duration of HIV infection was 11 years. Thirty-three percent had prior AIDS; 91% were receiving antiretrovirals; and 88% had undetectable viral load. Median CD4T-cell count was 341 cells per cubic millimetre; 20.5% had hepatitis C coinfection. Most frequent causes of ESRD were HIV-associated nephropathy (46%) and other glomerulonephritis (28%). Hemodialysis (93%) was the most common dialysis modality; 34% of patients were on the RT waiting list. A poor HIV control was the reason for exclusion from RT waiting list in 22.4% of cases. All the RT recipients were all alive at the time of the survey. Acute rejection was reported in 8 patients (30%). Functioning graft was present in 21 (80%). CONCLUSIONS:: This is the first multinational cross-sectional study of ESRD among European HIV population. Low prevalence of ESRD was found. Two-thirds of patients were excluded from RT for non-HIV/AIDS-related pathologies. Most patients had a functioning graft despite a high acute rejection rate.
Resumo:
Renal transplantation has become an established option for renal replacement therapy in many patients with end stage renal disease. Living donation is a possibility for timely transplantation, hampered in 20 % of all possible donors and recipients byincompatible blood groups. AB0-incompatible renal transplantation overcomes this hurdle with acceptable allograft survival compared to conventional living-donor renal transplantation. During the last 10 years, the number of patients awaiting renal transplantation older than 65 years has nearly doubled. The decision to transplant those patients and their medical treatment is a growing challenge in transplantation. On the other hand donor age is increasing with potential negative consequences for long-term outcome of organ function. Antibody-mediated humoral rejection have been identified lately as an important cause for allograft failure during long-term follow up of renal transplant patients. New immunological methods to detect donor-specific antibodies, like solid-phase assays (Luminex®), have increased the knowledge and understanding of humoral rejection processes. This will lead hopefully to modified immunosuppressive strategies to minimize organ failure due to chronic rejection.
Resumo:
PURPOSE: To prospectively evaluate whether intravenous morphine co-medication improves bile duct visualization of dual-energy CT-cholangiography. MATERIALS AND METHODS: Forty potential donors for living-related liver transplantation underwent CT-cholangiography with infusion of a hepatobiliary contrast agent over 40min. Twenty minutes after the beginning of the contrast agent infusion, either normal saline (n=20 patients; control group [CG]) or morphine sulfate (n=20 patients; morphine group [MG]) was injected. Forty-five minutes after initiation of the contrast agent, a dual-energy CT acquisition of the liver was performed. Applying dual-energy post-processing, pure iodine images were generated. Primary study goals were determination of bile duct diameters and visualization scores (on a scale of 0 to 3: 0-not visualized; 3-excellent visualization). RESULTS: Bile duct visualization scores for second-order and third-order branch ducts were significantly higher in the MG compared to the CG (2.9±0.1 versus 2.6±0.2 [P<0.001] and 2.7±0.3 versus 2.1±0.6 [P<0.01], respectively). Bile duct diameters for the common duct and main ducts were significantly higher in the MG compared to the CG (5.9±1.3mm versus 4.9±1.3mm [P<0.05] and 3.7±1.3mm versus 2.6±0.5mm [P<0.01], respectively). CONCLUSION: Intravenous morphine co-medication significantly improved biliary visualization on dual-energy CT-cholangiography in potential donors for living-related liver transplantation.
Resumo:
Biliary cast syndrome (BCS) is the presence of casts within the intrahepatic or extrahepatic biliary system after orthotopic liver transplantation. Our work compares two percutaneous methods for BCS treatment: the mechanical cast-extraction technique (MCE) versus the hydraulic cast-extraction (HCE) technique using a rheolytic system.
Resumo:
OBJECTIVES: Donation after circulatory declaration of death (DCDD) could significantly improve the number of cardiac grafts for transplantation. Graft evaluation is particularly important in the setting of DCDD given that conditions of cardio-circulatory arrest and warm ischaemia differ, leading to variable tissue injury. The aim of this study was to identify, at the time of heart procurement, means to predict contractile recovery following cardioplegic storage and reperfusion using an isolated rat heart model. Identification of reliable approaches to evaluate cardiac grafts is key in the development of protocols for heart transplantation with DCDD. METHODS: Hearts isolated from anaesthetized male Wistar rats (n = 34) were exposed to various perfusion protocols. To simulate DCDD conditions, rats were exsanguinated and maintained at 37°C for 15-25 min (warm ischaemia). Isolated hearts were perfused with modified Krebs-Henseleit buffer for 10 min (unloaded), arrested with cardioplegia, stored for 3 h at 4°C and then reperfused for 120 min (unloaded for 60 min, then loaded for 60 min). Left ventricular (LV) function was assessed using an intraventricular micro-tip pressure catheter. Statistical significance was determined using the non-parametric Spearman rho correlation analysis. RESULTS: After 120 min of reperfusion, recovery of LV work measured as developed pressure (DP)-heart rate (HR) product ranged from 0 to 15 ± 6.1 mmHg beats min(-1) 10(-3) following warm ischaemia of 15-25 min. Several haemodynamic parameters measured during early, unloaded perfusion at the time of heart procurement, including HR and the peak systolic pressure-HR product, correlated significantly with contractile recovery after cardioplegic storage and 120 min of reperfusion (P < 0.001). Coronary flow, oxygen consumption and lactate dehydrogenase release also correlated significantly with contractile recovery following cardioplegic storage and 120 min of reperfusion (P < 0.05). CONCLUSIONS: Haemodynamic and biochemical parameters measured at the time of organ procurement could serve as predictive indicators of contractile recovery. We believe that evaluation of graft suitability is feasible prior to transplantation with DCDD, and may, consequently, increase donor heart availability.
Resumo:
BACKGROUND: Individual adaptation of processed patient's blood volume (PBV) should reduce number and/or duration of autologous peripheral blood progenitor cell (PBPC) collections. STUDY DESIGN AND METHODS: The durations of leukapheresis procedures were adapted by means of an interim analysis of harvested CD34+ cells to obtain the intended yield of CD34+ within as few and/or short as possible leukapheresis procedures. Absolute efficiency (AE; CD34+/kg body weight) and relative efficiency (RE; total CD34+ yield of single apheresis/total number of preapheresis CD34+) were calculated, assuming an intraapheresis recruitment if RE was greater than 1, and a yield prediction models for adults was generated. RESULTS: A total of 196 adults required a total of 266 PBPC collections. The median AE was 7.99 x 10(6), and the median RE was 1.76. The prediction model for AE showed a satisfactory predictive value for preapheresis CD34+ only. The prediction model for RE also showed a low predictive value (R2 = 0.36). Twenty-eight children underwent 44 PBPC collections. The median AE was 12.13 x 10(6), and the median RE was 1.62. Major complications comprised bleeding episodes related to central venous catheters (n = 4) and severe thrombocytopenia of less than 10 x 10(9) per L (n = 16). CONCLUSION: A CD34+ interim analysis is a suitable tool for individual adaptation of the duration of leukapheresis. During leukapheresis, a substantial recruitment of CD34+ was observed, resulting in a RE of greater than 1 in more than 75 percent of patients. The upper limit of processed PBV showing an intraapheresis CD34+ recruitment is higher than in a standard large-volume leukapheresis. Therefore, a reduction of individually needed PBPC collections by means of a further escalation of the processed PBV seems possible.