843 resultados para COLLABORATIVE TRANSPLANT
Resumo:
The toxicity of long-term immunosuppressive therapy has become a major concern in long-term follow-up of heart transplant recipients. In this respect the quality of renal function is undoubtedly linked to cyclosporin A (CsA) drug levels. In cardiac transplantation, specific CsA trough levels have historically been maintained between 250 and 350 micrograms/L in many centers without direct evidence for the necessity of such high levels while using triple-drug immunosuppression. This retrospective analysis compares the incidence of acute and chronic graft rejection as well as overall mortality between groups of patients with high (250 to 350 micrograms/L) and low (150 to 250 micrograms/L) specific CsA trough levels. A total of 332 patients who underwent heart transplantation between October 1985 and October 1992 with a minimum follow-up of 30 days were included in this study (46 women and 276 men; aged, 44 +/- 12 years; mean follow-up, 1,122 +/- 777 days). Standard triple-drug immunosuppression included first-year specific CsA target trough levels of 250 to 300 micrograms/L. Patients were grouped according to their average creatinine level in the first postoperative year (group I, < 130 mumol/L, n = 234; group II, > or = 130 mumol/L, n = 98). The overall 5-year survival excluding the early 30-day mortality was 92% (group I, 216/232) and 91% (group II, 89/98) with 75% of the mortality due to chronic rejection. The rate of rejection for the entire follow-up period was similar in both groups (first year: group I, 3.2 +/- 2.6 rejection/patient/year; group II, 3.6 +/- 2.7 rejection/patient/year; p = not significant).(ABSTRACT TRUNCATED AT 250 WORDS)
Resumo:
1. Habitat fragmentation and variation in habitat quality can both affect plant performance, but their effects have rarely been studied in combination. We thus examined plant performance in response to differences in habitat quality for a species subject to habitat fragmentation, the common but declining perennial herb Lychnis flos-cuculi. 2. We reciprocally transplanted plants between 15 fen grasslands in north-east Switzerland and recorded plant performance for 4 years. 3. Variation between the 15 target sites was the most important factor and affected all measures of plant performance in all years. This demonstrates the importance of plastic responses to habitat quality for plant performance. 4. Plants from smaller populations produced fewer rosettes than plants from larger populations in the first year of the replant-transplant experiment. 5. Plant performance decreased with increasing ecological difference between grassland of origin and target grassland, indicating adaptation to ecological conditions. In contrast, plant performance was not influenced by microsatellite distance and hardly by geographic distance between grassland of origin and target grassland. 6. Plants originating from larger populations were better able to cope with larger ecological differences between transplantation site and site of origin. 7. Synthesis: In addition to the direct effects of target grasslands, both habitat fragmentation, through reduced population size, and adaptation to habitats of different quality, contributed to the performance of L. flos-cuculi. This underlines that habitat fragmentation also affects species that are still common. Moreover, it suggests that restoration projects involving L. flos-cuculi should use plant material from large populations living in habitats similar to the restoration site. Finally, our results bring into question whether plants in small habitat remnants will be able to cope with future environmental change.
Resumo:
BACKGROUND: The CD4 cell count at which combination antiretroviral therapy should be started is a central, unresolved issue in the care of HIV-1-infected patients. In the absence of randomised trials, we examined this question in prospective cohort studies. METHODS: We analysed data from 18 cohort studies of patients with HIV. Antiretroviral-naive patients from 15 of these studies were eligible for inclusion if they had started combination antiretroviral therapy (while AIDS-free, with a CD4 cell count less than 550 cells per microL, and with no history of injecting drug use) on or after Jan 1, 1998. We used data from patients followed up in seven of the cohorts in the era before the introduction of combination therapy (1989-95) to estimate distributions of lead times (from the first CD4 cell count measurement in an upper range to the upper threshold of a lower range) and unseen AIDS and death events (occurring before the upper threshold of a lower CD4 cell count range is reached) in the absence of treatment. These estimations were used to impute completed datasets in which lead times and unseen AIDS and death events were added to data for treated patients in deferred therapy groups. We compared the effect of deferred initiation of combination therapy with immediate initiation on rates of AIDS and death, and on death alone, in adjacent CD4 cell count ranges of width 100 cells per microL. FINDINGS: Data were obtained for 21 247 patients who were followed up during the era before the introduction of combination therapy and 24 444 patients who were followed up from the start of treatment. Deferring combination therapy until a CD4 cell count of 251-350 cells per microL was associated with higher rates of AIDS and death than starting therapy in the range 351-450 cells per microL (hazard ratio [HR] 1.28, 95% CI 1.04-1.57). The adverse effect of deferring treatment increased with decreasing CD4 cell count threshold. Deferred initiation of combination therapy was also associated with higher mortality rates, although effects on mortality were less marked than effects on AIDS and death (HR 1.13, 0.80-1.60, for deferred initiation of treatment at CD4 cell count 251-350 cells per microL compared with initiation at 351-450 cells per microL). INTERPRETATION: Our results suggest that 350 cells per microL should be the minimum threshold for initiation of antiretroviral therapy, and should help to guide physicians and patients in deciding when to start treatment.
Resumo:
End stage renal disease is a major complication after orthotopic liver transplantation (OLT). Vasoconstriction of renal arterial vessels because of calcineurin inhibitor (CNI) treatment plays a pivotal role in the development of renal insufficiency following OLT. Renal resistance can be measured non-invasively by determining the resistance index (RI) of segmental arteries by color-coded duplex ultrasonography, a measure with predictive value for future renal failure. Sixteen OLT patients on long-term CNI therapy were recruited prospectively and randomly assigned either to receive the m-TOR inhibitor sirolimus (SRL) or to continue on CNI treatment, and were followed for one yr. Serum creatinine (crea) declined after conversion to SRL, whereas it tended to increase in patients remaining on CNI (meanDelta crea SRL: -27, -18, -18, -15 micromol/L; meanDelta crea CNI: 4, 5, 8, 11 micromol/L at 1, 3, 6, 12 months, p = 0.02). RI improved after switching to SRL and was lower on SRL than on CNI (meanDeltaRI SRL: -0.04, -0.04, -0.03, -0.03; meanDeltaRI CNI: -0.006, 0.004, -0.007, -0.01 after 1, 3, 6, 12 months, p = 0.016). Individual changes of RI correlated significantly with individual changes of crea (r = 0.54, p < 0.001). Conversion from CNI to SRL can ameliorate renal function accompanied by a reduction of intrarenal RI after OLT.
Resumo:
BACKGROUND: The role of human herpesvirus (HHV)-8 in the pathogenesis of multiple myeloma and its pre-malignant state of monoclonal gammopathy is unclear. HHV-8 is transmitted by organ transplantation, representing a unique model with which to investigate primary HHV-8 infection. METHODS: The authors studied the incidence of clonal gammopathy in renal transplant recipients and correlated it with previous and recent HHV-8 infection. RESULTS: Clonal gammopathy was observed in 31 of 162 (19%) HHV-8-seronegative patients, in 5 of 17 (29%) HHV-8-seropositive patients, and in 9 of 24 (38%) HHV-8 seroconverters within 5 years after transplantation. Gammopathy was often transient, and no progression to myeloma was observed. Two patients with persistent gammopathy developed B-cell lymphoma. In a logistic regression model, HHV-8 serostatus of the graft recipient was significantly associated with subsequent development of gammopathy, with a relative risk (RR) of 1.9 and a 95% confidence interval (CI) of 0.5 to 6.4 for an HHV-8-seropositive recipient and an RR of 2.9 and a 95% CI of 1.01 to 8.0 for seroconverters as compared with baseline (HHV-8 seronegative). Other significant variables were cytomegalovirus (CMV) serostatus and the intensity of immunosuppression (RR of 10.4 and 95% CI of 2.6-41.7 for a CMV-negative recipient with a CMV-positive donor vs. a CMV-negative recipient with a CMV-negative donor and RR of 17.6 and 95% CI of 2.0-150.8 if OKT3 was used vs. no use of antilymphocytic substances). CONCLUSIONS: Transplant recipients with HHV-8 infection are more likely to develop clonal gammopathy. However, this risk is much lower than the risk conferred by CMV infection and antilymphocytic therapy, arguing against a major role of HHV-8 infection in the pathogenesis of clonal plasma cell proliferation.
Resumo:
OBJECTIVE: Nursing in 'live islands' and routine high dose intravenous immunoglobulins after allogeneic hematopoietic stem cell transplantation were abandoned by many teams in view of limited evidence and high costs. METHODS: This retrospective single-center study examines the impact of change from nursing in 'live islands' to care in single rooms (SR) and from high dose to targeted intravenous immunoglobulins (IVIG) on mortality and infection rate of adult patients receiving an allogeneic stem cell or bone marrow transplantation in two steps and three time cohorts (1993-1997, 1997-2000, 2000-2003). RESULTS: Two hundred forty-eight allogeneic hematopoetic stem cell transplantations were performed in 227 patients. Patient characteristics were comparable in the three cohorts for gender, median age, underlying disease, and disease stage, prophylaxis for graft versus host disease (GvHD) and cytomegalovirus constellation. The incidence of infections (78.4%) and infection rates remained stable (rates/1000 days of neutropenia for sepsis 17.61, for pneumonia 6.76). Cumulative incidence of GvHD and transplant-related mortality did not change over time. CONCLUSIONS: Change from nursing in 'live islands' to SR and reduction of high dose to targeted IVIG did not result in increased infection rates or mortality despite an increase in patient age. These results support the current practice.
Resumo:
Projects in the area of architectural design and urban planning typically engage several architects as well as experts from other professions. While the design and review meetings thus often involve a large number of cooperating participants, the actual design is still done by the individuals in the time in between those meetings using desktop PCs and CAD applications. A real collaborative approach to architectural design and urban planning is often limited to early paper-based sketches.In order to overcome these limitations, we designed and realized the ARTHUR system, an Augmented Reality (AR) enhanced round table to support complex design and planning decisions for architects. WhileAR has been applied to this area earlier, our approach does not try to replace the use of CAD systems but rather integrates them seamlessly into the collaborative AR environment. The approach is enhanced by intuitiveinteraction mechanisms that can be easily con-figured for different application scenarios.
Resumo:
To compare the effects of deflazacort (DEFLA) vs. prednisone (PRED) on bone mineral density (BMD), body composition, and lipids, 24 patients with end-stage renal disease were randomized in a double blind design and followed 78 weeks after kidney transplantation. BMD and body composition were assessed using dual energy x-ray absorptiometry. Seventeen patients completed the study. Glucocorticosteroid doses, cyclosporine levels, rejection episodes, and drop-out rates were similar in both groups. Lumbar BMD decreased more in PRED than in DEFLA (P < 0.05), the difference being particularly marked after 24 weeks (9.1 +/- 1.8% vs. 3.0 +/- 2.4%, respectively). Hip BMD decreased from baseline in both groups (P < 0.01), without intergroup differences. Whole body BMD decreased from baseline in PRED (P < 0.001), but not in DEFLA. Lean body mass decreased by approximately 2.5 kg in both groups after 6-12 weeks (P < 0.001), then remained stable. Fat mass increased more (P < 0.01) in PRED than in DEFLA (7.1 +/- 1.8 vs. 3.5 +/- 1.4 kg). Larger increases in total cholesterol (P < 0.03), low density lipoprotein cholesterol (P < 0.01), lipoprotein B2 (P < 0.03), and triglycerides (P = 0.054) were observed in PRED than in DEFLA. In conclusion, using DEFLA instead of PRED in kidney transplant patients is associated with decreased loss of total skeleton and lumbar spine BMD, but does not alter bone loss at the upper femur. DEFLA also helps to prevent fat accumulation and worsening of the lipid profile.
Resumo:
This paper presents our research works in the domain of Collaborative Environments centred on Problem Based Learning (PBL) and taking advantage of existing Electronic Documents. We first present the modelling and engineering problems that we want to address; then we discuss technological issues of such a research particularly the use of OpenUSS and of the Enterprise Java Open Source Architecture (EJOSA) to implement such collaborative PBL environments.
Resumo:
To master changing performance demands, autonomous transport vehicles are deployed to make inhouse material flow applications more flexible. The socalled cellular transport system consists of a multitude of small scale transport vehicles which shall be able to form a swarm. Therefore the vehicles need to detect each other, exchange information amongst each other and sense their environment. By provision of peripherally acquired information of other transport entities, more convenient decisions can be made in terms of navigation and collision avoidance. This paper is a contribution to collective utilization of sensor data in the swarm of cellular transport vehicles.
Resumo:
OBJECTIVES Zidovudine (ZDV) is recommended for first-line antiretroviral therapy (ART) in resource-limited settings. ZDV may, however, lead to anemia and impaired immunological response. We compared CD4+ cell counts over 5 years between patients starting ART with and without ZDV in southern Africa. DESIGN Cohort study. METHODS Patients aged at least 16 years who started first-line ART in South Africa, Botswana, Zambia, or Lesotho were included. We used linear mixed-effect models to compare CD4+ cell count trajectories between patients on ZDV-containing regimens and patients on other regimens, censoring follow-up at first treatment change. Impaired immunological recovery, defined as a CD4+ cell count below 100 cells/μl at 1 year, was assessed in logistic regression. Analyses were adjusted for baseline CD4+ cell count and hemoglobin level, age, sex, type of regimen, viral load monitoring, and calendar year. RESULTS A total of 72,597 patients starting ART, including 19,758 (27.2%) on ZDV, were analyzed. Patients on ZDV had higher CD4+ cell counts (150 vs.128 cells/μl) and hemoglobin level (12.0 vs. 11.0 g/dl) at baseline, and were less likely to be women than those on other regimens. Adjusted differences in CD4+ cell counts between regimens containing and not containing ZDV were -16 cells/μl [95% confidence interval (CI) -18 to -14] at 1 year and -56 cells/μl (95% CI -59 to -52) at 5 years. Impaired immunological recovery was more likely with ZDV compared to other regimens (odds ratio 1.40, 95% CI 1.22-1.61). CONCLUSION In southern Africa, ZDV is associated with inferior immunological recovery compared to other backbones. Replacing ZDV with another nucleoside reverse transcriptase inhibitor could avoid unnecessary switches to second-line ART.
Resumo:
BACKGROUND Few estimates exist of the life expectancy of HIV-positive adults receiving antiretroviral treatment (ART) in low- and middle-income countries. We aimed to estimate the life expectancy of patients starting ART in South Africa and compare it with that of HIV-negative adults. METHODS AND FINDINGS Data were collected from six South African ART cohorts. Analysis was restricted to 37,740 HIV-positive adults starting ART for the first time. Estimates of mortality were obtained by linking patient records to the national population register. Relative survival models were used to estimate the excess mortality attributable to HIV by age, for different baseline CD4 categories and different durations. Non-HIV mortality was estimated using a South African demographic model. The average life expectancy of men starting ART varied between 27.6 y (95% CI: 25.2-30.2) at age 20 y and 10.1 y (95% CI: 9.3-10.8) at age 60 y, while estimates for women at the same ages were substantially higher, at 36.8 y (95% CI: 34.0-39.7) and 14.4 y (95% CI: 13.3-15.3), respectively. The life expectancy of a 20-y-old woman was 43.1 y (95% CI: 40.1-46.0) if her baseline CD4 count was ≥ 200 cells/µl, compared to 29.5 y (95% CI: 26.2-33.0) if her baseline CD4 count was <50 cells/µl. Life expectancies of patients with baseline CD4 counts ≥ 200 cells/µl were between 70% and 86% of those in HIV-negative adults of the same age and sex, and life expectancies were increased by 15%-20% in patients who had survived 2 y after starting ART. However, the analysis was limited by a lack of mortality data at longer durations. CONCLUSIONS South African HIV-positive adults can have a near-normal life expectancy, provided that they start ART before their CD4 count drops below 200 cells/µl. These findings demonstrate that the near-normal life expectancies of HIV-positive individuals receiving ART in high-income countries can apply to low- and middle-income countries as well. Please see later in the article for the Editors' Summary.