843 resultados para COLLABORATIVE TRANSPLANT
Resumo:
The incidence of HIV encephalopathies was determined in an ongoing consecutive autopsy study. Among 345 patients who died from AIDS in Switzerland during 1981-1990, 68 (19%) showed morphological evidence of HIV encephalopathy. Two major histopathological manifestations were observed. Progressive diffuse leukoencephalopathy (PDL) was present in 33 cases and is characterized by a diffuse loss of myelin staining in the deep white matter of the cerebral and cerebellar hemispheres, with scattered multinucleated giant cells but little or no inflammatory reaction. Multinucleated giant cell encephalitis (MGCE) was diagnosed in 32 cases; it's hallmarks are accumulations of multinucleated giant cells with prominent inflammatory reaction and focal necroses. In 3 patients both types of lesions overlapped. Brain tissue from 27 patients was analyzed for the presence of HIV gag sequences using the polymerase chain reaction (PCR) with primers encoding a 109 base pair segment of the viral gene. Amplification succeeded in all patients with clinical and histopathological evidence for HIV encephalopathy but was absent in AIDS patients with opportunistic bacterial, parasitic and/or viral infections. Potential mechanisms by which HIV exerts it's adverse effects on the human CNS are discussed.
Resumo:
BACKGROUND After heart transplantation (HTx), the interindividual pharmacokinetic variability of immunosuppressive drugs represents a major therapeutic challenge due to the narrow therapeutic window between over-immunosuppression causing toxicity and under-immunosuppression leading to graft rejection. Although genetic polymorphisms have been shown to influence pharmacokinetics of immunosuppressants, data in the context of HTx are scarce. We thus assessed the role of genetic variation in CYP3A4, CYP3A5, POR, NR1I2, and ABCB1 acting jointly in immunosuppressive drug pathways in tacrolimus (TAC) and ciclosporin (CSA) dose requirement in HTx recipients. METHODS Associations between 7 functional genetic variants and blood dose-adjusted trough (C0) concentrations of TAC and CSA at 1, 3, 6, and 12 months after HTx were evaluated in cohorts of 52 and 45 patients, respectively. RESULTS Compared with CYP3A5 nonexpressors (*3/*3 genotype), CYP3A5 expressors (*1/*3 or *1/*1 genotype) required around 2.2- to 2.6-fold higher daily TAC doses to reach the targeted C0 concentration at all studied time points (P ≤ 0.003). Additionally, the POR*28 variant carriers showed higher dose-adjusted TAC-C0 concentrations at all time points resulting in significant differences at 3 (P = 0.025) and 6 months (P = 0.047) after HTx. No significant associations were observed between the genetic variants and the CSA dose requirement. CONCLUSIONS The CYP3A5*3 variant has a major influence on the required TAC dose in HTx recipients, whereas the POR*28 may additionally contribute to the observed variability. These results support the importance of genetic markers in TAC dose optimization after HTx.
Resumo:
The central assumption in the literature on collaborative networks and policy networks is that political outcomes are affected by a variety of state and nonstate actors. Some of these actors are more powerful than others and can therefore have a considerable effect on decision making. In this article, we seek to provide a structural and institutional explanation for these power differentials in policy networks and support the explanation with empirical evidence. We use a dyadic measure of influence reputation as a proxy for power, and posit that influence reputation over the political outcome is related to vertical integration into the political system by means of formal decision-making authority, and to horizontal integration by means of being well embedded into the policy network. Hence, we argue that actors are perceived as influential because of two complementary factors: (a) their institutional roles and (b) their structural positions in the policy network. Based on temporal and cross-sectional exponential random graph models, we compare five cases about climate, telecommunications, flood prevention, and toxic chemicals politics in Switzerland and Germany. The five networks cover national and local networks at different stages of the policy cycle. The results confirm that institutional and structural drivers seem to have a crucial impact on how an actor is perceived in decision making and implementation and, therefore, their ability to significantly shape outputs and service delivery.
Resumo:
BACKGROUND High early mortality in patients with HIV-1 starting antiretroviral therapy (ART) in sub-Saharan Africa, compared to Europe and North America, is well documented. Longer-term comparisons between settings have been limited by poor ascertainment of mortality in high burden African settings. This study aimed to compare mortality up to four years on ART between South Africa, Europe, and North America. METHODS AND FINDINGS Data from four South African cohorts in which patients lost to follow-up (LTF) could be linked to the national population register to determine vital status were combined with data from Europe and North America. Cumulative mortality, crude and adjusted (for characteristics at ART initiation) mortality rate ratios (relative to South Africa), and predicted mortality rates were described by region at 0-3, 3-6, 6-12, 12-24, and 24-48 months on ART for the period 2001-2010. Of the adults included (30,467 [South Africa], 29,727 [Europe], and 7,160 [North America]), 20,306 (67%), 9,961 (34%), and 824 (12%) were women. Patients began treatment with markedly more advanced disease in South Africa (median CD4 count 102, 213, and 172 cells/µl in South Africa, Europe, and North America, respectively). High early mortality after starting ART in South Africa occurred mainly in patients starting ART with CD4 count <50 cells/µl. Cumulative mortality at 4 years was 16.6%, 4.7%, and 15.3% in South Africa, Europe, and North America, respectively. Mortality was initially much lower in Europe and North America than South Africa, but the differences were reduced or reversed (North America) at longer durations on ART (adjusted rate ratios 0.46, 95% CI 0.37-0.58, and 1.62, 95% CI 1.27-2.05 between 24 and 48 months on ART comparing Europe and North America to South Africa). While bias due to under-ascertainment of mortality was minimised through death registry linkage, residual bias could still be present due to differing approaches to and frequency of linkage. CONCLUSIONS After accounting for under-ascertainment of mortality, with increasing duration on ART, the mortality rate on HIV treatment in South Africa declines to levels comparable to or below those described in participating North American cohorts, while substantially narrowing the differential with the European cohorts. Please see later in the article for the Editors' Summary.
Resumo:
This study analysed the outcome of 563 Aplastic Anaemia (AA) children aged 0-12 years reported to the Severe Aplastic Anaemia Working Party database of the European Society for Blood and Marrow Transplantation, according to treatment received. Overall survival (OS) after upfront human leucocyte antigen-matched family donor (MFD) haematopoietic stem cell transplantation (HSCT) or immunosuppressive treatment (IST) was 91% vs. 87% (P 0·18). Event-free survival (EFS) after upfront MFD HSCT or IST was 87% vs. 33% (P 0·001). Ninety-one of 167 patients (55%) failed front-line IST and underwent rescue HSCT. The OS of this rescue group was 83% compared with 91% for upfront MFD HSCT patients and 97% for those who did not fail IST up-front (P 0·017). Rejection was 2% for MFD HSCT and HSCT post-IST failure (P 0·73). Acute graft-versus-host disease (GVHD) grade II-IV was 8% in MFD graft vs. 25% for HSCT post-IST failure (P < 0·0001). Chronic GVHD was 6% in MFD HSCT vs. 20% in HSCT post-IST failure (P < 0·0001). MFD HSCT is an excellent therapy for children with AA. IST has a high failure rate, but remains a reasonable first-line choice if MFD HSCT is not available because high OS enables access to HSCT, which is a very good rescue option.
Resumo:
Two batches of excretory/secretory (E/S) antigens from second stage larvae of Toxocara canis maintained in vitro were prepared independently in two different laboratories (Zürich and Basel) and analysed in order to obtain information for future efforts to standardize the enzyme-linked immunosorbent assay (ELISA) used for the serodiagnosis of human toxocariasis. SDS-PAGE and "Western-blotting" revealed at least 10 different antigenic components common to the two antigen preparations. However, distinct qualitative and quantitative differences among the two E/S-antigens were observed, since one antigen had a more complex composition than the other. Despite these differences, an accordance of serodiagnosis was obtained in 80% of 25 sera from patients with suspected Toxocara infection tested independently in two different ELISA systems (Basel and Zürich) with the corresponding E/S-antigens. The specificity was 93% as determined (BS-antigen, BS-ELISA) by testing 46 out of 3396 sera from patients with parasitologically proven extra-intestinal helminthic infections. Cross-reactions occurred mainly with sera from patients infected with filariae (5 from 13 cases) exhibiting very high extinction values in their homologous ELISA-system. The reproducibility (intra- and inter-test variations) of two ELISA systems using the corresponding E/S-antigens varied from 5-15%. The results demonstrate that T. canis E/S-antigens may well be applicable for standardization of the ELISA used for the serodiagnosis of human toxocariasis.
Resumo:
The selection of liver transplant candidates with hepatocellular carcinoma (HCC) is currently validated based on Milan criteria. The use of extended criteria has remained a matter of debate, mainly because of the absence of prospective validation. The present prospective study recruited patients according to the previously proposed Total Tumor Volume (TTV ≤115 cm(3) )/alpha fetoprotein (AFP ≤400 ng/ml) score. Patients with AFP >400 ng/ml were excluded, and as such the Milan group was modified to include only patients with AFP <400 ng/ml; these patients were compared to patients beyond Milan, but within TTV/AFP. From January 2007 to March 2013, 233 patients with HCC were listed for liver transplantation. Of them, 195 patients were within Milan, and 38 beyond Milan but within TTV/AFP. The average follow-up from listing was 33,9 ±24,9 months. The risk of drop-out was higher for patients beyond Milan but within TTV/AFP (16/38, 42,1%), than for patients within Milan (49/195, 25,1%, p=0,033). In parallel, intent-to-treat survival from listing was lower in the patients beyond Milan (53,8% vs. 71,6% at four years, p<0,001). After a median waiting time of 8 months, 166 patients were transplanted, 134 patients within Milan criteria, and 32 beyond Milan but within TTV/AFP. They demonstrated acceptable and similar recurrence rates (4,5% vs. 9,4%, p=0,138) and post-transplant survivals (78,7% vs. 74,6% at four years, p=0,932). CONCLUSION Based on the present prospective study, HCC liver transplant candidate selection could be expanded to the TTV (≤115 cm(3) )/AFP (≤400 ng/ml) criteria in centers with at least 8-month waiting time. An increased risk of drop-out on the waiting list can be expected but with equivalent and satisfactory post-transplant survival. This article is protected by copyright. All rights reserved.
Resumo:
Prospective cohort studies significantly contribute to answering specific research questions in a defined population. Since 2008, the Swiss Transplant Cohort Study (STCS) systematically enrolled >95 % of all transplant recipients in Switzerland, collecting predefined data at determined time points. Designed as an open cohort, the STCS has included >3900 patients to date, with a median follow-up of 2.96 years (IQR 1.44-4.73). This review highlights some relevant findings in the field of transplant-associated infections gained by the STCS so far. Three key general aspects have crystallized: (i) Well-run cohort studies are a powerful tool to conduct genetic studies, which are crucially dependent on a meticulously described phenotype. (ii) Long-term real-life observations are adding a distinct layer of information that cannot be obtained during randomized studies. (iii) The systemic collection of data, close interdisciplinary collaboration, and continuous analysis of some key outcome data such as infectious diseases endpoints can improve patient care.
Resumo:
BACKGROUND HIV-1 viral load (VL) testing is recommended to monitor antiretroviral therapy (ART) but not universally available. We examined monitoring of first-line and switching to second-line ART in sub-Saharan Africa, 2004-2013. METHODS Adult HIV-1 infected patients starting combination ART in 16 countries were included. Switching was defined as a change from a non-nucleoside reverse-transcriptase inhibitor (NNRTI)-based regimen to a protease inhibitor (PI)-based regimen, with a change of ≥1 NRTI. Virological and immunological failures were defined per World Health Organization criteria. We calculated cumulative probabilities of switching and hazard ratios with 95% confidence intervals (CI) comparing routine VL monitoring, targeted VL monitoring, CD4 cell monitoring and clinical monitoring, adjusted for programme and individual characteristics. FINDINGS Of 297,825 eligible patients, 10,352 patients (3·5%) switched during 782,412 person-years of follow-up. Compared to CD4 monitoring hazard ratios for switching were 3·15 (95% CI 2·92-3·40) for routine VL, 1·21 (1·13-1·30) for targeted VL and 0·49 (0·43-0·56) for clinical monitoring. Overall 58.0% of patients with confirmed virological and 19·3% of patients with confirmed immunological failure switched within 2 years. Among patients who switched the percentage with evidence of treatment failure based on a single CD4 or VL measurement ranged from 32·1% with clinical to 84.3% with targeted VL monitoring. Median CD4 counts at switching were 215 cells/µl under routine VL monitoring but lower with other monitoring (114-133 cells/µl). INTERPRETATION Overall few patients switched to second-line ART and switching occurred late in the absence of routine viral load monitoring. Switching was more common and occurred earlier with targeted or routine viral load testing.
Resumo:
Indigenous media as a phenomenon cannot be reduced to a reaction to western hegemony and colonial legacies, but is often rooted in the context of resistance, empowerment, self-determination and the reclaiming of symbolic representation. Therefore I would like to reflect on different cases of indigenous film and participatory video work in an attempt to highlight the multiple dynamics that arise due to the desideratum of self-representation and to finally locate us as anthropologists in that context.
Resumo:
BACKGROUND Kidney recipients maintaining a prolonged allograft survival in the absence of immunosuppressive drugs and without evidence of rejection are supposed to be exceptional. The ERA-EDTA-DESCARTES working group together with Nantes University launched a European-wide survey to identify new patients, describe them and estimate their frequency for the first time. METHODS Seventeen coordinators distributed a questionnaire in 256 transplant centres and 28 countries in order to report as many 'operationally tolerant' patients (TOL; defined as having a serum creatinine <1.7 mg/dL and proteinuria <1 g/day or g/g creatinine despite at least 1 year without any immunosuppressive drug) and 'almost tolerant' patients (minimally immunosuppressed patients (MIS) receiving low-dose steroids) as possible. We reported their number and the total number of kidney transplants performed at each centre to calculate their frequency. RESULTS One hundred and forty-seven questionnaires were returned and we identified 66 TOL (61 with complete data) and 34 MIS patients. Of the 61 TOL patients, 26 were previously described by the Nantes group and 35 new patients are presented here. Most of them were noncompliant patients. At data collection, 31/35 patients were alive and 22/31 still TOL. For the remaining 9/31, 2 were restarted on immunosuppressive drugs and 7 had rising creatinine of whom 3 resumed dialysis. Considering all patients, 10-year death-censored graft survival post-immunosuppression weaning reached 85% in TOL patients and 100% in MIS patients. With 218 913 kidney recipients surveyed, cumulative incidences of operational tolerance and almost tolerance were estimated at 3 and 1.5 per 10 000 kidney recipients, respectively. CONCLUSIONS In kidney transplantation, operational tolerance and almost tolerance are infrequent findings associated with excellent long-term death-censored graft survival.
Resumo:
BACKGROUND Cytomegalovirus (CMV) is associated with an increased risk of cardiac allograft vasculopathy (CAV), the major limiting factor for long-term survival after heart transplantation (HTx). The purpose of this study was to evaluate the impact of CMV infection during long-term follow-up after HTx. METHODS A retrospective, single-centre study analyzed 226 HTx recipients (mean age 45 ± 13 years, 78 % men) who underwent transplantation between January 1988 and December 2000. The incidence and risk factors for CMV infection during the first year after transplantation were studied. Risk factors for CAV were included in an analyses of CAV-free survival within 10 years post-transplant. The effect of CMV infection on the grade of CAV was analyzed. RESULTS Survival to 10 years post-transplant was higher in patients with no CMV infection (69 %) compared with patients with CMV disease (55 %; p = 0.018) or asymptomatic CMV infection (54 %; p = 0.053). CAV-free survival time was higher in patients with no CMV infection (6.7 years; 95 % CI, 6.0-7.4) compared with CMV disease (4.2 years; CI, 3.2-5.2; p < 0.001) or asymptomatic CMV infection (5.4 years; CI, 4.3-6.4; p = 0.013). In univariate analysis, recipient age, donor age, coronary artery disease (CAD), asymptomatic CMV infection and CMV disease were significantly associated with CAV-free survival. In multivariate regression analysis, CMV disease, asymptomatic CMV infection, CAD and donor age remained independent predictors of CAV-free survival at 10 years post-transplant. CONCLUSIONS CAV-free survival was significantly reduced in patients with CMV disease and asymptomatic CMV infection compared to patients without CMV infection. These findings highlight the importance of close monitoring of CMV viral load and appropriate therapeutic strategies for preventing asymptomatic CMV infection.