953 resultados para End-stage Renal Disease (esrd)
Resumo:
Objective: To evaluate the determinants of total plasma homocysteine levels and their relations with nutritional parameters, inflammatory status, and traditional risk factors for cardiovascular disease in renal failure patients on dialysis treatment. Design: The study was conducted on 70 clinically stable patients, 50 of them on hemodialysis (70% men; 55.3 +/- 14.5 years) and 20 on peritoneal dialysis (50% men; 62 +/- 13.7 years). Patients were analyzed in terms of biochemical parameters (serum lipids, creatinine, homocysteine [Hcy], creatine-kinase [Ck], folic acid, and vitamin B(12)), anthropometric data, markers of inflammatory status (tumor necrosis factor-alpha, C-reactive protein, interleukin-6), and adapted subjective global assessment. Results: The total prevalence of hyperhomocysteinemia (>15 mu mol/L) was 85.7%. Plasma folic acid and plasma vitamin B(12) were within the normal range. Multiple regression analysis (r(2) - 0.20) revealed that the determinants of total Hcy were type of dialysis, creatinine, Ck, folic acid, and total cholesterol. Hcy was positively correlated with albumin and creatinine and negatively correlated with total cholesterol, high density lipoprotein cholesterol, folic acid, and vitamin B(12). Conclusions: The determinants of total Hcy in the study sample were type of dialysis, creatinine, Ck, folic acid, and total cholesterol. Evidently, the small sample size might have had an effect on the statistical analyses and further studies are needed. However, Hcy in patients on dialysis treatment may not have the same effect as observed in the general population. In this respect, the association between malnutrition and inflammation may be a confounding factor in the determination of the true relationship between Hcy, nutritional status, and cardiovascular risk factors in this group. (C) 2011 by the National Kidney Foundation, Inc. All rights reserved.
Resumo:
Introduction. Orthotopic liver transplantation (OLT) is the treatment of choice of hepatocellular carcinoma (HCC) for patients with cirrhosis, mainly those with early HCC. Herein we have present the clinical characteristics and outcomes of cirrhotic patients with HCC who underwent OLT from cadaveric donors in our institution. Methods. From May 2001 to May 2009, we performed 121 OLT including 24 patients (19.8%) with cirrhosis and HCC within the Milan criteria. In 4 cases, HCC was an incidental finding in the explants. Results. The patients` average age was 55 +/- 10 years, including 82% men. Fifty percent of patients were Child class B or C. The average Model for End Stage Liver Disease for Child A, B, and C categories were 11, 15, and 18, respectively. The HCC diagnosis was made by 2 dynamic images in 16 cases; 1 dynamic image plus alphafetoprotein >400 ng/mL in 4; and 4 by histologic confirmation. Twenty patients received a locoregional treatment before OLT: 6 percutaneous ethanol injection, 9 transarterial chemoembolization, 1 transarterial embolization, and 4 a combination of these modalities. The median follow-up after OLT was 19.7 months (range, 1-51). A vascular invasion was observed in the explant of 1 patient, who developed an HCC recurrence and succumbed at 8 months after OLT. Two further patients, without vascular invasion or satellite tumor displayed tumor recurrences at 7 and 3 months after OLT, and death at 2 and 1 month after the diagnosis. The remaining 25 patients have not shown a tumor recurrence. Conclusion. In the present evaluation, OLT patients with early HCC and no vascular invasion showed satisfactory results and good disease-free survival. Strictly following the Milan criteria for liver transplantation in patients with HCC greatly reduces but does not completely avoid, the chances of tumor recurrence.
Resumo:
More than 30% of the patients on peritoneal dialysis show chronic systemic inflammatory activity with high levels of C-reactive protein. The purpose of this cross-sectional study was to investigate the influence of the inflammatory state on clinical and nutritional markers in patients on peritoneal dialysis. Twenty-seven patients were included: mean age was 57.6 +/- 19 years, 48% were male, and median time on peritoneal dialysis was 16.0 (8.3; 35.8) months. Clinical, dialytic, laboratory, anthropometric and electric bioimpedance data were collected with the sample stratified for C-reactive protein. In patients, the levels of Interleukin-6 and tumor necrosis factor-a were higher, while adiponectin levels were lower than in healthy individuals (p <= 0.001), indicating the presence of inflammatory activity in the sample. When compared to patients with C-reactive protein < 1 mg/dL, those with = 1mg/dL showed higher body mass index (29.4 +/- 6.1 vs. 24.4 +/- 4.5 kg/m(2); p = 0.009), percent of standard body weight (124.5 +/- 25.4 vs. 106.8 +/- 17.9 %; p = 0.012), and percent of body fat as assessed by both anthropometry (31.3 +/- 9.9 vs. 23.9 +/- 9.1%; p = 0.056) and bioimpedance (38.9 +/- 6.3 vs. 26.2 +/- 12.6 %; p < 0.001). Patients with C-reactive protein = 1mg/dL also exhibited higher levels of ferritin (701 +/- 568 vs. 532 +/- 356 ng/mL; p = 0.054) and lower total lymphocyte count (median 1838 vs. 1638 mm(3); p = 0.001). In conclusion, higher body mass index and body fat markers were associated with C-reactive protein = 1mg/dL, and higher C-reactive protein was associated with immunocompetence impairment evidenced by the lower total lymphocyte count. Our findings confirm the relationship between inflammation, body fat, and immunocompetence, which may be superimposed potentializing the inflammatory status.
Resumo:
The consumption of excess alcohol in patients with liver iron storage diseases, in particular the iron-overload disease hereditary haemochromatosis (HH), has important clinical consequences. HH, a common genetic disorder amongst people of European descent, results in a slow, progressive accumulation of excess hepatic iron. If left untreated, the condition may lead to fibrosis, cirrhosis and primary hepatocellular carcinoma. The consumption of excess alcohol remains an important cause of hepatic cirrhosis and alcohol consumption itself may lead to altered iron homeostasis. Both alcohol and iron independently have been shown to result in increased oxidative stress causing lipid peroxidation and tissue damage. Therefore, the added effects of both toxins may exacerbate the pathogenesis of disease and impose an increased risk of cirrhosis. This review discusses the concomitant effects of alcohol and iron on the pathogenesis of liver disease. We also discuss the implications of co-existent alcohol and iron in end-stage liver disease.
Resumo:
In a liver transplant (LT) center, treatments with Prometheus were evaluated. The main outcome considered was 1 and 6 months survival. Methods. During the study period, 74 patients underwent treatment with Prometheus; 64 were enrolled,with a mean age of 51 13 years; 47men underwent 212 treatments (mean, 3.02 per patient). The parameters evaluated were age, sex, laboratorial (liver enzymes, ammonia) and clinical (model for end-stage liver disease and Child-Turcotte-Pugh score) data. Results. Death was verified in 23 patients (35.9%) during the hospitalization period, 20 patients (31.3%) were submitted to liver transplantation, and 21 were discharged. LT was performed in 4 patients with acute liver failure (ALF, 23.7%), in 7 patients with acute on chronic liver failure (AoCLF, 43.7%), and in 6 patients with liver disease after LT (30%). Seven patients who underwent LT died (35%). In the multivariate analysis, older age (P ¼ .015), higher international normalized ratio (INR) (P ¼ .019), and acute liver failure (P ¼ .039) were independently associated with an adverse 1-month clinical outcome. On the other hand, older age (P ¼ .011) and acute kidney injury (P ¼ .031) at presentation were both related to worse 6-month outcome. For patients with ALF and AoCLF we did not observe the same differences. Conclusions. In this cohort, older age was the most important parameter defining 1- and 6-month survival, although higher INR and presence of ALF were important for 1-month survival and AKI for 6-month survival. No difference was observed between patients who underwent LT or did not have LT.
Resumo:
Liver transplantation is now the standard treatment for end-stage liver disease. Given the shortage of liver donors and the progressively higher number of patients waiting for transplantation, improvements in patient selection and optimization of timing for transplantation are needed. Several solutions have been suggested, including increasing the donor pool; a fair policy for allocation, not permitting variables such as age, gender, and race, or third-party payer status to play any role; and knowledge of the natural history of each liver disease for which transplantation is offered. To observe ethical rules and distributive justice (guarantee to every citizen the same opportunity to get an organ), the "sickest first" policy must be used. Studies have demonstrated that death has no relationship with waiting time, but rather with the severity of liver disease at the time of inclusion. Thus, waiting time is no longer part of the United Network for Organ Sharing distribution criteria. Waiting time only differentiates between equally severely diseased patients. The authors have analyzed the waiting list mortality and 1-year survival for patients of the State of São Paulo, from July 1997 through January 2001. Only the chronological criterion was used. According to "Secretaria de Estado da Saúde de São Paulo" data, among all waiting list deaths, 82.2% occurred within the first year, and 37.6% within the first 3 months following inclusion. The allocation of livers based on waiting time is neither fair nor ethical, impairs distributive justice and human rights, and does not occur in any other part of the world.
Resumo:
The significant development of immunosuppressive drug therapies within the past 20 years has had a major impact on the outcome of clinical solid organ transplantation, mainly by decreasing the incidence of acute rejection episodes and improving short-term patient and graft survival. However, long-term results remain relatively disappointing because of chronic allograft dysfunction and patient morbidity or mortality, which is often related to the adverse effects of immunosuppressive treatment. Thus, the induction of specific immunological tolerance of the recipient towards the allograft remains an important objective in transplantation. In this article, we first briefly describe the mechanisms of allograft rejection and immune tolerance. We then review in detail current tolerogenic strategies that could promote central or peripheral tolerance, highlighting the promises as well as the remaining challenges in clinical transplantation. The induction of haematopoietic mixed chimerism could be an approach to induce robust central tolerance, and we describe recent encouraging reports of end-stage kidney disease patients, without concomitant malignancy, who have undergone combined bone marrow and kidney transplantation. We discuss current studies suggesting that, while promoting peripheral transplantation tolerance in preclinical models, induction protocols based on lymphocyte depletion (polyclonal antithymocyte globulins, alemtuzumab) or co-stimulatory blockade (belatacept) should, at the current stage, be considered more as drug-minimization rather than tolerance-inducing strategies. Thus, a better understanding of the mechanisms that promote peripheral tolerance has led to newer approaches and the investigation of individualized donor-specific cellular therapies based on manipulated recipient regulatory T cells.
Resumo:
Over the past 50 years organ transplantation has become an established worldwide practice, bringing immense benefits to hundreds of thousands of patients. The use of human organs (hereinafter â?~organsâ?T) for transplantation has steadily increased during the last two decades. Organ transplantation is now the most cost-effective treatment for end-stage renal failure, while for end-stage failure of organs such as the liver, lung and heart it is the only available treatment. Click here to download PDF 806kb You can read a summary of the document here
Resumo:
Liver transplant seems to be an effective option to prolong survival in patients with end-stage liver disease, although it still can be followed by serious complications. Invasive fungal infections (ifi) are related to high rates of morbidity and mortality. The epidemiology of fungal infections in Brazilian liver transplant recipients is unknown. The aim of this observational and retrospective study was to determine the incidence and epidemiology of fungal infections in all patients who underwent liver transplantation at Albert Einstein Israeli Hospital between 2002-2007. A total of 596 liver transplants were performed in 540 patients. Overall, 77 fungal infections occurred in 68 (13%) patients. Among the 77 fungal infections, there were 40 IFI that occurred in 37 patients (7%). Candida and Aspergillus species were the most common etiologic agents. Candida species accounted for 82% of all fungal infections and for 67% of all IFI, while Aspergillus species accounted for 9% of all fungal infections and for 17% of all IFI. Non-albicans Candida species were the predominant Candida isolates. Invasive aspergillosis tended to occur earlier in the post-transplant period. These findings can contribute to improve antifungal prophylaxis and therapy practices in Brazilian centres.
Resumo:
Chronic hepatitis B virus (HBV) infection is responsible for up to 30% of cases of liver cirrhosis and up to 53% of cases of hepatocellular carcinoma. Liver transplantation (LT) is the best therapeutic option for patients with end-stage liver failure caused by HBV. The success of transplantation, though, depends on receiving prophylactic treatment against post-transplant viral reactivation. In the absence of prophylaxis, liver transplantation due to chronic hepatitis B (CHB) is associated with high rates of viral recurrence and poor survival. The introduction of treatment with hepatitis B immunoglobulins (HBIG) during the 1990s and later the incorporation of oral antiviral drugs have improved the prognosis of these patients. Thus, LT for CHB is now a universally accepted option, with an estimated 5 years survival of around 85% vs the 45% survival seen prior to the introduction of HBIG. The combination of lamivudine plus HBIG has for many years been the most widely used prophylactic regimen. However, with the appearance of new more potent oral antiviral agents associated with less resistance (e.g., entecavir and tenofovir) for the treatment of CHB, new prophylactic strategies are being designed, either in combination with HBIG or alone as a monotherapy. These advances have allowed for more personalized prophylaxis based on the individual risk profile of a given patient. In addition, the small pool of donors has required the use of anti-HBc-positive donors (with the resulting possibility of transmitting HBV from these organs), which has been made possible by suitable prophylactic regimens.
Resumo:
Autoantibodies to apolipoprotein/A-1 (anti-ApoA-1 IgG) have pro-atherogenic properties in patients at high cardiovascular risk, but its prevalence in patients with end-stage kidney disease is unknown. The aims of this single-center, cross-sectional study were to assess the prevalence of anti-ApoA-1 antibodies in patients on maintenance hemodialysis (MHD), and to examine its correlation with inflammatory biomarkers related to atherosclerotic plaque vulnerability and dialysis vintage. To this purpose, anti-ApoA-1 IgG levels and the concentrations of interleukin-6 (IL-6), interleukin-8 (IL-8), monocyte chemoattractant protein-1 (MCP-1), metalloproteinase-9 (MMP-9), tumor necrosis factor-α, and C-reactive protein (CRP) were assessed in the sera of 66 MHD patients (mean age: 68 ± 14 years, 36% women, 32% diabetics). Anti-ApoA-1 IgG positivity (defined as a blood value ≥ 97.5(th) percentile of the normal distribution as assessed in healthy blood donors) was 20%. Circulating levels of anti-ApoA-1 IgG correlated positively with dialysis vintage, but not with cardiovascular risk factors or previous cardiovascular events; no significant correlations were found between the anti-ApoA1 IgG levels and circulating levels of IL-6, IL-8, MCP-1, MMP-9, CRP, or low-density lipoprotein-cholesterol. In multivariable linear regression, adjusted for age and sex, only dialysis vintage remained positively and independently associated with anti-ApoA-1 titers (β = 0.05, 95% CI: 0.006; 0.28, P = 0.049). In conclusion, the prevalence of anti-ApoA-1 IgG is raised in the MHD-population, and positively associated with dialysis vintage, a major determinant of cardiovascular outcome. Whether antiApoA-1 antibodies play a role in the pathophysiology of accelerated atherosclerosis in the MHD-population merits further study.
Resumo:
BACKGROUND/AIMS: Treatment of chronic HCV infection has become a priority in HIV+ patients, given the faster progression to end-stage liver disease. The primary endpoint of this study was to evaluate and compare antiviral efficacy of Peginterferon alpha 2a plus ribavirin in HIV-HCV co-infected and HCV mono-infected patients, and to examine whether 6 months of therapy would have the same efficacy in HIV patients with favourable genotypes 2 and 3 as in mono-infected patients, to minimise HCV-therapy-related toxicities. Secondary endpoints were to evaluate predictors of sustained virological response (SVR) and frequency of side-effects. METHODS: Patients with genotypes 1 and 4 were treated for 48 weeks with Pegasys 180 microg/week plus Copegus 1000-1200 mg/day according to body weight; patients with genotypes 2 and 3 for 24 weeks with Pegasys 180 microg/week plus Copegus 800 mg/day. RESULTS: 132 patients were enrolled in the study: 85 HCV mono-infected (38: genotypes 1 and 4; 47: genotypes 2 and 3), 47 HIV-HCV co-infected patients (23: genotypes 1 and 4; 24: genotypes 2 and 3). In an intention-to-treat analysis, SVR for genotypes 1 and 4 was observed in 58% of HCV mono-infected and in 13% of HIV-HCV co-infected patients (P = 0.001). For genotypes 2 and 3, SVR was observed in 70% of HCV mono-infected and in 67% of HIV-HCV co-infected patients (P = 0.973). Undetectable HCV-RNA at week 4 had a positive predictive value for SVR for mono-infected patients with genotypes 1 and 4 of 0.78 (95% CI: 0.54-0.93) and of 0.81 (95% CI: 0.64-0.92) for genotypes 2 and 3. For co-infected patients with genotypes 2 and 3, the positive predictive value of SVR of undetectable HCV-RNA at week 4 was 0.76 (95%CI, 0.50-0.93). Study not completed by 22 patients (36%): genotypes 1 and 4 and by 12 patients (17%): genotypes 2 and 3. CONCLUSION: Genotypes 2 or 3 predict the likelihood of SVR in HCV mono-infected and in HIV-HCV co-infected patients. A 6-month treatment with Peginterferon alpha 2a plus ribavirin has the same efficacy in HIV-HCV co-infected patients with genotypes 2 and 3 as in mono-infected patients. HCV-RNA negativity at 4 weeks has a positive predictive value for SVR. Aggressive treatment of adverse effects to avoid dose reduction, consent withdrawal or drop-out is crucial to increase the rate of SVR, especially when duration of treatment is 48 weeks. Sixty-one percent of HIV-HCV co-infected patients with genotypes 1 and 4 did not complete the study against 4% with genotypes 2 and 3.
Resumo:
Introduction: Streptomycin, as other aminoglycosides, exhibits concentration-dependent bacterial killing but has a narrow therapeutic window. It is primarily eliminated unchanged by the kidneys. Data and dosing information to achieve a safe regimen in patients with chronic renal failure undergoing hemodialysis (HD) are scarce. Although main adverse reactions are related to prolonged, elevated serum concentrations, literature recommendation is to administer streptomycin after each HD. Patients (or Materials) and Methods: We report the case of a patient with end-stage renal failure, undergoing HD, who was successfully treated with streptomycin for gentamicin-resistant Enterococcus faecalis bacteremia with prosthetic arteriovenous fistula infection. Streptomycin was administered intravenously 7.5 mg/kg, 3 hours before each dialysis (3 times a week) during 6 weeks in combination with amoxicillin. Streptomycin plasma levels were monitored with repeated blood sampling before, after, and between HD sessions. A 2-compartment model was used to reconstruct the concentration time profile over days on and off HD. Results: Streptomycin trough plasma-concentration was 2.8 mg/L. It peaked to 21.4 mg/L 30 minutes after intravenous administration, decreased to 18.2 mg/L immediately before HD, and dropped to 4.5 mg/L at the end of a 4-hour HD session. Plasma level increased again to 5.7 mg/L 2 hours after the end of HD and was 2.8 mg/L 48 hours later, before the next administration and HD. The pharmacokinetics of streptomycin was best described with a 2-compartment model. The computer simulation fitted fairly well to the observed concentrations during or between HD sessions. Redistribution between the 2 compartments after the end of HD reproduced the rebound of plasma concentrations after HD. No significant toxicity was observed during treatment. The outcome of the infection was favorable, and no sign of relapse was observed after a follow-up of 3 months. Conclusion: Streptomycin administration of 7.5 mg/kg 3 hours before HD sessions in a patient with end-stage renal failure resulted in an effective and safe dosing regimen. Monitoring plasma levels along with pharmacokinetic simulation document the suitability of this dosing scheme, which should replace current dosage recommendations for streptomycin in HD.
Resumo:
Transcatheter aortic valve implantation is a feasible therapeutic option for selected patients with severe aortic stenosis and high or prohibitive risk for standard surgery. Lung transplant recipients are often considered high-risk patients for heart surgery because of their specific transplant-associated characteristics and comorbidities. We report a case of successful transfemoral transcatheter aortic valve replacement in a lung transplant recipient with a symptomatic severe aortic stenosis, severe left ventricular dysfunction, and end-stage renal failure 9 years after bilateral lung transplantation.
Resumo:
Summary Cell therapy has emerged as a strategy for the treatment of various human diseases. Cells can be transplanted considering their morphological and functional properties to restore a tissue damage, as represented by blood transfusion, bone marrow or pancreatic islet cells transplantation. With the advent of the gene therapy, cells also were used as biological supports for the production of therapeutic molecules that can act either locally or at distance. This strategy represents the basis of ex vivo gene therapy characterized by the removal of cells from an organism, their genetic modification and their implantation into the same or another individual in a physiologically suitable location. The tissue or biological function damage dictates the type of cells chosen for implantation and the required function of the implanted cells. The general aim of this work was to develop an ex vivo gene therapy approach for the secretion of erythropoietin (Epo) in patients suffering from Epo-responsive anemia, thus extending to humans, studies previously performed with mouse cells transplanted in mice and rats. Considering the potential clinical application, allogeneic primary human cells were chosen for practical and safety reasons. In contrast to autologous cells, the use of allogeneic cells allows to characterize a cell lineage that can be further transplanted in many individuals. Furthermore allogeneic cells avoid the potential risk of zoonosis encountered with xenogeneic cells. Accordingly, the immune reaction against this allogeneic source was prevented by cell macro- encapsulation that prevents cell-to-cell contact with the host immune system and allows to easy retrieve the implanted device. The first step consisted in testing the survival of various human primary cells that were encapsulated and implanted for one month in the subcutaneous tissue of immunocompetent and naturally or therapeutically immunodepressed mice, assuming that xenogeneic applications constitute a stringent and representative screening before human transplantation. A fibroblast lineage from the foreskin of a young donor, DARC 3.1 cells, showed the highest mean survival score. We have then performed studies to optimize the manufacturing procedures of the encapsulation device for successful engraftment. The development of calcifications on the polyvinyl alcohol (PVA) matrix serving as a scaffold for enclosed cells into the hollow fiber devices was reported after one month in vivo. Various parameters, including matrix rinsing solutions, batches of PVA and cell lineages were assessed for their respective role in the development of the phenomenon. We observed that the calcifications could be totally prevented by using ultra-pure sterile water instead of phosphate buffer saline solution in the rinsing procedure of the PVA matrix. Moreover, a higher lactate dehydrogenase activity of the cells was found to decrease calcium depositions due to more acidic microenvironment, inhibiting the calcium precipitation. After the selection of the appropriate cell lineage and the optimization of encapsulation conditions, a retroviral-based approach was applied to DARC 3.1 fibroblasts for the transduction of the human Epo cDNA. Various modifications of the retroviral vector and the infection conditions were performed to obtain clinically relevant levels of human Epo. The insertion of a post-transcriptional regulatory element from the woodchuck hepatitis virus as well as of a Kozak consensus sequence led to a 7.5-fold increase in transgene expression. Human Epo production was further optimized by increasing the multiplicity of infection and by selecting high producer cells allowing to reach 200 IU hEpo/10E6 cells /day. These modified cells were encapsulated and implanted in vivo in the same conditions as previously described. All the mouse strains showed a sustained increase in their hematocrit and a high proportion of viable cells were observed after retrieval of the capsules. Finally, in the perspective of human application, a syngeneic model using encapsulated murine myoblasts transplanted in mice was realized to investigate the roles of both the host immune response and the cells metabolic requirements. Various loading densities and anti-inflammatory as well as immunosuppressive drugs were studied. The results showed that an immune process is responsible of cell death in capsules loaded at high cell density. A supporting matrix of PVA was shown to limit the cell density and to avoid early metabolic cell death, preventing therefore the immune reaction. This study has led to the development of encapsulated cells of human origin producing clinically relevant amounts of human EPO. This work resulted also to the optimization of cell encapsulation technical parameters allowing to begin a clinical application in end-stage renal failure patients. Résumé La thérapie cellulaire s'est imposée comme une stratégie de traitement potentiel pour diverses maladies. Si l'on considère leur morphologie et leur fonction, les cellules peuvent être transplantées dans le but de remplacer une perte tissulaire comme c'est le cas pour les transfusions sanguines ou les greffes de moelle osseuse ou de cellules pancréatiques. Avec le développement de la thérapie génique, les cellules sont également devenues des supports biologiques pour la production de molécules thérapeutiques. Cette stratégie représente le fondement de la thérapie génique ex vivo, caractérisée par le prélèvement de cellules d'un organisme, leur modification génétique et leur implantation dans le même individu ou dans un autre organisme. Le choix du type de cellule et la fonction qu'elle doit remplir pour un traitement spécifique dépend du tissu ou de la fonction biologique atteintes. Le but général de ce travail est de développer .une approche par thérapie génique ex vivo de sécrétion d'érythropoïétine (Epo) chez des patients souffrant d'anémie, prolongeant ainsi des travaux réalisés avec des cellules murines implantées chez des souris et des rats. Dans cette perpective, notre choix s'est porté sur des cellules humaines primaires allogéniques. En effet, contrairement aux cellules autologues, une caractérisation unique de cellules allogéniques peut déboucher sur de nombreuses applications. Par ailleurs, l'emploi de cellules allogéniques permet d'éviter les riques de zoonose que l'on peut rencontrer avec des cellules xénogéniques. Afin de protéger les cellules allogéniques soumises à une réaction immunitaire, leur confinement dans des macro-capsules cylindriques avant leur implantation permet d'éviter leur contact avec les cellules immunitaires de l'hôte, et de les retrouver sans difficulté en cas d'intolérance ou d'effet secondaire. Dans un premier temps, nous avons évalué la survie de différentes lignées cellulaires humaines primaires, une fois encapsulées et implantées dans le tissu sous-cutané de souris, soit immunocompétentes, soit immunodéprimées naturellement ou par l'intermédiaire d'un immunosuppresseur. Ce modèle in vivo correspond à des conditions xénogéniques et représente par conséquent un environnement de loin plus hostile pour les cellules qu'une transplantation allogénique. Une lignée fibroblastique issue du prépuce d'un jeune enfant, nommée DARC 3 .1, a montré une remarquable résistance avec un score de survie moyen le plus élevé parmi les lignées testées. Par la suite, nous nous sommes intéressés aux paramètres intervenant dans la réalisation du système d'implantation afin d'optimaliser les conditions pour une meilleure adaptation des cellules à ce nouvel environnement. En effet, en raison de l'apparition, après un mois in vivo, de calcifications au niveau de la matrice de polyvinyl alcohol (PVA) servant de support aux cellules encapsulées, différents paramètres ont été étudiés, tels que les procédures de fabrication, les lots de PVA ou encore les lignées cellulaires encapsulées, afin de mettre en évidence leur rôle respectif dans la survenue de ce processus. Nous avons montré que l'apparition des calcifications peut être totalement prévenue par l'utilisation d'eau pure au lieu de tampon phosphaté lors du rinçage des matrices de PVA. De plus, nous avons observe qu'un taux de lactate déshydrogénase cellulaire élevé était corrélé avec une diminution des dépôts de calcium au sein de la matrice en raison d'un micro-environnement plus acide inhibant la précipitation du calcium. Après sélection de la lignée cellulaire appropriée et de l'optimisation des conditions d'encapsulation, une modification génétique des fibroblastes DARC 3.1 a été réalisée par une approche rétrovirale, permettant l'insertion de l'ADN du gène de l'Epo dans le génome cellulaire. Diverses modifications, tant au niveau génétique qu'au niveau des conditions d'infection, ont été entreprises afin d'obtenir des taux de sécrétion d'Epo cliniquement appropriés. L'insertion dans la séquence d'ADN d'un élément de régulation post¬transcriptionnelle dérivé du virus de l'hépatite du rongeur (« woodchuck ») ainsi que d'une séquence consensus appelée « Kozak » ont abouti à une augmentation de sécrétion d'Epo 7.5 fois plus importante. De même, l'optimisation de la multiplicité d'infection et la sélection plus drastique des cellules hautement productrices ont permis finalement d'obtenir une sécrétion correspondant à 200 IU d'Epo/10E6 cells/jour. Ces cellules génétiquement modifiées ont été encapsulées et implantées in vivo dans les mêmes conditions que celles décrites plus haut. Toutes les souris transplantées ont montré une augmentation significative de leur hématocrite et une proportion importante de cellules présentait une survie conservée au moment de l'explantation des capsules. Finalement, dans la perspective d'une application humaine, un modèle syngénique a été proposé, basé sur l'implantation de myoblastes murins encapsulés dans des souris, afin d'investiguer les rôles respectifs de la réponse immunitaire du receveur et des besoins métaboliques cellulaires sur leur survie à long terme. Les cellules ont été encapsulées à différentes densités et les animaux transplantés se sont vus administrer des injections de molécules anti-inflammatoires ou immunosuppressives. Les résultats ont démontré qu'une réaction immunologique péri-capsulaire était à la base du rejet cellulaire dans le cas de capsules à haute densité cellulaire. Une matrice de PVA peut limiter cette densité et éviter une mort cellulaire précoce due à une insuffisance métabolique et par conséquent prévenir la réaction immunitaire. Ce travail a permis le développement de cellules encapsulées d'origine humaine sécrétant des taux d'Epo humaine adaptés à des traitements cliniques. De pair avec l'optimalisation des paramètres d'encapsulation, ces résultats ont abouti à l'initiation d'une application clinique destinée à des patients en insuffisance rénale terminale.