124 resultados para Maintenance peritoneal dialysis
Resumo:
BACKGROUND: Hyperhomocysteinaemia has been identified as an independent cardiovascular risk factor and is found in more than 85% of patients on maintenance haemodialysis. Previous studies have shown that folic acid can lower circulating homocysteine in dialysis patients. We evaluated prospectively the effect of increasing the folic acid dosage from 1 to 6 mg per dialysis on plasma total homocysteine levels of haemodialysis patients with and without a history of occlusive vascular artery disease (OVD). METHODS: Thirty-nine stable patients on high-flux dialysis were studied. Their mean age was 63 +/-11 years and 17 (43%) had a history of OVD, either coronary and/or cerebral and/or peripheral occlusive disease. For several years prior to the study, the patients had received an oral post-dialysis multivitamin supplement including 1 mg of folic acid per dialysis. After baseline determinations, the folic acid dose was increased from 1 to 6 mg/dialysis for 3 months. RESULTS: After 3 months, plasma homocysteine had decreased significantly by approximately 23% from 31.1 +/- 12.7 to 24.5 +/- 9 micromol/l (P = 0.0005), while folic acid concentrations had increased from 6.5 +/- 2.5 to 14.4+/-2.5 microg/l (P < 0.0001). However, the decrease of homocysteine was quite different in patients with and in those without OVD. In patients with OVD, homocysteine decreased only marginally by approximately 2.5% (from 29.0 +/- 10.3 to 28.3 +/- 8.4 micromol/l, P = 0.74), whereas in patients without OVD there was a significant reduction of approximately 34% (from 32.7+/-14.4 to 21.6+/-8.6 micromol/l, P = 0.0008). Plasma homocysteine levels were reduced by > 15% in three patients (18%) in the group with OVD compared with 19 (86%) in the group without OVD (P = 0.001), and by > 30% in none of the patients (0%) in the former group compared with 13 (59%) in the latter (P = 0.001). CONCLUSIONS: These results indicate that the homocysteine-lowering effect of folic acid administration appears to be less effective in haemodialysis patients having occlusive vascular disease than in those without evidence of such disease.
Resumo:
Crohn's disease (CD), a major form of human inflammatory bowel disease, is characterized by primary immunodeficiencies. The nuclear receptor peroxisome proliferator-activated receptor gamma (PPARgamma) is essential for intestinal homeostasis in response to both dietary- and microbiota-derived signals. Its role in host defense remains unknown, however. We show that PPARgamma functions as an antimicrobial factor by maintaining constitutive epithelial expression of a subset of beta-defensin in the colon, which includes mDefB10 in mice and DEFB1 in humans. Colonic mucosa of Ppargamma mutant animals shows defective killing of several major components of the intestinal microbiota, including Candida albicans, Bacteroides fragilis, Enterococcus faecalis, and Escherichia coli. Neutralization of the colicidal activity using an anti-mDefB10 blocking antibody was effective in a PPARgamma-dependent manner. A functional promoter variant that is required for DEFB1 expression confers strong protection against Crohn's colitis and ileocolitis (odds ratio, 0.559; P = 0.018). Consistently, colonic involvement in CD is specifically linked to reduced expression of DEFB1 independent of inflammation. These findings support the development of PPARgamma-targeting therapeutic and/or nutritional approaches to prevent colonic inflammation by restoring antimicrobial immunity in CD.
Resumo:
BACKGROUND: High-dose chemotherapy with autologous stem-cell transplantation is a standard treatment for young patients with multiple myeloma. Residual disease is almost always present after transplantation and is responsible for relapse. This phase 3, placebo-controlled trial investigated the efficacy of lenalidomide maintenance therapy after transplantation. METHODS: We randomly assigned 614 patients younger than 65 years of age who had nonprogressive disease after first-line transplantation to maintenance treatment with either lenalidomide (10 mg per day for the first 3 months, increased to 15 mg if tolerated) or placebo until relapse. The primary end point was progression-free survival. RESULTS: Lenalidomide maintenance therapy improved median progression-free survival (41 months, vs. 23 months with placebo; hazard ratio, 0.50; P<0.001). This benefit was observed across all patient subgroups, including those based on the β(2)-microglobulin level, cytogenetic profile, and response after transplantation. With a median follow-up period of 45 months, more than 70% of patients in both groups were alive at 4 years. The rates of grade 3 or 4 peripheral neuropathy were similar in the two groups. The incidence of second primary cancers was 3.1 per 100 patient-years in the lenalidomide group versus 1.2 per 100 patient-years in the placebo group (P=0.002). Median event-free survival (with events that included second primary cancers) was significantly improved with lenalidomide (40 months, vs. 23 months with placebo; P<0.001). CONCLUSIONS: Lenalidomide maintenance after transplantation significantly prolonged progression-free and event-free survival among patients with multiple myeloma. Four years after randomization, overall survival was similar in the two study groups. (Funded by the Programme Hospitalier de Recherche Clinique and others; ClinicalTrials.gov number, NCT00430365.).
Resumo:
Newer chemotherapeutic protocols as well as high-dose chemotherapy have increased the response rate in myeloma. However, these treatments are not curative. Effective maintenance strategies are now required to prolong the duration of response. We conducted a randomized trial of maintenance treatment with thalidomide and pamidronate. Two months after high-dose therapy, 597 patients younger than age 65 years were randomly assigned to receive no maintenance (arm A), pamidronate (arm B), or pamidronate plus thalidomide (arm C). A complete or very good partial response was achieved by 55% of patients in arm A, 57% in arm B, and 67% in arm C (P = .03). The 3-year postrandomization probability of event-free survival was 36% in arm A, 37% in arm B, and 52% in arm C (P < .009). The 4-year postdiagnosis probability of survival was 77% in arm A, 74% in arm B, and 87% in arm C (P < .04). The proportion of patients who had skeletal events was 24% in arm A, 21% in arm B, and 18% in arm C (P = .4). Thalidomide is an effective maintenance therapy in patients with multiple myeloma. Maintenance treatment with pamidronate does not decrease the incidence of bone events.
Resumo:
Serum-free aggregating brain cell cultures are free-floating three-dimensional primary cell cultures able to reconstitute spontaneously a histotypic brain architecture to reproduce critical steps of brain development and to reach a high level of structural and functional maturity. This culture system offers, therefore, a unique model for neurotoxicity testing both during the development and at advanced cellular differentiation, and the high number of aggregates available combined with the excellent reproducibility of the cultures facilitates routine test procedures. This chapter presents a detailed description of the preparation, maintenance, and use of these cultures for neurotoxicity studies and a comparison of the developmental characteristics between cultures derived from the telencephalon and cultures derived from the whole brain. For culture preparation, mechanically dissociated embryonic brain tissue is used. The initial cell suspension, composed of neural stem cells, neural progenitor cells, immature postmitotic neurons, glioblasts, and microglial cells, is kept in a serum-free, chemically defined medium under continuous gyratory agitation. Spherical aggregates form spontaneously and are maintained in suspension culture for several weeks. Within the aggregates, the cells rearrange and mature, reproducing critical morphogenic events, such as migration, proliferation, differentiation, synaptogenesis, and myelination. For experimentation, replicate cultures are prepared by the randomization of aggregates from several original flasks. The high yield and reproducibility of the cultures enable multiparametric endpoint analyses, including "omics" approaches.
Resumo:
Psychotic patients to not access easily to psychiatric care. First, psychotic disorders are difficult to identify among a great number of non psychotic depressive and anxious disorders. Second, inpatient care has shortened and now focus on acute care rather than long stay. For some psychotic patients, desinstitutionalization means exclusion and marginalization. Intensive case management can answer these needs in collaboration with relatives and professionals of patient's social network. Results and care's steps of intensive case management as practiced in Lausanne are described and illustrated with cases vignettes.
Resumo:
BACKGROUND: The obective of this study was to perform a cost-effectiveness analysis comparing intermittent with continuous renal replacement therapy (IRRT versus CRRT) as initial therapy for acute kidney injury (AKI) in the intensive care unit (ICU). METHODS: Assuming some patients would potentially be eligible for either modality, we modeled life year gained, the quality-adjusted life years (QALYs) and healthcare costs for a cohort of 1000 IRRT patients and a cohort of 1000 CRRT patients. We used a 1-year, 5-year and a lifetime horizon. A Markov model with two health states for AKI survivors was designed: dialysis dependence and dialysis independence. We applied Weibull regression from published estimates to fit survival curves for CRRT and IRRT patients and to fit the proportion of dialysis dependence among CRRT and IRRT survivors. We then applied a risk ratio reported in a large retrospective cohort study to the fitted CRRT estimates in order to determine the proportion of dialysis dependence for IRRT survivors. We conducted sensitivity analyses based on a range of differences for daily implementation cost between CRRT and IRRT (base case: CRRT day $632 more expensive than IRRT day; range from $200 to $1000) and a range of risk ratios for dialysis dependence for CRRT as compared with IRRT (from 0.65 to 0.95; base case: 0.80). RESULTS: Continuous renal replacement therapy was associated with a marginally greater gain in QALY as compared with IRRT (1.093 versus 1.078). Despite higher upfront costs for CRRT in the ICU ($4046 for CRRT versus $1423 for IRRT in average), the 5-year total cost including the cost of dialysis dependence was lower for CRRT ($37 780 for CRRT versus $39 448 for IRRT on average). The base case incremental cost-effectiveness analysis showed that CRRT dominated IRRT. This dominance was confirmed by extensive sensitivity analysis. CONCLUSIONS: Initial CRRT is cost-effective compared with initial IRRT by reducing the rate of long-term dialysis dependence among critically ill AKI survivors.
Resumo:
Here we discuss life-history evolution from the perspective of adaptive phenotypic plasticity, with a focus on polyphenisms for somatic maintenance and survival. Polyphenisms are adaptive discrete alternative phenotypes that develop in response to changes in the environment. We suggest that dauer larval diapause and its associated adult phenotypes in the nematode (Caenorhabditis elegans), reproductive dormancy in the fruit fly (Drosophila melanogaster) and other insects, and the worker castes of the honey bee (Apis mellifera) are examples of what may be viewed as the polyphenic regulation of somatic maintenance and survival. In these and other cases, the same genotype can--depending upon its environment--express either of two alternative sets of life-history phenotypes that differ markedly with respect to somatic maintenance, survival ability, and thus life span. This plastic modulation of somatic maintenance and survival has traditionally been underappreciated by researchers working on aging and life history. We review the current evidence for such adaptive life-history switches and their molecular regulation and suggest that they are caused by temporally and/or spatially varying, stressful environments that impose diversifying selection, thereby favoring the evolution of plasticity of somatic maintenance and survival under strong regulatory control. By considering somatic maintenance and survivorship from the perspective of adaptive life-history switches, we may gain novel insights into the mechanisms and evolution of aging.
Resumo:
Summary Gynodioecy, the joint occurrence of females and hermaphrodites within natural populations, is a widely studied mating system ever since Darwin (1877). It is an exceptional mating system because continuous selection is necessary to maintain it. Since females only reproduce through ovules whereas hermaphrodites transmit genes through ovules and pollen, larger female fitness, in terms of seed output, is required to allow their maintenance. Two non-exclusive mechanisms can account for the maintenance of females. First, as females do not produce pollen they can reallocate their resources towards a higher ovule production. Second, hermaphrodites can self- and cross-fertilize whereas females are obligate outcrossers. Thus hermaphrodites should partly suffer from inbreeding depression (i.e.: the fitness decline of inbred relative to outbred individuals) and thereby produce less fit progeny than females. This thesis investigated the effects of self- and cross-fertilization of heimaphrodites over two consecutive generations. Inbreeding depression increased across the successive stages of the life- cycle (i.e.: from "seed traits" to "reproductive traits") displaying large inbreeding depression estimates (up to 0.76). This investigation not only detected large inbreeding depression estimates but also detected mechanisms involved in the maintenance of inbreeding depression. For instance cryptic self-incompatibility which is here a larger in vivo pollen performance of distant pollen compared to self-pollen; the expression of inbreeding depression especially in late life-cycle stages, and the appearance of females in the progeny of selfed hermaphrodites. The female biased sex ratio in the progeny of selfed hermaphrodites was a surprising result and could either come from the sex determining mechanisms (complex nucleo-cytoplasmic interaction(s)) and/or from inbreeding depression. Indeed, we not only got females and hermaphrodites but also partial male-sterile (PMS) individuals (i.e.: individuals with differing number of viable stamens). We detected that inbred pollen bearing plants (excluding females) have less viable stamens per flower than outbred plants. A positive correlation was detected between inbreeding depression for the number of viable stamens per flower and the difference in sex ratio between inbred and outbred individuals. A positive relationship was also detected between inbreeding depression for pollen viability and inbreeding depression for number of viable stamens per flower. Each correlation can either account for pleiotropic effects (a major gene acting on the two considered traits) or linkage disequilibrium between genes controlling each of the two related traits. If we hypothesize that these correlations are due to a major gene with pleiotropic effects, the positive relationship between inbreeding depression for number of viable stamens per flower and inbreeding depression for pollen viability showed that deleterious alleles present on a major gene coding for pollen production and viability depressed male fitness within inbred plants. The positive relationship between sex ratio difference between inbred and outbred individuals and inbreeding depression for number of viable stamens per flower indicates that (1) either number of viable stamens per flower is, in addition to inbreeding, also affected by the loci coding for sex determinism or, (2) the presence of females within the progeny of selfed hermaphrodites is a consequence of large inbreeding depression inhibiting pollen production, or (3) sex is here determined by a combination of loci coding for sex expression and inbreeding depression for male reproductive traits. In conclusion, Silene vulgaris has been shown to be a good model for understanding the evolution of mating systems that promote outbreeding. Résumé La gynodïoécie est définie comme étant la présence simultanée d'hermaphrodites et de femelles au sein de populations naturelles d'une même espèce. Ce système de reproduction a toujours fasciné le monde scientifique depuis Darwin, comme en témoigne ses écrits (1876, 1877) sur les systèmes de reproduction chez les plantes. Les femelles ne transmettent leurs gènes qu'à travers leurs ovules alors que les hermaphrodites transmettent leurs gènes à la fois par la voie mâle (le pollen) et la voie femelle (les ovules). La condition pour que la gynodïoécie se maintienne nécessite donc une fitness de la fonction femelle plus élevée chez les femelles que chez les hermaphrodites. Deux mécanismes mutuellement non exclusifs peuvent expliquer le maintien des femelles au sein de ces populations gynodioïques. D'une part, les femelles peuvent réallouer les ressources non utilisées pour la production de pollen et peuvent par conséquent produire plus d'ovules. D'autre part, la reproduction des femelles ne peut se faire que par allo-fécondation alors que les hermaphrodites, peuvent se reproduire à la fois par auto- et allo-fécondation. L'autofécondation s'accompagne en général d'une diminution de fitness de la descendance relativement à la progéniture issue d'allo-fécondation ; ce phénomène est connu sous le nom de dépression de consanguinité. Cette thèse avait pour but de mettre en évidence une éventuelle dépression de consanguinité chez Silene vulgaris, une espèce gynodioïque. Des hermaphrodites, issus de trois vallées alpines, ont été auto- et allo¬fécondés sur deux générations successives. La dépression de consanguinité pouvant s'exprimer à tous les stades de vie d'un individu, plusieurs traits de fitness, allant du nombre de graines par fruit à la production de gamètes ont été mesurés sur différents stades de vie successifs. L'estimation de la dépression de consanguinité totale atteignait des valeurs allant de 0.52 à 0.76 selon la vallée considérée, ce qui indiquerait que les hermaphrodites ont tout intérêt à limiter l'autofécondation et que les femelles ne devraient pas avoir de peine à subsister dans les vallées étudiées. Par la même occasion des mécanismes diminuant la purge potentielle du fardeau génétique, et permettant ainsi le maintien du « niveau » de dépression de consanguinité et par conséquence le maintien de la gynodïoécie ont été mis en évidence. En effet, nos résultats montrent que la dépression de consanguinité s'exprimait tard dans le cycle de vie permettant ainsi à un certain nombre individus consanguins de transmettre leurs allèles délétères à la génération suivante. D'autre part, la croissance in vivo des tubes polliniques d'auto-pollen était plus lente que celle de l'allo-pollen et donc en situation de compétition directe, les ovules devraient plutôt être issus d'allo-fécondation, diminuant ainsi les chances de purges d'allèles délétères. Enfin, l'apparition de femelles dans la progéniture d'hermaphrodites autofécondés diminue aussi les chances de purge d'allèles délétères. Il nous a été impossible de déterminer si l'apparition de femelles dans la descendance d'hermaphrodites autofécondés était due au déterminisme génétique du sexe ou si la différence de sexe ratio entre la descendance auto- et allo-fécondée était due à une éventuelle dépression de consanguinité inhibant la production de pollen. Nous avons observé que S. vulgaris ne présentaient pas uniquement des hermaphrodites et des femelles mais aussi toute sorte d'individus intermédiaires avec un nombre variable d'étamines viables. Nous avons pu mettre' en évidence des corrélations positives entre (1) la différence de sexe ratio (la proportion d'individus produisant du pollen) entre individus consanguins et non consanguins et une estimation de la dépression de consanguinité pour le nombre d'étamines viables d'individus produisant du pollen, ainsi qu'entre (2) la dépression de consanguinité pour le nombre d'étamines viables et celle estimée pour la viabilité du pollen. Chaque corrélation indique soit l'effet d'un (ou plusieurs) gène(s) pléiotropique(s), soit un déséquilibre de liaison entre les gènes. En considérant que ces corrélations sont le résultat d'effet pléiotropiques, la relation entre le nombre d'étamines viables par fleur et la viabilité du pollen, indiquerait un effet négatif de la consanguinité sur la production et la viabilité du pollen due partiellement à un gène majeur. La seconde corrélation indiquerait soit que les gènes responsables de la détermination du sexe agissent aussi sur l'expression de la fonction mâle soit que l'expression du sexe est sujette à la dépression de consanguinité, ou encore un mélange des deux. Aux regards de ces résultats, Silene vulgaris s'est avéré être un bon modèle de compréhension de l'évolution des systèmes de reproduction vers la séparation des sexes.
Resumo:
Growing evidence suggests that the patient's immune response may play a major role in the long-term efficacy of antibody therapies of follicular lymphoma (FL). Particular long-lasting recurrence free survivals have been observed after first line, single agent rituximab or after radioimmunotherapy (RIT). Rituximab maintenance, furthermore, has a major efficacy in prolonging recurrence free survival after chemotherapy. On the other hand, RIT as a single step treatment showed a remarkable capacity to induce complete and partial remissions when applied in recurrence and as initial treatment of FL or given for consolidation. These clinical results strongly suggest that RIT combined with rituximab maintenance could stabilize the high percentages of patients with CR and PR induced by RIT. While the precise mechanisms of the long-term efficacy of these 2 treatments are not elucidated, different observations suggest that the patient's T cell immune response could be decisive. With this review, we discuss the potential role of the patient's immune system under rituximab and RIT and argue that the T cell immunity might be particularly promoted when combining the 2 antibody treatments in the early therapy of FL.
Resumo:
BACKGROUND: Guidelines for the management of anaemia in patients with chronic kidney disease (CKD) recommend a minimal haemoglobin (Hb) target of 11 g/dL. Recent surveys indicate that this requirement is not met in many patients in Europe. In most studies, Hb is only assessed over a short-term period. The aim of this study was to examine the control of anaemia over a continuous long-term period in Switzerland. METHODS: A prospective multi-centre observational study was conducted in dialysed patients treated with recombinant human epoetin (EPO) beta, over a one-year follow-up period, with monthly assessments of anaemia parameters. RESULTS: Three hundred and fifty patients from 27 centres, representing 14% of the dialysis population in Switzerland, were included. Mean Hb was 11.9 +/- 1.0 g/dL, and remained stable over time. Eighty-five % of the patients achieved mean Hb >or= 11 g/dL. Mean EPO dose was 155 +/- 118 IU/kg/week, being delivered mostly by subcutaneous route (64-71%). Mean serum ferritin and transferrin saturation were 435 +/- 253 microg/L and 30 +/- 11%, respectively. At month 12, adequate iron stores were found in 72.5% of patients, whereas absolute and functional iron deficiencies were observed in only 5.1% and 17.8%, respectively. Multivariate analysis showed that diabetes unexpectedly influenced Hb towards higher levels (12.1 +/- 0.9 g/dL; p = 0.02). One year survival was significantly higher in patients with Hb >or= 11 g/dL than in those with Hb <11 g/dL (19.7% vs 7.3%, p = 0.006). CONCLUSION: In comparison to European studies of reference, this survey shows a remarkable and continuous control of anaemia in Swiss dialysis centres. These results were reached through moderately high EPO doses, mostly given subcutaneously, and careful iron therapy management.
Resumo:
BACKGROUND: Exposure to particles (PM) induces adverse health effects (cancer, cardiovascular and pulmonary diseases). A key-role in these adverse effects seems to be played by oxidative stress, which is an excess of reactive oxygen species relative to the amount of reducing species (including antioxidants), the first line of defense against reactive oxygen species. The aim of this study was to document the oxidative stress caused by exposure to respirable particles in vivo, and to test whether exposed workers presented changes in their urinary levels for reducing species.METHODS: Bus depot workers (n = 32) exposed to particles and pollutants (respirable PM4, organic and elemental carbon, particulate metal content, polycyclic aromatic hydrocarbons, NOx, O3) were surveyed over two consecutive days. We collected urine samples before and after each shift, and quantified an oxidative stress biomarker (8-hydroxy-2'-deoxyguanosine), the reducing capacity and a biomarker of PAH exposure (1-hydroxypyrene). We used a linear mixed model to test for associations between the oxidative stress status of the workers and their particle exposure as well as with their urinary level of reducing species.RESULTS: Workers were exposed to low levels of respirable PM4 (range 25-71 μg/m3). However, urinary levels of 8-hydroxy-2'-deoxyguanosine increased significantly within each shift and between both days for non-smokers. The between-day increase was significantly correlated (p < 0.001) with the concentrations of organic carbon, NOx, and the particulate copper content. The within-shift increase in 8OHdG was highly correlated to an increase of the urinary reducing capacity (Spearman ρ = 0.59, p < 0.0001).CONCLUSIONS: These findings confirm that exposure to components associated to respirable particulate matter causes a systemic oxidative stress, as measured with the urinary 8OHdG. The strong association observed between urinary 8OHdG with the reducing capacity is suggestive of protective or other mechanisms, including circadian effects. Additional investigations should be performed to understand these observations.
Resumo:
How have changes in communications technology affected the way that misinformation spreads through a population and persists? To what extent do differences in the architecture of social networks affect the spread of misinformation, relative to the rates and rules by which individuals transmit or eliminate different pieces of information (cultural traits)? Here, we use analytical models and individual-based simulations to study how a 'cultural load' of misinformation can be maintained in a population under a balance between social transmission and selective elimination of cultural traits with low intrinsic value. While considerable research has explored how network architecture affects percolation processes, we find that the relative rates at which individuals transmit or eliminate traits can have much more profound impacts on the cultural load than differences in network architecture. In particular, the cultural load is insensitive to correlations between an individual's network degree and rate of elimination when these quantities vary among individuals. Taken together, these results suggest that changes in communications technology may have influenced cultural evolution more strongly through changes in the amount of information flow, rather than the details of who is connected to whom.