92 resultados para STABLE PATIENTS

em Université de Lausanne, Switzerland


Relevância:

80.00% 80.00%

Publicador:

Resumo:

Background/Purpose: The primary treatment goals for gouty arthritis (GA) are rapid relief of pain and inflammation during acute attacks, and long-term hyperuricemia management. A post-hoc analysis of 2 pivotal trials was performed to assess efficacy and safety of canakinumab (CAN), a fully human monoclonal anti-IL-1_ antibody, vs triamcinolone acetonide (TA) in GA patients unable to use NSAIDs and colchicine, and who were on stable urate lowering therapy (ULT) or unable to use ULT. Methods: In these 12-week, randomized, multicenter, double-blind, double-dummy, active-controlled studies (_-RELIEVED and _-RELIEVED II), patients had to have frequent attacks (_3 attacks in previous year) meeting preliminary GA ACR 1977 criteria, and were unresponsive, intolerant, or contraindicated to NSAIDs and/or colchicine, and if on ULT, ULT was stable. Patients were randomized during an acute attack to single dose CAN 150 mg s.c. or TA 40 mg i.m. and were redosed "on demand" for each new attack. Patients completing the core studies were enrolled into blinded 12-week extension studies to further investigate on-demand use of CAN vs TA for new attacks. The subpopulation selected for this post-hoc analysis was (a) unable to use NSAIDs and colchicine due to contraindication, intolerance or lack of efficacy for these drugs, and (b) currently on ULT, or contraindication or previous failure of ULT, as determined by investigators. Subpopulation comprised 101 patients (51 CAN; 50 TA) out of 454 total. Results: Several co-morbidities, including hypertension (56%), obesity (56%), diabetes (18%), and ischemic heart disease (13%) were reported in 90% of this subpopulation. Pain intensity (VAS 100 mm scale) was comparable between CAN and TA treatment groups at baseline (least-square [LS] mean 74.6 and 74.4 mm, respectively). A significantly lower pain score was reported with CAN vs TA at 72 hours post dose (1st co-primary endpoint on baseline flare; LS mean, 23.5 vs 33.6 mm; difference _10.2 mm; 95% CI, _19.9, _0.4; P_0.0208 [1-sided]). CAN significantly reduced risk for their first new attacks by 61% vs TA (HR 0.39; 95% CI, 0.17-0.91, P_0.0151 [1-sided]) for the first 12 weeks (2nd co-primary endpoint), and by 61% vs TA (HR 0.39; 95% CI, 0.19-0.79, P_0.0047 [1-sided]) over 24 weeks. Serum urate levels increased for CAN vs TA with mean change from baseline reaching a maximum of _0.7 _ 2.0 vs _0.1 _ 1.8 mg/dL at 8 weeks, and _0.3 _ 2.0 vs _0.2 _ 1.4 mg/dL at end of study (all had GA attack at baseline). Adverse Events (AEs) were reported in 33 (66%) CAN and 24 (47.1%) TA patients. Infections and infestations were the most common AEs, reported in 10 (20%) and 5 (10%) patients treated with CAN and TA respectively. Incidence of SAEs was comparable between CAN (gastritis, gastroenteritis, chronic renal failure) and TA (aortic valve incompetence, cardiomyopathy, aortic stenosis, diarrohea, nausea, vomiting, bicuspid aortic valve) groups (2 [4.0%] vs 2 [3.9%]). Conclusion: CAN provided superior pain relief and reduced risk of new attack in highly-comorbid GA patients unable to use NSAIDs and colchicine, and who were currently on stable ULT or unable to use ULT. The safety profile in this post-hoc subpopulation was consistent with the overall _-RELIEVED and _-RELIEVED II population.

Relevância:

70.00% 70.00%

Publicador:

Resumo:

OBJECTIVES: The role of beta-blockers in the treatment of hypertension is discussed controversially and the data showing a clear benefit in acute coronary syndromes (ACS) were obtained in the thrombolysis era. The goal of this study was to analyze the role of pretreatment with beta-blockers in patients with ACS. METHODS: Using data from the Acute Myocardial Infarction in Switzerland (AMIS Plus) registry, we analyzed outcomes of patients with beta-blocker pretreatment in whom they were continued during hospitalization (group A), those without beta-blocker pretreatment but with administration after admission (group B) and those who never received them (group C). Major adverse cardiac events defined as composed endpoint of re-infarction and stroke (during hospitalization) and/or in-hospital death were compared between the groups. RESULTS: A total of 24,709 patients were included in the study (6,234 in group A, 12,344 in group B, 6,131 in group C). Patients of group B were younger compared to patients of group A and C (62.5, 67.6 and 68.4, respectively). In the multivariate analysis, odds ratio for major adverse cardiac events was 0.59 (CI 0.47-0.74) for group A and 0.66 (CI 0.55-0.83) for group B, while group C was taken as a reference. CONCLUSIONS: beta-Blocker therapy is beneficial in ACS and they should be started in those who are not pretreated and continued in stable patients who had been on chronic beta-blocker therapy before.

Relevância:

70.00% 70.00%

Publicador:

Resumo:

Background: In haemodynamically stable patients with acute symptomatic pulmonary embolism (PE), studies have not evaluated the usefulness of combining the measurement of cardiac troponin, transthoracic echocardiogram (TTE), and lower extremity complete compression ultrasound (CCUS) testing for predicting the risk of PE-related death. Methods: The study assessed the ability of three diagnostic tests (cardiac troponin I (cTnI), echocardiogram, and CCUS) to prognosticate the primary outcome of PE-related mortality during 30 days of follow-up after a diagnosis of PE by objective testing. Results: Of 591 normotensive patients diagnosed with PE, the primary outcome occurred in 37 patients (6.3%; 95% CI 4.3% to 8.2%). Patients with right ventricular dysfunction (RVD) by TTE and concomitant deep vein thrombosis (DVT) by CCUS had a PE-related mortality of 19.6%, compared with 17.1% of patients with elevated cTnI and concomitant DVT and 15.2% of patients with elevated cTnI and RVD. The use of any two-test strategy had a higher specificity and positive predictive value compared with the use of any test by itself. A combined three-test strategy did not further improve prognostication. For a subgroup analysis of high-risk patients, according to the pulmonary embolism severity index (classes IV and V), positive predictive values of the two-test strategies for PE-related mortality were 25.0%, 24.4% and 20.7%, respectively. Conclusions: In haemodynamically stable patients with acute symptomatic PE, a combination of echocardiography (or troponin testing) and CCUS improved prognostication compared with the use of any test by itself for the identification of those at high risk of PE-related death.

Relevância:

70.00% 70.00%

Publicador:

Resumo:

Objective: Recovery-oriented care for patients with schizophrenia involves consideration of cultural issues, such as religion and spirituality. However, there is evidence that psychiatrists rarely address such topics. This study examined acceptance of a spiritual assessment by patients and clinicians, suggestions for treatment that arose from the assessment, and patient outcomes-in terms of treatment compliance and satisfaction with care (as measured by treatment alliance). Methods: Outpatients with psychosis were randomly assigned to two groups: an intervention group that received traditional treatment and a religious and spiritual assessment (N=40) and a control group that received only traditional treatment (N=38). Eight psychiatrists were trained to administer the assessment to their established and stable patients. After each administration, the psychiatrist attended a supervision session with a psychiatrist and a psychologist of religion. Baseline and three-month data were collected. Results: The spiritual assessment was well accepted by patients. During supervision, psychiatrists reported potential clinical uses for the assessment information for 67% of patients. No between-group differences in medication adherence and satisfaction with care were found at three months, although patients in the in- tervention group had significantly better appointment attendance dur- ing the follow-up period. Their interest in discussing religion and spirituality with their psychiatrists remained high. The process was not as well accepted by psychiatrists. Conclusions: Spiritual assessment can raise important clinical issues in the treatment of patients with chronic schizophrenia. Cultural factors, such as religion and spirituality, should be considered early in clinical training, because many clinicians are not at ease addressing such topics with patients.

Relevância:

70.00% 70.00%

Publicador:

Resumo:

This part of the EFISG guidelines focuses on non-neutropenic adult patients. Only a few of the numerous recommendations can be summarized in the abstract. Prophylactic usage of fluconazole is supported in patients with recent abdominal surgery and recurrent gastrointestinal perforations or anastomotic leakages. Candida isolation from respiratory secretions alone should never prompt treatment. For the targeted initial treatment of candidaemia, echinocandins are strongly recommended while liposomal amphotericin B and voriconazole are supported with moderate, and fluconazole with marginal strength. Treatment duration for candidaemia should be a minimum of 14 days after the end of candidaemia, which can be determined by one blood culture per day until negativity. Switching to oral treatment after 10 days of intravenous therapy has been safe in stable patients with susceptible Candida species. In candidaemia, removal of indwelling catheters is strongly recommended. If catheters cannot be removed, lipid-based amphotericin B or echinocandins should be preferred over azoles. Transoesophageal echocardiography and fundoscopy should be performed to detect organ involvement. Native valve endocarditis requires surgery within a week, while in prosthetic valve endocarditis, earlier surgery may be beneficial. The antifungal regimen of choice is liposomal amphotericin B +/- flucytosine. In ocular candidiasis, liposomal amphotericin B +/- flucytosine is recommended when the susceptibility of the isolate is unknown, and in susceptible isolates, fluconazole and voriconazole are alternatives. Amphotericin B deoxycholate is not recommended for any indication due to severe side effects.

Relevância:

70.00% 70.00%

Publicador:

Resumo:

BACKGROUND: Hyperhomocysteinaemia has been identified as an independent cardiovascular risk factor and is found in more than 85% of patients on maintenance haemodialysis. Previous studies have shown that folic acid can lower circulating homocysteine in dialysis patients. We evaluated prospectively the effect of increasing the folic acid dosage from 1 to 6 mg per dialysis on plasma total homocysteine levels of haemodialysis patients with and without a history of occlusive vascular artery disease (OVD). METHODS: Thirty-nine stable patients on high-flux dialysis were studied. Their mean age was 63 +/-11 years and 17 (43%) had a history of OVD, either coronary and/or cerebral and/or peripheral occlusive disease. For several years prior to the study, the patients had received an oral post-dialysis multivitamin supplement including 1 mg of folic acid per dialysis. After baseline determinations, the folic acid dose was increased from 1 to 6 mg/dialysis for 3 months. RESULTS: After 3 months, plasma homocysteine had decreased significantly by approximately 23% from 31.1 +/- 12.7 to 24.5 +/- 9 micromol/l (P = 0.0005), while folic acid concentrations had increased from 6.5 +/- 2.5 to 14.4+/-2.5 microg/l (P < 0.0001). However, the decrease of homocysteine was quite different in patients with and in those without OVD. In patients with OVD, homocysteine decreased only marginally by approximately 2.5% (from 29.0 +/- 10.3 to 28.3 +/- 8.4 micromol/l, P = 0.74), whereas in patients without OVD there was a significant reduction of approximately 34% (from 32.7+/-14.4 to 21.6+/-8.6 micromol/l, P = 0.0008). Plasma homocysteine levels were reduced by > 15% in three patients (18%) in the group with OVD compared with 19 (86%) in the group without OVD (P = 0.001), and by > 30% in none of the patients (0%) in the former group compared with 13 (59%) in the latter (P = 0.001). CONCLUSIONS: These results indicate that the homocysteine-lowering effect of folic acid administration appears to be less effective in haemodialysis patients having occlusive vascular disease than in those without evidence of such disease.

Relevância:

70.00% 70.00%

Publicador:

Resumo:

Purpose: The accurate estimation of total energy expenditure (TEE) is essential to allow the provision of nutritional requirements in patients treated by maintenance hemodialysis (MHD). The measurement of TEE and resting energy expenditure (REE) by direct or indirect calorimetry and doubly labeled water are complicated, timeconsuming and cumbersome in this population. Recently, a new system called SenseWear® armband (SWA) was developed to assess TEE, physical activity and REE. This device works by measurements of body acceleration in two axes, heat production and steps counts. REE measured by indirect calorimetry and SWA are well correlated. The aim of this study was to determine TEE, physical activity and REE on patients on MHD using this new device. Methods and materials: Daily TEE, REE, step count, activity time, intensity of activity and lying time were determined for 7 consecutive days in unselected stable patients on MHD and sex, age and weightmatched healthy controls (HC). Patients with malnutrition, cancer, use of immunosuppressive drugs, hypoalbumemia <35 g/L and those hospitalized in the last 3 months, were excluded. For MHD patients, separate analyses were conducted in dialysis and non-dialysis days. Relevant parameters known to affect REE, such as BMI, albumin, pre-albumin, hemoglobin, Kt/V, CRP, bicarbonate, PTH, TSH, were recorded. Results: Thirty patients on MHD and 30 HC were included. In MHD patients, there were 20 men and 10 women. Age was 60,13 years ± 14.97 (mean ± SD), BMI was 25.77 kg/m² ± 4.73 and body weight was 74.65 kg ± 16.16. There were no significant differences between the two groups. TEE was lower in MHD patients compared to HC (28.79 ± 5.51 SD versus 32.91 ± 5.75 SD kcal/kg/day; p <0.01). Activity time was significantly lower in patients on MHD (101.3 ± 12.6SD versus 50.7 ± 9.4 SD min; p = 0.0021). Energy expenditure during the time of activity was significantly lower in MHD patients. MHD patients walked 4543 ± 643 SD vs 8537 ± 744 SD steps per day (p <0.0001). Age was negatively correlated with TEE (r = -0.70) and intensity of activity (r = -0.61) in HC, but not in patients on MHD. TEE showed no difference between dialysis and non-dialysis days (29.92 ± 2.03 SD versus 28.44 ± 1.90 SD kcal/kg/day; p = NS), reflecting a lack of difference in activity (number of steps, time of physical activity) and REE. This finding was observed in MHD patients both older and younger than 60 years. However, age stratification appeared to have an influence on TEE, regardless of dialysis day, (29.92 ± 2.07 SD kcal/kg/day for <60 years-old versus 27.41 ± 1.04 SD kcal/kg/day for ≥60 years old), although failing to reach statistical significance. Conclusion: Using SWA, we have shown that stable patients on MHD have a lower TEE than matched HC. On average, a TEE of 28.79 kcal/kg/day, partially affected by age, was measured. This finding gives support to the clinical impression that it is difficult and probably unnecessary to provide an energy amount of 30-35 kcal/kg/day, as proposed by international guidelines for this population. In addition, we documented for the first time that MHD patients exert a reduced physical activity as compared to HC. There were surprisingly no differences in TEE, REE and physical activity parameters between dialysis and non-dialysis days. This observation might be due to the fact that patients on MHD produce a physical effort to reach the dialysis centre. Age per se did not influence physical activity in MHD patients, contrary to HC, reflecting the impact of co-morbidities on physical activity in this group of patients.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Background and purpose: Decision making (DM) has been defined as the process through which a person forms preferences, selects and executes actions, and evaluates the outcome related to a selected choice. This ability represents an important factor for adequate behaviour in everyday life. DM impairment in multiple sclerosis (MS) has been previously reported. The purpose of the present study was to assess DM in patients with MS at the earliest clinically detectable time point of the disease. Methods: Patients with definite (n=109) or possible (clinically isolated syndrome, CIS; n=56) MS, a short disease duration (mean 2.3 years) and a minor neurological disability (mean EDSS 1.8) were compared to 50 healthy controls aged 18 to 60 years (mean age 32.2) using the Iowa Gambling Task (IGT). Subjects had to select a card from any of 4 decks (A/B [disadvantageous]; C/D [advantageous]). The game consisted of 100 trials then grouped in blocks of 20 cards for data analysis. Skill in DM was assessed by means of a learning index (LI) defined as the difference between the averaged last three block indexes and first two block indexes (LI=[(BI-3+BI-4+BI-5)/3-(BI-1+B2)/2]). Non parametric tests were used for statistical analysis. Results: LI was higher in the control group (0.24, SD 0.44) than in the MS group (0.21, SD 0.38), however without reaching statistical significance (p=0.7). Interesting differences were detected when MS patients were grouped according to phenotype. A trend to a difference between MS subgroups and controls was observed for LI (p=0.06), which became significant between MS subgroups (p=0.03). CIS patients who confirmed MS diagnosis by presenting a second relapse after study entry showed a dysfunction in the IGT in comparison to the other CIS (p=0.01) and definite MS (p=0.04) patients. In the opposite, CIS patients characterised by not entirely fulfilled McDonald criteria at inclusion and absence of relapse during the study showed an normal learning pattern on the IGT. Finally, comparing MS patients who developed relapses after study entry, those who remained clinically stable and controls, we observed impaired performances only in relapsing patients in comparison to stable patients (p=0.008) and controls (p=0.03). Discussion: These results raise the assumption of a sustained role for both MS relapsing activity and disease heterogeneity (i.e. infra-clinical severity or activity of MS) in the impaired process of decision making.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Background: Recent data have suggested that a population of CD4+ CD25high T cells, phenotypically characterized by the expression of CD45RO and CD127, is significantly expanded in stable liver and kidney transplant recipients and represents alloreactive T cells. We analyzed this putative new alloreactive cellular marker in various groups of kidney transplant recipients. Patients and methods: Flow cytometry was used to analyze the expression of CD25, CD45RO and CD127 on peripheral CD4+ T cells. Of 73 kidney recipients, 59 had a stable graft function under standard immunosuppressive therapy (IS), 5 had biopsy-proven chronic humoral rejection (CHR), 8 were stable under minimal IS and one was an operationally "tolerant" patient who had discontinued IS for more than 3 years. Sixty-six healthy subjects (HS) were studied as controls. Results: Overall, the alloreactive T cell population was found to be significantly increased in the 73 kidney recipients (mean ± SE: 15.03 ± 1.04% of CD4+ CD25high T cells) compared to HS (5.93 ± 0.39%) (p <0.001). In the 5 patients with CHR, this population was highly expanded (31.33 ± 4.16%), whereas it was comparable to HS in the 8 stable recipients receiving minimal IS (6.12 ± 0.86%), in 4 patients who had been switched to sirolimus (4.21 ± 0.53%) as well as in the unique "tolerant" recipient (4.69%). Intermediate levels (15.84 ± 0.93%) were found in the 55 recipients with stable graft function on standard CNI-based IS. Regulatory T cells, defined as CD4+ CD25high FoxP3+ CD127low, were found to be significantly reduced in all recipients except in those with minimal or no IS, and this reduction was particularly striking in recipients with CHR. Conclusion: After kidney transplantation, an alloreactive T cell population was found to be significantly expanded and it correlates with the clinical status of the recipients. Interestingly, in stable patients with minimal (or no) IS as well as in patients on sirolimus, alloreactive T cells were comparable the healthy controls. Measuring circulating CD4+ CD25high CD45RO+ CD127high T cells may become a useful monitoring tool after transplantation.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Introduction: Non-operative management (NOM) of blunt splenic injuries in hemodynamically stable patients is nowadays considered the standard treatment. Material and Methods: The aim was to clarify the criteria used for primary operative management (OM) and planned NOM. Furthermore, the study aimed to identify risk factors for failure of NOM. All adult patients with blunt splenic injuries treated from 2000-2008 were reviewed and a logistic regression analysis employed. Results: There were 206 patients (146 men, 70.9%). Mean age was 38.2 ± 19.1 years. The mean Injury Severity Score (ISS) was 30.9 ± 11.6. The American Association for the Surgery of Trauma (AAST) classification of the splenic injury was: grade I, n = 43 (20.9%); grade II, n = 52 (25.2%); grade III, n = 60 (29.1%), grade IV, n = 42 (20.4%) and grade V, n = 9 (4.4%). 47 patients (22.8%) required immediate surgery (OM). More than 5 units of red cell transfusions (odds ratio [OR] 13.72, P < 0.001), a Glasgow Coma Scale < 11 (OR 9.88, P = 0.009) and age ? 55 years (OR 3.29, P = 0.038) were associated with primary OM. 159 patients (77.2%) qualified for a non-surgical approach (NOM), which was successful in 89.9% (143/159). The overall splenic salvage rate amounted to 69.4% (143/206). Multiple logistic regression analysis found age ? 40 years to be the only factor significantly and independently related to the failure of NOM (OR 13.58, P = 0.001). Conclusion: Advanced age is associated with an increased failure rate of NOM in patients with blunt splenic injuries.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Introduction: Recent data have suggested that a population of CD4+ CD25high T cells, phenotypically characterized by the expression of CD45RO and CD127, is significantly expanded in stable liver and kidney transplant recipients and represents alloreactive T cells. We analyzed this putative new alloreactive cellular marker in various groups of kidney transplant recipients. Patients & methods: Flow cytometry was used to analyze the expression of CD25, CD45RO and CD127 on peripheral CD4+ T cells. Of 73 kidney transplant recipients, 59 had a stable graft function under standard immunosuppressive therapy (IS), 5 had biopsy-proven chronic humoral rejection (CHR), 8 were stable under minimal IS and one was an operationally "tolerant" patient who had discontinued IS for more than 3 years. Sixty-six healthy subjects (HS) were studied as controls. Results: Overall, the alloreactive T cell population was found to be significantly increased in the 73 kidney recipients (mean ± SE: 15.03 ± 1.04% of CD4+ CD25high T cells) compared to HS (5.93 ± 0.39%) (p<0.001). In the 5 patients with CHR, this population was highly expanded (31.33 ± 4.16%), whereas it was comparable to HS in the 8 stable recipients receiving minimal IS (6.12 ± 0.86%), in 4 patients who had been switched to sirolimus (4.21 ± 0.53%) as well as in the unique "tolerant" recipient (4.69%). Intermediate levels (15.84 ± 0.93%) were found in the 55 recipients with stable graft function on standard CNI-based IS. Regulatory T cells, defined as CD4+CD25high FoxP3+ CD127low, were found to be significantly reduced in all recipients except in those with minimal or no IS, and this reduction was particularly striking in recipients with CHR. Conclusion: After kidney transplantation, an alloreactive T cell population was found to be significantly expanded and it correlates with the clinical status of the recipients. Interestingly, in stable patients with minimal (or no) IS as well as in patients on sirolimus, alloreactive T cells were comparable the healthy controls. Measuring circulating CD4+CD25high CD45RO+ CD127high T cells may become a useful monitoring tool after transplantation.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

BACKGROUND: During the last decade, the management of blunt hepatic injury has considerably changed. Three options are available as follows: nonoperative management (NOM), transarterial embolization (TAE), and surgery. We aimed to evaluate in a systematic review the current practice and outcomes in the management of Grade III to V blunt hepatic injury. METHOD: The MEDLINE database was searched using PubMed to identify English-language citations published after 2000 using the key words blunt, hepatic injury, severe, and grade III to V in different combinations. Liver injury was graded according to the American Association for the Surgery of Trauma classification on computed tomography (CT). Primary outcome analyzed was success rate in intention to treat. Critical appraisal of the literature was performed using the validated National Institute for Health and Care Excellence "Quality Assessment for Case Series" system. RESULTS: Twelve articles were selected for critical appraisal (n = 4,946 patients). The median quality score of articles was 4 of 8 (range, 2-6). Overall, the median Injury Severity Score (ISS) at admission was 26 (range, 0.6-75). A median of 66% (range, 0-100%) of patients was managed with NOM, with a success rate of 94% (range, 86-100%). TAE was used in only 3% of cases (range, 0-72%) owing to contrast extravasation on CT with a success rate of 93% (range, 81-100%); however, 9% to 30% of patients required a laparotomy. Thirty-one percent (range, 17-100%) of patients were managed with surgery owing to hemodynamic instability in most cases, with 12% to 28% requiring secondary TAE to control recurrent hepatic bleeding. Mortality was 5% (range, 0-8%) after NOM and 51% (range, 30-68%) after surgery. CONCLUSION: NOM of Grade III to V blunt hepatic injury is the first treatment option to manage hemodynamically stable patients. TAE and surgery are considered in a highly selective group of patients with contrast extravasation on CT or shock at admission, respectively. Additional standardization of the reports is necessary to allow accurate comparisons of the various management strategies. LEVEL OF EVIDENCE: Systematic review, level IV.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Study objectives: Many major drugs are not available in paediatric form. The aim of this study was to develop a stable liquid solution of captopril for oral paediatric use allowing individualised dosage and easy administration to newborn and young patients. Methods: A specific HPLC-UV method was developed. In a pilot study, a number of formulations described in the literature as affording one-month stability were examined. In the proper long-term study, the formulation that gave the best results was then prepared in large batches and its stability monitored for two years at 5°C and room temperature, and for one year at 40°C. Results: Most formulations described in the literature were found wanting in our pilot study. A simple solution of the drug (1 mg/mL) in purified water (European Pharmacopeia) containing 0.1% disodium edetate (EDTA-Na) as preservative proved chemically and microbiologically stable at 5°C and room temperature for two years. Conclusion: The proposed in-house formulation fulfils stringent criteria of purity and stability and is fully acceptable for oral administration to newborn and young patients.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

OBJECTIVE: To investigate whether HIV-infected patients on a stable and fully suppressive combination antiretroviral therapy (cART) regimen could safely be monitored less often than the current recommendations of every 3 months. DESIGN: Two thousand two hundred and forty patients from the EuroSIDA study who maintained a stable and fully suppressed cART regimen for 1 year were included in the analysis. METHODS: Risk of treatment failure, defined by viral rebound, fall in CD4 cell count, development of new AIDS-defining illness, serious opportunistic infection or death, in the 12 months following a year of a stable and fully suppressed regimen was assessed. RESULTS: One hundred thirty-one (6%) patients experienced treatment failure in the 12 months following a year of stable therapy, viral rebound occurred in 99 (4.6%) patients. After 3, 6 and 12 months, patients had a 0.3% [95% confidence interval (CI) 0.1-0.5], 2.2% (95% CI 1.6-2.8) and 6.0% (95% CI 5.0-7.0) risk of treatment failure, respectively. Patients who spent more than 80% of their time on cART with fully suppressed viraemia prior to baseline had a 38% reduced risk of treatment failure, hazard ratio 0.62 (95% CI 0.42-0.90, P = 0.01). CONCLUSION: Patients who have responded well to cART and are on a well tolerated and durably fully suppressive cART regimen have a low chance of experiencing treatment failure in the next 3-6 months. Therefore, in this subgroup of otherwise healthy patients, it maybe reasonable to extend visit intervals to 6 months, with cost and time savings to both the treating clinics and the patients.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Purpose: Sirolimus (SRL) has been used to replace calcineurin inhibitors (CNI) for various indications including CNI-induced toxicity. The aim of this study was to evaluate the efficacy and safety of switching from CNI to SRL in stable renal transplant recipients (RTR) with low grade proteinuria (<1 g/24 h). Methods and materials: Between 2001 and 2007, 41 patients (20 females, 21 males; mean age 47 ± 13) were switched after a median time post-transplantation of 73.5 months (range 0.2-273.2 months). Indications for switch were CNI nephrotoxicity (39%), thrombotic micro-angiopathy (14.6%), post-transplantation cancer (24.4%), CNI neurotoxicity (7.4%), or others (14.6%). Mean follow-up after SRL switch was 23.8±16.3 months. Mean SRL dosage and through levels were 2.4 ± 1.1 mg/day and 8 ± 2.2 ug/l respectively. Immunosuppressive regiments were SRL + mycophenolate mofetil (MMF) (31.7%), SRL + MMF + prednisone (36.58%), SRL + prednisone (19.51%), SRL + Azathioprine (9.75%), or SRL alone (2.43%). Results: Mean creatinine decreased from 164 to 143 μmol/l (p <0.03), mean estimated glomerular filtration rate (eGFR) increased significantly from 50.13 to 55.01 ml/minute (p <0.00001), mean systolic and diastolic blood pressure decreased from 138 to 132 mm Hg (p <0.03) and from 83 to78 mm Hg (p <0.01), but mean proteinuria increased from 0.21 to 0.63 g/24 h (p <0.001). While mean total cholesterolemia didn't increased significantly from 5.09 to 5.56 mmol/l (p = 0.06). The main complications after SRL switch were dermatitis (19.5%), urinary tract infections (24.4%), ankle edema (13.3%), and transient oral ulcers (20%). Acute rejection after the switch occurred in 7.3% of patients (n = 3), and 2 acute rejections were successfully treated with corticosteroids and 1 did not respond to treatment (not related to switch). SRL had to be discontinued in 17% of patients (2 nephrotic syndromes, 2 severe edema, 1 acute rejection, 1 thrombotic micro-angiopathy, and 1 fever). Conclusion: In conclusion, we found that switching from CNI to SRL in stable RTR was safe and associated with a significant improvement of renal function and blood pressure. Known side-effects of SRL led to drug discontinuation in less than 20% of patients and the acute rejection rate was 7.3%. This experience underlines the importance of patient selection before switching to SRL, in particular regarding preswitch proteinuria.