999 resultados para microbiological monitoring
Resumo:
Introduction: Imatinib trough plasma concentrations (Cmin) have been correlated with treatment response in chronic myeloid leukemia (CML) patients. The use of Cmin monitoring for optimizing imatinib dosage (therapeutic drug monitoring [TDM]) is therefore proposed for patients with unsatisfying response or tolerance ("rescue TDM"). A cycle of "routine TDM" for dosage individualization could also be beneficial to prevent unfavorable events, yet its clinical usefulness has not been evaluated. We aimed to assess prospectively whether a "routine TDM" intervention targeting imatinib Cmin of 1000 ng/mL (tolerance, 750-1500 ng/mL) could improve efficacy, tolerance, and persistence on treatment compared with "rescue TDM" use only. Patients (or Materials) and Methods: The Swiss Imatinib COncentration Monitoring Evaluation (I-COME) study was a multicenter randomized controlled trial (ISRCTN31181395). Adult patients in chronic or accelerated phase CML receiving imatinib ≤5 years were eligible. Patients were randomly (1:1) allocated to receive "routine TDM" intervention or to serve as controls with access only to "rescue TDM". All had 1-year follow-up. The primary endpoint was a combined efficacy-safety outcome (failure- and toxicity-free survival without imatinib discontinuation), analyzed in intention-to-treat. Results: Among 56 CML recruited patients, 55 had their molecular and cytogenetic response measured. 14/27 of patients receiving "routine TDM" (52% [33%-71%]) remained event-free versus 16/28 of control patients with "rescue TDM" only (57% [39%-75%]; P=0.69). In the "routine TDM" group, dosage recommendations were adopted entirely in 50% of patients (median Cmin at study end, 895 ng/mL; CV = 33%). These patients had fewer unfavorable events (28% [5%-52%]) compared with patients not receiving the advised dosage (77% [54%-99%]; P = 0.03; median Cmin at study end, 648 ng/mL; CV = 38%). Conclusion: This first prospective target concentration intervention trial could not formally demonstrate a benefit of "routine TDM" of imatinib, especially due to a small patient number and limited prescriber's adherence to dosage recommendations. Nevertheless, the patients receiving the advised dosage more often met target concentrations and the combined outcome (efficacy, tolerance, and persistence). A cycle of routine TDM could thus be favorable, at least in patients eligible for dosage adjustment. Its usefulness should, however, be further confirmed in larger trials.
Resumo:
Given that clay-rich landslides may become mobilized, leading to rapid mass movements (earthflows and debris flows), they pose critical problems in risk management worldwide. The most widely proposed mechanism leading to such flow-like movements is the increase in water pore pressure in the sliding mass, generating partial or complete liquefaction. This solid-to-liquid transition results in a dramatic reduction of mechanical rigidity in the liquefied zones, which could be detected by monitoring shear wave velocity variations. With this purpose in mind, the ambient seismic noise correlation technique has been applied to measure the variation in the seismic surface wave velocity in the Pont Bourquin landslide (Swiss Alps). This small but active composite earthslide-earthflow was equipped with continuously recording seismic sensors during spring and summer 2010. An earthslide of a few thousand cubic meters was triggered in mid-August 2010, after a rainy period. This article shows that the seismic velocity of the sliding material, measured from daily noise correlograms, decreased continuously and rapidly for several days prior to the catastrophic event. From a spectral analysis of the velocity decrease, it was possible to determine the location of the change at the base of the sliding layer. These results demonstrate that ambient seismic noise can be used to detect rigidity variations before failure and could potentially be used to predict landslides.
Resumo:
INTRODUCTION: Continuous EEG (cEEG) is increasingly used to monitor brain function in neuro-ICU patients. However, its value in patients with coma after cardiac arrest (CA), particularly in the setting of therapeutic hypothermia (TH), is only beginning to be elucidated. The aim of this study was to examine whether cEEG performed during TH may predict outcome. METHODS: From April 2009 to April 2010, we prospectively studied 34 consecutive comatose patients treated with TH after CA who were monitored with cEEG, initiated during hypothermia and maintained after rewarming. EEG background reactivity to painful stimulation was tested. We analyzed the association between cEEG findings and neurologic outcome, assessed at 2 months with the Glasgow-Pittsburgh Cerebral Performance Categories (CPC). RESULTS: Continuous EEG recording was started 12 ± 6 hours after CA and lasted 30 ± 11 hours. Nonreactive cEEG background (12 of 15 (75%) among nonsurvivors versus none of 19 (0) survivors; P < 0.001) and prolonged discontinuous "burst-suppression" activity (11 of 15 (73%) versus none of 19; P < 0.001) were significantly associated with mortality. EEG seizures with absent background reactivity also differed significantly (seven of 15 (47%) versus none of 12 (0); P = 0.001). In patients with nonreactive background or seizures/epileptiform discharges on cEEG, no improvement was seen after TH. Nonreactive cEEG background during TH had a positive predictive value of 100% (95% confidence interval (CI), 74 to 100%) and a false-positive rate of 0 (95% CI, 0 to 18%) for mortality. All survivors had cEEG background reactivity, and the majority of them (14 (74%) of 19) had a favorable outcome (CPC 1 or 2). CONCLUSIONS: Continuous EEG monitoring showing a nonreactive or discontinuous background during TH is strongly associated with unfavorable outcome in patients with coma after CA. These data warrant larger studies to confirm the value of continuous EEG monitoring in predicting prognosis after CA and TH.
Resumo:
Invasive candidiasis, including candidemia and deep-seated Candida infections, is a severe opportunistic infection with an overall mortality in ICU patients comparable to that of severe sepsis/septic shock. With an incidence ranging from 5 to 10 cases per 1000 ICU admissions, invasive candidiasis represents 510% of all ICU-acquired infections. Although a high proportion of critically ill patients is colonised with Candida spp., only 540% develop an invasive infection. The occurrence of this complication is difficult to predict and an early diagnosis remains a major challenge. Indeed, blood cultures are positive in a minority of cases and often late in the course of infection. New non-culture based laboratory techniques may contribute to early diagnosis and management of invasive candidiasis. Recent data suggest that prediction rules based on risk factors, clinical and microbiological parameters or monitoring of Candida colonisation may efficiently identify critically ill patients at high risk of invasive candidiasis who may benefit of preventive or pre-emptive antifungal therapy. In many cancer centres, exposure to azoles antifungals has been associated with an epidemiological shift from Candida albicans to non-albicans Candida species with reduced antifungal susceptibility or intrinsic resistance. This trend has not been observed in recent surveys on candidemia in non-immunocompromised ICU patients. Prophylaxis, pre-emptive or empirical antifungal treatment are possible approaches for prevention or early management of invasive candidiasis. However, the selection of high-risk patients remains critical for an efficient management aimed at reducing the number needed to treat and thus avoiding unnecessary treatments associated with the emergence of resistance, drug toxicity and costs.
Resumo:
Both N excess and deficiency may affect cotton yield and quality. It would therefore be useful to base the N management fertilization on the monitoring of the nutritional status. This study investigated the correlations among the following determination methods of the N nutritional status of cotton (Gossypium hirsutum L., var. Latifolia): chlorophyll readings (SPAD-502®, Minolta), specific-ion nitrate meter (Nitrate Meter C-141, Horiba-Cardy®), and laboratory analysis (conventional foliar diagnosis). Samples were taken weekly from two weeks before flowering to the fifth week after the first flower. The experiment was conducted on the Fazenda Santa Tereza, Itapeva, State of São Paulo, Brazil. The crop was fertilized with 40 kg ha-1 N at planting and 0, 30, 60, 90, and 120 kg ha-1 of side-dressed N. The range of leaf N contents reported as adequate for samples taken 80-90 days after plant emergence (traditional foliar diagnosis) may be used as reference from the beginning of flowering when the plant is not stressed. Specific-ion nitrate meter readings can be used as a nutritional indicator of cotton nutrition from one week after pinhead until the third week of flowering. In this case, plants are well-nourished when readings exceed 8,000 mg L-1 NO3-. The chlorophyll meter can also be used to estimate the nutritional status of cotton from the third week of flowering. In this case the readings should be above 48 in well-nourished plants.
Resumo:
Drug development has improved over recent decades, with refinements in analytical techniques, population pharmacokinetic-pharmacodynamic (PK-PD) modelling and simulation, and new biomarkers of efficacy and tolerability. Yet this progress has not yielded improvements in individualization of treatment and monitoring, owing to various obstacles: monitoring is complex and demanding, many monitoring procedures have been instituted without critical assessment of the underlying evidence and rationale, controlled clinical trials are sparse, monitoring procedures are poorly validated and both drug manufacturers and regulatory authorities take insufficient account of the importance of monitoring. Drug concentration and effect data should be increasingly collected, analyzed, aggregated and disseminated in forms suitable for prescribers, along with efficient monitoring tools and evidence-based recommendations regarding their best use. PK-PD observations should be collected for both novel and established critical drugs and applied to observational data, in order to establish whether monitoring would be suitable. Methods for aggregating PK-PD data in systematic reviews should be devised. Observational and intervention studies to evaluate monitoring procedures are needed. Miniaturized monitoring tests for delivery at the point of care should be developed and harnessed to closed-loop regulated drug delivery systems. Intelligent devices would enable unprecedented precision in the application of critical treatments, i.e. those with life-saving efficacy, narrow therapeutic margins and high interpatient variability. Pharmaceutical companies, regulatory agencies and academic clinical pharmacologists share the responsibility of leading such developments, in order to ensure that patients obtain the greatest benefit and suffer the least harm from their medicines.
Resumo:
The Iowa EHDI High-Risk Monitoring Protocol is based on the Joint Committee on Infant Hearing 2007 position statement. Emphasis is placed on follow-up as deemed appropriate by the primary health care provider and audiologist. The Iowa protocol describes the follow-up process for children with risk factors.
Resumo:
Background: Recent data have suggested that a population of CD4+ CD25high T cells, phenotypically characterized by the expression of CD45RO and CD127, is significantly expanded in stable liver and kidney transplant recipients and represents alloreactive T cells. Induction therapies may have an impact on this alloreactive T cell population. In this study, we prospectively analyzed CD4+ CD25high CD45RO+ CD127high T cells after induction with either thymoglobulin or basiliximab. Patients and methods: A total of twenty-seven kidney transplant recipients were prospectively enrolled; 14 received thymoglobulin induction followed by a 4-day course of steroids with tacrolimus and mycophenolate mofetil («thymo group»), and 13 received basiliximab induction followed by standard triple immunosuppression (tacrolimus, mycophenolate mofetil and prednisone) («BSX group»). Phenotypical analysis by flow cytometry of the expression of CD25, CD45RO and CD127 on peripheral CD4+ T cells was performed at 0, 3 and 6 months after transplantation. Twenty-four healthy subjects (HS) were studied as controls. Results: There were no differences in baseline characteristics between the groups; at 6 months, patient survival (100%), graft survival (100%), serum creatinine (thymo group versus BSX group: 129 versus 125 micromol/l) and acute rejection (2/14 versus 2/13) were not significantly different. Thymo induction produced a prolonged CD4 T cell depletion. As compared to pre-transplantation values, an expansion of the alloreactive T cell population was observed at 3 months in both thymo (mean: from 6.38% to 14.72%) and BSX (mean: from 8.01% to 18.42%) groups. At 6 months, the alloreactive T cell population remained significantly expanded in the thymo group (16.92 ± 2.87%) whereas it tended to decrease in the BSX group (10.22 ± 1.38%). Conclusion: Overall, our results indicate that the expansion of alloreactive T cells occurs rapidly after transplantation in patients receiving either thymo or BSX induction. Whether differences at later timepoints or whether different IS regimens may modify this alloreactive population remains to be studied.