885 resultados para Monitoring methods
Resumo:
BACKGROUND Monitoring of HIV viral load in patients on combination antiretroviral therapy (ART) is not generally available in resource-limited settings. We examined the cost-effectiveness of qualitative point-of-care viral load tests (POC-VL) in sub-Saharan Africa. DESIGN Mathematical model based on longitudinal data from the Gugulethu and Khayelitsha township ART programmes in Cape Town, South Africa. METHODS Cohorts of patients on ART monitored by POC-VL, CD4 cell count or clinically were simulated. Scenario A considered the more accurate detection of treatment failure with POC-VL only, and scenario B also considered the effect on HIV transmission. Scenario C further assumed that the risk of virologic failure is halved with POC-VL due to improved adherence. We estimated the change in costs per quality-adjusted life-year gained (incremental cost-effectiveness ratios, ICERs) of POC-VL compared with CD4 and clinical monitoring. RESULTS POC-VL tests with detection limits less than 1000 copies/ml increased costs due to unnecessary switches to second-line ART, without improving survival. Assuming POC-VL unit costs between US$5 and US$20 and detection limits between 1000 and 10,000 copies/ml, the ICER of POC-VL was US$4010-US$9230 compared with clinical and US$5960-US$25540 compared with CD4 cell count monitoring. In Scenario B, the corresponding ICERs were US$2450-US$5830 and US$2230-US$10380. In Scenario C, the ICER ranged between US$960 and US$2500 compared with clinical monitoring and between cost-saving and US$2460 compared with CD4 monitoring. CONCLUSION The cost-effectiveness of POC-VL for monitoring ART is improved by a higher detection limit, by taking the reduction in new HIV infections into account and assuming that failure of first-line ART is reduced due to targeted adherence counselling.
Resumo:
PURPOSE Little data is available on noninvasive MRI-based assessment of renal function during upper urinary tract (UUT) obstruction. In this study, we determined whether functional multiparametric kidney MRI is able to monitor treatment response in acute unilateral UUT obstruction. MATERIAL AND METHODS Between 01/2008 and 01/2010, 18 patients with acute unilateral UUT obstruction due to calculi were prospectively enrolled to undergo kidney MRI with conventional, blood oxygen level-dependent (BOLD) and diffusion-weighted (DW) sequences on emergency admission and after release of obstruction. Functional imaging parameters of the obstructed and contralateral unobstructed kidneys derived from BOLD (apparent spin relaxation rate [R2*]) and DW (total apparent diffusion coefficient [ADCT], pure diffusion coefficient [ADCD] and perfusion fraction [FP]) sequences were assessed during acute UUT obstruction and after its release. RESULTS During acute obstruction, R2* and FP values were lower in the cortex (p=0.020 and p=0.031, respectively) and medulla (p=0.012 and p=0.190, respectively) of the obstructed compared to the contralateral unobstructed kidneys. After release of obstruction, R2* and FP values increased both in the cortex (p=0.016 and p=0.004, respectively) and medulla (p=0.071 and p=0.044, respectively) of the formerly obstructed kidneys to values similar to those found in the contralateral kidneys. ADCT and ADCD values did not significantly differ between obstructed and contralateral unobstructed kidneys during or after obstruction. CONCLUSIONS In our patients with acute unilateral UUT obstruction due to calculi, functional kidney MRI using BOLD and DW sequences allowed for the monitoring of pathophysiologic changes of obstructed kidneys during obstruction and after its release.
Resumo:
The identification of plausible causes for water body status deterioration will be much easier if it can build on available, reliable, extensive and comprehensive biogeochemical monitoring data (preferably aggregated in a database). A plausible identification of such causes is a prerequisite for well-informed decisions on which mitigation or remediation measures to take. In this chapter, first a rationale for an extended monitoring programme is provided; it is then compared to the one required by the Water Framework Directive (WFD). This proposal includes a list of relevant parameters that are needed for an integrated, a priori status assessment. Secondly, a few sophisticated statistical tools are described that subsequently allow for the estiation of the magnitude of impairment as well as the likely relative importance of different stressors in a multiple stressed environment. The advantages and restrictions of these rather complicated analytical methods are discussed. Finally, the use of Decision Support Systems (DSS) is advocated with regard to the specific WFD implementation requirements.
Resumo:
Detector uniformity is a fundamental performance characteristic of all modern gamma camera systems, and ensuring a stable, uniform detector response is critical for maintaining clinical images that are free of artifact. For these reasons, the assessment of detector uniformity is one of the most common activities associated with a successful clinical quality assurance program in gamma camera imaging. The evaluation of this parameter, however, is often unclear because it is highly dependent upon acquisition conditions, reviewer expertise, and the application of somewhat arbitrary limits that do not characterize the spatial location of the non-uniformities. Furthermore, as the goal of any robust quality control program is the determination of significant deviations from standard or baseline conditions, clinicians and vendors often neglect the temporal nature of detector degradation (1). This thesis describes the development and testing of new methods for monitoring detector uniformity. These techniques provide more quantitative, sensitive, and specific feedback to the reviewer so that he or she may be better equipped to identify performance degradation prior to its manifestation in clinical images. The methods exploit the temporal nature of detector degradation and spatially segment distinct regions-of-non-uniformity using multi-resolution decomposition. These techniques were tested on synthetic phantom data using different degradation functions, as well as on experimentally acquired time series floods with induced, progressively worsening defects present within the field-of-view. The sensitivity of conventional, global figures-of-merit for detecting changes in uniformity was evaluated and compared to these new image-space techniques. The image-space algorithms provide a reproducible means of detecting regions-of-non-uniformity prior to any single flood image’s having a NEMA uniformity value in excess of 5%. The sensitivity of these image-space algorithms was found to depend on the size and magnitude of the non-uniformities, as well as on the nature of the cause of the non-uniform region. A trend analysis of the conventional figures-of-merit demonstrated their sensitivity to shifts in detector uniformity. The image-space algorithms are computationally efficient. Therefore, the image-space algorithms should be used concomitantly with the trending of the global figures-of-merit in order to provide the reviewer with a richer assessment of gamma camera detector uniformity characteristics.
Resumo:
Background: Ischemia monitoring cannot always be performed by 12-lead ECG. Hence, the individual performance of the ECG leads is crucial. No experimental data on the ECG's specificity for transient ischemia exist. Methods: In 45 patients a 19-lead ECG was registered during a 1-minute balloon occlusion of a coronary artery (left anterior descending artery [LAD], right coronary artery [RCA] or left circumflex artery [LCX]). ST-segment shifts and sensitivity/specificity of the leads were measured. Results: During LAD occlusion, V3 showed maximal ST-segment elevation (0.26 mV [IQR 0.16–0.33 mV], p = 0.001) and sensitivity/specificity (88% and 80%). During RCA occlusion, III showed maximal ST-elevation (0.2 mV [IQR 0.09–0.26 mV], p = 0.004), aVF had the best sensitivity/specificity (85% and 68%). During LCX occlusion, V6 showed maximal ST-segment elevation (0.04 mV [IQR 0.02–0.14 mV], p = 0.005), and sensitivity/specificity was (31%/92%) but could be improved (63%/72%) using an optimized cut-off for ischemia. Conclusion: V3, aVF and V6 show the best performance to detect transient ischemia.
Resumo:
BACKGROUND: Chemotherapies of solid tumors commonly include 5-fluorouracil (5-FU). With standard doses of 5-FU, substantial inter-patient variability has been observed in exposure levels and treatment response. Recently, improved outcomes in colorectal cancer patients due to pharmacokinetically guided 5-FU dosing were reported. We aimed at establishing a rapid and sensitive method for monitoring 5-FU plasma levels in cancer patients in our routine clinical practice. METHODS: Performance of the Saladax My5-FU™ immunoassay was evaluated on the Roche Cobas® Integra 800 analyzer. Subsequently, 5-FU concentrations of 247 clinical plasma samples obtained with this assay were compared to the results obtained by liquid chromatography-tandem mass spectrometry (LC-MS/MS) and other commonly used clinical analyzers (Olympus AU400, Roche Cobas c6000, and Thermo Fisher CDx90). RESULTS: The My-FU assay was successfully validated on the Cobas Integra 800 analyzer in terms of linearity, precision, accuracy, recovery, interference, sample carryover, and dilution integrity. Method comparison between the Cobas Integra 800 and LC-MS/MS revealed a proportional bias of 7% towards higher values measured with the My5-FU assay. However, when the Cobas Integra 800 was compared to three other clinical analyzers in addition to LC-MS/MS including 50 samples representing the typical clinical range of 5-FU plasma concentrations, only a small proportional bias (≤1.6%) and a constant bias below the limit of detection was observed. CONCLUSIONS: The My5-FU assay demonstrated robust and highly comparable performance on different analyzers. Therefore, the assay is suitable for monitoring 5-FU plasma levels in routine clinical practice and may contribute to improved efficacy and safety of commonly used 5-FU-based chemotherapies.
Resumo:
INTRODUCTION: Voluntary muscle activity, including swallowing, decreases during the night. The association between nocturnal awakenings and swallowing activity is under-researched with limited information on the frequency of swallows during awake and asleep periods. AIM: The aim of this study was to assess nocturnal swallowing activity and identify a cut-off predicting awake and asleep periods. METHODS: Patients undergoing impedance-pH monitoring as part of GERD work-up were asked to wear a wrist activity detecting device (Actigraph(®)) at night. Swallowing activity was quantified by analysing impedance changes in the proximal esophagus. Awake and asleep periods were determined using a validated scoring system (Sadeh algorithm). Receiver operating characteristics (ROC) analyses were performed to determine sensitivity, specificity and accuracy of swallowing frequency to identify awake and asleep periods. RESULTS: Data from 76 patients (28 male, 48 female; mean age 56 ± 15 years) were included in the analysis. The ROC analysis found that 0.33 sw/min (i.e. one swallow every 3 min) had the optimal sensitivity (78 %) and specificity (76 %) to differentiate awake from asleep periods. A swallowing frequency of 0.25 sw/min (i.e. one swallow every 4 min) was 93 % sensitive and 57 % specific to identify awake periods. A swallowing frequency of 1 sw/min was 20 % sensitive but 96 % specific in identifying awake periods. Impedance-pH monitoring detects differences in swallowing activity during awake and asleep periods. Swallowing frequency noticed during ambulatory impedance-pH monitoring can predict the state of consciousness during nocturnal periods
Resumo:
For early diagnosis and therapy of alcohol-related disorders, alcohol biomarkers are highly valuable. Concerning specificity, indirect markers can be influenced by nonethanol-related factors, whereas direct markers are only formed after ethanol consumption. Sensitivity of the direct markers depends on cutoffs of analytical methods, material for analysis and plays an important role for their utilization in different fields of application. Until recently, the biomarker phosphatidylethanol has been used to differentiate between social drinking and alcohol abuse. After method optimization, the detection limit could be lowered and phosphatidylethanol became sensitive enough to even detect the consumption of low amounts of alcohol. This perspective gives a summary of most common alcohol biomarkers and summarizes new developments for monitoring alcohol consumption habits.
Resumo:
PURPOSE This study assessed whether a cycle of "routine" therapeutic drug monitoring (TDM) for imatinib dosage individualization, targeting an imatinib trough plasma concentration (C min) of 1,000 ng/ml (tolerance: 750-1,500 ng/ml), could improve clinical outcomes in chronic myelogenous leukemia (CML) patients, compared with TDM use only in case of problems ("rescue" TDM). METHODS Imatinib concentration monitoring evaluation was a multicenter randomized controlled trial including adult patients in chronic or accelerated phase CML receiving imatinib since less than 5 years. Patients were allocated 1:1 to "routine TDM" or "rescue TDM." The primary endpoint was a combined outcome (failure- and toxicity-free survival with continuation on imatinib) over 1-year follow-up, analyzed in intention-to-treat (ISRCTN31181395). RESULTS Among 56 patients (55 evaluable), 14/27 (52 %) receiving "routine TDM" remained event-free versus 16/28 (57 %) "rescue TDM" controls (P = 0.69). In the "routine TDM" arm, dosage recommendations were correctly adopted in 14 patients (median C min: 895 ng/ml), who had fewer unfavorable events (28 %) than the 13 not receiving the advised dosage (77 %; P = 0.03; median C min: 648 ng/ml). CONCLUSIONS This first target concentration intervention trial could not formally demonstrate a benefit of "routine TDM" because of small patient number and surprisingly limited prescriber's adherence to dosage recommendations. Favorable outcomes were, however, found in patients actually elected for target dosing. This study thus shows first prospective indication for TDM being a useful tool to guide drug dosage and shift decisions. The study design and analysis provide an interesting paradigm for future randomized TDM trials on targeted anticancer agents.
Resumo:
OBJECTIVES Many paediatric antiretroviral therapy (ART) programmes in Southern Africa rely on CD4⁺ to monitor ART. We assessed the benefit of replacing CD4⁺ by viral load monitoring. DESIGN A mathematical modelling study. METHODS A simulation model of HIV progression over 5 years in children on ART, parameterized by data from seven South African cohorts. We simulated treatment programmes with 6-monthly CD4⁺ or 6- or 12-monthly viral load monitoring. We compared mortality, second-line ART use, immunological failure and time spent on failing ART. In further analyses, we varied the rate of virological failure, and assumed that the rate is higher with CD4⁺ than with viral load monitoring. RESULTS About 7% of children were predicted to die within 5 years, independent of the monitoring strategy. Compared with CD4⁺ monitoring, 12-monthly viral load monitoring reduced the 5-year risk of immunological failure from 1.6 to 1.0% and the mean time spent on failing ART from 6.6 to 3.6 months; 1% of children with CD4⁺ compared with 12% with viral load monitoring switched to second-line ART. Differences became larger when assuming higher rates of virological failure. When assuming higher virological failure rates with CD4⁺ than with viral load monitoring, up to 4.2% of children with CD4⁺ compared with 1.5% with viral load monitoring experienced immunological failure; the mean time spent on failing ART was 27.3 months with CD4⁺ monitoring and 6.0 months with viral load monitoring. Conclusion: Viral load monitoring did not affect 5-year mortality, but reduced time on failing ART, improved immunological response and increased switching to second-line ART.
Resumo:
Conservation and monitoring of forest biodiversity requires reliable information about forest structure and composition at multiple spatial scales. However, detailed data about forest habitat characteristics across large areas are often incomplete due to difficulties associated with field sampling methods. To overcome this limitation we employed a nationally available light detection and ranging (LiDAR) remote sensing dataset to develop variables describing forest landscape structure across a large environmental gradient in Switzerland. Using a model species indicative of structurally rich mountain forests (hazel grouse Bonasa bonasia), we tested the potential of such variables to predict species occurrence and evaluated the additional benefit of LiDAR data when used in combination with traditional, sample plot-based field variables. We calibrated boosted regression trees (BRT) models for both variable sets separately and in combination, and compared the models’ accuracies. While both field-based and LiDAR models performed well, combining the two data sources improved the accuracy of the species’ habitat model. The variables retained from the two datasets held different types of information: field variables mostly quantified food resources and cover in the field and shrub layer, LiDAR variables characterized heterogeneity of vegetation structure which correlated with field variables describing the understory and ground vegetation. When combined with data on forest vegetation composition from field surveys, LiDAR provides valuable complementary information for encompassing species niches more comprehensively. Thus, LiDAR bridges the gap between precise, locally restricted field-data and coarse digital land cover information by reliably identifying habitat structure and quality across large areas.
Resumo:
INTRODUCTION Anatomic imaging alone is often inadequate for tuning systemic treatment for individual tumor response. Optically based techniques could potentially contribute to fast and objective response monitoring in personalized cancer therapy. In the present study, we evaluated the feasibility of dual-modality diffuse reflectance spectroscopy-autofluorescence spectroscopy (DRS-AFS) to monitor the effects of systemic treatment in a mouse model for hereditary breast cancer. METHODS Brca1(-/-); p53(-/-) mammary tumors were grown in 36 mice, half of which were treated with a single dose of cisplatin. Changes in the tumor physiology and morphology were measured for a period of 1 week using dual-modality DRS-AFS. Liver and muscle tissues were also measured to distinguish tumor-specific alterations from systemic changes. Model-based analyses were used to derive different optical parameters like the scattering and absorption coefficients, as well as sources of intrinsic fluorescence. Histopathologic analysis was performed for cross-validation with trends in optically based parameters. RESULTS Treated tumors showed a significant decrease in Mie-scattering slope and Mie-to-total scattering fraction and an increase in both fat volume fraction and tissue oxygenation after 2 days of follow-up. Additionally, significant tumor-specific changes in the fluorescence spectra were seen. These longitudinal trends were consistent with changes observed in the histopathologic analysis, such as vital tumor content and formation of fibrosis. CONCLUSIONS This study demonstrates that dual-modality DRS-AFS provides quantitative functional information that corresponds well with the degree of pathologic response. DRS-AFS, in conjunction with other imaging modalities, could be used to optimize systemic cancer treatment on the basis of early individual tumor response.
Resumo:
BACKGROUND The assessment of hemodynamic status is a crucial task in the initial evaluation of trauma patients. However, blood pressure and heart rate are often misleading, as multiple variables may impact these conventional parameters. More reliable methods such as pulmonary artery thermodilution for cardiac output measuring would be necessary, but its applicability in the Emergency Department is questionable due to their invasive nature. Non-invasive cardiac output monitoring devices may be a feasible alternative. METHODS A systematic literature review was conducted. Only studies that explicitly investigated non-invasive hemodynamic monitoring devices in trauma patients were considered. RESULTS A total of 7 studies were identified as suitable and were included into this review. These studies evaluated in a total of 1,197 trauma patients the accuracy of non-invasive hemodynamic monitoring devices by comparing measurements to pulmonary artery thermodilution, which is the gold standard for cardiac output measuring. The correlation coefficients r between the two methods ranged from 0.79 to 0.92. Bias and precision analysis ranged from -0.02 +/- 0.78 l/min/m(2) to -0.14 +/- 0.73 l/min/m(2). Additionally, data on practicality, limitations and clinical impact of the devices were collected. CONCLUSION The accuracy of non-invasive cardiac output monitoring devices in trauma patients is broadly satisfactory. As the devices can be applied very early in the shock room or even preclinically, hemodynamic shock may be recognized much earlier and therapeutic interventions could be applied more rapidly and more adequately. The devices can be used in the daily routine of a busy ED, as they are non-invasive and easy to master.
Resumo:
Purpose: Selective retina therapy (SRT) has shown great promise compared to conventional retinal laser photocoagulation as it avoids collateral damage and selectively targets the retinal pigment epithelium (RPE). Its use, however, is challenging in terms of therapy monitoring and dosage because an immediate tissue reaction is not biomicroscopically discernibel. To overcome these limitations, real-time optical coherence tomography (OCT) might be useful to monitor retinal tissue during laser application. We have thus evaluated a proprietary OCT system for its capability of mapping optical changes introduced by SRT in retinal tissue. Methods: Freshly enucleated porcine eyes, covered in DMEM upon collection were utilized and a total of 175 scans from ex-vivo porcine eyes were analyzed. The porcine eyes were used as an ex-vivo model and results compared to two time-resolved OCT scans, recorded from a patient undergoing SRT treatment (SRT Vario, Medical Laser Center Lübeck). In addition to OCT, fluorescin angiography and fundus photography were performed on the patient and OCT scans were subsequently investigated for optical tissue changes linked to laser application. Results: Biomicroscopically invisible SRT lesions were detectable in OCT by changes in the RPE / Bruch's complex both in vivo and the porcine ex-vivo model. Laser application produced clearly visible optical effects such as hyperreflectivity and tissue distortion in the treated retina. Tissue effects were even discernible in time-resolved OCT imaging when no hyper-reflectivity persisted after treatment. Data from ex-vivo porcine eyes showed similar to identical optical changes while effects visible in OCT appeared to correlate with applied pulse energy, leading to an additional reflective layer when lesions became visible in indirect ophthalmoscopy. Conclusions: Our results support the hypothesis that real-time high-resolution OCT may be a promising modality to obtain additional information about the extent of tissue damage caused by SRT treatment. Data shows that our exvivo porcine model adequately reproduces the effects occurring in-vivo, and thus can be used to further investigate this promising imaging technique.
Resumo:
BACKGROUND Patients requiring anticoagulation suffer from comorbidities such as hypertension. On the occasion of INR monitoring, general practitioners (GPs) have the opportunity to control for blood pressure (BP). We aimed to evaluate the impact of Vitamin-K Antagonist (VKA) monitoring by GPs on BP control in patients with hypertension. METHODS We cross-sectionally analyzed the database of the Swiss Family Medicine ICPC Research using Electronic Medical Records (FIRE) of 60 general practices in a primary care setting in Switzerland. This database includes 113,335 patients who visited their GP between 2009 and 2013. We identified patients with hypertension based on antihypertensive medication prescribed for ≥6 months. We compared patients with VKA for ≥3 months and patients without such treatment regarding BP control. We adjusted for age, sex, observation period, number of consultations and comorbidity. RESULTS We identified 4,412 patients with hypertension and blood pressure recordings in the FIRE database. Among these, 569 (12.9 %) were on Phenprocoumon (VKA) and 3,843 (87.1 %) had no anticoagulation. Mean systolic and diastolic BP was significantly lower in the VKA group (130.6 ± 14.9 vs 139.8 ± 15.8 and 76.6 ± 7.9 vs 81.3 ± 9.3 mm Hg) (p < 0.001 for both). The difference remained after adjusting for possible confounders. Systolic and diastolic BP were significantly lower in the VKA group, reaching a mean difference of -8.4 mm Hg (95 % CI -9.8 to -7.0 mm Hg) and -1.5 mm Hg (95 % CI -2.3 to -0.7 mm Hg), respectively (p < 0.001 for both). CONCLUSIONS In a large sample of hypertensive patients in Switzerland, VKA treatment was independently associated with better systolic and diastolic BP control. The observed effect could be due to better compliance with antihypertensive medication in patients treated with VKA. Therefore, we conclude to be aware of this possible benefit especially in patients with lower expected compliance and with multimorbidity.