60 resultados para Monitoring methods


Relevância:

30.00% 30.00%

Publicador:

Resumo:

RATIONALE: Interstitial lung disease (ILD) in patients with systemic sclerosis (SSc) is associated with increased morbidity and mortality. Gastroesophageal reflux (GER) is considered a contributing factor in the pathogenesis of ILD. OBJECTIVES: To characterize GER (acid and nonacid) in patients with SSc with and without ILD. METHODS: Patients with SSc underwent pulmonary high-resolution computer tomography (HRCT) scan and 24-hour impedance-pH monitoring off-proton pump inhibitor therapy. The presence of pulmonary fibrosis was assessed using validated HRCT-scores. Reflux monitoring parameters included number of acid and nonacid reflux episodes, proximal migration of the refluxate, and distal esophageal acid exposure. Unless otherwise specified, data are presented as median (25th-75th percentile). MEASUREMENTS AND MAIN RESULTS: Forty consecutive patients with SSc (35 female; mean age, 53 yr; range, 24-71; 15 patients with diffuse and 25 with limited SSc) were investigated; 18 (45%) patients with SSc had pulmonary fibrosis (HRCT score >or= 7). Patients with SSc with ILD had higher (P < 0.01) esophageal acid exposure (10.3 [7.5-15] vs. 5.2 [1.5-11]), higher (P < 0.01) number of acid (41 [31-58] vs. 19 [10-23]) and nonacid (25 [20-35] vs. 17 [11-19]) reflux episodes, and higher (P < 0.01) number of reflux episodes reaching the proximal esophagus (42.5 [31-54] vs. 15 [8-22]) compared with patients with SSc with normal HRCT scores. Pulmonary fibrosis scores (HRCT score) correlated well with the number of reflux episodes in the distal (r(2) = 0.637) and proximal (r(2) = 0.644) esophagus. CONCLUSIONS: Patients with SSc with ILD have more severe reflux (i.e., more reflux episodes and more reflux reaching the proximal esophagus). Whether or not the development of ILD in patients with SSc can be prevented by reflux-reducing treatments needs to be investigated.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

BACKGROUND: Difference in pulse pressure (dPP) reliably predicts fluid responsiveness in patients. We have developed a respiratory variation (RV) monitoring device (RV monitor), which continuously records both airway pressure and arterial blood pressure (ABP). We compared the RV monitor measurements with manual dPP measurements. METHODS: ABP and airway pressure (PAW) from 24 patients were recorded. Data were fed to the RV monitor to calculate dPP and systolic pressure variation in two different ways: (a) considering both ABP and PAW (RV algorithm) and (b) ABP only (RV(slim) algorithm). Additionally, ABP and PAW were recorded intraoperatively in 10-min intervals for later calculation of dPP by manual assessment. Interobserver variability was determined. Manual dPP assessments were used for comparison with automated measurements. To estimate the importance of the PAW signal, RV(slim) measurements were compared with RV measurements. RESULTS: For the 24 patients, 174 measurements (6-10 per patient) were recorded. Six observers assessed dPP manually in the first 8 patients (10-min interval, 53 measurements); no interobserver variability occurred using a computer-assisted method. Bland-Altman analysis showed acceptable bias and limits of agreement of the 2 automated methods compared with the manual method (RV: -0.33% +/- 8.72% and RV(slim): -1.74% +/- 7.97%). The difference between RV measurements and RV(slim) measurements is small (bias -1.05%, limits of agreement 5.67%). CONCLUSIONS: Measurements of the automated device are comparable with measurements obtained by human observers, who use a computer-assisted method. The importance of the PAW signal is questionable.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Campylobacter, a major zoonotic pathogen, displays seasonality in poultry and in humans. In order to identify temporal patterns in the prevalence of thermophilic Campylobacter spp. in a voluntary monitoring programme in broiler flocks in Germany and in the reported human incidence, time series methods were used. The data originated between May 2004 and June 2007. By the use of seasonal decomposition, autocorrelation and cross-correlation functions, it could be shown that an annual seasonality is present. However, the peak month differs between sample submission, prevalence in broilers and human incidence. Strikingly, the peak in human campylobacterioses preceded the peak in broiler prevalence in Lower Saxony rather than occurring after it. Significant cross-correlations between monthly temperature and prevalence in broilers as well as between human incidence, monthly temperature, rainfall and wind-force were identified. The results highlight the necessity to quantify the transmission of Campylobacter from broiler to humans and to include climatic factors in order to gain further insight into the epidemiology of this zoonotic disease.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

INTRODUCTION Vasospastic brain infarction is a devastating complication of aneurysmal subarachnoid hemorrhage (SAH). Using a probe for invasive monitoring of brain tissue oxygenation or blood flow is highly focal and may miss the site of cerebral vasospasm (CVS). Probe placement is based on the assumption that the spasm will occur either at the dependent vessel territory of the parent artery of the ruptured aneurysm or at the artery exposed to the focal thick blood clot. We investigated the likelihood of a focal monitoring sensor being placed in vasospasm or infarction territory on a hypothetical basis. METHODS From our database we retrospectively selected consecutive SAH patients with angiographically proven (day 7-14) severe CVS (narrowing of vessel lumen >50%). Depending on the aneurysm location we applied a standard protocol of probe placement to detect the most probable site of severe CVS or infarction. We analyzed whether the placement was congruent with existing CVS/infarction. RESULTS We analyzed 100 patients after SAH caused by aneurysms located in the following locations: MCA (n = 14), ICA (n = 30), A1CA (n = 4), AcoA or A2CA (n = 33), and VBA (n = 19). Sensor location corresponded with CVS territory in 93% of MCA, 87% of ICA, 76% of AcoA or A2CA, but only 50% of A1CA and 42% of VBA aneurysms. The focal probe was located inside the infarction territory in 95% of ICA, 89% of MCA, 78% of ACoA or A2CA, 50% of A1CA and 23% of VBA aneurysms. CONCLUSION The probability that a single focal probe will be situated in the territory of severe CVS and infarction varies. It seems to be reasonably accurate for MCA and ICA aneurysms, but not for ACA or VBA aneurysms.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

OBJECTIVES Mortality in patients starting antiretroviral therapy (ART) is higher in Malawi and Zambia than in South Africa. We examined whether different monitoring of ART (viral load [VL] in South Africa and CD4 count in Malawi and Zambia) could explain this mortality difference. DESIGN Mathematical modelling study based on data from ART programmes. METHODS We used a stochastic simulation model to study the effect of VL monitoring on mortality over 5 years. In baseline scenario A all parameters were identical between strategies except for more timely and complete detection of treatment failure with VL monitoring. Additional scenarios introduced delays in switching to second-line ART (scenario B) or higher virologic failure rates (due to worse adherence) when monitoring was based on CD4 counts only (scenario C). Results are presented as relative risks (RR) with 95% prediction intervals and percent of observed mortality difference explained. RESULTS RRs comparing VL with CD4 cell count monitoring were 0.94 (0.74-1.03) in scenario A, 0.94 (0.77-1.02) with delayed switching (scenario B) and 0.80 (0.44-1.07) when assuming a 3-times higher rate of failure (scenario C). The observed mortality at 3 years was 10.9% in Malawi and Zambia and 8.6% in South Africa (absolute difference 2.3%). The percentage of the mortality difference explained by VL monitoring ranged from 4% (scenario A) to 32% (scenarios B and C combined, assuming a 3-times higher failure rate). Eleven percent was explained by non-HIV related mortality. CONCLUSIONS VL monitoring reduces mortality moderately when assuming improved adherence and decreased failure rates.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

BACKGROUND Monitoring of HIV viral load in patients on combination antiretroviral therapy (ART) is not generally available in resource-limited settings. We examined the cost-effectiveness of qualitative point-of-care viral load tests (POC-VL) in sub-Saharan Africa. DESIGN Mathematical model based on longitudinal data from the Gugulethu and Khayelitsha township ART programmes in Cape Town, South Africa. METHODS Cohorts of patients on ART monitored by POC-VL, CD4 cell count or clinically were simulated. Scenario A considered the more accurate detection of treatment failure with POC-VL only, and scenario B also considered the effect on HIV transmission. Scenario C further assumed that the risk of virologic failure is halved with POC-VL due to improved adherence. We estimated the change in costs per quality-adjusted life-year gained (incremental cost-effectiveness ratios, ICERs) of POC-VL compared with CD4 and clinical monitoring. RESULTS POC-VL tests with detection limits less than 1000 copies/ml increased costs due to unnecessary switches to second-line ART, without improving survival. Assuming POC-VL unit costs between US$5 and US$20 and detection limits between 1000 and 10,000 copies/ml, the ICER of POC-VL was US$4010-US$9230 compared with clinical and US$5960-US$25540 compared with CD4 cell count monitoring. In Scenario B, the corresponding ICERs were US$2450-US$5830 and US$2230-US$10380. In Scenario C, the ICER ranged between US$960 and US$2500 compared with clinical monitoring and between cost-saving and US$2460 compared with CD4 monitoring. CONCLUSION The cost-effectiveness of POC-VL for monitoring ART is improved by a higher detection limit, by taking the reduction in new HIV infections into account and assuming that failure of first-line ART is reduced due to targeted adherence counselling.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

PURPOSE Little data is available on noninvasive MRI-based assessment of renal function during upper urinary tract (UUT) obstruction. In this study, we determined whether functional multiparametric kidney MRI is able to monitor treatment response in acute unilateral UUT obstruction. MATERIAL AND METHODS Between 01/2008 and 01/2010, 18 patients with acute unilateral UUT obstruction due to calculi were prospectively enrolled to undergo kidney MRI with conventional, blood oxygen level-dependent (BOLD) and diffusion-weighted (DW) sequences on emergency admission and after release of obstruction. Functional imaging parameters of the obstructed and contralateral unobstructed kidneys derived from BOLD (apparent spin relaxation rate [R2*]) and DW (total apparent diffusion coefficient [ADCT], pure diffusion coefficient [ADCD] and perfusion fraction [FP]) sequences were assessed during acute UUT obstruction and after its release. RESULTS During acute obstruction, R2* and FP values were lower in the cortex (p=0.020 and p=0.031, respectively) and medulla (p=0.012 and p=0.190, respectively) of the obstructed compared to the contralateral unobstructed kidneys. After release of obstruction, R2* and FP values increased both in the cortex (p=0.016 and p=0.004, respectively) and medulla (p=0.071 and p=0.044, respectively) of the formerly obstructed kidneys to values similar to those found in the contralateral kidneys. ADCT and ADCD values did not significantly differ between obstructed and contralateral unobstructed kidneys during or after obstruction. CONCLUSIONS In our patients with acute unilateral UUT obstruction due to calculi, functional kidney MRI using BOLD and DW sequences allowed for the monitoring of pathophysiologic changes of obstructed kidneys during obstruction and after its release.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The identification of plausible causes for water body status deterioration will be much easier if it can build on available, reliable, extensive and comprehensive biogeochemical monitoring data (preferably aggregated in a database). A plausible identification of such causes is a prerequisite for well-informed decisions on which mitigation or remediation measures to take. In this chapter, first a rationale for an extended monitoring programme is provided; it is then compared to the one required by the Water Framework Directive (WFD). This proposal includes a list of relevant parameters that are needed for an integrated, a priori status assessment. Secondly, a few sophisticated statistical tools are described that subsequently allow for the estiation of the magnitude of impairment as well as the likely relative importance of different stressors in a multiple stressed environment. The advantages and restrictions of these rather complicated analytical methods are discussed. Finally, the use of Decision Support Systems (DSS) is advocated with regard to the specific WFD implementation requirements.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Background: Ischemia monitoring cannot always be performed by 12-lead ECG. Hence, the individual performance of the ECG leads is crucial. No experimental data on the ECG's specificity for transient ischemia exist. Methods: In 45 patients a 19-lead ECG was registered during a 1-minute balloon occlusion of a coronary artery (left anterior descending artery [LAD], right coronary artery [RCA] or left circumflex artery [LCX]). ST-segment shifts and sensitivity/specificity of the leads were measured. Results: During LAD occlusion, V3 showed maximal ST-segment elevation (0.26 mV [IQR 0.16–0.33 mV], p = 0.001) and sensitivity/specificity (88% and 80%). During RCA occlusion, III showed maximal ST-elevation (0.2 mV [IQR 0.09–0.26 mV], p = 0.004), aVF had the best sensitivity/specificity (85% and 68%). During LCX occlusion, V6 showed maximal ST-segment elevation (0.04 mV [IQR 0.02–0.14 mV], p = 0.005), and sensitivity/specificity was (31%/92%) but could be improved (63%/72%) using an optimized cut-off for ischemia. Conclusion: V3, aVF and V6 show the best performance to detect transient ischemia.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

BACKGROUND: Chemotherapies of solid tumors commonly include 5-fluorouracil (5-FU). With standard doses of 5-FU, substantial inter-patient variability has been observed in exposure levels and treatment response. Recently, improved outcomes in colorectal cancer patients due to pharmacokinetically guided 5-FU dosing were reported. We aimed at establishing a rapid and sensitive method for monitoring 5-FU plasma levels in cancer patients in our routine clinical practice. METHODS: Performance of the Saladax My5-FU™ immunoassay was evaluated on the Roche Cobas® Integra 800 analyzer. Subsequently, 5-FU concentrations of 247 clinical plasma samples obtained with this assay were compared to the results obtained by liquid chromatography-tandem mass spectrometry (LC-MS/MS) and other commonly used clinical analyzers (Olympus AU400, Roche Cobas c6000, and Thermo Fisher CDx90). RESULTS: The My-FU assay was successfully validated on the Cobas Integra 800 analyzer in terms of linearity, precision, accuracy, recovery, interference, sample carryover, and dilution integrity. Method comparison between the Cobas Integra 800 and LC-MS/MS revealed a proportional bias of 7% towards higher values measured with the My5-FU assay. However, when the Cobas Integra 800 was compared to three other clinical analyzers in addition to LC-MS/MS including 50 samples representing the typical clinical range of 5-FU plasma concentrations, only a small proportional bias (≤1.6%) and a constant bias below the limit of detection was observed. CONCLUSIONS: The My5-FU assay demonstrated robust and highly comparable performance on different analyzers. Therefore, the assay is suitable for monitoring 5-FU plasma levels in routine clinical practice and may contribute to improved efficacy and safety of commonly used 5-FU-based chemotherapies.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

INTRODUCTION: Voluntary muscle activity, including swallowing, decreases during the night. The association between nocturnal awakenings and swallowing activity is under-researched with limited information on the frequency of swallows during awake and asleep periods. AIM: The aim of this study was to assess nocturnal swallowing activity and identify a cut-off predicting awake and asleep periods. METHODS: Patients undergoing impedance-pH monitoring as part of GERD work-up were asked to wear a wrist activity detecting device (Actigraph(®)) at night. Swallowing activity was quantified by analysing impedance changes in the proximal esophagus. Awake and asleep periods were determined using a validated scoring system (Sadeh algorithm). Receiver operating characteristics (ROC) analyses were performed to determine sensitivity, specificity and accuracy of swallowing frequency to identify awake and asleep periods. RESULTS: Data from 76 patients (28 male, 48 female; mean age 56 ± 15 years) were included in the analysis. The ROC analysis found that 0.33 sw/min (i.e. one swallow every 3 min) had the optimal sensitivity (78 %) and specificity (76 %) to differentiate awake from asleep periods. A swallowing frequency of 0.25 sw/min (i.e. one swallow every 4 min) was 93 % sensitive and 57 % specific to identify awake periods. A swallowing frequency of 1 sw/min was 20 % sensitive but 96 % specific in identifying awake periods. Impedance-pH monitoring detects differences in swallowing activity during awake and asleep periods. Swallowing frequency noticed during ambulatory impedance-pH monitoring can predict the state of consciousness during nocturnal periods

Relevância:

30.00% 30.00%

Publicador:

Resumo:

For early diagnosis and therapy of alcohol-related disorders, alcohol biomarkers are highly valuable. Concerning specificity, indirect markers can be influenced by nonethanol-related factors, whereas direct markers are only formed after ethanol consumption. Sensitivity of the direct markers depends on cutoffs of analytical methods, material for analysis and plays an important role for their utilization in different fields of application. Until recently, the biomarker phosphatidylethanol has been used to differentiate between social drinking and alcohol abuse. After method optimization, the detection limit could be lowered and phosphatidylethanol became sensitive enough to even detect the consumption of low amounts of alcohol. This perspective gives a summary of most common alcohol biomarkers and summarizes new developments for monitoring alcohol consumption habits.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

PURPOSE This study assessed whether a cycle of "routine" therapeutic drug monitoring (TDM) for imatinib dosage individualization, targeting an imatinib trough plasma concentration (C min) of 1,000 ng/ml (tolerance: 750-1,500 ng/ml), could improve clinical outcomes in chronic myelogenous leukemia (CML) patients, compared with TDM use only in case of problems ("rescue" TDM). METHODS Imatinib concentration monitoring evaluation was a multicenter randomized controlled trial including adult patients in chronic or accelerated phase CML receiving imatinib since less than 5 years. Patients were allocated 1:1 to "routine TDM" or "rescue TDM." The primary endpoint was a combined outcome (failure- and toxicity-free survival with continuation on imatinib) over 1-year follow-up, analyzed in intention-to-treat (ISRCTN31181395). RESULTS Among 56 patients (55 evaluable), 14/27 (52 %) receiving "routine TDM" remained event-free versus 16/28 (57 %) "rescue TDM" controls (P = 0.69). In the "routine TDM" arm, dosage recommendations were correctly adopted in 14 patients (median C min: 895 ng/ml), who had fewer unfavorable events (28 %) than the 13 not receiving the advised dosage (77 %; P = 0.03; median C min: 648 ng/ml). CONCLUSIONS This first target concentration intervention trial could not formally demonstrate a benefit of "routine TDM" because of small patient number and surprisingly limited prescriber's adherence to dosage recommendations. Favorable outcomes were, however, found in patients actually elected for target dosing. This study thus shows first prospective indication for TDM being a useful tool to guide drug dosage and shift decisions. The study design and analysis provide an interesting paradigm for future randomized TDM trials on targeted anticancer agents.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

OBJECTIVES Many paediatric antiretroviral therapy (ART) programmes in Southern Africa rely on CD4⁺ to monitor ART. We assessed the benefit of replacing CD4⁺ by viral load monitoring. DESIGN A mathematical modelling study. METHODS A simulation model of HIV progression over 5 years in children on ART, parameterized by data from seven South African cohorts. We simulated treatment programmes with 6-monthly CD4⁺ or 6- or 12-monthly viral load monitoring. We compared mortality, second-line ART use, immunological failure and time spent on failing ART. In further analyses, we varied the rate of virological failure, and assumed that the rate is higher with CD4⁺ than with viral load monitoring. RESULTS About 7% of children were predicted to die within 5 years, independent of the monitoring strategy. Compared with CD4⁺ monitoring, 12-monthly viral load monitoring reduced the 5-year risk of immunological failure from 1.6 to 1.0% and the mean time spent on failing ART from 6.6 to 3.6 months; 1% of children with CD4⁺ compared with 12% with viral load monitoring switched to second-line ART. Differences became larger when assuming higher rates of virological failure. When assuming higher virological failure rates with CD4⁺ than with viral load monitoring, up to 4.2% of children with CD4⁺ compared with 1.5% with viral load monitoring experienced immunological failure; the mean time spent on failing ART was 27.3 months with CD4⁺ monitoring and 6.0 months with viral load monitoring. Conclusion: Viral load monitoring did not affect 5-year mortality, but reduced time on failing ART, improved immunological response and increased switching to second-line ART.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Conservation and monitoring of forest biodiversity requires reliable information about forest structure and composition at multiple spatial scales. However, detailed data about forest habitat characteristics across large areas are often incomplete due to difficulties associated with field sampling methods. To overcome this limitation we employed a nationally available light detection and ranging (LiDAR) remote sensing dataset to develop variables describing forest landscape structure across a large environmental gradient in Switzerland. Using a model species indicative of structurally rich mountain forests (hazel grouse Bonasa bonasia), we tested the potential of such variables to predict species occurrence and evaluated the additional benefit of LiDAR data when used in combination with traditional, sample plot-based field variables. We calibrated boosted regression trees (BRT) models for both variable sets separately and in combination, and compared the models’ accuracies. While both field-based and LiDAR models performed well, combining the two data sources improved the accuracy of the species’ habitat model. The variables retained from the two datasets held different types of information: field variables mostly quantified food resources and cover in the field and shrub layer, LiDAR variables characterized heterogeneity of vegetation structure which correlated with field variables describing the understory and ground vegetation. When combined with data on forest vegetation composition from field surveys, LiDAR provides valuable complementary information for encompassing species niches more comprehensively. Thus, LiDAR bridges the gap between precise, locally restricted field-data and coarse digital land cover information by reliably identifying habitat structure and quality across large areas.