60 resultados para Monitoring methods


Relevância:

30.00% 30.00%

Publicador:

Resumo:

INTRODUCTION Anatomic imaging alone is often inadequate for tuning systemic treatment for individual tumor response. Optically based techniques could potentially contribute to fast and objective response monitoring in personalized cancer therapy. In the present study, we evaluated the feasibility of dual-modality diffuse reflectance spectroscopy-autofluorescence spectroscopy (DRS-AFS) to monitor the effects of systemic treatment in a mouse model for hereditary breast cancer. METHODS Brca1(-/-); p53(-/-) mammary tumors were grown in 36 mice, half of which were treated with a single dose of cisplatin. Changes in the tumor physiology and morphology were measured for a period of 1 week using dual-modality DRS-AFS. Liver and muscle tissues were also measured to distinguish tumor-specific alterations from systemic changes. Model-based analyses were used to derive different optical parameters like the scattering and absorption coefficients, as well as sources of intrinsic fluorescence. Histopathologic analysis was performed for cross-validation with trends in optically based parameters. RESULTS Treated tumors showed a significant decrease in Mie-scattering slope and Mie-to-total scattering fraction and an increase in both fat volume fraction and tissue oxygenation after 2 days of follow-up. Additionally, significant tumor-specific changes in the fluorescence spectra were seen. These longitudinal trends were consistent with changes observed in the histopathologic analysis, such as vital tumor content and formation of fibrosis. CONCLUSIONS This study demonstrates that dual-modality DRS-AFS provides quantitative functional information that corresponds well with the degree of pathologic response. DRS-AFS, in conjunction with other imaging modalities, could be used to optimize systemic cancer treatment on the basis of early individual tumor response.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

BACKGROUND The assessment of hemodynamic status is a crucial task in the initial evaluation of trauma patients. However, blood pressure and heart rate are often misleading, as multiple variables may impact these conventional parameters. More reliable methods such as pulmonary artery thermodilution for cardiac output measuring would be necessary, but its applicability in the Emergency Department is questionable due to their invasive nature. Non-invasive cardiac output monitoring devices may be a feasible alternative. METHODS A systematic literature review was conducted. Only studies that explicitly investigated non-invasive hemodynamic monitoring devices in trauma patients were considered. RESULTS A total of 7 studies were identified as suitable and were included into this review. These studies evaluated in a total of 1,197 trauma patients the accuracy of non-invasive hemodynamic monitoring devices by comparing measurements to pulmonary artery thermodilution, which is the gold standard for cardiac output measuring. The correlation coefficients r between the two methods ranged from 0.79 to 0.92. Bias and precision analysis ranged from -0.02 +/- 0.78 l/min/m(2) to -0.14 +/- 0.73 l/min/m(2). Additionally, data on practicality, limitations and clinical impact of the devices were collected. CONCLUSION The accuracy of non-invasive cardiac output monitoring devices in trauma patients is broadly satisfactory. As the devices can be applied very early in the shock room or even preclinically, hemodynamic shock may be recognized much earlier and therapeutic interventions could be applied more rapidly and more adequately. The devices can be used in the daily routine of a busy ED, as they are non-invasive and easy to master.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Purpose: Selective retina therapy (SRT) has shown great promise compared to conventional retinal laser photocoagulation as it avoids collateral damage and selectively targets the retinal pigment epithelium (RPE). Its use, however, is challenging in terms of therapy monitoring and dosage because an immediate tissue reaction is not biomicroscopically discernibel. To overcome these limitations, real-time optical coherence tomography (OCT) might be useful to monitor retinal tissue during laser application. We have thus evaluated a proprietary OCT system for its capability of mapping optical changes introduced by SRT in retinal tissue. Methods: Freshly enucleated porcine eyes, covered in DMEM upon collection were utilized and a total of 175 scans from ex-vivo porcine eyes were analyzed. The porcine eyes were used as an ex-vivo model and results compared to two time-resolved OCT scans, recorded from a patient undergoing SRT treatment (SRT Vario, Medical Laser Center Lübeck). In addition to OCT, fluorescin angiography and fundus photography were performed on the patient and OCT scans were subsequently investigated for optical tissue changes linked to laser application. Results: Biomicroscopically invisible SRT lesions were detectable in OCT by changes in the RPE / Bruch's complex both in vivo and the porcine ex-vivo model. Laser application produced clearly visible optical effects such as hyperreflectivity and tissue distortion in the treated retina. Tissue effects were even discernible in time-resolved OCT imaging when no hyper-reflectivity persisted after treatment. Data from ex-vivo porcine eyes showed similar to identical optical changes while effects visible in OCT appeared to correlate with applied pulse energy, leading to an additional reflective layer when lesions became visible in indirect ophthalmoscopy. Conclusions: Our results support the hypothesis that real-time high-resolution OCT may be a promising modality to obtain additional information about the extent of tissue damage caused by SRT treatment. Data shows that our exvivo porcine model adequately reproduces the effects occurring in-vivo, and thus can be used to further investigate this promising imaging technique.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

BACKGROUND Patients requiring anticoagulation suffer from comorbidities such as hypertension. On the occasion of INR monitoring, general practitioners (GPs) have the opportunity to control for blood pressure (BP). We aimed to evaluate the impact of Vitamin-K Antagonist (VKA) monitoring by GPs on BP control in patients with hypertension. METHODS We cross-sectionally analyzed the database of the Swiss Family Medicine ICPC Research using Electronic Medical Records (FIRE) of 60 general practices in a primary care setting in Switzerland. This database includes 113,335 patients who visited their GP between 2009 and 2013. We identified patients with hypertension based on antihypertensive medication prescribed for ≥6 months. We compared patients with VKA for ≥3 months and patients without such treatment regarding BP control. We adjusted for age, sex, observation period, number of consultations and comorbidity. RESULTS We identified 4,412 patients with hypertension and blood pressure recordings in the FIRE database. Among these, 569 (12.9 %) were on Phenprocoumon (VKA) and 3,843 (87.1 %) had no anticoagulation. Mean systolic and diastolic BP was significantly lower in the VKA group (130.6 ± 14.9 vs 139.8 ± 15.8 and 76.6 ± 7.9 vs 81.3 ± 9.3 mm Hg) (p < 0.001 for both). The difference remained after adjusting for possible confounders. Systolic and diastolic BP were significantly lower in the VKA group, reaching a mean difference of -8.4 mm Hg (95 % CI -9.8 to -7.0 mm Hg) and -1.5 mm Hg (95 % CI -2.3 to -0.7 mm Hg), respectively (p < 0.001 for both). CONCLUSIONS In a large sample of hypertensive patients in Switzerland, VKA treatment was independently associated with better systolic and diastolic BP control. The observed effect could be due to better compliance with antihypertensive medication in patients treated with VKA. Therefore, we conclude to be aware of this possible benefit especially in patients with lower expected compliance and with multimorbidity.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

BACKGROUND HIV-1 viral load (VL) testing is recommended to monitor antiretroviral therapy (ART) but not universally available. We examined monitoring of first-line and switching to second-line ART in sub-Saharan Africa, 2004-2013. METHODS Adult HIV-1 infected patients starting combination ART in 16 countries were included. Switching was defined as a change from a non-nucleoside reverse-transcriptase inhibitor (NNRTI)-based regimen to a protease inhibitor (PI)-based regimen, with a change of ≥1 NRTI. Virological and immunological failures were defined per World Health Organization criteria. We calculated cumulative probabilities of switching and hazard ratios with 95% confidence intervals (CI) comparing routine VL monitoring, targeted VL monitoring, CD4 cell monitoring and clinical monitoring, adjusted for programme and individual characteristics. FINDINGS Of 297,825 eligible patients, 10,352 patients (3·5%) switched during 782,412 person-years of follow-up. Compared to CD4 monitoring hazard ratios for switching were 3·15 (95% CI 2·92-3·40) for routine VL, 1·21 (1·13-1·30) for targeted VL and 0·49 (0·43-0·56) for clinical monitoring. Overall 58.0% of patients with confirmed virological and 19·3% of patients with confirmed immunological failure switched within 2 years. Among patients who switched the percentage with evidence of treatment failure based on a single CD4 or VL measurement ranged from 32·1% with clinical to 84.3% with targeted VL monitoring. Median CD4 counts at switching were 215 cells/µl under routine VL monitoring but lower with other monitoring (114-133 cells/µl). INTERPRETATION Overall few patients switched to second-line ART and switching occurred late in the absence of routine viral load monitoring. Switching was more common and occurred earlier with targeted or routine viral load testing.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

BACKGROUND HIV-1 RNA viral load (VL) testing is recommended to monitor antiretroviral therapy (ART) but not available in many resource-limited settings. We developed and validated CD4-based risk charts to guide targeted VL testing. METHODS We modeled the probability of virologic failure up to 5 years of ART based on current and baseline CD4 counts, developed decision rules for targeted VL testing of 10%, 20% or 40% of patients in seven cohorts of patients starting ART in South Africa, and plotted cut-offs for VL testing on colour-coded risk charts. We assessed the accuracy of risk chart-guided VL testing to detect virologic failure in validation cohorts from South Africa, Zambia and the Asia-Pacific. FINDINGS 31,450 adult patients were included in the derivation and 25,294 patients in the validation cohorts. Positive predictive values increased with the percentage of patients tested: from 79% (10% tested) to 98% (40% tested) in the South African, from 64% to 93% in the Zambian and from 73% to 96% in the Asia-Pacific cohorts. Corresponding increases in sensitivity were from 35% to 68% in South Africa, from 55% to 82% in Zambia and from 37% to 71% in Asia-Pacific. The area under the receiver-operating curve increased from 0.75 to 0.91 in South Africa, from 0.76 to 0.91 in Zambia and from 0.77 to 0.92 in Asia Pacific. INTERPRETATION CD4-based risk charts with optimal cut-offs for targeted VL testing may be useful to monitor ART in settings where VL capacity is limited.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The potential effects of climatic changes on natural risks are widely discussed. But the formulation of strategies for adapting risk management practice to climate changes requires knowledge of the related risks for people and economic values. The main goals of this work were (1) the development of a method for analysing and comparing risks induced by different natural hazard types, (2) highlighting the most relevant natural hazard processes and related damages, (3) the development of an information system for the monitoring of the temporal development of natural hazard risk and (4) the visualisation of the resulting information for the wider public. A comparative exposure analysis provides the basis for pointing out the hot spots of natural hazard risks in the province of Carinthia, Austria. An analysis of flood risks in all municipalities provides the basis for setting the priorities in the planning of flood protection measures. The methods form the basis for a monitoring system that periodically observes the temporal development of natural hazard risks. This makes it possible firstly to identify situations in which natural hazard risks are rising and secondly to differentiate between the most relevant factors responsible for the increasing risks. The factors that most influence the natural risks could be made evident.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Purpose: Selective retina therapy (SRT) is a novel treatment for retinal pathologies, solely targeting the retinal pigment epithelium (RPE). During SRT, the detection of an immediate tissue reaction is challenging as tissue effects remain limited to intracellular RPE photodisruption. Time-resolved ultra-high axial resolution optical coherence tomography (OCT) is thus evaluated for the monitoring of dynamic optical changes at and around the RPE during SRT. Methods: An experimental OCT system with an ultra-high axial resolution of 1.78 µm was combined with an SRT system and time-resolved OCT M-scans of the target area were recorded from four patients undergoing SRT. OCT scans were analyzed and OCT morphology was correlated with findings in fluorescein angiography, fundus photography and cross-sectional OCT. Results: In cases where the irradiation caused RPE damage proven by fluorescein angiography, the lesions were well discernible in time-resolved OCT images but remained invisible in fundus photography and cross-sectional OCT acquired after treatment. If RPE damage was introduced, all applied SRT pulses led to detectable signal changes in the time-resolved OCT images. The extent of optical signal variation seen in the OCT data appeared to scale with the applied SRT pulse energy. Conclusion: The first clinical results proved that successful SRT irradiation induces detectable changes in the OCT M-scan signal while it remains invisible in conventional ophthalmoscopic imaging. Thus, real-time high-resolution OCT is a promising modality to monitor and analyze tissue effects introduced by selective retina therapy and may be used to guide SRT in an automatic feedback mode.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Due to its extraordinary biodiversity and rapid deforestation, north-eastern Madagascar is a conservation hotspot of global importance. Reducing shifting cultivation is a high priority for policy-makers and conservationists; however, spatially explicit evidence of shifting cultivation is lacking due to the difficulty of mapping it with common remote sensing methods. To overcome this challenge, we adopted a landscape mosaic approach to assess the changes between natural forests, shifting cultivation and permanent cultivation systems at the regional level from 1995 to 2011. Our study confirmed that shifting cultivation is still being used to produce subsistence rice throughout the region, but there is a trend of intensification away from shifting cultivation towards permanent rice production, especially near protected areas. While large continuous forest exists today only in the core zones of protected areas, the agricultural matrix is still dominated by a dense cover of tree crops and smaller forest fragments. We believe that this evidence makes a crucial contribution to the development of interventions to prevent further conversion of forest to agricultural land while improving local land users' well-being.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

PURPOSE Despite different existing methods, monitoring of free muscle transfer is still challenging. In the current study we evaluated our clinical setting regarding monitoring of such tissues, using a recent microcirculation-imaging camera (EasyLDI) as an additional tool for detection of perfusion incompetency. PATIENTS AND METHODS This study was performed on seven patients with soft tissue defect, who underwent reconstruction with free gracilis muscle. Beside standard monitoring protocol (clinical assessment, temperature strips, and surface Doppler), hourly EasyLDI monitoring was performed for 48 hours. Thereby a baseline value (raised flap but connected to its vascular bundle) and an ischaemia perfusion value (completely resected flap) were measured at the same point. RESULTS The mean age of the patients, mean baseline value, ischaemia value perfusion were 48.00 ± 13.42 years, 49.31 ± 17.33 arbitrary perfusion units (APU), 9.87 ± 4.22 APU, respectively. The LDI measured values in six free muscle transfers were compatible with hourly standard monitoring protocol, and normalized LDI values significantly increased during time (P < 0.001, r = 0.412). One of the flaps required a return to theatre 17 hours after the operation, where an unsalvageable flap loss was detected. All normalized LDI values of this flap were under the ischaemia perfusion level and the trend was significantly descending during time (P < 0.001, r = -0.870). CONCLUSION Due to the capability of early detection of perfusion incompetency, LDI may be recommended as an additional post-operative monitoring device for free muscle flaps, for early detection of suspected failing flaps and for validation of other methods.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

BACKGROUND HIV-1 RNA viral load (VL) testing is recommended to monitor antiretroviral therapy (ART) but not available in many resource-limited settings. We developed and validated CD4-based risk charts to guide targeted VL testing. METHODS We modeled the probability of virologic failure up to 5 years of ART based on current and baseline CD4 counts, developed decision rules for targeted VL testing of 10%, 20%, or 40% of patients in 7 cohorts of patients starting ART in South Africa, and plotted cutoffs for VL testing on colour-coded risk charts. We assessed the accuracy of risk chart-guided VL testing to detect virologic failure in validation cohorts from South Africa, Zambia, and the Asia-Pacific. RESULTS In total, 31,450 adult patients were included in the derivation and 25,294 patients in the validation cohorts. Positive predictive values increased with the percentage of patients tested: from 79% (10% tested) to 98% (40% tested) in the South African cohort, from 64% to 93% in the Zambian cohort, and from 73% to 96% in the Asia-Pacific cohort. Corresponding increases in sensitivity were from 35% to 68% in South Africa, from 55% to 82% in Zambia, and from 37% to 71% in Asia-Pacific. The area under the receiver operating curve increased from 0.75 to 0.91 in South Africa, from 0.76 to 0.91 in Zambia, and from 0.77 to 0.92 in Asia-Pacific. CONCLUSIONS CD4-based risk charts with optimal cutoffs for targeted VL testing maybe useful to monitor ART in settings where VL capacity is limited.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

BACKGROUND Ongoing CD4 monitoring in patients on antiretroviral therapy (ART) with viral suppression has been questioned. We evaluated the probability of CD4 decline in children with viral suppression and CD4 recovery after 1 year on ART. METHODS We included children from 8 South African cohorts with routine HIV-RNA monitoring if (1) they were "responders" [HIV-RNA < 400 copies/mL and no severe immunosuppression after ≥1 year on ART (time 0)] and (2) ≥1 HIV-RNA and CD4 measurement within 15 months of time 0. We determined the probability of CD4 decline to World Health Organization-defined severe immunosuppression for 3 years after time 0 if viral suppression was maintained. Follow-up was censored at the earliest of the following dates: the day before first HIV-RNA measurement >400 copies/mL; day before a >15-month gap in testing and date of death, loss to follow-up, transfer out or database closure. RESULTS Among 5984 children [median age at time 0: 5.8 years (interquartile range: 3.1-9.0)], 270 children experienced a single CD4 decline to severe immunosuppression within 3 years of time 0 with probability of 6.6% (95% CI: 5.8-7.4). A subsequent CD4 measurement within 15 months of the first low measurement was available for 63% of children with CD4 decline and 86% showed CD4 recovery. The probability of CD4 decline was lowest (2.8%) in children aged 2 years or older with no or mild immunosuppression and on ART for <18 months at time 0. This group comprised 40% of children. CONCLUSIONS This finding suggests that it may be safe to stop routine CD4 monitoring in children older than 2 years and rely on virologic monitoring alone.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

AIM Depending on intensity, exercise may induce a strong hormonal and metabolic response, including acid-base imbalances and changes in microcirculation, potentially interfering with the accuracy of continuous glucose monitoring (CGM). The present study aimed at comparing the accuracy of the Dexcom G4 Platinum (DG4P) CGM during continuous moderate and intermittent high-intensity exercise (IHE) in adults with type 1 diabetes (T1DM). METHODS Ten male individuals with well-controlled T1DM (HbA1c 7.0±0.6% [54±6mmol/mol]) inserted the DG4P sensor 2 days prior to a 90min cycling session (50% VO2peak) either with (IHE) or without (CONT) a 10s all-out sprint every 10min. Venous blood samples for reference glucose measurement were drawn every 10min and euglycemia (target 7mmol/l) was maintained using an oral glucose solution. Additionally, lactate and venous blood gas variables were determined. RESULTS Mean reference blood glucose was 7.6±0.2mmol/l during IHE and 6.7±0.2mmol/l during CONT (p<0.001). IHE resulted in significantly higher levels of lactate (7.3±0.5mmol/l vs. 2.6±0.3mmol/l, p<0.001), while pH values were significantly lower in the IHE group (7.27 vs. 7.38, p=0.001). Mean absolute relative difference (MARD) was 13.3±2.2% for IHE and 13.6±2.8% for CONT suggesting comparable accuracy (p=0.90). Using Clarke Error Grid Analysis, 100% of CGM values during both IHE and CONT were in zones A and B (IHE: 77% and 23%; CONT: 78% and 22%). CONCLUSIONS The present study revealed good and comparable accuracy of the DG4P CGM system during intermittent high intensity and continuous moderate intensity exercise, despite marked differences in metabolic conditions. This corroborates the clinical robustness of CGM under differing exercise conditions. CLINICAL TRIAL REGISTRATION NUMBER ClinicalTrials.gov NCT02068638.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Introduction: Although it seems plausible that sports performance relies on high-acuity foveal vision, it could be empirically shown that myoptic blur (up to +2 diopters) does not harm performance in sport tasks that require foveal information pick-up like golf putting (Bulson, Ciuffreda, & Hung, 2008). How myoptic blur affects peripheral performance is yet unknown. Attention might be less needed for processing visual cues foveally and lead to better performance because peripheral cues are better processed as a function of reduced foveal vision, which will be tested in the current experiment. Methods: 18 sport science students with self-reported myopia volunteered as participants, all of them regularly wearing contact lenses. Exclusion criteria comprised visual correction other than myopic, correction of astigmatism and use of contact lenses out of Swiss delivery area. For each of the participants, three pairs of additional contact lenses (besides their regular lenses; used in the “plano” condition) were manufactured with an individual overcorrection to a retinal defocus of +1 to +3 diopters (referred to as “+1.00 D”, “+2.00 D”, and “+3.00 D” condition, respectively). Gaze data were acquired while participants had to perform a multiple object tracking (MOT) task that required to track 4 out of 10 moving stimuli. In addition, in 66.7 % of all trials, one of the 4 targets suddenly stopped during the motion phase for a period of 0.5 s. Stimuli moved in front of a picture of a sports hall to allow for foveal processing. Due to the directional hypotheses, the level of significance for one-tailed tests on differences was set at α = .05 and posteriori effect sizes were computed as partial eta squares (ηρ2). Results: Due to problems with the gaze-data collection, 3 participants had to be excluded from further analyses. The expectation of a centroid strategy was confirmed because gaze was closer to the centroid than the target (all p < .01). In comparison to the plano baseline, participants more often recalled all 4 targets under defocus conditions, F(1,14) = 26.13, p < .01, ηρ2 = .65. The three defocus conditions differed significantly, F(2,28) = 2.56, p = .05, ηρ2 = .16, with a higher accuracy as a function of a defocus increase and significant contrasts between conditions +1.00 D and +2.00 D (p = .03) and +1.00 D and +3.00 D (p = .03). For stop trials, significant differences could neither be found between plano baseline and defocus conditions, F(1,14) = .19, p = .67, ηρ2 = .01, nor between the three defocus conditions, F(2,28) = 1.09, p = .18, ηρ2 = .07. Participants reacted faster in “4 correct+button” trials under defocus than under plano-baseline conditions, F(1,14) = 10.77, p < .01, ηρ2 = .44. The defocus conditions differed significantly, F(2,28) = 6.16, p < .01, ηρ2 = .31, with shorter response times as a function of a defocus increase and significant contrasts between +1.00 D and +2.00 D (p = .01) and +1.00 D and +3.00 D (p < .01). Discussion: The results show that gaze behaviour in MOT is not affected to a relevant degree by a visual overcorrection up to +3 diopters. Hence, it can be taken for granted that peripheral event detection was investigated in the present study. This overcorrection, however, does not harm the capability to peripherally track objects. Moreover, if an event has to be detected peripherally, neither response accuracy nor response time is negatively affected. Findings could claim considerable relevance for all sport situations in which peripheral vision is required which now needs applied studies on this topic. References: Bulson, R. C., Ciuffreda, K. J., & Hung, G. K. (2008). The effect of retinal defocus on golf putting. Ophthalmic and Physiological Optics, 28, 334-344.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

BACKGROUND Oesophageal clearance has been scarcely studied. AIMS Oesophageal clearance in endoscopy-negative heartburn was assessed to detect differences in bolus clearance time among patients sub-grouped according to impedance-pH findings. METHODS In 118 consecutive endoscopy-negative heartburn patients impedance-pH monitoring was performed off-therapy. Acid exposure time, number of refluxes, baseline impedance, post-reflux swallow-induced peristaltic wave index and both automated and manual bolus clearance time were calculated. Patients were sub-grouped into pH/impedance positive (abnormal acid exposure and/or number of refluxes) and pH/impedance negative (normal acid exposure and number of refluxes), the former further subdivided on the basis of abnormal/normal acid exposure time (pH+/-) and abnormal/normal number of refluxes (impedance+/-). RESULTS Poor correlation (r=0.35) between automated and manual bolus clearance time was found. Manual bolus clearance time progressively decreased from pH+/impedance+ (42.6s), pH+/impedance- (27.1s), pH-/impedance+ (17.8s) to pH-/impedance- (10.8s). There was an inverse correlation between manual bolus clearance time and both baseline impedance and post-reflux swallow-induced peristaltic wave index, and a direct correlation between manual bolus clearance and acid exposure time. A manual bolus clearance time value of 14.8s had an accuracy of 93% to differentiate pH/impedance positive from pH/impedance negative patients. CONCLUSIONS When manually measured, bolus clearance time reflects reflux severity, confirming the pathophysiological relevance of oesophageal clearance in reflux disease.