970 resultados para application monitoring


Relevância:

30.00% 30.00%

Publicador:

Resumo:

A major concern of electrocatalysis research is to assess the structural and chemical changes that a catalyst may itself undergo in the course of the catalyzed process. These changes can influence not only the activity of the studied catalyst but also its selectivity toward the formation of a certain product. An illustrative example is the electroreduction of carbon dioxide on tin oxide nanoparticles, where under the operating conditions of the electrolysis (that is, at cathodic potentials), the catalyst undergoes structural changes which, in an extreme case, involve its reduction to metallic tin. This results in a decreased Faradaic efficiency (FE) for the production of formate (HCOO–) that is otherwise the main product of CO2 reduction on SnOx surfaces. In this study, we utilized potential- and time-dependent in operando Raman spectroscopy in order to monitor the oxidation state changes of SnO2 that accompany CO2 reduction. Investigations were carried out at different alkaline pH levels, and a strong correlation between the oxidation state of the surface and the FE of HCOO– formation was found. At moderately cathodic potentials, SnO2 exhibits a high FE for the production of formate, while at very negative potentials the oxide is reduced to metallic Sn, and the efficiency of formate production is significantly decreased. Interestingly, the highest FE of formate production is measured at potentials where SnO2 is thermodynamically unstable; however, its reduction is kinetically hindered.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Due to its extraordinary biodiversity and rapid deforestation, north-eastern Madagascar is a conservation hotspot of global importance. Reducing shifting cultivation is a high priority for policy-makers and conservationists; however, spatially explicit evidence of shifting cultivation is lacking due to the difficulty of mapping it with common remote sensing methods. To overcome this challenge, we adopted a landscape mosaic approach to assess the changes between natural forests, shifting cultivation and permanent cultivation systems at the regional level from 1995 to 2011. Our study confirmed that shifting cultivation is still being used to produce subsistence rice throughout the region, but there is a trend of intensification away from shifting cultivation towards permanent rice production, especially near protected areas. While large continuous forest exists today only in the core zones of protected areas, the agricultural matrix is still dominated by a dense cover of tree crops and smaller forest fragments. We believe that this evidence makes a crucial contribution to the development of interventions to prevent further conversion of forest to agricultural land while improving local land users' well-being.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

PURPOSE Despite different existing methods, monitoring of free muscle transfer is still challenging. In the current study we evaluated our clinical setting regarding monitoring of such tissues, using a recent microcirculation-imaging camera (EasyLDI) as an additional tool for detection of perfusion incompetency. PATIENTS AND METHODS This study was performed on seven patients with soft tissue defect, who underwent reconstruction with free gracilis muscle. Beside standard monitoring protocol (clinical assessment, temperature strips, and surface Doppler), hourly EasyLDI monitoring was performed for 48 hours. Thereby a baseline value (raised flap but connected to its vascular bundle) and an ischaemia perfusion value (completely resected flap) were measured at the same point. RESULTS The mean age of the patients, mean baseline value, ischaemia value perfusion were 48.00 ± 13.42 years, 49.31 ± 17.33 arbitrary perfusion units (APU), 9.87 ± 4.22 APU, respectively. The LDI measured values in six free muscle transfers were compatible with hourly standard monitoring protocol, and normalized LDI values significantly increased during time (P < 0.001, r = 0.412). One of the flaps required a return to theatre 17 hours after the operation, where an unsalvageable flap loss was detected. All normalized LDI values of this flap were under the ischaemia perfusion level and the trend was significantly descending during time (P < 0.001, r = -0.870). CONCLUSION Due to the capability of early detection of perfusion incompetency, LDI may be recommended as an additional post-operative monitoring device for free muscle flaps, for early detection of suspected failing flaps and for validation of other methods.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

BACKGROUND HIV-1 RNA viral load (VL) testing is recommended to monitor antiretroviral therapy (ART) but not available in many resource-limited settings. We developed and validated CD4-based risk charts to guide targeted VL testing. METHODS We modeled the probability of virologic failure up to 5 years of ART based on current and baseline CD4 counts, developed decision rules for targeted VL testing of 10%, 20%, or 40% of patients in 7 cohorts of patients starting ART in South Africa, and plotted cutoffs for VL testing on colour-coded risk charts. We assessed the accuracy of risk chart-guided VL testing to detect virologic failure in validation cohorts from South Africa, Zambia, and the Asia-Pacific. RESULTS In total, 31,450 adult patients were included in the derivation and 25,294 patients in the validation cohorts. Positive predictive values increased with the percentage of patients tested: from 79% (10% tested) to 98% (40% tested) in the South African cohort, from 64% to 93% in the Zambian cohort, and from 73% to 96% in the Asia-Pacific cohort. Corresponding increases in sensitivity were from 35% to 68% in South Africa, from 55% to 82% in Zambia, and from 37% to 71% in Asia-Pacific. The area under the receiver operating curve increased from 0.75 to 0.91 in South Africa, from 0.76 to 0.91 in Zambia, and from 0.77 to 0.92 in Asia-Pacific. CONCLUSIONS CD4-based risk charts with optimal cutoffs for targeted VL testing maybe useful to monitor ART in settings where VL capacity is limited.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

BACKGROUND Ongoing CD4 monitoring in patients on antiretroviral therapy (ART) with viral suppression has been questioned. We evaluated the probability of CD4 decline in children with viral suppression and CD4 recovery after 1 year on ART. METHODS We included children from 8 South African cohorts with routine HIV-RNA monitoring if (1) they were "responders" [HIV-RNA < 400 copies/mL and no severe immunosuppression after ≥1 year on ART (time 0)] and (2) ≥1 HIV-RNA and CD4 measurement within 15 months of time 0. We determined the probability of CD4 decline to World Health Organization-defined severe immunosuppression for 3 years after time 0 if viral suppression was maintained. Follow-up was censored at the earliest of the following dates: the day before first HIV-RNA measurement >400 copies/mL; day before a >15-month gap in testing and date of death, loss to follow-up, transfer out or database closure. RESULTS Among 5984 children [median age at time 0: 5.8 years (interquartile range: 3.1-9.0)], 270 children experienced a single CD4 decline to severe immunosuppression within 3 years of time 0 with probability of 6.6% (95% CI: 5.8-7.4). A subsequent CD4 measurement within 15 months of the first low measurement was available for 63% of children with CD4 decline and 86% showed CD4 recovery. The probability of CD4 decline was lowest (2.8%) in children aged 2 years or older with no or mild immunosuppression and on ART for <18 months at time 0. This group comprised 40% of children. CONCLUSIONS This finding suggests that it may be safe to stop routine CD4 monitoring in children older than 2 years and rely on virologic monitoring alone.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

AIM Depending on intensity, exercise may induce a strong hormonal and metabolic response, including acid-base imbalances and changes in microcirculation, potentially interfering with the accuracy of continuous glucose monitoring (CGM). The present study aimed at comparing the accuracy of the Dexcom G4 Platinum (DG4P) CGM during continuous moderate and intermittent high-intensity exercise (IHE) in adults with type 1 diabetes (T1DM). METHODS Ten male individuals with well-controlled T1DM (HbA1c 7.0±0.6% [54±6mmol/mol]) inserted the DG4P sensor 2 days prior to a 90min cycling session (50% VO2peak) either with (IHE) or without (CONT) a 10s all-out sprint every 10min. Venous blood samples for reference glucose measurement were drawn every 10min and euglycemia (target 7mmol/l) was maintained using an oral glucose solution. Additionally, lactate and venous blood gas variables were determined. RESULTS Mean reference blood glucose was 7.6±0.2mmol/l during IHE and 6.7±0.2mmol/l during CONT (p<0.001). IHE resulted in significantly higher levels of lactate (7.3±0.5mmol/l vs. 2.6±0.3mmol/l, p<0.001), while pH values were significantly lower in the IHE group (7.27 vs. 7.38, p=0.001). Mean absolute relative difference (MARD) was 13.3±2.2% for IHE and 13.6±2.8% for CONT suggesting comparable accuracy (p=0.90). Using Clarke Error Grid Analysis, 100% of CGM values during both IHE and CONT were in zones A and B (IHE: 77% and 23%; CONT: 78% and 22%). CONCLUSIONS The present study revealed good and comparable accuracy of the DG4P CGM system during intermittent high intensity and continuous moderate intensity exercise, despite marked differences in metabolic conditions. This corroborates the clinical robustness of CGM under differing exercise conditions. CLINICAL TRIAL REGISTRATION NUMBER ClinicalTrials.gov NCT02068638.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Environmental quality monitoring of water resources is challenged with providing the basis for safeguarding the environment against adverse biological effects of anthropogenic chemical contamination from diffuse and point sources. While current regulatory efforts focus on monitoring and assessing a few legacy chemicals, many more anthropogenic chemicals can be detected simultaneously in our aquatic resources. However, exposure to chemical mixtures does not necessarily translate into adverse biological effects nor clearly shows whether mitigation measures are needed. Thus, the question which mixtures are present and which have associated combined effects becomes central for defining adequate monitoring and assessment strategies. Here we describe the vision of the international, EU-funded project SOLUTIONS, where three routes are explored to link the occurrence of chemical mixtures at specific sites to the assessment of adverse biological combination effects. First of all, multi-residue target and non-target screening techniques covering a broader range of anticipated chemicals co-occurring in the environment are being developed. By improving sensitivity and detection limits for known bioactive compounds of concern, new analytical chemistry data for multiple components can be obtained and used to characterise priority mixtures. This information on chemical occurrence will be used to predict mixture toxicity and to derive combined effect estimates suitable for advancing environmental quality standards. Secondly, bioanalytical tools will be explored to provide aggregate bioactivity measures integrating all components that produce common (adverse) outcomes even for mixtures of varying compositions. The ambition is to provide comprehensive arrays of effect-based tools and trait-based field observations that link multiple chemical exposures to various environmental protection goals more directly and to provide improved in situ observations for impact assessment of mixtures. Thirdly, effect-directed analysis (EDA) will be applied to identify major drivers of mixture toxicity. Refinements of EDA include the use of statistical approaches with monitoring information for guidance of experimental EDA studies. These three approaches will be explored using case studies at the Danube and Rhine river basins as well as rivers of the Iberian Peninsula. The synthesis of findings will be organised to provide guidance for future solution-oriented environmental monitoring and explore more systematic ways to assess mixture exposures and combination effects in future water quality monitoring.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

BACKGROUND Oesophageal clearance has been scarcely studied. AIMS Oesophageal clearance in endoscopy-negative heartburn was assessed to detect differences in bolus clearance time among patients sub-grouped according to impedance-pH findings. METHODS In 118 consecutive endoscopy-negative heartburn patients impedance-pH monitoring was performed off-therapy. Acid exposure time, number of refluxes, baseline impedance, post-reflux swallow-induced peristaltic wave index and both automated and manual bolus clearance time were calculated. Patients were sub-grouped into pH/impedance positive (abnormal acid exposure and/or number of refluxes) and pH/impedance negative (normal acid exposure and number of refluxes), the former further subdivided on the basis of abnormal/normal acid exposure time (pH+/-) and abnormal/normal number of refluxes (impedance+/-). RESULTS Poor correlation (r=0.35) between automated and manual bolus clearance time was found. Manual bolus clearance time progressively decreased from pH+/impedance+ (42.6s), pH+/impedance- (27.1s), pH-/impedance+ (17.8s) to pH-/impedance- (10.8s). There was an inverse correlation between manual bolus clearance time and both baseline impedance and post-reflux swallow-induced peristaltic wave index, and a direct correlation between manual bolus clearance and acid exposure time. A manual bolus clearance time value of 14.8s had an accuracy of 93% to differentiate pH/impedance positive from pH/impedance negative patients. CONCLUSIONS When manually measured, bolus clearance time reflects reflux severity, confirming the pathophysiological relevance of oesophageal clearance in reflux disease.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Syndromic surveillance (SyS) systems currently exploit various sources of health-related data, most of which are collected for purposes other than surveillance (e.g. economic). Several European SyS systems use data collected during meat inspection for syndromic surveillance of animal health, as some diseases may be more easily detected post-mortem than at their point of origin or during the ante-mortem inspection upon arrival at the slaughterhouse. In this paper we use simulation to evaluate the performance of a quasi-Poisson regression (also known as an improved Farrington) algorithm for the detection of disease outbreaks during post-mortem inspection of slaughtered animals. When parameterizing the algorithm based on the retrospective analyses of 6 years of historic data, the probability of detection was satisfactory for large (range 83-445 cases) outbreaks but poor for small (range 20-177 cases) outbreaks. Varying the amount of historical data used to fit the algorithm can help increasing the probability of detection for small outbreaks. However, while the use of a 0·975 quantile generated a low false-positive rate, in most cases, more than 50% of outbreak cases had already occurred at the time of detection. High variance observed in the whole carcass condemnations time-series, and lack of flexibility in terms of the temporal distribution of simulated outbreaks resulting from low reporting frequency (monthly), constitute major challenges for early detection of outbreaks in the livestock population based on meat inspection data. Reporting frequency should be increased in the future to improve timeliness of the SyS system while increased sensitivity may be achieved by integrating meat inspection data into a multivariate system simultaneously evaluating multiple sources of data on livestock health.