70 resultados para Self monitoring blood glycose
Resumo:
Most cows encounter a state of negative energy balance during the periparturient period, which may lead to metabolic disorders and impaired fertility. The aim of this study was to assess the potential of milk fatty acids as diagnostic tools of detrimental levels of blood plasma nonesterified fatty acids (NEFA), defined as NEFA concentrations beyond 0.6 mmol/L, in a data set of 92 early lactating cows fed a glucogenic or lipogenic diet and subjected to 0-, 30-, or 60-d dry period before parturition. Milk was collected in wk 2, 3, 4, and 8 (n = 368) and blood was sampled weekly from wk 2 to 8 after parturition. Milk was analyzed for milk fatty acids and blood plasma for NEFA. Data were classified as "at risk of detrimental blood plasma NEFA" (NEFA ≥ 0.6 mmol/L) and "not at risk of detrimental blood plasma NEFA" (NEFA <0.6 mmol/L). Concentrations of 45 milk fatty acids and milk fat C18:1 cis-9-to-C15:0 ratio were subjected to a discriminant analysis. Milk fat C18:1 cis-9 revealed the most discriminating variable to identify detrimental blood plasma NEFA. A false positive rate of 10% allowed us to diagnose 46% of the detrimental blood plasma NEFA cases based on a milk fat C18:1 cis-9 concentration of at least 230 g/kg of milk fatty acids. Additionally, it was assessed whether the milk fat C18:1 cis-9 concentrations of wk 2 could be used as an early warning for detrimental blood plasma NEFA risk during the first 8 wk in lactation. Cows with at least 240 g/kg of C18:1 cis-9 in milk fat had about 50% chance to encounter blood plasma NEFA values of 0.6 mmol/L or more during the first 8 wk of lactation, with a false positive rate of 11.4%. Profit simulations were based on costs for cows suffering from detrimental blood plasma NEFA, and costs for preventive treatment based on daily dosing of propylene glycol for 3 wk. Given the relatively low incidence rate (8% of all observations), continuous monitoring of milk fatty acids during the first 8 wk of lactation to diagnose detrimental blood plasma NEFA does not seem cost effective. On the contrary, milk fat C18:1 cis-9 of the second lactation week could be an early warning of cows at risk of detrimental blood NEFA. In this case, selective treatment may be cost effective.
Resumo:
BACKGROUND The assessment of hemodynamic status is a crucial task in the initial evaluation of trauma patients. However, blood pressure and heart rate are often misleading, as multiple variables may impact these conventional parameters. More reliable methods such as pulmonary artery thermodilution for cardiac output measuring would be necessary, but its applicability in the Emergency Department is questionable due to their invasive nature. Non-invasive cardiac output monitoring devices may be a feasible alternative. METHODS A systematic literature review was conducted. Only studies that explicitly investigated non-invasive hemodynamic monitoring devices in trauma patients were considered. RESULTS A total of 7 studies were identified as suitable and were included into this review. These studies evaluated in a total of 1,197 trauma patients the accuracy of non-invasive hemodynamic monitoring devices by comparing measurements to pulmonary artery thermodilution, which is the gold standard for cardiac output measuring. The correlation coefficients r between the two methods ranged from 0.79 to 0.92. Bias and precision analysis ranged from -0.02 +/- 0.78 l/min/m(2) to -0.14 +/- 0.73 l/min/m(2). Additionally, data on practicality, limitations and clinical impact of the devices were collected. CONCLUSION The accuracy of non-invasive cardiac output monitoring devices in trauma patients is broadly satisfactory. As the devices can be applied very early in the shock room or even preclinically, hemodynamic shock may be recognized much earlier and therapeutic interventions could be applied more rapidly and more adequately. The devices can be used in the daily routine of a busy ED, as they are non-invasive and easy to master.
Resumo:
Animal work implicates the brain-derived neurotrophic factor (BDNF) in function of the ventral striatum (VS), a region known for its role in processing valenced feedback. Recent evidence in humans shows that BDNF Val66Met polymorphism modulates VS activity in anticipation of monetary feedback. However, it remains unclear whether the polymorphism impacts the processing of self-attributed feedback differently from feedback attributed to an external agent. In this study, we emphasize the importance of the feedback attribution because agency is central to computational accounts of the striatum and cognitive accounts of valence processing. We used functional magnetic resonance imaging and a task, in which financial gains/losses are either attributable to performance (self-attributed, SA) or chance (externally-attributed, EA) to ask whether BDNF Val66Met polymorphism predicts VS activity. We found that BDNF Val66Met polymorphism influenced how feedback valence and agency information were combined in the VS and in the right inferior frontal junction (IFJ). Specifically, Met carriers' VS response to valenced feedback depended on agency information, while Val/Val carriers' VS response did not. This context-specific modulation of valence effectively amplified VS responses to SA losses in Met carriers. The IFJ response to SA losses also differentiated Val/Val from Met carriers. These results may point to a reduced allocation of attention and altered motivational salience to SA losses in Val/Val compared to Met carriers. Implications for major depressive disorder are discussed.
Resumo:
The concentration of 11-nor-9-carboxy-Δ(9)-tetrahydrocannabinol (THCCOOH) in whole blood is used as a parameter for assessing the consumption behavior of cannabis consumers. The blood level of THCCOOH-glucuronide might provide additional information about the frequency of cannabis use. To verify this assumption, a column-switching liquid chromatography-tandem mass spectrometry (LC-MS/MS) method for the rapid and direct quantification of free and glucuronidated THCCOOH in human whole blood was newly developed. The method comprised protein precipitation, followed by injection of the processed sample onto a trapping column and subsequent gradient elution to an analytical column for separation and detection. The total LC run time was 4.5 min. Detection of the analytes was accomplished by electrospray ionization in positive ion mode and selected reaction monitoring using a triple-stage quadrupole mass spectrometer. The method was fully validated by evaluating the following parameters: linearity, lower limit of quantification, accuracy and imprecision, selectivity, extraction efficiency, matrix effect, carry-over, dilution integrity, analyte stability, and re-injection reproducibility. All acceptance criteria were analyzed and the predefined criteria met. Linearity ranged from 5.0 to 500 μg/L for both analytes. The method was successfully applied to whole blood samples from a large collective of cannabis consumers, demonstrating its applicability in the forensic field.
Resumo:
OBJECTIVES A dissociation between behavioural (in-control) and physiological parameters (indicating loss-of-control) is associated with cardiovascular risk in defensive coping (DefS) Africans. We evaluated relationships between DefS, sub-clinical atherosclerosis, low-grade inflammation and hypercoagulation in a bi-ethnic sex cohort. METHODS Black (Africans) and white Africans (Caucasians) (n = 375; aged 44.6 ± 9.7 years) were included. Ambulatory BP, vascular structure (left carotid cross-sectional wall area (L-CSWA) and plaque counts), and markers of coagulation and inflammation were quantified. Ethnicity/coping style interaction was revealed only in DefS participants. RESULTS A hypertensive state, less plaque, low-grade inflammation, and hypercoagulation were more prevalent in DefS Africans (27-84%) than DefS Caucasians (18-41%). Regression analyses demonstrated associations between L-CSWA and 24 hour systolic BP (R(2) = 0.38; β = 0.78; p < 0.05) in DefS African men but not in DefS African women or Caucasians. No associations between L-CSWA and coagulation markers were evident. CONCLUSION Novel findings revealed hypercoagulation, low-grade inflammation and hyperkinetic BP (physiological loss-of-control responses) in DefS African men. Coupled to a self-reported in-control DefS behavioural profile, this reflects dissociation between behaviour and physiology. It may explain changes in vascular structure, increasing cerebrovascular disease risk in a state of hyper-vigilant coping.
Resumo:
AIM Depending on intensity, exercise may induce a strong hormonal and metabolic response, including acid-base imbalances and changes in microcirculation, potentially interfering with the accuracy of continuous glucose monitoring (CGM). The present study aimed at comparing the accuracy of the Dexcom G4 Platinum (DG4P) CGM during continuous moderate and intermittent high-intensity exercise (IHE) in adults with type 1 diabetes (T1DM). METHODS Ten male individuals with well-controlled T1DM (HbA1c 7.0±0.6% [54±6mmol/mol]) inserted the DG4P sensor 2 days prior to a 90min cycling session (50% VO2peak) either with (IHE) or without (CONT) a 10s all-out sprint every 10min. Venous blood samples for reference glucose measurement were drawn every 10min and euglycemia (target 7mmol/l) was maintained using an oral glucose solution. Additionally, lactate and venous blood gas variables were determined. RESULTS Mean reference blood glucose was 7.6±0.2mmol/l during IHE and 6.7±0.2mmol/l during CONT (p<0.001). IHE resulted in significantly higher levels of lactate (7.3±0.5mmol/l vs. 2.6±0.3mmol/l, p<0.001), while pH values were significantly lower in the IHE group (7.27 vs. 7.38, p=0.001). Mean absolute relative difference (MARD) was 13.3±2.2% for IHE and 13.6±2.8% for CONT suggesting comparable accuracy (p=0.90). Using Clarke Error Grid Analysis, 100% of CGM values during both IHE and CONT were in zones A and B (IHE: 77% and 23%; CONT: 78% and 22%). CONCLUSIONS The present study revealed good and comparable accuracy of the DG4P CGM system during intermittent high intensity and continuous moderate intensity exercise, despite marked differences in metabolic conditions. This corroborates the clinical robustness of CGM under differing exercise conditions. CLINICAL TRIAL REGISTRATION NUMBER ClinicalTrials.gov NCT02068638.
Resumo:
Introduction: Although it seems plausible that sports performance relies on high-acuity foveal vision, it could be empirically shown that myoptic blur (up to +2 diopters) does not harm performance in sport tasks that require foveal information pick-up like golf putting (Bulson, Ciuffreda, & Hung, 2008). How myoptic blur affects peripheral performance is yet unknown. Attention might be less needed for processing visual cues foveally and lead to better performance because peripheral cues are better processed as a function of reduced foveal vision, which will be tested in the current experiment. Methods: 18 sport science students with self-reported myopia volunteered as participants, all of them regularly wearing contact lenses. Exclusion criteria comprised visual correction other than myopic, correction of astigmatism and use of contact lenses out of Swiss delivery area. For each of the participants, three pairs of additional contact lenses (besides their regular lenses; used in the “plano” condition) were manufactured with an individual overcorrection to a retinal defocus of +1 to +3 diopters (referred to as “+1.00 D”, “+2.00 D”, and “+3.00 D” condition, respectively). Gaze data were acquired while participants had to perform a multiple object tracking (MOT) task that required to track 4 out of 10 moving stimuli. In addition, in 66.7 % of all trials, one of the 4 targets suddenly stopped during the motion phase for a period of 0.5 s. Stimuli moved in front of a picture of a sports hall to allow for foveal processing. Due to the directional hypotheses, the level of significance for one-tailed tests on differences was set at α = .05 and posteriori effect sizes were computed as partial eta squares (ηρ2). Results: Due to problems with the gaze-data collection, 3 participants had to be excluded from further analyses. The expectation of a centroid strategy was confirmed because gaze was closer to the centroid than the target (all p < .01). In comparison to the plano baseline, participants more often recalled all 4 targets under defocus conditions, F(1,14) = 26.13, p < .01, ηρ2 = .65. The three defocus conditions differed significantly, F(2,28) = 2.56, p = .05, ηρ2 = .16, with a higher accuracy as a function of a defocus increase and significant contrasts between conditions +1.00 D and +2.00 D (p = .03) and +1.00 D and +3.00 D (p = .03). For stop trials, significant differences could neither be found between plano baseline and defocus conditions, F(1,14) = .19, p = .67, ηρ2 = .01, nor between the three defocus conditions, F(2,28) = 1.09, p = .18, ηρ2 = .07. Participants reacted faster in “4 correct+button” trials under defocus than under plano-baseline conditions, F(1,14) = 10.77, p < .01, ηρ2 = .44. The defocus conditions differed significantly, F(2,28) = 6.16, p < .01, ηρ2 = .31, with shorter response times as a function of a defocus increase and significant contrasts between +1.00 D and +2.00 D (p = .01) and +1.00 D and +3.00 D (p < .01). Discussion: The results show that gaze behaviour in MOT is not affected to a relevant degree by a visual overcorrection up to +3 diopters. Hence, it can be taken for granted that peripheral event detection was investigated in the present study. This overcorrection, however, does not harm the capability to peripherally track objects. Moreover, if an event has to be detected peripherally, neither response accuracy nor response time is negatively affected. Findings could claim considerable relevance for all sport situations in which peripheral vision is required which now needs applied studies on this topic. References: Bulson, R. C., Ciuffreda, K. J., & Hung, G. K. (2008). The effect of retinal defocus on golf putting. Ophthalmic and Physiological Optics, 28, 334-344.
Resumo:
The basophil activation test (BAT) has become a pervasive test for allergic response through the development of flow cytometry, discovery of activation markers such as CD63 and unique markers identifying basophil granulocytes. Basophil activation test measures basophil response to allergen cross-linking IgE on between 150 and 2000 basophil granulocytes in <0.1 ml fresh blood. Dichotomous activation is assessed as the fraction of reacting basophils. In addition to clinical history, skin prick test, and specific IgE determination, BAT can be a part of the diagnostic evaluation of patients with food-, insect venom-, and drug allergy and chronic urticaria. It may be helpful in determining the clinically relevant allergen. Basophil sensitivity may be used to monitor patients on allergen immunotherapy, anti-IgE treatment or in the natural resolution of allergy. Basophil activation test may use fewer resources and be more reproducible than challenge testing. As it is less stressful for the patient and avoids severe allergic reactions, BAT ought to precede challenge testing. An important next step is to standardize BAT and make it available in diagnostic laboratories. The nature of basophil activation as an ex vivo challenge makes it a multifaceted and promising tool for the allergist. In this EAACI task force position paper, we provide an overview of the practical and technical details as well as the clinical utility of BAT in diagnosis and management of allergic diseases.
Resumo:
BACKGROUND: Bioluminescence imaging is widely used for cell-based assays and animal imaging studies, both in biomedical research and drug development. Its main advantages include its high-throughput applicability, affordability, high sensitivity, operational simplicity, and quantitative outputs. In malaria research, bioluminescence has been used for drug discovery in vivo and in vitro, exploring host-pathogen interactions, and studying multiple aspects of Plasmodium biology. While the number of fluorescent proteins available for imaging has undergone a great expansion over the last two decades, enabling simultaneous visualization of multiple molecular and cellular events, expansion of available luciferases has lagged behind. The most widely used bioluminescent probe in malaria research is the Photinus pyralis firefly luciferase, followed by the more recently introduced Click-beetle and Renilla luciferases. Ultra-sensitive imaging of Plasmodium at low parasite densities has not been previously achieved. With the purpose of overcoming these challenges, a Plasmodium berghei line expressing the novel ultra-bright luciferase enzyme NanoLuc, called PbNLuc has been generated, and is presented in this work. RESULTS: NanoLuc shows at least 150 times brighter signal than firefly luciferase in vitro, allowing single parasite detection in mosquito, liver, and sexual and asexual blood stages. As a proof-of-concept, the PbNLuc parasites were used to image parasite development in the mosquito, liver and blood stages of infection, and to specifically explore parasite liver stage egress, and pre-patency period in vivo. CONCLUSIONS: PbNLuc is a suitable parasite line for sensitive imaging of the entire Plasmodium life cycle. Its sensitivity makes it a promising line to be used as a reference for drug candidate testing, as well as the characterization of mutant parasites to explore the function of parasite proteins, host-parasite interactions, and the better understanding of Plasmodium biology. Since the substrate requirements of NanoLuc are different from those of firefly luciferase, dual bioluminescence imaging for the simultaneous characterization of two lines, or two separate biological processes, is possible, as demonstrated in this work.
Resumo:
For driving aptitude assessment (DAA), the analysis of several alcohol biomarkers is essential for the detection of alcohol intake besides psycho-medical exploration. In Switzerland, EtG in hair (hEtG) is often the only direct marker for abstinence monitoring in DAA. Therefore, the suitability of phosphatidylethanol (PEth) was investigated as additional biomarker. PEth 16:0/18:1 and 16:0/18:2 were determined by online-SPE-LC-MS/MS in 136 blood samples of persons undergoing DAA and compared to hEtG, determined in hair segments taken at the same time. With a PEth 16:0/18:1 threshold of 210 ng/mL for excessive alcohol consumption, all (n = 30) but one tested person also had hEtG values ≥30 pg/mg. In 54 cases, results are not in contradiction to an abstinence as neither PEth (<20 ng/mL) nor hEtG (<7 pg/mg) was detected. In eight cases, both markers showed moderate consumption. Altogether, PEth and hEtG were in accordance in 68 % of the samples, although covering different time periods of alcohol consumption. With receiver operating characteristic analysis, PEth was evaluated to differentiate abstinence, moderate, and excessive alcohol consumption in accordance with hEtG limits. A PEth 16:0/18:1 threshold of 150 ng/mL resulted in the best sensitivity (70.6 %) and specificity (98.8 %) for excessive consumption. Values between 20 and 150 ng/mL passed for moderate consumption, values <20 ng/mL passed for abstinence. As PEth mostly has a shorter detection window (2-4 weeks) than hEtG (up to 6 months depending on hair length), changes in drinking behavior can be detected earlier by PEth than by hEtG analysis alone. Therefore, PEth helps to improve the diagnostic information and is a valuable additional alcohol marker for DAA.