965 resultados para angiotensin blood level
Resumo:
It is now commonly accepted that chronic inflammation associated with obesity during aging induces insulin resistance in the liver. In the present study, we investigated whether the improvement in insulin sensitivity and insulin signaling, mediated by acute exercise, could be associated with modulation of protein-tyrosine phosphatase 1B (PTP-1B) in the liver of old rats. Aging rats were subjected to swimming for two 1.5-h long bouts, separated by a 45 min rest period. Sixteen hours after the exercise, the rats were sacrificed and proteins from the insulin signaling pathway were analyzed by immunoblotting. Our results show that the fat mass was increased in old rats. The reduction in glucose disappearance rate (Kitt) observed in aged rats was restored 16 h after exercise. Aging increased the content of PTP-1B and attenuated insulin signaling in the liver of rats, a phenomenon that was reversed by exercise. Aging rats also increased the IRβ/PTP-1B and IRS-1/PTP-1B association in the liver when compared with young rats. Conversely, in the liver of exercised old rats, IRβ/PTP-1B and IRS-1/PTP-1B association was markedly decreased. Moreover, in the hepatic tissue of old rats, the insulin signalling was decreased and PEPCK and G6Pase levels were increased when compared with young rats. Interestingly, 16 h after acute exercise, the PEPCK and G6Pase protein level were decreased in the old exercised group. These results provide new insights into the mechanisms by which exercise restores insulin signalling in liver during aging. © 2013 Moura et al; licensee BioMed Central Ltd.
Resumo:
Accumulating evidence suggests an association between body volume overload and inflammation in chronic kidney diseases. The purpose of this study was to evaluate the effect of dialysate sodium concentration reduction on extracellular water volume, blood pressure (BP), and inflammatory state in hemodialysis (HD) patients. In this prospective controlled study, adult patients on HD for at least 90 days and those with C-reactive protein (CRP) levels ≥ 0.7 mg/dL were randomly allocated into two groups: group A, which included 29 patients treated with reduction of dialysate sodium concentration from 138 to 135 mEq/L; and group B, which included 23 HD patients not receiving dialysate sodium reduction (controls). Of these, 20 patients in group A and 18 in group B completed the protocol study. Inflammatory, biochemical, hematological, and nutritional markers were assessed at baseline and after 8 and 16 weeks. Baseline characteristics were not significantly different between the two groups. Group A showed a significant reduction in serum concentrations of tumor necrosis factor-α, and interleukin-6 over the study period, while the BP and extracellular water (ECW) did not change. In Group B, there were no changes in serum concentrations of inflammatory markers, BP, and ECW. Dialysate sodium reduction is associated with attenuation of the inflammatory state, without changes in the BP and ECW, suggesting inhibition of a salt-induced inflammatory response. Copyright © 2013 Informa Healthcare USA, Inc.
Resumo:
Background: Hypertension can be generated by a great number of mechanisms including elevated uric acid (UA) that contribute to the anion superoxide production. However, physical exercise is recommended to prevent and/or control high blood pressure (BP). The purpose of this study was to investigate the relationship between BP and UA and whether this relationship may be mediated by the functional fitness index.Methods: All participants (n = 123) performed the following tests: indirect maximal oxygen uptake (VO2max), AAHPERD Functional Fitness Battery Test to determine the general fitness functional index (GFFI), systolic and diastolic blood pressure (SBP and DBP), body mass index (BMI) and blood sample collection to evaluate the total-cholesterol (CHOL), LDL-cholesterol (LDL-c), HDL-cholesterol (HDL-c), triglycerides (TG), uric acid (UA), nitrite (NO2) and thiobarbituric acid reactive substances (T-BARS). After the physical, hemodynamic and metabolic evaluations, all participants were allocated into three groups according to their GFFI: G1 (regular), G2 (good) and G3 (very good).Results: Baseline blood pressure was higher in G1 when compared to G3 (+12% and +11%, for SBP and DBP, respectively, p<0.05) and the subjects who had higher values of BP also presented higher values of UA. Although UA was not different among GFFI groups, it presented a significant correlation with GFFI and VO2max. Also, nitrite concentration was elevated in G3 compared to G1 (140±29 μM vs 111± 29 μM, for G3 and G1, respectively, p<0.0001). As far as the lipid profile, participants in G3 presented better values of CHOL and TG when compared to those in G1.Conclusions: Taking together the findings that subjects with higher BP had elevated values of UA and lower values of nitrite, it can be suggested that the relationship between blood pressure and the oxidative stress produced by acid uric may be mediated by training status. © 2013 Trapé et al.; licensee BioMed Central Ltd.
Resumo:
The aim of this study was to determine the relationship between blood lactate and glucose during an incremental test after exercise induced lactic acidosis, under normal and acute β-adrenergic blockade. Eight fit males (cyclists or triathletes) performed a protocol to determine the intensity corresponding to the individual equilibrium point between lactate entry and removal from the blood (incremental test after exercise induced lactic acidosis), determined from the blood lactate (Lacmin) and glucose (Glucmin) response. This protocol was performed twice in a double-blind randomized order by ingesting either propranolol (80 mg) or a placebo (dextrose), 120 min prior to the test. The blood lactate and glucose concentration obtained 7 minutes after anaerobic exercise (Wingate test) was significantly lower (p<0.01) with the acute β-adrenergic blockade (9.1±1.5 mM; 3.9±0.1 mM), respectively than in the placebo condition (12.4±1.8 mM; 5.0±0.1 mM). There was no difference (p>0.05) between the exercise intensity determined by Lacmin (212.1±17.4 W) and Glucmin (218.2±22.1 W) during exercise performed without acute β-adrenergic blockade. The exercise intensity at Lacmin was lowered (p<0.05) from 212.1±17.4 to 181.0±15.6 W and heart rate at Lacmin was reduced (p<0.01) from 161.2±8.4 to 129.3±6.2 beats min-1 as a result of the blockade. It was not possible to determine the exercise intensity corresponding to Glucmin with β-adrenergic blockade, since the blood glucose concentration presented a continuous decrease during the incremental test. We concluded that the similar pattern response of blood lactate and glucose during an incremental test after exercise induced lactic acidosis, is not present during β-adrenergic blockade suggesting that, at least in part, this behavior depends upon adrenergic stimulation.
Resumo:
The concentration of 11-nor-9-carboxy-Δ(9)-tetrahydrocannabinol (THCCOOH) in whole blood is used as a parameter for assessing the consumption behavior of cannabis consumers. The blood level of THCCOOH-glucuronide might provide additional information about the frequency of cannabis use. To verify this assumption, a column-switching liquid chromatography-tandem mass spectrometry (LC-MS/MS) method for the rapid and direct quantification of free and glucuronidated THCCOOH in human whole blood was newly developed. The method comprised protein precipitation, followed by injection of the processed sample onto a trapping column and subsequent gradient elution to an analytical column for separation and detection. The total LC run time was 4.5 min. Detection of the analytes was accomplished by electrospray ionization in positive ion mode and selected reaction monitoring using a triple-stage quadrupole mass spectrometer. The method was fully validated by evaluating the following parameters: linearity, lower limit of quantification, accuracy and imprecision, selectivity, extraction efficiency, matrix effect, carry-over, dilution integrity, analyte stability, and re-injection reproducibility. All acceptance criteria were analyzed and the predefined criteria met. Linearity ranged from 5.0 to 500 μg/L for both analytes. The method was successfully applied to whole blood samples from a large collective of cannabis consumers, demonstrating its applicability in the forensic field.
Resumo:
The relationship of body condition score ( BCS) and blood urea and ammonia to pregnancy outcome was examined in Italian Mediterranean Buffalo cows mated by AI. The study was conducted on 150 buffaloes at 145 +/- 83 days in milk that were fed a diet comprising 14.8% crude protein, 0.9 milk forage units . kg(-1) dry matter and a non- structural carbohydrate/ crude protein ratio of 2.14. The stage of the oestrous cycle was synchronised by the Ovsynch- TAI programme and blood urea and ammonia levels were assessed on the day of AI. Energy corrected milk ( ECM) production and BCS were recorded bi- weekly. The pregnancy risk was 46.7% and was slightly lower in buffaloes with BCS < 6.0 and BCS > 7.5. There were no significant differences in ECM, urea and ammonia between pregnant and non- pregnant buffaloes. However, pregnancy outcome was higher ( P = 0.02) in buffaloes with blood urea < 6.83 mmol . L-1. The likelihood of pregnancy for buffaloes with low urea blood level was 2.6 greater than for high urea level and exposure to a high urea level lowered the probability of pregnancy by about 0.25. The findings indicate that buffaloes are similar to cattle and increased blood levels of urea are associated with reduced fertility when animals are mated by AI.
Resumo:
The vagus nerve is an important component of the efferent arm of the baroreflex. Blood pressure levels as well as baroreflex control of circulation are significantly different in male and female spontaneously hypertensive rats (SHR). We proposed to investigate the morphometric differences between genders using the vagus nerve of SHR. Adult animals (20 weeks old) were anesthetized and had their arterial pressure (AP) and heart rate (HR) recorded by a computerized system. The rats were then systemically perfused with a fixative solution and had their cervical vagi nerves prepared for light microscopy. Proximal and distal segments of the left and right vagi nerves were evaluated for morphometric parameters including fascicle area and diameter, myelinated fiber number, density, area and diameter. Comparisons were made between sides and segments on the same gender as well as between genders. Differences were considered significant when p<0.05. Male SHR had significantly higher AP and HR. Morphometric data showed no differences between the same levels of both sides and between segments on the same side for male and female rats. In addition, no significant morphometric differences were observed when genders were compared. This is the first description of vagus nerve morphometry in SHR indicating that gender differences in AP and HR cannot be attributed to dissimilarities in vagal innervation of the heart. These data provide a morphological basis for further studies involving functional investigations of the efferent arm of the baroreflex in hypertension. (C) 2007 Elsevier B.V. All rights reserved.
Resumo:
A preliminary study of the pharmacokinetic parameters of t-Butylaminoethyl disulfide was performed after administration of two different single doses (35 and 300 mg/kg) of either the cold or labelled drug. Plasma or blood samples were treated with dithiothreitol, perchloric acid, and, after filtration, submitted to further purification with anionic resein. In the final step, the drug was retained on a cationic resin column, eluted with NaCl 1M and detected according to the method of Ellman (1958). Alternatively, radioactive drug was detected by liquid scintillation counting. The results corresponding to the smaller dose of total drug suggested a pharmacokinetic behavior related to a one open compartment model with the following parameters: area under the intravenous curve (AUC i.v.):671 ± 14; AUC oral: 150 ± 40 µg.min. ml [raised to the power of -1]; elimination rate constant: 0.071 min [raised to the power of -1]; biological half life: 9.8 min; distribution volume: 0.74 ml/g. For the higher dose, the results seemed to obey a more complex undertermined model. Combining the results, the occurence of a dose-dependent pharmacokinetic behavior is suggested, the drug being rapidly absorbed and rapidly eliminated; the elimination process being related mainly to metabolization. The drug seems to be more toxic when administered I.V. because by this route it escapes first pass metabolism, while being quickly distributed to tissues. The maximum tolerated blood level seems to be around 16 µg/ml.
Resumo:
QUESTION UNDER STUDY: To assess which high-risk acute coronary syndrome (ACS) patient characteristics played a role in prioritising access to intensive care unit (ICU), and whether introducing clinical practice guidelines (CPG) explicitly stating ICU admission criteria altered this practice. PATIENTS AND METHODS: All consecutive patients with ACS admitted to our medical emergency centre over 3 months before and after CPG implementation were prospectively assessed. The impact of demographic and clinical characteristics (age, gender, cardiovascular risk factors, and clinical parameters upon admission) on ICU hospitalisation of high-risk patients (defined as retrosternal pain of prolonged duration with ECG changes and/or positive troponin blood level) was studied by logistic regression. RESULTS: Before and after CPG implementation, 328 and 364 patients, respectively, were assessed for suspicion of ACS. Before CPG implementation, 36 of the 81 high-risk patients (44.4%) were admitted to ICU. After CPG implementation, 35 of the 90 high-risk patients (38.9%) were admitted to ICU. Male patients were more frequently admitted to ICU before CPG implementation (OR=7.45, 95% CI 2.10-26.44), but not after (OR=0.73, 95% CI 0.20-2.66). Age played a significant role in both periods (OR=1.57, 95% CI 1.24-1.99), both young and advanced ages significantly reducing ICU admission, but to a lesser extent after CPG implementation. CONCLUSION: Prioritisation of access to ICU for high-risk ACS patients was age-dependent, but focused on the cardiovascular risk factor profile. CPG implementation explicitly stating ICU admission criteria decreased discrimination against women, but other factors are likely to play a role in bed allocation.
Resumo:
Intoxications are a frequent problem in the ER. In the vast majorityof cases, supportive treatment is sufficient. Severe intoxications withunknown agents are considered an indication for a urinary drug screen,and are recommended by several toxicology centers. However, theirusefulness for patient management remains uncertain.Study objectives: Evaluation of the impact of a urinary drug screen(Biosite Triage TOX Drug Screen) testing 11 substances(acetaminophen, amphetamines, methamphetamines, barbiturates,benzodiazepines, cocaïne, methadone, opioids, phencyclidine,cannabis, tricyclic antidepressants) on initial adult patient managementin the emergency department of a university hospital with ~35.000annual admissions.Methods: Observational retrospective analysis of all tests performedbetween 09/2009 and 09/2010. A test utility was defined as useful if itresulted in the administration of a specific antidote (Flumazenil/Naloxone), the use of a quantitative confirmatory toxicologic test, or achange in patient's disposition.Results: 57 tests were performed. Patient age was 32 ± 11 (SD) years;58% were men; 30% were also intoxicated with alcohol. Two patientsdied (3.5%): the first one of a diphenhydramin overdose, the other of ahypertensive intracerebral hemorrhage believed to be caused cocaineabuse but a negative urine test. Test indications were: 54% firstpsychotic episode; 25% acute respiratory failure; 18% coma; 12%seizure; 11% opioids toxidrome; 7% sympathicomimetic toxidrome; 5%hypotension; 4% ventricular arrhythmia (VT, VF, torsades de pointes)or long QT. 75% of tests were positives for >=1 substance (mean 1.7 ±0.9). 47% of results were unexpected by history. 18% of resultsinfluenced patient management: 7% had a negative test that confirmedthe diagnosis of endogenous psychosis in a first psychotic episode, andallowed transfer to psychiatry; 5% received flumazenil/naloxone;2% had an acetaminophen blood level after a positive screen; finally,4% had an unexpected methadone abuse that required prolongationof hospital stay.Conclusions: A rapid urinary toxicologic screen was seldom used inour emergency department, and its impact on patient managementwas marginal: only one in 6 tests influenced treatment decisions.
Resumo:
OBJECTIVES: Skin notations are used as a hazard identification tool to flag chemicals associated with a potential risk related to transdermal penetration. The transparency and rigorousness of the skin notation assignment process have recently been questioned. We compared different approaches proposed as criteria for these notations as a starting point for improving and systematizing current practice. METHODS: In this study, skin notations, dermal acute lethal dose 50 in mammals (LD(50)s) and two dermal risk indices derived from previously published work were compared using the lists of Swiss maximum allowable concentrations (MACs) and threshold limit values (TLVs) from the American Conference of Governmental Industrial Hygienists (ACGIH). The indices were both based on quantitative structure-activity relationship (QSAR) estimation of transdermal fluxes. One index compared the cumulative dose received through skin given specific exposure surface and duration to that received through lungs following inhalation 8 h at the MAC or TLV. The other index estimated the blood level increase caused by adding skin exposure to the inhalation route at kinetic steady state. Dermal-to-other route ratios of LD(50) were calculated as secondary indices of dermal penetrability. RESULTS: The working data set included 364 substances. Depending on the subdataset, agreement between the Swiss and ACGIH skin notations varied between 82 and 87%. Chemicals with a skin notation were more likely to have higher dermal risk indices and lower dermal LD(50) than chemicals without a notation (probabilities between 60 and 70%). The risk indices, based on cumulative dose and kinetic steady state, respectively, appeared proportional up to a constant independent of chemical-specific properties. They agreed well with dermal LD(50)s (Spearman correlation coefficients -0.42 to -0.43). Dermal-to-other routes LD(50) ratios were moderately associated with QSAR-based transdermal fluxes (Spearman correlation coefficients -0.2 to -0.3). CONCLUSIONS: The plausible but variable relationship between current skin notations and the different approaches tested confirm the need to improve current skin notations. QSAR-based risk indices and dermal toxicity data might be successfully integrated in a systematic alternative to current skin notations for detecting chemicals associated with potential dermal risk in the workplace. [Authors]
Resumo:
When requesting a blood level measurement in the context of "Therapeutic drug monitoring" (TDM), numerous aspects have to be considered in the pre-analytical and analytical area, as in the integration of associated clinical data. This review presents therapeutic classes for which a clinical benefit of TDM is established or suggested, at least in some settings. For each class of drugs, the main pharmacokinetic, pre-analytical, analytical and clinical aspects are evaluated in the scope of such a monitoring. Each step of the TDM process is important and none should be neglected. Additional clinical trials are however warranted to better establish the exact conditions of use for such a monitoring.
Resumo:
Requesting a blood level measurement of a drug is part of the global approach known as "Therapeutic Drug Monitoring". Diverse situations require this monitoring approach, such as inadequate response to treatment or organ failure. Every drug however does not possess all the characteristics for a TDM program. The therapeutic range of a TDM drug has indeed to be narrow and its interindividual pharmacokinetic variability to be wide. As the development of new drugs is currently slowing down, the precise management of existing treatments certainly deserves progress, but needs however to be applied rationally, starting from a valid indication to blood sampling, and ending with a sound dosage adaptation decision.
Resumo:
Some methadone maintenance treatment (MMT) programs prescribe inadequate daily methadone doses. Patients complain of withdrawal symptoms and continue illicit opioid use, yet practitioners are reluctant to increase doses above certain arbitrary thresholds. Serum methadone levels (SMLs) may guide practitioners dosing decisions, especially for those patients who have low SMLs despite higher methadone doses. Such variation is due in part to the complexities of methadone metabolism. The medication itself is a racemic (50:50) mixture of 2 enantiomers: an active "R" form and an essentially inactive "S" form. Methadone is metabolized primarily in the liver, by up to five cytochrome P450 isoforms, and individual differences in enzyme activity help explain wide ranges of active R-enantiomer concentrations in patients given identical doses of racemic methadone. Most clinical research studies have used methadone doses of less than 100 mg/day [d] and have not reported corresponding SMLs. New research suggests that doses ranging from 120 mg/d to more than 700 mg/d, with correspondingly higher SMLs, may be optimal for many patients. Each patient presents a unique clinical challenge, and there is no way of prescribing a single best methadone dose to achieve a specific blood level as a "gold standard" for all patients. Clinical signs and patient-reported symptoms of abstinence syndrome, and continuing illicit opioid use, are effective indicators of dose inadequacy. There does not appear to be a maximum daily dose limit when determining what is adequately "enough" methadone in MMT.
Resumo:
Marijuana is the most widely used illicit drug, however its effects on cognitive functions underling safe driving remain mostly unexplored. Our goal was to evaluate the impact of cannabis on the driving ability of occasional smokers, by investigating changes in the brain network involved in a tracking task. The subject characteristics, the percentage of Δ(9)-Tetrahydrocannabinol in the joint, and the inhaled dose were in accordance with real-life conditions. Thirty-one male volunteers were enrolled in this study that includes clinical and toxicological aspects together with functional magnetic resonance imaging of the brain and measurements of psychomotor skills. The fMRI paradigm was based on a visuo-motor tracking task, alternating active tracking blocks with passive tracking viewing and rest condition. We show that cannabis smoking, even at low Δ(9)-Tetrahydrocannabinol blood concentrations, decreases psychomotor skills and alters the activity of the brain networks involved in cognition. The relative decrease of Blood Oxygen Level Dependent response (BOLD) after cannabis smoking in the anterior insula, dorsomedial thalamus, and striatum compared to placebo smoking suggests an alteration of the network involved in saliency detection. In addition, the decrease of BOLD response in the right superior parietal cortex and in the dorsolateral prefrontal cortex indicates the involvement of the Control Executive network known to operate once the saliencies are identified. Furthermore, cannabis increases activity in the rostral anterior cingulate cortex and ventromedial prefrontal cortices, suggesting an increase in self-oriented mental activity. Subjects are more attracted by intrapersonal stimuli ("self") and fail to attend to task performance, leading to an insufficient allocation of task-oriented resources and to sub-optimal performance. These effects correlate with the subjective feeling of confusion rather than with the blood level of Δ(9)-Tetrahydrocannabinol. These findings bolster the zero-tolerance policy adopted in several countries that prohibits the presence of any amount of drugs in blood while driving.