924 resultados para aspartate aminotransferase blood level
Resumo:
The use of blood spot collection cards is a simple way to obtain specimens for analysis of drugs for the purpose of therapeutic drug monitoring, assessing adherence to medications and preventing toxicity in routine clinical setting. We describe the development and validation of a microanalytical technique for the determination of metformin from dried blood spots. The method is based on reversed phase high-performance liquid chromatography with ultraviolet detection. Drug recovery in the developed method was found to be more than 84%. The limits of detection and quantification were calculated to be to be 90 and 150 ng/ml, respectively. The intraday and interday precision (measured by CV%) was always less than 9%. The accuracy (measured by relative error, %) was always less than 12%. Stability analysis showed that metformin is stable for at least 2 months when stored at -70 degrees C. The small volume of blood required (10 mu L), combined with the simplicity of the analytical technique makes this a useful procedure for monitoring metformin concentrations in routine clinical settings. The method is currently being applied to the analysis of blood spots taken from diabetic patients to assess adherence to medications and relationship between metformin level and metabolic control of diabetes. (c) 2006 Elsevier B.V. All rights reserved.
Resumo:
Background: This is an update of a previous review (McGuinness 2006). Hypertension and cognitive impairment are prevalent in older people. Hypertension is a direct risk factor for vascular dementia (VaD) and recent studies have suggested hypertension impacts upon prevalence of Alzheimer's disease (AD). Therefore does treatment of hypertension prevent cognitive decline?
Objectives: To assess the effects of blood pressure lowering treatments for the prevention of dementia and cognitive decline in patients with hypertension but no history of cerebrovascular disease.
Search strategy: The Specialized Register of the Cochrane Dementia and Cognitive Improvement Group, The Cochrane Library, MEDLINE, EMBASE, PsycINFO, CINAHL, LILACS as well as many trials databases and grey literature sources were searched on 13 February 2008 using the terms: hypertens$ OR anti-hypertens$. Selection criteria: Randomized, double-blind, placebo controlled trials in which pharmacological or non-pharmacological interventions to lower blood pressure were given for at least six months.
Data collection and analysis: Two independent reviewers assessed trial quality and extracted data. The following outcomes were assessed: incidence of dementia, cognitive change from baseline, blood pressure level, incidence and severity of side effects and quality of life.
Main results: Four trials including 15,936 hypertensive subjects were identified. Average age was 75.4 years. Mean blood pressure at entry across the studies was 171/86 mmHg. The combined result of the four trials reporting incidence of dementia indicated no significant difference between treatment and placebo (236/7767 versus 259/7660, Odds Ratio (OR) = 0.89, 95% CI 0.74, 1.07) and there was considerable heterogeneity between the trials. The combined results from the three trials reporting change in Mini Mental State Examination (MMSE) did not indicate a benefit from treatment (Weighted Mean Difference (WMD) = 0.42, 95%CI 0.30, 0.53). Both systolic and diastolic blood pressure levels were reduced significantly in the three trials assessing this outcome (WMD = -10.22, 95% CI -10.78, -9.66 for systolic blood pressure, WMD = -4.28, 95% CI -4.58, -3.98 for diastolic blood pressure). Three trials reported adverse effects requiring discontinuation of treatment and the combined results indicated no significant difference (OR = 1.01, 95% CI 0.92, 1.11). When analysed separately, however, more patients on placebo in Syst Eur 1997 were likely to discontinue treatment due to side effects; the converse was true in SHEP 1991. Quality of life data could not be analysed in the four studies. Analysis of the included studies in this review was problematic as many of the control subjects received antihypertensive treatment because their blood pressures exceeded pre-set values. In most cases the study became a comparison between the study drug against a usual antihypertensive regimen.
Authors' conclusions: There is no convincing evidence fromthe trials identified that blood pressure lowering in late-life prevents the development of dementia or cognitive impairment in hypertensive patients with no apparent prior cerebrovascular disease. There were significant problems identified with analysing the data, however, due to the number of patients lost to follow-up and the number of placebo patients who received active treatment. This introduced bias. More robust results may be obtained by conducting a meta-analysis using individual patient data.
Resumo:
Background: There is growing interest in the potential utility of molecular diagnostics in improving the detection of life-threatening infection (sepsis). LightCycler® SeptiFast is a multipathogen probebased real-time PCR system targeting DNA sequences of bacteria and fungi present in blood samples within a few hours. We report here the protocol of the first systematic review of published clinical diagnostic accuracy studies of this technology when compared with blood culture in the setting of suspected sepsis. Methods/design: Data sources: the Cochrane Database of Systematic Reviews, the Database of Abstracts of Reviews of Effects (DARE), the Health Technology Assessment Database (HTA), the NHS Economic Evaluation Database (NHSEED), The Cochrane Library, MEDLINE, EMBASE, ISI Web of Science, BIOSIS Previews, MEDION and the Aggressive Research Intelligence Facility Database (ARIF). Study selection: diagnostic accuracy studies that compare the real-time PCR technology with standard culture results performed on a patient's blood sample during the management of sepsis. Data extraction: three reviewers, working independently, will determine the level of evidence, methodological quality and a standard data set relating to demographics and diagnostic accuracy metrics for each study. Statistical analysis/data synthesis: heterogeneity of studies will be investigated using a coupled forest plot of sensitivity and specificity and a scatter plot in Receiver Operator Characteristic (ROC) space. Bivariate model method will be used to estimate summary sensitivity and specificity. The authors will investigate reporting biases using funnel plots based on effective sample size and regression tests of asymmetry. Subgroup analyses are planned for adults, children and infection setting (hospital vs community) if sufficient data are uncovered. Dissemination: Recommendations will be made to the Department of Health (as part of an open-access HTA report) as to whether the real-time PCR technology has sufficient clinical diagnostic accuracy potential to move forward to efficacy testing during the provision of routine clinical care.
Resumo:
Committees worldwide have set almost identical folate recommendations for the prevention of the first occurrence of neural tube defects (NTDs). We evaluate these recommendations by reviewing the results of intervention studies that examined the response of red blood cell folate to altered folate intake. Three options are suggested to achieve the extra 400 mu g folic acid/d being recommended by the official committees: increased intake of folate-rich foods, dietary folic acid supplementation, and folic acid fortification of food. A significant increase in foods naturally rich in folates was shown to be a relatively ineffective means of increasing red blood cell folate status in women compared with equivalent intakes of folic acid-fortified food, presumably because the synthetic form of the vitamin is more stable and more bioavailable. Although folic acid supplements are highly effective in optimizing folate status, supplementation is not an effective strategy for the primary prevention of NTDs because of poor compliance. Thus, food fortification is seen by many as the only option likely to succeed. Mandatory folic acid fortification of grain products was introduced recently in the United States at a level projected to provide an additional mean intake of 100 mu g folic acid/d, but some feel that this policy does not go far enough. A recent clinical trial predicted that the additional intake of folic acid in the United States will reduce NTDs by >20%, whereas 200 mu g/d would be highly protective and is the dose also shown to be optimal in lowering plasma homocysteine, with possible benefits in preventing cardiovascular disease. Thus, an amount lower than the current target of an extra 400 mu g/d may be sufficient to increase red blood cell folate to concentrations associated with the lowest risk of NTDs, but further investigation is warranted to establish the optimal amount.
Resumo:
We have investigated the relationship between erythropoietin (Epo) and pH, PaO2 and haematocrit in 100 cord blood samples obtained at Caesarean section prior to labour. Of 82 term (> 37 weeks) infants, 64 were appropriately grown (10th-90th centiles), and their mean cord serum Epo and cord blood Epo was 23 +/- 8 mU/ml (mean +/- SD). Strong inverse correlations were found between cord serum Epo and cord blood pH (r = -0.74; p <0.0001), and between cord serum Epo and cord blood PaO2 (r = -0.55; p <0.0001), but not between cord serum Epo and cord haematocrit (r = 0.02; p <0.9). For the 18 preterm babies (gestation 32.4 +/- 4.1 weeks, birth weight 1,820 +/- 476 g), the Epo level was 36 +/- 8 mU/ml, which was not significantly greater than for the term babies. Strong inverse correlations were again found between Epo and pH (r = -0.87; p <0.0001) and Epo and PaO2 (r = -0.69; p <0.002). Babies from complicated pregnancies (intra-uterine growth retardation, pre-eclampsia, antepartum haemorrhage, diabetes mellitus and fetal distress) tended to have higher Epo levels. Thirteen babies had Epo levels > 40 mU/ml, and 11 (85%) of these required neonatal intensive care. Cord serum Epo correlates better with oxygen tension and pH at birth than with fetal growth and haematocrit, which are measures of chronic stress to the fetus.
Resumo:
Epidemiological studies show that elevated plasma levels of advanced glycation end products (AGEs) are associated with diabetes, kidney disease, and heart disease. Thus AGEs have been used as disease progression markers. However, the effects of variations in biological sample processing procedures on the level of AGEs in plasma/serum samples have not been investigated. The objective of this investigation was to assess the effect of variations in blood sample collection on measured Ne_(carboxy-methyl)lysine (CML), the best characterised AGE, and its homolog, Ne_(carboxyethyl)lysine (CEL). The investigation examined the effect on CML and CEL of different blood collection tubes, inclusion of a stabilising cocktail, effect of freeze thaw cycles, different storage times and temperatures, and effects of delaying centrifugation on a pooled sample from healthy volunteers. CML and CEL were measured in extracted samples by ultra_performance liquid chromatography-tandem mass spectrometry. Median CML and CEL ranged from 0.132 to 0.140 mM/M lys and from 0.053 to 0.060 mM/M lys, respectively. No significant difference was shown CML or CEL in plasma/serum samples. Therefore samples collected as part of epidemiological studies that do not undergo specific sample treatment at collection are suitable for measuring CML and CEL.
Resumo:
Many of the physiological functions of von Willebrand Factor (VWF), including its binding interaction with blood platelets, are regulated by the magnitude of applied fluid/hydrodynamic stress. We applied two complementary strategies to study the effect of fluid forces on the solution structure of VWF. First, small-angle neutron scattering was used to measure protein conformation changes in response to laminar shear rates (G) up to 3000/s. Here, purified VWF was sheared in a quartz Couette cell and protein conformation was measured in real time over length scales from 2-140 nm. Second, changes in VWF structure up to 9600/s were quantified by measuring the binding of a fluorescent probe 1,1'-bis(anilino)-4-,4'-bis(naphtalene)-8,8'-disulfonate (bis-ANS) to hydrophobic pockets exposed in the sheared protein. Small angle neutron scattering studies, coupled with quantitative modeling, showed that VWF undergoes structural changes at G < 3000/s. These changes were most prominent at length scales <10 nm (scattering vector (q) range >0.6/nm). A mathematical model attributes these changes to the rearrangement of domain level features within the globular section of the protein. Studies with bis-ANS demonstrated marked increase in bis-ANS binding at G > 2300/s. Together, the data suggest that local rearrangements at the domain level may precede changes at larger-length scales that accompany exposure of protein hydrophobic pockets. Changes in VWF conformation reported here likely regulate protein function in response to fluid shear.
Resumo:
Epidemiological studies show that elevated plasma levels of advanced glycation end products (AGEs) are associated with diabetes, kidney disease, and heart disease. Thus AGEs have been used as disease progression markers. However, the effects of variations in biological sample processing procedures on the level of AGEs in plasma/serum samples have not been investigated. The objective of this investigation was to assess the effect of variations in blood sample collection on measured N (ε)-(carboxymethyl)lysine (CML), the best characterised AGE, and its homolog, N (ε)-(carboxyethyl)lysine (CEL). The investigation examined the effect on CML and CEL of different blood collection tubes, inclusion of a stabilising cocktail, effect of freeze thaw cycles, different storage times and temperatures, and effects of delaying centrifugation on a pooled sample from healthy volunteers. CML and CEL were measured in extracted samples by ultra-performance liquid chromatography-tandem mass spectrometry. Median CML and CEL ranged from 0.132 to 0.140 mM/M lys and from 0.053 to 0.060 mM/M lys, respectively. No significant difference was shown CML or CEL in plasma/serum samples. Therefore samples collected as part of epidemiological studies that do not undergo specific sample treatment at collection are suitable for measuring CML and CEL.
Resumo:
AIMS: To determine whether alanine aminotransferase or gamma-glutamyltransferase levels, as markers of liver health and non-alcoholic fatty liver disease, might predict cardiovascular events in people with Type 2 diabetes.
METHODS: Data from the Fenofibrate Intervention and Event Lowering in Diabetes study were analysed to examine the relationship between liver enzymes and incident cardiovascular events (non-fatal myocardial infarction, stroke, coronary and other cardiovascular death, coronary or carotid revascularization) over 5 years.
RESULTS: Alanine aminotransferase level had a linear inverse relationship with the first cardiovascular event occurring in participants during the study period. After adjustment, for every 1 sd higher baseline alanine aminotransferase value (13.2 U/l), the risk of a cardiovascular event was 7% lower (95% CI 4-13; P=0.02). Participants with alanine aminotransferase levels below and above the reference range 8-41 U/l for women and 9-59 U/l for men, had hazard ratios for a cardiovascular event of 1.86 (95% CI 1.12-3.09) and 0.65 (95% CI 0.49-0.87), respectively (P=0.001). No relationship was found for gamma-glutamyltransferase.
CONCLUSIONS: The data may indicate that in people with Type 2 diabetes, which is associated with higher alanine aminotransferase levels because of prevalent non-alcoholic fatty liver disease, a low alanine aminotransferase level is a marker of hepatic or systemic frailty rather than health. This article is protected by copyright. All rights reserved.
Resumo:
Malaria, caused by Plasmodium falciparum (P. falciparum), ranks as one of the most baleful infectious diseases worldwide. New antimalarial treatments are needed to face existing or emerging drug resistant strains. Protein degradation appears to play a significant role during the asexual intraerythrocytic developmental cycle (IDC) of P. falciparum. Inhibition of the ubiquitin proteasome system (UPS), a major intracellular proteolytic pathway, effectively reduces infection and parasite replication. P. falciparum and erythrocyte UPS coexist during IDC but the nature of their relationship is largely unknown. We used an approach based on Tandem Ubiquitin-Binding Entities (TUBEs) and 1D gel electrophoresis followed by mass spectrometry to identify major components of the TUBEs-associated ubiquitin proteome of both host and parasite during ring, trophozoite and schizont stages. Ring-exported protein (REX1), a P. falciparum protein located in Maurer's clefts and important for parasite nutrient import, was found to reach a maximum level of ubiquitylation in trophozoites stage. The Homo sapiens (H. sapiens) TUBEs associated ubiquitin proteome decreased during the infection, whereas the equivalent P. falciparum TUBEs-associated ubiquitin proteome counterpart increased. Major cellular processes such as DNA repair, replication, stress response, vesicular transport and catabolic events appear to be regulated by ubiquitylation along the IDC P. falciparum infection.
Resumo:
The traditional basis for assessing the effect of antihypertensive therapy is the blood pressure reading taken by a physician. However, several recent trials have been designed to evaluate the blood pressure lowering effect of various therapeutic agents during the patients' normal daytime activities, using a portable, semi-automatic blood pressure recorder. The results have shown that in a given patient, blood pressure measured at the physician's office often differs greatly from that prevailing during the rest of the day. This is true both in treated and untreated hypertensive patients. The difference between office and ambulatory recorded pressures cannot be predicted from blood pressure levels measured by the physician. Therefore, a prospective study was carried out in patients with diastolic blood pressures that were uncontrolled at the physician's office despite antihypertensive therapy. The purpose was to evaluate the response of recorded ambulatory blood pressure to treatment adjustments aimed at reducing office blood pressure below a pre-set target level. Only patients with high ambulatory blood pressures at the outset appeared to benefit from further changes in therapy. Thus, ambulatory blood pressure monitoring can be used to identify those patients who remain hypertensive only when facing the physician, despite antihypertensive therapy. Ambulatory monitoring could thus help to evaluate the efficacy of antihypertensive therapy and allow individual treatment.
Resumo:
Many studies based on either an experimental or an epidemiological approach, have shown that the ability to drive is impaired when the driver is under the influence of cannabis. Baseline performances of heavy users remain impaired even after several weeks of abstinence. Symptoms of cannabis abuse and dependence are generally considered incompatible with safe driving. Recently, it has been shown that traffic safety can be increased by reporting the long-term unfit drivers to the driver licensing authorities and referring the cases for further medical assessment. Evaluation of the frequency of cannabis use is a prerequisite for a reliable medical assessment of the fitness to drive. In a previous paper we advocated the use of two thresholds based on 11-nor-9-carboxy-Δ9-tetrahydrocannabinol (THCCOOH) concentration in whole blood to help to distinguish occasional cannabis users (≤3μg/L) from heavy regular smokers (≥40μg/L). These criteria were established on the basis of results obtained in a controlled cannabis smoking study with placebo, carried out with two groups of young male volunteers; the first group was characterized by a heavy use (≥10 joints/month) while the second group was made up of occasional users smoking at most 1 joint/week. However, to date, these cutoffs have not been adequately assessed under real conditions. Their validity can now be evaluated and confirmed with 146 traffic offenders' real cases in which the whole blood cannabinoid concentrations and the frequency of cannabis use are known. The two thresholds were not challenged by the presence of ethanol (40% of cases) and of other therapeutic and illegal drugs (24%). Thus, we propose the following procedure that can be very useful in the Swiss context but also in other countries with similar traffic policies: if the whole blood THCCOOH concentration is higher than 40μg/L, traffic offenders must be directed first and foremost toward medical assessment of their fitness to drive. This evaluation is not recommended if the THCCOOH concentration is lower than 3μg/L and if the self-rated frequency of cannabis use is less than 1 time/week. A THCCOOH level between these two thresholds cannot be reliably interpreted. In such a case, further medical assessment and follow-up of the fitness to drive are also suggested, but with lower priority.
Resumo:
BACKGROUND AND OBJECTIVES: The SBP values to be achieved by antihypertensive therapy in order to maximize reduction of cardiovascular outcomes are unknown; neither is it clear whether in patients with a previous cardiovascular event, the optimal values are lower than in the low-to-moderate risk hypertensive patients, or a more cautious blood pressure (BP) reduction should be obtained. Because of the uncertainty whether 'the lower the better' or the 'J-curve' hypothesis is correct, the European Society of Hypertension and the Chinese Hypertension League have promoted a randomized trial comparing antihypertensive treatment strategies aiming at three different SBP targets in hypertensive patients with a recent stroke or transient ischaemic attack. As the optimal level of low-density lipoprotein cholesterol (LDL-C) level is also unknown in these patients, LDL-C-lowering has been included in the design. PROTOCOL DESIGN: The European Society of Hypertension-Chinese Hypertension League Stroke in Hypertension Optimal Treatment trial is a prospective multinational, randomized trial with a 3 × 2 factorial design comparing: three different SBP targets (1, <145-135; 2, <135-125; 3, <125 mmHg); two different LDL-C targets (target A, 2.8-1.8; target B, <1.8 mmol/l). The trial is to be conducted on 7500 patients aged at least 65 years (2500 in Europe, 5000 in China) with hypertension and a stroke or transient ischaemic attack 1-6 months before randomization. Antihypertensive and statin treatments will be initiated or modified using suitable registered agents chosen by the investigators, in order to maintain patients within the randomized SBP and LDL-C windows. All patients will be followed up every 3 months for BP and every 6 months for LDL-C. Ambulatory BP will be measured yearly. OUTCOMES: Primary outcome is time to stroke (fatal and non-fatal). Important secondary outcomes are: time to first major cardiovascular event; cognitive decline (Montreal Cognitive Assessment) and dementia. All major outcomes will be adjudicated by committees blind to randomized allocation. A Data and Safety Monitoring Board has open access to data and can recommend trial interruption for safety. SAMPLE SIZE CALCULATION: It has been calculated that 925 patients would reach the primary outcome after a mean 4-year follow-up, and this should provide at least 80% power to detect a 25% stroke difference between SBP targets and a 20% difference between LDL-C targets.
Resumo:
This article extends existing discussion in literature on probabilistic inference and decision making with respect to continuous hypotheses that are prevalent in forensic toxicology. As a main aim, this research investigates the properties of a widely followed approach for quantifying the level of toxic substances in blood samples, and to compare this procedure with a Bayesian probabilistic approach. As an example, attention is confined to the presence of toxic substances, such as THC, in blood from car drivers. In this context, the interpretation of results from laboratory analyses needs to take into account legal requirements for establishing the 'presence' of target substances in blood. In a first part, the performance of the proposed Bayesian model for the estimation of an unknown parameter (here, the amount of a toxic substance) is illustrated and compared with the currently used method. The model is then used in a second part to approach-in a rational way-the decision component of the problem, that is judicial questions of the kind 'Is the quantity of THC measured in the blood over the legal threshold of 1.5 μg/l?'. This is pointed out through a practical example.
Resumo:
Molecular evidence suggests that levels of vitamin D are associated with kidney function loss. Still, population-based studies are limited and few have considered the potential confounding effect of baseline kidney function. This study evaluated the association of serum 25-hydroxyvitamin D with change in eGFR, rapid eGFR decline, and incidence of CKD and albuminuria. Baseline (2003-2006) and 5.5-year follow-up data from a Swiss adult general population were used to evaluate the association of serum 25-hydroxyvitamin D with change in eGFR, rapid eGFR decline (annual loss >3 ml/min per 1.73 m(2)), and incidence of CKD and albuminuria. Serum 25-hydroxyvitamin D was measured at baseline using liquid chromatography-tandem mass spectrometry. eGFR and albuminuria were collected at baseline and follow-up. Multivariate linear and logistic regression models were used considering potential confounding factors. Among the 4280 people included in the analysis, the mean±SD annual eGFR change was -0.57±1.78 ml/min per 1.73 m(2), and 287 (6.7%) participants presented rapid eGFR decline. Before adjustment for baseline eGFR, baseline 25-hydroxyvitamin D level was associated with both mean annual eGFR change and risk of rapid eGFR decline, independently of baseline albuminuria. Once adjusted for baseline eGFR, associations were no longer significant. For every 10 ng/ml higher baseline 25-hydroxyvitamin D, the adjusted mean annual eGFR change was -0.005 ml/min per 1.73 m(2) (95% confidence interval, -0.063 to 0.053; P=0.87) and the risk of rapid eGFR decline was null (odds ratio, 0.93; 95% confidence interval, 0.79 to 1.08; P=0.33). Baseline 25-hydroxyvitamin D level was not associated with incidence of CKD or albuminuria. The association of 25-hydroxyvitamin D with eGFR decline is confounded by baseline eGFR. Sufficient 25-hydroxyvitamin D levels do not seem to protect from eGFR decline independently from baseline eGFR.