93 resultados para cut-off score
Resumo:
Introduction: Particularly in elderly patients, the brain responds to a systemic inflammatory response with an increased production of inflammatory mediators. This has hypothetically been linked to the development of postoperative cognitive dysfunction (POCD). Methods: We investigated 31 patients aged >65 yrs undergoing elective major surgery under standardized general anaesthesia (thiopental, sevoflurane, fentanyl, atracurium). Cognitive function was measured preoperatively and 7 days postoperatively using the extended version of the Consortium to Establish a Registry for Alzheimer's Disease - Neuropsychological Assessment Battery (CERAD-NAB, validated German version) for which we developed a diagnostic cut-off in healthy elderly volunteers. Systemic C-reactive protein (CRP) and interleukin 6 (IL-6) were measured preoperatively, 2 days postoperatively, and 7 days postoperatively. Values for CRP, IL-6, operative characteristics and hospital length of stay in patients with POCD and without POCD were compared using the Mann- Whitney U test and are shown as median [range]. Results: Fourteen patients (45%) developed POCD. Values for CRP were not statistically different in patients with POCD and without POCD but tended to be higher in patients with POCD 2 days postoperatively. Patients with POCD had significantly higher IL-6 values on postoperative days 2 and 7 (table 1). These patients also had a significantly longer duration of anaesthesia (305 [195-620] vs.190 [150-560] min, p = 0.034), larger intraoperative blood loss (425 [0-1600] vs. 100 [0-1500] ml, p = 0.018) and longer hospital stays (15 [8-45] vs. 8 [4-40] days, p = 0.008). Table 1 POCD (n = 14) No POCD (n = 17) p value CRP (mg/dl) preop. 4.0 [1.0-245] 4.2 [0.3-36.2] 0.6 2 days postop. 223 [20-318] 98 [4.5-384] 0.07 7 days postop. 58 [15-147] 44 [11-148] 0.2 IL-6 (U/ml) preop. 2[2-28.1] 2 [2-7.3] 0.8 2 days postop. 56 [17-315] 20 [2-123] 0.009 7 days postop. 9[2-77] 4 [2-16] 0.03 Interpretation: In this small group of patients, high IL-6 values postoperatively were associated with POCD supporting a role for systemic inflammation in the development of POCD. In patients with POCD, duration of anaesthesia was significantly longer, and intraoperative blood losses were larger. These risk factors will need to be confirmed in a larger group of patients. The difference in length of stay may be indicative of postoperative complications, which have been linked to POCD earlier.
Resumo:
OBJECTIVE: This article analyses the influence of treatment duration on survival in patients with invasive carcinoma of the cervix treated by radical radiation therapy. METHOD: Three hundred and sixty patients with FIGO stage IB-IIIB carcinoma of the cervix were treated in Lausanne (Switzerland) with external radiation and brachytherapy as first line therapy. Median therapy duration was 45 days. Patients were classified according to the duration of the therapies, taking 60 days (the 75th percentile) as an arbitrary cut-off. RESULTS: The 5-year survival was 61% (S.E. = 3%) for the therapy duration group of less than 60 days and 53% (S.E. = 7%) for the group of more than 60 days. In terms of univariate hazard ratio (HR), the relative difference between the two groups corresponds to a 50% increase of deaths (HR = 1.53, 95% CI = 1.03-2.28) for the longer therapy duration group (P = 0.044). In a multivariate analysis, the magnitude of estimated relative hazards for the longer therapies are confirmed though significance was reduced (HR = 1.52, 95% CI = 0.94-2.45, P = 0.084). CONCLUSION: These findings suggest that short treatment duration is a factor associated with longer survival in carcinoma of the cervix.
Resumo:
We evaluated the effectiveness of supplementation with high dose of oral vitamin D3 to correct vitamin D insufficiency. We have shown that one or two oral bolus of 300,000 IU of vitamin D3 can correct vitamin D insufficiency in 50% of patients and that the patients who benefited more from supplementation were those with the lowest baseline levels. INTRODUCTION: Adherence with daily oral supplements of vitamin D3 is suboptimal. We evaluated the effectiveness of a single high dose of oral vitamin D3 (300,000 IU) to correct vitamin D insufficiency in a rheumatologic population. METHODS: Over 1 month, 292 patients had levels of 25-OH vitamin D determined. Results were classified as: deficiency <10 ng/ml, insufficiency ≥10 to 30 ng/ml, and normal ≥30 ng/ml. We added a category using the IOM recommended cut-off of 20 ng/ml. Patients with deficient or normal levels were excluded, as well as patients already supplemented with vitamin D3. Selected patients (141) with vitamin D insufficiency (18.5 ng/ml (10.2-29.1) received a prescription for 300,000 IU of oral vitamin D3 and were asked to return after 3 (M3) and 6 months (M6). Patients still insufficient at M3 received a second prescription for 300,000 IU of oral vitamin D3. Relation between changes in 25-OH vitamin D between M3 and M0 and baseline values were assessed. RESULTS: Patients (124) had a blood test at M3. Two (2%) had deficiency (8.1 ng/ml (7.5-8.7)) and 50 (40%) normal results (36.7 ng/ml (30.5-5.5)). Seventy-two (58%) were insufficient (23.6 ng/ml (13.8-29.8)) and received a second prescription for 300,000 IU of oral vitamin D3. Of the 50/124 patients who had normal results at M3 and did not receive a second prescription, 36 (72%) had a test at M6. Seventeen (47%) had normal results (34.8 ng/ml (30.3-42.8)) and 19 (53%) were insufficient (25.6 ng/ml (15.2-29.9)). Of the 72/124 patients who receive a second prescription, 54 (75%) had a test at M6. Twenty-eight (52%) had insufficiency (23.2 ng/ml (12.8-28.7)) and 26 (48%) had normal results (33.8 ng/ml (30.0-43.7)). At M3, 84% patients achieved a 25-OH vitamin D level >20 ng/ml. The lowest the baseline value, the highest the change after 3 months (negative relation with a correlation coefficient r = -0.3, p = 0.0007). CONCLUSIONS: We have shown that one or two oral bolus of 300,000 IU of vitamin D3 can correct vitamin D insufficiency in 50% of patients.
Resumo:
BACKGROUND AND OBJECTIVES: Anabolic steroids are synthetic derivatives of testosterone, modified to enhance its anabolic actions (promotion of protein synthesis and muscle growth). They have numerous side effects, and are on the International Olympic Committee's list of banned substances. Gas chromatography-mass spectrometry allows identification and characterisation of steroids and their metabolites in the urine but may not distinguish between pharmaceutical and natural testosterone. Indirect methods to detect doping include determination of the testosterone/epitestosterone glucuronide ratio with suitable cut-off values. Direct evidence may be obtained with a method based on the determination of the carbon isotope ratio of the urinary steroids. This paper aims to give an overview of the use of anabolic-androgenic steroids in sport and methods used in anti-doping laboratories for their detection in urine, with special emphasis on doping with testosterone. METHODS: Review of the recent literature of anabolic steroid testing, athletic use, and adverse effects of anabolic-androgenic steroids. RESULTS: Procedures used for detection of doping with endogenous steroids are outlined. The World Anti-Doping Agency provided a guide in August 2004 to ensure that laboratories can report, in a uniform way, the presence of abnormal profiles of urinary steroids resulting from the administration of testosterone or its precursors, androstenediol, androstenedione, dehydroepiandrosterone or a testosterone metabolite, dihydrotestosterone, or a masking agent, epitestosterone. CONCLUSIONS: Technology developed for detection of testosterone in urine samples appears suitable when the substance has been administered intramuscularly. Oral administration leads to rapid pharmacokinetics, so urine samples need to be collected in the initial hours after intake. Thus there is a need to find specific biomarkers in urine or plasma to enable detection of long term oral administration of testosterone.
Resumo:
Introduction: Coarctation of the aorta is a common congenital heart malformation. Mode of diagnosis changed from clinically to almost exclusively by echocardiogram and MRI. We claim to find a new echocardiographic index, based on simple and reliable morphologic measurements, to facilitate the diagnosis of aortic coarctation in the newborn.We reproduce the same procedure for older child to validate this new index. Material and Methods: We reviewed echocardiographic studies of 47 neonates with diagnosis of coarctation who underwent cardiac surgery between January 1997 and February 2003 and compared them with a matched control group. We measured 12 different sites of the aorta, aortic arch and the great vessels on the echocardiographic bands. In a second time we reviewed 23 infants for the same measurements and compare them with a matched control group. Results: 47 neonates with coarctation were analysed, age 11.8 _ 10 days,weight 3.0 _ 0.6 kg, body surface 0.20 _ 0.02m2. The control group was of 16 newborns aged 15.8 _ 10 days,weight 3.2 _ 0.9 kg and body surface 0.20 _ 0.04m2. A significant difference was noted in many morphologic measurement between the both groups, the most significant being the distance between the left carotid artery and the left subclavian artery (coarctation vs control: 7.3 _ 3mm vs 2.4 _ 0.8mm, p _ 0.0001). We then defined a new index, the carotid-subclavian arteries index (CSI) as the diameter of the distal tranverse aortic arch divided to the distance left carotid artery to left subclavian artery being also significaly different (coarctation vs control: 0.76 _ 0.86 vs 2.95 _ 1.24, p _ 0.0001). With the cutoff value of this index of 1.5 the sensitivity for aortic coarctation was 98% and the specificity of 92%. In an older group of infant with coarctation (16 patients) we apply the same principle and find for a cut-off value of 1.5 a sensitivity of 95% and a specificity of 100%. Conclusions: The CSI allows to evaluate newborns and infants for aortic coarctation with simple morphologic measurement that are not depending of the left ventricular function, presence of a patent ductus arteriosus or not. Further aggressive evaluation of these patient with a CSI _ 1.5 is indicated.
Resumo:
Introduction: A substantial number of patients with cancer suffer considerable pain at some point during their disease, and approximately 25% of cancer patients die in pain. In cases of uncontrolled pain or intolerable side effects, intrathecal drug delivery system (IDDS) is a recognised management option. Indeed, IDDS offer rapid and effective pain relief with less drug side effects compared to oral or parenteral administration. The aim of this study is to retrospectively review our series of cancer patients treated with IDDS. Method: Data was extracted from the institutional neuromodulation registry. Patients with cancer pain treated with IDDS from 01.01.1997 to 30.12.2009 were analysed for subjective improvement, changes in pain intensity (VAS) and survival time after implantation. Measurements were available for a decreasing number of patients as time since baseline increased. Results: During the studied period, 78 patients were implanted with IDDS for cancer pain. The mean survival time was 11.1 months (median: 3.8 months) and 14 patients (18%) were still alive at the end of the studied period. Subjective improvement was graded between 55 and 83% during the first year. Mean VAS during the first year remained lower than VAS at baseline. Discussion: IDDS has been shown to be cost-effective in several studies. Although initial costs of implantation are high, the cost benefits favour analgesia with implanted intrathecal pumps over epidural external systems after 3 to 6 months in cancer patients. Improved survival has been associated with IDDS and in this series both the mean and median survival times were above the cut-off value of three months. The mean subjective improvement was above 50% during the whole first year, suggesting a good efficacy of the treatment, a finding that is consistent with the results from other groups. Changes in pain intensity are difficult to interpret in the context of rapidly progressive disease such as in terminal cancer. However, mean VAS from 1 thru12 months were lower than baseline, suggesting improved pain control with IDDS, or at least a stabilisation of the pain symptoms. Conclusion: Our retrospective series suggests IDDS is effective in intractable cancer pain and we believe it should be considered even in terminally ill patients with limited life expectancies.
Resumo:
On June 26-27, 2006, 60 academic and industry scientists gathered during the PROSAFE workshop to discuss recommendations on taxonomy, antibiotic resistance, in vitro assessment of virulence and in vivo assessment of safety of probiotics used for human consumption. For identification of lactic acid bacteria (LAB) intended for probiotic use, it was recommended that conventional biochemical methods should be complemented with molecular methods and that these should be performed by an expert lab. Using the newly developed LAB Susceptibility test Medium (LSM), tentative epidemiological cut-off values were proposed. It was recommended that potentially probiotic strains not belonging to the wildtype distributions of relevant antimicrobials should not be developed as future products for human or animal consumption. Furthermore, it was recommended that the use of strains harbouring known and confirmed virulence genes should be avoided. Finally, for in vivo assessment of safety by investigating strain pathogenicity in animal models, the rat endocarditis model appeared to be the most reliable model tested in the PROSAFE project. Moreover, consensus was reached for approving the necessity of a human colonisation study in a randomised placebo-controlled double-blind design; however, further discussions are needed on the details of such as study.
Resumo:
High performance liquid chromatography (HPLC) is the reference method for measuring concentrations of antimicrobials in blood. This technique requires careful sample preparation. Protocols using organic solvents and/or solid extraction phases are time consuming and entail several manipulations, which can lead to partial loss of the determined compound and increased analytical variability. Moreover, to obtain sufficient material for analysis, at least 1 ml of plasma is required. This constraint makes it difficult to determine drug levels when blood sample volumes are limited. However, drugs with low plasma-protein binding can be reliably extracted from plasma by ultra-filtration with a minimal loss due to the protein-bound fraction. This study validated a single-step ultra-filtration method for extracting fluconazole (FLC), a first-line antifungal agent with a weak plasma-protein binding, from plasma to determine its concentration by HPLC. Spiked FLC standards and unknowns were prepared in human and rat plasma. Samples (240 microl) were transferred into disposable microtube filtration units containing cellulose or polysulfone filters with a 5 kDa cut-off. After centrifugation for 60 min at 15000g, FLC concentrations were measured by direct injection of the filtrate into the HPLC. Using cellulose filters, low molecular weight proteins were eluted early in the chromatogram and well separated from FLC that eluted at 8.40 min as a sharp single peak. In contrast, with polysulfone filters several additional peaks interfering with the FLC peak were observed. Moreover, the FLC recovery using cellulose filters compared to polysulfone filters was higher and had a better reproducibility. Cellulose filters were therefore used for the subsequent validation procedure. The quantification limit was 0.195 mgl(-1). Standard curves with a quadratic regression coefficient > or = 0.9999 were obtained in the concentration range of 0.195-100 mgl(-1). The inter and intra-run accuracies and precisions over the clinically relevant concentration range, 1.875-60 mgl(-1), fell well within the +/-15% variation recommended by the current guidelines for the validation of analytical methods. Furthermore, no analytical interference was observed with commonly used antibiotics, antifungals, antivirals and immunosuppressive agents. Ultra-filtration of plasma with cellulose filters permits the extraction of FLC from small volumes (240 microl). The determination of FLC concentrations by HPLC after this single-step procedure is selective, precise and accurate.
Resumo:
OBJECTIVES: An article by the Swiss AIDS Commission states that patients with stably suppressed viraemia [i.e. several successive HIV-1 RNA plasma concentrations (viral loads, VL) below the limits of detection during 6 months or more of highly active antiretroviral therapy (HAART)] are unlikely to be infectious. Questions then arise: how reliable is the undetectability of the VL, given the history of measures? What factors determine reliability? METHODS: We assessed the probability (henceforth termed reliability) that the n+1 VL would exceed 50 or 1000 HIV-1 RNA copies/mL when the nth one had been <50 copies/mL in 6168 patients of the Swiss HIV Cohort Study who were continuing to take HAART between 2003 and 2007. General estimating equations were used to analyse potential factors of reliability. RESULTS: With a cut-off at 50 copies/mL, reliability was 84.5% (n=1), increasing to 94.5% (n=5). Compliance, the current type of HAART and the first antiretroviral therapy (ART) received (HAART or not) were predictive factors of reliability. With a cut-off at 1000 copies/mL, reliability was 97.5% (n=1), increasing to 99.1% (n=4). Chart review revealed that patients had stopped their treatment, admitted to major problems with compliance or were taking non-HAART ART in 72.2% of these cases. Viral escape caused by resistance was found in 5.6%. No explanation was found in the charts of 22.2% of cases. CONCLUSIONS: After several successive VLs at <50 copies/mL, reliability reaches approximately 94% with a cut-off of 50 copies/mL and approximately 99% with a cut-off at 1000 copies/mL. Compliance is the most important factor predicting reliability.
Resumo:
We performed an international proficiency study of Human Papillomavirus (HPV) type 16 serology. A common methodology for serology based on virus-like particle (VLP) ELISA was used by 10 laboratories in 6 continents. The laboratories used the same VLP reference reagent, which was selected as the most stable, sensitive and specific VLP preparation out of VLPs donated from 5 different sources. A blinded proficiency panel consisting of 52 serum samples from women with PCR-verified HPV 16-infection, 11 control serum samples from virginal women and the WHO HPV 16 International Standard (IS) serum were distributed. The mean plus 3 standard deviations of the negative control serum samples was the most generally useful "cut-off" criterion for distinguishing positive and negative samples. Using sensitivity of at least 50% and a specificity of 100% as proficiency criteria, 6/10 laboratories were proficient. In conclusion, an international Standard Operating Procedure for HPV serology, an international reporting system in International Units (IU) and a common "cut-off" criterion have been evaluated in an international HPV serology proficiency study.
Resumo:
Robust estimators for accelerated failure time models with asymmetric (or symmetric) error distribution and censored observations are proposed. It is assumed that the error model belongs to a log-location-scale family of distributions and that the mean response is the parameter of interest. Since scale is a main component of mean, scale is not treated as a nuisance parameter. A three steps procedure is proposed. In the first step, an initial high breakdown point S estimate is computed. In the second step, observations that are unlikely under the estimated model are rejected or down weighted. Finally, a weighted maximum likelihood estimate is computed. To define the estimates, functions of censored residuals are replaced by their estimated conditional expectation given that the response is larger than the observed censored value. The rejection rule in the second step is based on an adaptive cut-off that, asymptotically, does not reject any observation when the data are generat ed according to the model. Therefore, the final estimate attains full efficiency at the model, with respect to the maximum likelihood estimate, while maintaining the breakdown point of the initial estimator. Asymptotic results are provided. The new procedure is evaluated with the help of Monte Carlo simulations. Two examples with real data are discussed.
Resumo:
Background: Beryllium sensitization (BeS) is caused by exposure to beryllium in the workplace and may progress to chronic beryllium disease (CBD). This granulomatous lung disorder mimicks sarcoidosis clinically, but is characterized by beryllium specific CD4+ T-cells immune response. BeS is classically detected by beryllium lymphocyte proliferation test (BeLPT), but this assay requires radioactivity and is not very sensitive. In the context of a study aiming to evaluate if CBD patients are misdiagnosed as sarcoidosis patients in Switzerland, we developed EliSpot and CFSE beryllium flow cytometric test. Methods: 23 patients considered as having sarcoidosis (n = 21), CBD (n = 1) and possible CBD (n = 1) were enrolled. Elispot was performed using plate covered with gamma-IFN mAb. Cells were added to wells and incubated overnight at 37 °C with medium (neg ctrl), SEB (pos ctrl) or BeSO4 at 1, 10 and 100 microM. Anti-IFN-gamma biotinylated mAb were added and spots were visualized using streptavidinhorseradish peroxidase and AEC substrate reagent. Results were reported as spot forming unit (SFU). For Beryllium specific CFSE flow cytometry analysis, CFSE labelled cells were cultured in the presence of SEB and 1, 10 or 100 microM BeSO4. Unstimulated CFSE labeled cells were defined as controls. The cells were incubated for 6 days at 37 °C and 5% CO2. Surface labelling of T-lymphocytes and vivid as control of cells viability was performed at the time of harvest. Results: Using EliSpot technology, we were able to detect a BeS in 1/23 enrolled patients with a mean of 780 SFU (cut off value at 50 SFU). This positive result was confirmed using different concentration of BeSO4. Among the 23 patients tested, 22 showed negative results with EliSpot. Using CFSE flow cytometry, 1/7 tested patients showed a positive result with a beryllium specific CD4+ count around 30% versus 45% for SEB stimulation as positif control and 0.6 % for negative control. This patient was the one with a positive EliSpot assay. Conclusions: The preliminary data demonstrated the feasibility of Elispot and CFSE flow cytometry to detect BeS. The patient with a beryllium specific positive EliSpot and CFSE flow cytometry result had been exposed to beryllium at her workplace 20 years ago and is still regularly controlled for her pulmonary status. A positive BeLPT had already been described in 2001 in France for this patient. Further validation of these techniques are in progress.
Resumo:
Object The goal of this study was to establish whether clear patterns of initial pain freedom could be identified when treating patients with classic trigeminal neuralgia (TN) by using Gamma Knife surgery (GKS). The authors compared hypesthesia and pain recurrence rates to see if statistically significant differences could be found. Methods Between July 1992 and November 2010, 737 patients presenting with TN underwent GKS and prospective evaluation at Timone University Hospital in Marseille, France. In this study the authors analyzed the cases of 497 of these patients, who participated in follow-up longer than 1 year, did not have megadolichobasilar artery- or multiple sclerosis-related TN, and underwent GKS only once; in other words, the focus was on cases of classic TN with a single radiosurgical treatment. Radiosurgery was performed with a Leksell Gamma Knife (model B, C, or Perfexion) using both MR and CT imaging targeting. A single 4-mm isocenter was positioned in the cisternal portion of the trigeminal nerve at a median distance of 7.8 mm (range 4.5-14 mm) anterior to the emergence of the nerve. A median maximum dose of 85 Gy (range 70-90 Gy) was delivered. Using empirical methods and assisted by a chart with clear cut-off periods of pain free distribution, the authors were able to divide patients who experienced freedom from pain into 3 separate groups: patients who became pain free within the first 48 hours post-GKS; those who became pain free between 48 hours and 30 days post-GKS; and those who became pain free more than 30 days after GKS. Results The median age in the 497 patients was 68.3 years (range 28.1-93.2 years). The median follow-up period was 43.75 months (range 12-174.41 months). Four hundred fifty-four patients (91.34%) were initially pain free within a median time of 10 days (range 1-459 days) after GKS. One hundred sixty-nine patients (37.2%) became pain free within the first 48 hours (Group PF(≤ 48 hours)), 194 patients (42.8%) between posttreatment Day 3 and Day 30 (Group PF((>48 hours, ≤ 30 days))), and 91 patients (20%) after 30 days post-GKS (Group PF(>30 days)). Differences in postoperative hypesthesia were found: in Group PF(≤ 48 hours) 18 patients (13.7%) developed postoperative hypesthesia, compared with 30 patients (19%) in Group PF((>48 hours, ≤ 30 days)) and 22 patients (30.6%) in Group PF(>30 days) (p = 0.014). One hundred fifty-seven patients (34.4%) who initially became free from pain experienced a recurrence of pain with a median delay of 24 months (range 0.62-150.06 months). There were no statistically significant differences between the patient groups with respect to pain recurrence: 66 patients (39%) in Group PF(≤ 48 hours) experienced pain recurrence, compared with 71 patients (36.6%) in Group PF((>48 hours, ≤ 30 days)) and 27 patients (29.7%) in Group PF(>30 days) (p = 0.515). Conclusions A substantial number of patients (169 cases, 37.2%) became pain free within the first 48 hours. The rate of hypesthesia was higher in patients who became pain free more than 30 days after GKS, with a statistically significant difference between patient groups (p = 0.014).
Resumo:
Epigenetic silencing of the DNA repair protein O(6)-methylguanine-DNA methyltransferase (MGMT) by promoter methylation predicts successful alkylating agent therapy, such as with temozolomide, in glioblastoma patients. Stratified therapy assignment of patients in prospective clinical trials according to tumor MGMT status requires a standardized diagnostic test, suitable for high-throughput analysis of small amounts of formalin-fixed, paraffin-embedded tumor tissue. A direct, real-time methylation-specific PCR (MSP) assay was developed to determine methylation status of the MGMT gene promoter. Assay specificity was obtained by selective amplification of methylated DNA sequences of sodium bisulfite-modified DNA. The copy number of the methylated MGMT promoter, normalized to the beta-actin gene, provides a quantitative test result. We analyzed 134 clinical glioma samples, comparing the new test with the previously validated nested gel-based MSP assay, which yields a binary readout. A cut-off value for the MGMT methylation status was suggested by fitting a bimodal normal mixture model to the real-time results, supporting the hypothesis that there are two distinct populations within the test samples. Comparison of the tests showed high concordance of the results (82/91 [90%]; Cohen's kappa = 0.80; 95% confidence interval, 0.82-0.95). The direct, real-time MSP assay was highly reproducible (Pearson correlation 0.996) and showed valid test results for 93% (125/134) of samples compared with 75% (94/125) for the nested, gel-based MSP assay. This high-throughput test provides an important pharmacogenomic tool for individualized management of alkylating agent chemotherapy.
Resumo:
Excessive alcohol consumption represents a major risk factor for morbidity and mortality. It is therefore indispensable to be able to detect at-risk drinking. Ethyl glucuronide (EtG) is a specific marker of alcohol consumption. The determination of ethyl glucuronide in urine or blood can be used to prove recent driving under the influence of alcohol, even if ethanol is no longer detectable. The commercialization of an EtG specific immunological assay now allows to obtain preliminary results rapidly and easily with satisfying sensitivity. Moreover, the detection of ethyl glucuronide in hair offers the opportunity to evaluate an alcohol consumption over a long period. The EtG concentration in hair is in correlation with the amount of ingested alcohol. Thus, the analysis of ethyl glucuronide can be used to monitor abstinence, to detect alcohol relapse and to identify at-risk drinkers. However, a cut off allowing to detect chronic alcohol abuser reliably still does not exist. Therefore, it is recommended to perform the analysis of ethyl glucuronide in complement to the existing blood markers. A study financed by the Swiss Foundation for Alcohol Research is actually conducted by the West Switzerland University Center of Legal Medicine in order to establish an objective cut-off.