15 resultados para objective tests

em Université de Lausanne, Switzerland


Relevância:

20.00% 20.00%

Publicador:

Resumo:

BACKGROUND: First hospitalisation for a psychotic episode causes intense distress to patients and families, but offers an opportunity to make a diagnosis and start treatment. However, linkage to outpatient psychiatric care remains a notoriously difficult step for young psychotic patients, who frequently interrupt treatment after hospitalisation. Persistence of symptoms, and untreated psychosis may therefore remain a problem despite hospitalisation and proper diagnosis. With persisting psychotic symptoms, numerous complications may arise: breakdown in relationships, loss of family and social support, loss of employment or study interruption, denial of disease, depression, suicide, substance abuse and violence. Understanding mechanisms that might promote linkage to outpatient psychiatric care is therefore a critical issue, especially in early intervention in psychotic disorders. OBJECTIVE: To study which factors hinder or promote linkage of young psychotic patients to outpatient psychiatric care after a first hospitalisation, in the absence of a vertically integrated program for early psychosis. Method. File audit study of all patients aged 18 to 30 who were admitted for the first time to the psychiatric University Hospital of Lausanne in the year 2000. For statistical analysis, chi2 tests were used for categorical variables and t-test for dimensional variables; p<0.05 was considered as statistically significant. RESULTS: 230 patients aged 18 to 30 were admitted to the Lausanne University psychiatric hospital for the first time during the year 2000, 52 of them with a diagnosis of psychosis (23%). Patients with psychosis were mostly male (83%) when compared with non-psychosis patients (49%). Furthermore, they had (1) 10 days longer mean duration of stay (24 vs 14 days), (2) a higher rate of compulsory admissions (53% vs 22%) and (3) were more often hospitalised by a psychiatrist rather than by a general practitioner (83% vs 53%). Other socio-demographic and clinical features at admission were similar in the two groups. Among the 52 psychotic patients, 10 did not stay in the catchment area for subsequent treatment. Among the 42 psychotic patients who remained in the catchment area after discharge, 20 (48%) did not attend the scheduled or rescheduled outpatient appointment. None of the socio demographic characteristics were associated with attendance to outpatient appointments. On the other hand, voluntary admission and suicidal ideation before admission were significantly related to attending the initial appointment. Moreover, some elements of treatment seemed to be associated with higher likelihood to attend outpatient treatment: (1) provision of information to the patient regarding diagnosis, (2) discussion about the treatment plan between in- and outpatient staff, (3) involvement of outpatient team during hospitalisation, and (4) elaboration of concrete strategies to face basic needs, organise daily activities or education and reach for help in case of need. CONCLUSION: As in other studies, half of the patients admitted for a first psychotic episode failed to link to outpatient psychiatric care. Our study suggests that treatment rather than patient's characteristics play a critical role in this phenomenon. Development of a partnership and involvement of patients in the decision process, provision of good information regarding the illness, clear definition of the treatment plan, development of concrete strategies to cope with the illness and its potential complications, and involvement of the outpatient treating team already during hospitalisation, all came out as critical strategies to facilitate adherence to outpatient care. While the current rate of disengagement after admission is highly concerning, our finding are encouraging since they constitute strategies that can easily be implemented. An open approach to psychosis, the development of partnership with patients and a better coordination between inpatient and outpatient teams should therefore be among the targets of early intervention programs. These observations might help setting up priorities when conceptualising new programs and facilitate the implementation of services that facilitate engagement of patients in treatment during the critical initial phase of psychotic disorders.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

OBJECTIVE: To assess the impact of liver hypertrophy of the future liver remnant volume (FLR) induced by preoperative portal vein embolization (PVE) on the immediate postoperative complications after a standardized major liver resection. SUMMARY BACKGROUND DATA: PVE is usually indicated when FLR is estimated to be too small for major liver resection. However, few data exist regarding the exact quantification of sufficient minimal functional hepatic volume required to avoid postoperative complications in both patients with or without chronic liver disease. METHODS: All consecutive patients in whom an elective right hepatectomy was feasible and who fulfilled the inclusion and exclusion criteria between 1998 and 2000 were assigned to have alternatively either immediate surgery or surgery after PVE. Among 55 patients (25 liver metastases, 2 cholangiocarcinoma, and 28 hepatocellular carcinoma), 28 underwent right hepatectomy after PVE and 27 underwent immediate surgery. Twenty-eight patients had chronic liver disease. FLR and estimated rate of functional future liver remnant (%FFLR) volumes were assessed by computed tomography. RESULTS: The mean increase of FLR and %FFLR 4 to 8 weeks after PVE were respectively 44 +/- 19% and 16 +/- 7% for patients with normal liver and 35 +/- 28% and 9 +/- 3% for those with chronic liver disease. All patients with normal liver and 86% with chronic liver disease experienced hypertrophy after PVE. The postoperative course of patients with normal liver who underwent PVE before right hepatectomy was similar to those with immediate surgery. In contrast, PVE in patients with chronic liver disease significantly decreased the incidence of postoperative complications as well as the intensive care unit stay and total hospital stay after right hepatectomy. CONCLUSIONS: Before elective right hepatectomy, the hypertrophy of FLR induced by PVE had no beneficial effect on the postoperative course in patients with normal liver. In contrast, in patients with chronic liver disease, the hypertrophy of the FLR induced by PVE decreased significantly the rate of postoperative complications.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

OBJECTIVE: The study tests the hypothesis that a low daily fat intake may induce a negative fat balance and impair catch-up growth in stunted children between 3 and 9y of age. DESIGN: Randomized case-control study. SETTING: Three rural villages of the West Kiang District, The Gambia. SUBJECTS: Three groups of 30 stunted but not wasted children (height for age z-score < or = -2.0, weight for height z-score > or = -2.0) 3-9 y of age were selected by anthropometric survey. Groups were matched for age, sex, village, degree of stunting and season. INTERVENTION: Two groups were randomly assigned to be supplemented five days a week for one year with either a high fat (n = 29) or a high carbohydrate biscuit (n = 30) each containing approximately 1600 kJ. The third group was a non supplemented control group (n = 29). Growth, nutritional status, dietary intake, resting energy expenditure and morbidity were compared. RESULTS: Neither the high fat nor the high carbohydrate supplement had an effect on weight or height gain. The high fat supplement did slightly increase adipose tissue mass. There was no effect of supplementation on resting energy expenditure or morbidity. In addition, the annual growth rate was not associated with a morbidity score. CONCLUSIONS: Results show that neither a high fat nor a high carbohydrate supplement given during 12 months to stunted Gambian children induced catch-up growth. The authors suggest that an adverse effect of the environment on catch-up growth persists despite the nutritional interventions.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Introduction: Coordination is a strategy chosen by the central nervous system to control the movements and maintain stability during gait. Coordinated multi-joint movements require a complex interaction between nervous outputs, biomechanical constraints, and pro-prioception. Quantitatively understanding and modeling gait coordination still remain a challenge. Surgeons lack a way to model and appreciate the coordination of patients before and after surgery of the lower limbs. Patients alter their gait patterns and their kinematic synergies when they walk faster or slower than normal speed to maintain their stability and minimize the energy cost of locomotion. The goal of this study was to provide a dynamical system approach to quantitatively describe human gait coordination and apply it to patients before and after total knee arthroplasty. Methods: A new method of quantitative analysis of interjoint coordination during gait was designed, providing a general model to capture the whole dynamics and showing the kinematic synergies at various walking speeds. The proposed model imposed a relationship among lower limb joint angles (hips and knees) to parameterize the dynamics of locomotion of each individual. An integration of different analysis tools such as Harmonic analysis, Principal Component Analysis, and Artificial Neural Network helped overcome high-dimensionality, temporal dependence, and non-linear relationships of the gait patterns. Ten patients were studied using an ambulatory gait device (Physilog®). Each participant was asked to perform two walking trials of 30m long at 3 different speeds and to complete an EQ-5D questionnaire, a WOMAC and Knee Society Score. Lower limbs rotations were measured by four miniature angular rate sensors mounted respectively, on each shank and thigh. The outcomes of the eight patients undergoing total knee arthroplasty, recorded pre-operatively and post-operatively at 6 weeks, 3 months, 6 months and 1 year were compared to 2 age-matched healthy subjects. Results: The new method provided coordination scores at various walking speeds, ranged between 0 and 10. It determined the overall coordination of the lower limbs as well as the contribution of each joint to the total coordination. The difference between the pre-operative and post-operative coordination values were correlated with the improvements of the subjective outcome scores. Although the study group was small, the results showed a new way to objectively quantify gait coordination of patients undergoing total knee arthroplasty, using only portable body-fixed sensors. Conclusion: A new method for objective gait coordination analysis has been developed with very encouraging results regarding the objective outcome of lower limb surgery.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Point-of-care (POC) tests offer potentially substantial benefits for the management of infectious diseases, mainly by shortening the time to result and by making the test available at the bedside or at remote care centres. Commercial POC tests are already widely available for the diagnosis of bacterial and viral infections and for parasitic diseases, including malaria. Infectious diseases specialists and clinical microbiologists should be aware of the indications and limitations of each rapid test, so that they can use them appropriately and correctly interpret their results. The clinical applications and performance of the most relevant and commonly used POC tests are reviewed. Some of these tests exhibit insufficient sensitivity, and should therefore be coupled to confirmatory tests when the results are negative (e.g. Streptococcus pyogenes rapid antigen detection test), whereas the results of others need to be confirmed when positive (e.g. malaria). New molecular-based tests exhibit better sensitivity and specificity than former immunochromatographic assays (e.g. Streptococcus agalactiae detection). In the coming years, further evolution of POC tests may lead to new diagnostic approaches, such as panel testing, targeting not just a single pathogen, but all possible agents suspected in a specific clinical setting. To reach this goal, the development of serology-based and/or molecular-based microarrays/multiplexed tests will be needed. The availability of modern technology and new microfluidic devices will provide clinical microbiologists with the opportunity to be back at the bedside, proposing a large variety of POC tests that will allow quicker diagnosis and improved patient care.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

OBJECTIVE: To investigate the safety and efficacy of 50-Hz repetitive transcranial magnetic stimulation (rTMS) in the treatment of motor symptoms in Parkinson disease (PD). BACKGROUND: Progression of PD is characterized by the emergence of motor deficits that gradually respond less to dopaminergic therapy. rTMS has shown promising results in improving gait, a major cause of disability, and may provide a therapeutic alternative. Prior controlled studies suggest that an increase in stimulation frequency might enhance therapeutic efficacy. METHODS: In this randomized, double blind, sham-controlled study, the authors investigated the safety and efficacy of 50-Hz rTMS of the motor cortices in 8 sessions over 2 weeks. Assessment of safety and clinical efficacy over a 1-month period included timed tests of gait and bradykinesia, Unified Parkinson's Disease Rating Scale (UPDRS), and additional clinical, neurophysiological, and neuropsychological parameters. In addition, the safety of 50-Hz rTMS was tested with electromyography-electroencephalogram (EMG-EEG) monitoring during and after stimulation. RESULTS: The authors investigated 26 patients with mild to moderate PD: 13 received 50-Hz rTMS and 13 sham stimulation. The 50-Hz rTMS did not improve gait, bradykinesia, and global and motor UPDRS, but there appeared a short-lived "on"-state improvement in activities of daily living (UPDRS II). The 50-Hz rTMS lengthened the cortical silent period, but other neurophysiological and neuropsychological measures remained unchanged. EMG/EEG recorded no pathological increase of cortical excitability or epileptic activity. There were no adverse effects. CONCLUSION: It appears that 50-Hz rTMS of the motor cortices is safe, but it fails to improve motor performance and functional status in PD. Prolonged stimulation or other techniques with rTMS might be more efficacious but need to be established in future research.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

SETTING: Ambulatory paediatric clinic in Lausanne, Switzerland, a country with a significant proportion of tuberculosis (TB) among immigrants. AIM: To assess the factors associated with positive tuberculin skin tests (TST) among children examined during a health check-up or during TB contact tracing, notably the influence of BCG vaccination (Bacille Calmette Guérin) and history of TB contact. METHOD: A descriptive study of children who had a TST (2 Units RT23) between November 2002 and April 2004. Age, sex, history of TB contact, BCG vaccination status, country of origin and birth outside Switzerland were recorded. RESULTS: Of 234 children, 176 (75%) had a reaction equal to zero and 31 (13%) tested positive (>10 mm). In a linear regression model, the size of the TST varied significantly according to the history of TB contact, age, TB incidence in the country of origin and BCG vaccination status but not according to sex or birth in or outside Switzerland. In a logistic regression model including all the recorded variables, age (Odds Ratio = 1.21, 95% CI 1.08; 1.35), a history of TB contact (OR = 7.31, 95% CI 2.23; 24) and the incidence of TB in the country of origin (OR = 1.01, 95% CI 1.00; 1.02) were significantly associated with a positive TST but sex (OR = 1.18, 95% CI 0.50; 2.78) and BCG vaccination status (OR = 2.97, 95% CI 0.91; 9.72) were not associated. CONCLUSIONS: TB incidence in the country of origin, BCG vaccination and age influence the TSTreaction (size or proportion of TST > or = 10 mm). However the most obvious risk factor for a positive TST is a history of contact with TB.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

OBJECTIVE: Both subclinical hypothyroidism and the metabolic syndrome have been associated with increased risk of coronary heart disease events. It is unknown whether the prevalence and incidence of metabolic syndrome is higher as TSH levels increase, or in individuals with subclinical hypothyroidism. We sought to determine the association between thyroid function and the prevalence and incidence of the metabolic syndrome in a cohort of older adults. DESIGN: Data were analysed from the Health, Ageing and Body Composition Study, a prospective cohort of 3075 community-dwelling US adults. PARTICIPANTS: Two thousand one hundred and nineteen participants with measured TSH and data on metabolic syndrome components were included in the analysis. MEASUREMENTS: TSH was measured by immunoassay. Metabolic syndrome was defined per revised ATP III criteria. RESULTS: At baseline, 684 participants met criteria for metabolic syndrome. At 6-year follow-up, incident metabolic syndrome developed in 239 individuals. In fully adjusted models, each unit increase in TSH was associated with a 3% increase in the odds of prevalent metabolic syndrome (OR, 1.03; 95% CI, 1.01-1.06; P = 0.02), and the association was stronger for TSH within the normal range (OR, 1.16; 95% CI, 1.03-1.30; P = 0.02). Subclinical hypothyroidism with a TSH > 10 mIU/l was significantly associated with increased odds of prevalent metabolic syndrome (OR, 2.3; 95% CI, 1.0-5.0; P = 0.04); the odds of incident MetS was similar (OR 2.2), but the confidence interval was wide (0.6-7.5). CONCLUSIONS: Higher TSH levels and subclinical hypothyroidism with a TSH > 10 mIU/l are associated with increased odds of prevalent but not incident metabolic syndrome.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

PURPOSE: To investigate the effect of intraocular straylight (IOS) induced by white opacity filters (WOF) on threshold measurements for stimuli employed in three perimeters: standard automated perimetry (SAP), pulsar perimetry (PP) and the Moorfields motion displacement test (MDT).¦METHODS: Four healthy young (24-28 years old) observers were tested six times with each perimeter, each time with one of five different WOFs and once without, inducing various levels of IOS (from 10% to 200%). An increase in IOS was measured with a straylight meter. The change in sensitivity from baseline was normalized, allowing comparison of standardized (z) scores (change divided by the SD of normative values) for each instrument.¦RESULTS: SAP and PP thresholds were significantly affected (P < 0.001) by moderate to large increases in IOS (50%-200%). The drop in motion displacement (MD) from baseline with WOF 5, was approximately 5 dB, in both SAP and PP which represents a clinically significant loss; in contrast the change in MD with MDT was on average 1 minute of arc, which is not likely to indicate a clinically significant loss.¦CONCLUSIONS: The Moorfields MDT is more robust to the effects of additional straylight in comparison with SAP or PP.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

CONTEXT: In populations of older adults, prediction of coronary heart disease (CHD) events through traditional risk factors is less accurate than in middle-aged adults. Electrocardiographic (ECG) abnormalities are common in older adults and might be of value for CHD prediction. OBJECTIVE: To determine whether baseline ECG abnormalities or development of new and persistent ECG abnormalities are associated with increased CHD events. DESIGN, SETTING, AND PARTICIPANTS: A population-based study of 2192 white and black older adults aged 70 to 79 years from the Health, Aging, and Body Composition Study (Health ABC Study) without known cardiovascular disease. Adjudicated CHD events were collected over 8 years between 1997-1998 and 2006-2007. Baseline and 4-year ECG abnormalities were classified according to the Minnesota Code as major and minor. Using Cox proportional hazards regression models, the addition of ECG abnormalities to traditional risk factors were examined to predict CHD events. MAIN OUTCOME MEASURE: Adjudicated CHD events (acute myocardial infarction [MI], CHD death, and hospitalization for angina or coronary revascularization). RESULTS: At baseline, 276 participants (13%) had minor and 506 (23%) had major ECG abnormalities. During follow-up, 351 participants had CHD events (96 CHD deaths, 101 acute MIs, and 154 hospitalizations for angina or coronary revascularizations). Both baseline minor and major ECG abnormalities were associated with an increased risk of CHD after adjustment for traditional risk factors (17.2 per 1000 person-years among those with no abnormalities; 29.3 per 1000 person-years; hazard ratio [HR], 1.35; 95% CI, 1.02-1.81; for minor abnormalities; and 31.6 per 1000 person-years; HR, 1.51; 95% CI, 1.20-1.90; for major abnormalities). When ECG abnormalities were added to a model containing traditional risk factors alone, 13.6% of intermediate-risk participants with both major and minor ECG abnormalities were correctly reclassified (overall net reclassification improvement [NRI], 7.4%; 95% CI, 3.1%-19.0%; integrated discrimination improvement, 0.99%; 95% CI, 0.32%-2.15%). After 4 years, 208 participants had new and 416 had persistent abnormalities. Both new and persistent ECG abnormalities were associated with an increased risk of subsequent CHD events (HR, 2.01; 95% CI, 1.33-3.02; and HR, 1.66; 95% CI, 1.18-2.34; respectively). When added to the Framingham Risk Score, the NRI was not significant (5.7%; 95% CI, -0.4% to 11.8%). CONCLUSIONS: Major and minor ECG abnormalities among older adults were associated with an increased risk of CHD events. Depending on the model, adding ECG abnormalities was associated with improved risk prediction beyond traditional risk factors.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Des nombreuses études ont montré une augmentation des scores aux tests d'aptitudes à travers les générations (« effet Flynn »). Différentes hypothèses d'ordre biologique, social et/ou éducationnels ont été élaborées afin d'expliquer ce phénomène. L'objectif de cette recherche est d'examiner l'évolution des performances aux tests d'aptitudes sur la base d'étalonnages datant de 1991 et de 2002. Les résultats suggèrent une inversion non homogène de l'effet Flynn. La diminution concerne plus particulièrement les tests d'aptitudes scolaires, comme ceux évaluant le facteur verbal et numérique. Cette étude pourrait refléter un changement de l'importance accordée aux différentes aptitudes peu évaluées en orientation scolaire et professionnelle.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Despite a low positive predictive value, diagnostic tests such as complete blood count (CBC) and C-reactive protein (CRP) are commonly used to evaluate whether infants with risk factors for early-onset neonatal sepsis (EOS) should be treated with antibiotics. We investigated the impact of implementing a protocol aiming at reducing the number of diagnostic tests in infants with risk factors for EOS in order to compare the diagnostic performance of repeated clinical examination with CBC and CRP measurement. The primary outcome was the time between birth and the first dose of antibiotics in infants treated for suspected EOS. Among the 11,503 infants born at ≥35 weeks during the study period, 222 were treated with antibiotics for suspected EOS. The proportion of infants receiving antibiotics for suspected EOS was 2.1% and 1.7% before and after the change of protocol (p = 0.09). Reduction of diagnostic tests was associated with earlier antibiotic treatment in infants treated for suspected EOS (hazard ratio 1.58; 95% confidence interval [CI] 1.20-2.07; p <0.001), and in infants with neonatal infection (hazard ratio 2.20; 95% CI 1.19-4.06; p = 0.01). There was no difference in the duration of hospital stay nor in the proportion of infants requiring respiratory or cardiovascular support before and after the change of protocol. Reduction of diagnostic tests such as CBC and CRP does not delay initiation of antibiotic treatment in infants with suspected EOS. The importance of clinical examination in infants with risk factors for EOS should be emphasised.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

BACKGROUND: In Switzerland, patients may undergo "blood tests" without being informed what these are screening for. Inadequate doctor-patient communication may result in patient misunderstanding. We examined what patients in the emergency department (ED) believed they had been screened for and explored their attitudes to routine (non-targeted) human immunodeficiency virus (HIV) screening. METHODS: Between 1st October 2012 and 28th February 2013, a questionnaire-based survey was conducted among patients aged 16-70 years old presenting to the ED of Lausanne University Hospital. Patients were asked: (1) if they believed they had been screened for HIV; (2) if they agreed in principle to routine HIV screening and (3) if they agreed to be HIV tested during their current ED visit. RESULTS: Of 466 eligible patients, 411 (88%) agreed to participate. Mean age was 46 ± 16 years; 192 patients (47%) were women; 366 (89%) were Swiss or European; 113 (27%) believed they had been screened for HIV, the proportion increasing with age (p ≤0.01), 297 (72%) agreed in principle with routine HIV testing in the ED, and 138 patients (34%) agreed to be HIV tested during their current ED visit. CONCLUSION: In this ED population, 27% believed incorrectly they had been screened for HIV. Over 70% agreed in principle with routine HIV testing and 34% agreed to be tested during their current visit. These results demonstrate willingness among patients concerning routine HIV testing in the ED and highlight a need for improved doctor-patient communication about what a blood test specifically screens for.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

OBJECTIVE: The prevalence of ragweed allergy is increasing worldwide. Ragweed distribution and abundance is spreading in Europe in a wide area ranging from the Rhone valley in France to Hungary and Ukraine, where the rate of the prevalence can peak at as high as 12%. Low-grade ragweed colonisation was seen in Geneva and Ticino, less than two decades ago. There were fears that allergies to ragweed would increase Switzerland. The intent of this study was to assess the rate of prevalence of sensitisation and allergy to ragweed in the population living in the first rural Swiss setting where ragweed had been identified in 1996, and to evaluate indirectly the efficacy of elimination and containment strategies. MATERIAL AND METHODS: In 2009, 35 adults in a rural village in the Canton of Geneva were recruited. Data were collected by means of questionnaires and skin-prick tests were done on each participant. The study was approved by the local Ethics Committee. RESULTS: Based on questionnaires, 48.6% had rhinitis (95% confidence interval [CI] 32.9-64.4; n = 17/35) and 17.1% asthma (95% CI 8.1-32.6; n = 6/35). Atopy was diagnosed in 26.4% (95% CI 12.9-44.4) of the sample (n = 9/34). Ragweed sensitisation was found in 2.9% (95% CI 0.7-19.7; n = 1/34), mugwort sensitisation in 2.9% (95% CI 0.1-14.9; n = 1/35), alder sensitisation in 17.1% (95% CI 6.6-33.6; n = 6/35), ash sensitisation in 12.5% (95% CI 3.5-29.0; n = 4/32) and grass sensitisation in 22.9% (95% CI 10.4-40.1; n = 8/35). Ragweed (95% CI 0.1-14.9; n = 1/34) and mugwort allergies (95% CI 0.1-14.9; n = 1/35) were both found in 2.9% of the population. CONCLUSION: This study showed a surprisingly low incidence of ragweed sensitisation and allergy, of 2.9% and 2.9%, respectively, 20 years after the first ragweed detection in Geneva. The feared rise in ragweed allergy seems not to have happened in Switzerland, compared with other ragweed colonised countries. These results strongly support early field strategies against ragweed.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Objective: Frequent Emergency Department (ED) users are vulnerable individuals and discrimination is usually associated with increased vulnerability. The aim of this study was to investigate frequent ED users' perceptions of discrimination and to test whether they were associated with increased vulnerability. Methods: In total, 250 adult frequent ED users were interviewed in Lausanne University Hospital. From a previously published questionnaire, we assessed 15 dichotomous sources of perceived discrimination. Vulnerability was assessed using health status: objective health status (evaluation by a healthcare practitioner including somatic, mental health, behavioral, and social issues - dichotomous variables) and subjective health status [self-evaluation including health-related quality of life (WHOQOL) and quality of life (EUROQOL) - mean-scores]. We computed the prevalence rates of perceived discrimination and tested associations between perceived discrimination and health status (Fischer's exact tests, Mann-Whitney U-tests)