39 resultados para starting
em Université de Lausanne, Switzerland
Resumo:
BACKGROUND: The factors that contribute to increasing obesity rates in human immunodeficiency virus (HIV)-positive persons and to body mass index (BMI) increase that typically occurs after starting antiretroviral therapy (ART) are incompletely characterized. METHODS: We describe BMI trends in the entire Swiss HIV Cohort Study (SHCS) population and investigate the effects of demographics, HIV-related factors, and ART on BMI change in participants with data available before and 4 years after first starting ART. RESULTS: In the SHCS, overweight/obesity prevalence increased from 13% in 1990 (n = 1641) to 38% in 2012 (n = 8150). In the participants starting ART (n = 1601), mean BMI increase was 0.92 kg/m(2) per year (95% confidence interval, .83-1.0) during year 0-1 and 0.31 kg/m(2) per year (0.29-0.34) during years 1-4. In multivariable analyses, annualized BMI change during year 0-1 was associated with older age (0.15 [0.06-0.24] kg/m(2)) and CD4 nadir <199 cells/µL compared to nadir >350 (P < .001). Annualized BMI change during years 1-4 was associated with CD4 nadir <100 cells/µL compared to nadir >350 (P = .001) and black compared to white ethnicity (0.28 [0.16-0.37] kg/m(2)). Individual ART combinations differed little in their contribution to BMI change. CONCLUSIONS: Increasing obesity rates in the SHCS over time occurred at the same time as aging of the SHCS population, demographic changes, earlier ART start, and increasingly widespread ART coverage. Body mass index increase after ART start was typically biphasic, the BMI increase in year 0-1 being as large as the increase in years 1-4 combined. The effect of ART regimen on BMI change was limited.
Resumo:
BACKGROUND: Estimates of drug resistance incidence to modern first-line combination antiretroviral therapies against human immunodeficiency virus (HIV) type 1 are complicated by limited availability of genotypic drug resistance tests (GRTs) and uncertain timing of resistance emergence. METHODS: Five first-line combinations were studied (all paired with lamivudine or emtricitabine): efavirenz (EFV) plus zidovudine (AZT) (n = 524); EFV plus tenofovir (TDF) (n = 615); lopinavir (LPV) plus AZT (n = 573); LPV plus TDF (n = 301); and ritonavir-boosted atazanavir (ATZ/r) plus TDF (n = 250). Virological treatment outcomes were classified into 3 risk strata for emergence of resistance, based on whether undetectable HIV RNA levels were maintained during therapy and, if not, whether viral loads were >500 copies/mL during treatment. Probabilities for presence of resistance mutations were estimated from GRTs (n = 2876) according to risk stratum and therapy received at time of testing. On the basis of these data, events of resistance emergence were imputed for each individual and were assessed using survival analysis. Imputation was repeated 100 times, and results were summarized by median values (2.5th-97.5th percentile range). RESULTS: Six years after treatment initiation, EFV plus AZT showed the highest cumulative resistance incidence (16%) of all regimens (<11%). Confounder-adjusted Cox regression confirmed that first-line EFV plus AZT (reference) was associated with a higher median hazard for resistance emergence, compared with other treatments: EFV plus TDF (hazard ratio [HR], 0.57; range, 0.42-0.76), LPV plus AZT (HR, 0.63; range, 0.45-0.89), LPV plus TDF (HR, 0.55; range, 0.33-0.83), ATZ/r plus TDF (HR, 0.43; range, 0.17-0.83). Two-thirds of resistance events were associated with detectable HIV RNA level ≤500 copies/mL during treatment, and only one-third with virological failure (HIV RNA level, >500 copies/mL). CONCLUSIONS: The inclusion of TDF instead of AZT and ATZ/r was correlated with lower rates of resistance emergence, most likely because of improved tolerability and pharmacokinetics resulting from a once-daily dosage.
Resumo:
In societies with strong multigenerational links, economic uncertainty results in choosing to stay with one child, sometimes in association with postponement of first births (i.e. Italy) and sometimes in early childbearing (i.e. Bulgaria). The interaction between intergenerational family practices in lowest-low fertility contexts is likely to play a role on differences timing to parenthood. In this paper, we focus on the phenomenon of women who have one child in their early twenties in Bulgaria and do not intend to have a second child. We argue that the key to this process is the persistence of extended multigenerational households in the Bulgarian context and their effect on young couples' fertility decision making. We use semi-structured interview data from the project Fertility Choices in Central and Eastern Europe and ethnographic fieldnotes. The interviews were collected from a sample of 22 couples resident in Sofia and representing different permutations of educational level, marital status and number of children (0 or 1). The four-year ethnographic fieldwork was conducted in both rural and urban Bulgaria between 1997 and 2009. Results suggest that as long as the economic situation remains dire, and young Bulgarians hopes for the future remain cynical, multigenerational households represent the accepted practice of entering into parenthood for young families.
Resumo:
Background: Although CD4 cell count monitoring is used to decide when to start antiretroviral therapy in patients with HIV-1 infection, there are no evidence-based recommendations regarding its optimal frequency. It is common practice to monitor every 3 to 6 months, often coupled with viral load monitoring. We developed rules to guide frequency of CD4 cell count monitoring in HIV infection before starting antiretroviral therapy, which we validated retrospectively in patients from the Swiss HIV Cohort Study.Methodology/Principal Findings: We built up two prediction rules ("Snap-shot rule" for a single sample and "Track-shot rule" for multiple determinations) based on a systematic review of published longitudinal analyses of CD4 cell count trajectories. We applied the rules in 2608 untreated patients to classify their 18 061 CD4 counts as either justifiable or superfluous, according to their prior >= 5% or < 5% chance of meeting predetermined thresholds for starting treatment. The percentage of measurements that both rules falsely deemed superfluous never exceeded 5%. Superfluous CD4 determinations represented 4%, 11%, and 39% of all actual determinations for treatment thresholds of 500, 350, and 200x10(6)/L, respectively. The Track-shot rule was only marginally superior to the Snap-shot rule. Both rules lose usefulness for CD4 counts coming near to treatment threshold.Conclusions/Significance: Frequent CD4 count monitoring of patients with CD4 counts well above the threshold for initiating therapy is unlikely to identify patients who require therapy. It appears sufficient to measure CD4 cell count 1 year after a count > 650 for a threshold of 200, > 900 for 350, or > 1150 for 500x10(6)/L, respectively. When CD4 counts fall below these limits, increased monitoring frequency becomes advisable. These rules offer guidance for efficient CD4 monitoring, particularly in resource-limited settings.
Resumo:
Despite increasing numbers of women attaining higher level academic degrees, gender disparities remain among higher education and among university faculties. Some have posited that this may stem from inadequate academic identity development of women at the doctoral level. While existing gender differences may stem from multiple and variable origins, mentoring has been proposed as a viable means to promote academic identity development and address these gender gaps. This study used a qualitative, narrative case study design to evaluate "StartingDoc" a structured mentoring program launched among Swiss Universities aimed at promoting networking and academic identity development among female doctoral students. Herein we describe the 9 emergent themes which arose from the small-group mentoring program and suggest that such an approach is both feasible and beneficial for young female academics. Further work is needed to elucidate the most effective strategies for developing and retaining women in academia.
Resumo:
BACKGROUND: Tenofovir is associated with reduced renal function, but it is not clear whether there is a greater decline in renal function when tenofovir is co-administered with a boosted protease inhibitor rather than with a nonnucleoside reverse transcriptase inhibitor (NNRTI). METHODS: We calculated the estimated glomerular filtration rate (eGFR) for patients in the Swiss HIV Cohort Study. We estimated the difference in eGFR over time between first therapies containing tenofovir and either the NNRTI efavirenz or the protease inhibitors lopinavir (LPV/r) or atazanavir (ATV/r), both boosted with ritonavir. RESULTS: Patients on a first therapy of tenofovir co-administered with efavirenz (n = 484), LPV/r (n = 269) and ATV/r (n = 187) were followed for a median of 1.7, 1.2 and 1.3 years, respectively. Relative to tenofovir and efavirenz, the estimated difference in eGFR for tenofovir and LPV/r was -2.6 ml/min per 1.73 m [95% confidence interval (CI) -7.3 to 2.2) during the first 6 months of therapy, then followed by a difference of 0.0 ml/min per 1.73 m (95% CI -1.1 to 1.1) for each additional 6 months of therapy. Relative to tenofovir and efavirenz, the estimated difference in eGFR for tenofovir and ATV/r was -7.6 ml/min per 1.73 m (95% CI -11.8 to -3.4) during the first 6 months of therapy, then followed by a difference of -0.5 ml/min per 1.73 m (95% CI -1.6 to 0.7) for each additional 6 months of therapy. CONCLUSION: Tenofovir with either boosted protease inhibitor leads to a greater initial decline in eGFR than tenofovir with efavirenz; this decline may be worse with ATV/r than with LPV/r.
Resumo:
BACKGROUND: Adverse effects of combination antiretroviral therapy (CART) commonly result in treatment modification and poor adherence. METHODS: We investigated predictors of toxicity-related treatment modification during the first year of CART in 1318 antiretroviral-naive human immunodeficiency virus (HIV)-infected individuals from the Swiss HIV Cohort Study who began treatment between January 1, 2005, and June 30, 2008. RESULTS: The total rate of treatment modification was 41.5 (95% confidence interval [CI], 37.6-45.8) per 100 person-years. Of these, switches or discontinuations because of drug toxicity occurred at a rate of 22.4 (95% CI, 19.5-25.6) per 100 person-years. The most frequent toxic effects were gastrointestinal tract intolerance (28.9%), hypersensitivity (18.3%), central nervous system adverse events (17.3%), and hepatic events (11.5%). In the multivariate analysis, combined zidovudine and lamivudine (hazard ratio [HR], 2.71 [95% CI, 1.95-3.83]; P < .001), nevirapine (1.95 [1.01-3.81]; P = .050), comedication for an opportunistic infection (2.24 [1.19-4.21]; P = .01), advanced age (1.21 [1.03-1.40] per 10-year increase; P = .02), female sex (1.68 [1.14-2.48]; P = .009), nonwhite ethnicity (1.71 [1.18-2.47]; P = .005), higher baseline CD4 cell count (1.19 [1.10-1.28] per 100/microL increase; P < .001), and HIV-RNA of more than 5.0 log(10) copies/mL (1.47 [1.10-1.97]; P = .009) were associated with higher rates of treatment modification. Almost 90% of individuals with treatment-limiting toxic effects were switched to a new regimen, and 85% achieved virologic suppression to less than 50 copies/mL at 12 months compared with 87% of those continuing CART (P = .56). CONCLUSIONS: Drug toxicity remains a frequent reason for treatment modification; however, it does not affect treatment success. Close monitoring and management of adverse effects and drug-drug interactions are crucial for the durability of CART.
Resumo:
Invasive candidiasis is associated with high mortality rates (35% to 60%), similar to the range reported for septic shock. The most common types include candidemia, frequently observed in immunocompromised patients, and noncandidemic systemic candidiasis, which constitutes the majority of cases in critically ill patients. However, they are difficult to prove and a definite diagnosis usually occurs late in the course of the disease, thus contributing to their bad prognosis. Early empirical treatment improves the prognosis and currently relies on the positive predictive value (PPV) of risk-assessment strategies (colonization index, Candida score, predictive rules) based on combinations of risk factors, but it may have also largely contributed to the overuse of antifungal agents in critically ill patients. In this context, non- culture-based diagnostic methods, including specific and nonspecific biomarkers, may significantly improve the diagnosis of invasive candidiasis. Candida DNA and mannan antigen/antimannan antibodies are of limited interest for the diagnosis of invasive candidiasis as they fail to identify noncandidemic systemic candidiasis, despite early positivity in candidemic patients. The utility of 1,3-beta-D-glucan (b-D-glucan), a panfungal cell wall antigen, has been demonstrated for the diagnosis of fungal infections in immunocompromised patients. Preliminary data suggest that it is also detectable early in critically ill patients developing noncandidemic systemic candidiasis. To take advantage of the high negative predictive value of risk-assessment strategies and the early increase in specific fungal biomarkers in high-risk patients, we propose a practical 2-step approach to improve the selection of patients susceptible to benefit from empirical antifungal treatment.
Resumo:
Résumé : Les progrès techniques de la spectrométrie de masse (MS) ont contribué au récent développement de la protéomique. Cette technique peut actuellement détecter, identifier et quantifier des milliers de protéines. Toutefois, elle n'est pas encore assez puissante pour fournir une analyse complète des modifications du protéome corrélées à des phénomènes biologiques. Notre objectif était le développement d'une nouvelle stratégie pour la détection spécifique et la quantification des variations du protéome, basée sur la mesure de la synthèse des protéines plutôt que sur celle de la quantité de protéines totale. Pour cela, nous volions associer le marquage pulsé des protéines par des isotopes stables avec une méthode d'acquisition MS basée sur le balayage des ions précurseurs (precursor ion scan, ou PIS), afin de détecter spécifiquement les protéines ayant intégré les isotopes et d'estimer leur abondance par rapport aux protéines non marquées. Une telle approche peut identifier les protéines avec les plus hauts taux de synthèse dans une période de temps donnée, y compris les protéines dont l'expression augmente spécifiquement suite à un événement précis. Nous avons tout d'abord testé différents acides aminés marqués en combinaison avec des méthodes PIS spécifiques. Ces essais ont permis la détection spécifique des protéines marquées. Cependant, en raison des limitations instrumentales du spectromètre de masse utilisé pour les méthodes PIS, la sensibilité de cette approche s'est révélée être inférieure à une analyse non ciblée réalisée sur un instrument plus récent (Chapitre 2.1). Toutefois, pour l'analyse différentielle de deux milieux de culture conditionnés par des cellules cancéreuses humaines, nous avons utilisé le marquage métabolique pour distinguer les protéines d'origine cellulaire des protéines non marquées du sérum présentes dans les milieux de culture (Chapitre 2.2). Parallèlement, nous avons développé une nouvelle méthode de quantification nommée IBIS, qui utilise des paires d'isotopes stables d'acides aminés capables de produire des ions spécifiques qui peuvent être utilisés pour la quantification relative. La méthode IBIS a été appliquée à l'analyse de deux lignées cellulaires cancéreuses complètement marquées, mais de manière différenciée, par des paires d'acides aminés (Chapitre 2.3). Ensuite, conformément à l'objectif initial de cette thèse, nous avons utilisé une variante pulsée de l'IBIS pour détecter des modifications du protéome dans des cellules HeLa infectée par le virus humain Herpes Simplex-1 (Chapitre 2.4). Ce virus réprime la synthèse des protéines des cellules hôtes afin d'exploiter leur mécanisme de traduction pour la production massive de virions. Comme prévu, de hauts taux de synthèse ont été mesurés pour les protéines virales détectées, attestant de leur haut niveau d'expression. Nous avons de plus identifié un certain nombre de protéines humaines dont le rapport de synthèse et de dégradation (S/D) a été modifié par l'infection virale, ce qui peut donner des indications sur les stratégies utilisées par les virus pour détourner la machinerie cellulaire. En conclusion, nous avons montré dans ce travail que le marquage métabolique peut être employé de façon non conventionnelle pour étudier des dimensions peu explorées en protéomique. Summary : In recent years major technical advancements greatly supported the development of mass spectrometry (MS)-based proteomics. Currently, this technique can efficiently detect, identify and quantify thousands of proteins. However, it is not yet sufficiently powerful to provide a comprehensive analysis of the proteome changes correlated with biological phenomena. The aim of our project was the development of ~a new strategy for the specific detection and quantification of proteomé variations based on measurements of protein synthesis rather than total protein amounts. The rationale for this approach was that changes in protein synthesis more closely reflect dynamic cellular responses than changes in total protein concentrations. Our starting idea was to couple "pulsed" stable-isotope labeling of proteins with a specific MS acquisition method based on precursor ion scan (PIS), to specifically detect proteins that incorporated the label and to simultaneously estimate their abundance, relative to the unlabeled protein isoform. Such approach could highlight proteins with the highest synthesis rate in a given time frame, including proteins specifically up-regulated by a given biological stimulus. As a first step, we tested different isotope-labeled amino acids in combination with dedicated PIS methods and showed that this leads to specific detection of labeled proteins. Sensitivity, however, turned out to be lower than an untargeted analysis run on a more recent instrument, due to MS hardware limitations (Chapter 2.1). We next used metabolic labeling to distinguish the proteins of cellular origin from a high background of unlabeled (serum) proteins, for the differential analysis of two serum-containing culture media conditioned by labeled human cancer cells (Chapter 2.2). As a parallel project we developed a new quantification method (named ISIS), which uses pairs of stable-isotope labeled amino acids able to produce specific reporter ions, which can be used for relative quantification. The ISIS method was applied to the analysis of two fully, yet differentially labeled cancer cell lines, as described in Chapter 2.3. Next, in line with the original purpose of this thesis, we used a "pulsed" variant of ISIS to detect proteome changes in HeLa cells after the infection with human Herpes Simplex Virus-1 (Chapter 2.4). This virus is known to repress the synthesis of host cell proteins to exploit the translation machinery for the massive production of virions. As expected, high synthesis rates were measured for the detected viral proteins, confirming their up-regulation. Moreover, we identified a number of human proteins whose synthesis/degradation ratio (S/D) was affected by the viral infection and which could provide clues on the strategies used by the virus to hijack the cellular machinery. Overall, in this work, we showed that metabolic labeling can be employed in alternative ways to investigate poorly explored dimensions in proteomics.
Resumo:
Reduced re'nal function has been reported with tenofovir disoproxil fumarate (TDF). It is not clear whether TDF co-administered with a boosted protease inhibitor (PI) leads to a greater decline in renal function than TDF co-administered with a non-nucleoside reverse transcriptase inhibitor (NNRTI).Methods: We selected ail antiretroviral therapy-naive patients in the Swiss HIV Cohort Study (SHCS) with calibrated or corrected serum creatinine measurements starting antiretroviral therapy with TDF and either efavirenz (EFV) or the ritonavir-boosted PIs, lopinavir (LPV/r) or atazanavir (ATV/r). As a measure of renal function, we used the Chronic Kidney Disease Epidemiology Collaboration (CKD-EPI) equation to estimate the glomerular filtration rate (eGFR). We calculated the difference in eGFR over time between two therapies using a marginal model for repeated measures. In weighted analyses, observations were weighted by the product of their point of treatment and censoring weights to adjust for differences both in the sort of patients starting each therapy and in the sort of patients remaining on each therapy over time.Results: By March 2011, 940 patients with at least one creatinine measurement on a first therapy with either TDF and EFV (n=484), TDF and LPVlr (n=269) or TDF and ATV/r (n=187) had been followed for a median of 1. 7, 1.2 and 1.3 years, respectively. Table 1 shows the difference in average estimated GFR (eGFR) over time since starting cART for two marginal models. The first model was not adjusted for potential confounders; the second mode! used weights to adjust for confounders. The results suggest a greater decline in renal function during the first 6 months if TDF is used with a PI rather than with an NNRTI, but no further difference between these therapies after the first 6 months. TDF and ATV/r may lead to a greater decline in the first 6 months than TDF and LPVlr.Conclusions: TDF co-administered with a boosted PI leads to a greater de cline in renal function over the first 6 months of therapy than TDF co-administered with an NNRTI; this decline may be worse with ATV/r than with LPV/r.
Resumo:
Evidence concerning the presence or absence of common neuronglia lineages in the postnatal mammalian central nervous system is still a matter of speculation. We address this problem using optic nerve explants, which show an extremely long survival in culture. Morphological, immunocytochemical and immunochemical methods were applied. The results obtained from in vitro tissue were compared with optic nerves (ONs) and whole-brain samples from animals of different ages. Newborn rat ONs represented the starting material of our tissue culture; they are composed of unmyelinated axons, astrocytes and progenitor cells but devoid of neuronal cell bodies. At this age, Western blots of ONs were positively stained by neurofilament and synapsin I specific antibodies. These bands increased in intensity during postnatal in situ development. In explant cultures, the glia cells reach a stage of functional differentiation and they maintain, together with undifferentiated cells, a complex histotypic organization. After 6 days in vitro, neurofilaments and synapsin I could not be detected on immunoblots, indicating that 1) axonal degeneration was completed, and 2) neuronal somata were absent at the time. Surprisingly, after about 4-5 weeks in culture, a new cell type appeared, which showed characteristics typical of neurons. After 406 days in vitro, neurofilaments and synapsin I were unequivocally detectable on Western blots. Furthermore, both immunocytochemical staining and light and electron microscopic examinations corroborated the presence of this earlier-observed cell type. These in vitro results clearly show the high developmental plasticity of ON progenitor cells, even late in development. The existence of a common neuron-glia precursor, which never gives rise to neurons in situ, is suggested.
Resumo:
BACKGROUND: Lipid-lowering therapy is costly but effective at reducing coronary heart disease (CHD) risk. OBJECTIVE: To assess the cost-effectiveness and public health impact of Adult Treatment Panel III (ATP III) guidelines and compare with a range of risk- and age-based alternative strategies. DESIGN: The CHD Policy Model, a Markov-type cost-effectiveness model. DATA SOURCES: National surveys (1999 to 2004), vital statistics (2000), the Framingham Heart Study (1948 to 2000), other published data, and a direct survey of statin costs (2008). TARGET POPULATION: U.S. population age 35 to 85 years. Time Horizon: 2010 to 2040. PERSPECTIVE: Health care system. INTERVENTION: Lowering of low-density lipoprotein cholesterol with HMG-CoA reductase inhibitors (statins). OUTCOME MEASURE: Incremental cost-effectiveness. RESULTS OF BASE-CASE ANALYSIS: Full adherence to ATP III primary prevention guidelines would require starting (9.7 million) or intensifying (1.4 million) statin therapy for 11.1 million adults and would prevent 20,000 myocardial infarctions and 10,000 CHD deaths per year at an annual net cost of $3.6 billion ($42,000/QALY) if low-intensity statins cost $2.11 per pill. The ATP III guidelines would be preferred over alternative strategies if society is willing to pay $50,000/QALY and statins cost $1.54 to $2.21 per pill. At higher statin costs, ATP III is not cost-effective; at lower costs, more liberal statin-prescribing strategies would be preferred; and at costs less than $0.10 per pill, treating all persons with low-density lipoprotein cholesterol levels greater than 3.4 mmol/L (>130 mg/dL) would yield net cost savings. RESULTS OF SENSITIVITY ANALYSIS: Results are sensitive to the assumptions that LDL cholesterol becomes less important as a risk factor with increasing age and that little disutility results from taking a pill every day. LIMITATION: Randomized trial evidence for statin effectiveness is not available for all subgroups. CONCLUSION: The ATP III guidelines are relatively cost-effective and would have a large public health impact if implemented fully in the United States. Alternate strategies may be preferred, however, depending on the cost of statins and how much society is willing to pay for better health outcomes. FUNDING: Flight Attendants' Medical Research Institute and the Swanson Family Fund. The Framingham Heart Study and Framingham Offspring Study are conducted and supported by the National Heart, Lung, and Blood Institute.
Resumo:
We conducted an experiment to assess the use of olfactory traces for spatial orientation in an open environment in rats, Rattus norvegicus. We trained rats to locate a food source at a fixed location from different starting points, in the presence or absence of visual information. A single food source was hidden in an array of 19 petri dishes regularly arranged in an open-field arena. Rats were trained to locate the food source either in white light (with full access to distant visuospatial information) or in darkness (without any visual information). In both cases, the goal was in a fixed location relative to the spatial frame of reference. The results of this experiment revealed that the presence of noncontrolled olfactory traces coherent with the spatial frame of reference enables rats to locate a unique position as accurately in darkness as with full access to visuospatial information. We hypothesize that the olfactory traces complement the use of other orientation mechanisms, such as path integration or the reliance on visuospatial information. This experiment demonstrates that rats can rely on olfactory traces for accurate orientation, and raises questions about the establishment of such traces in the absence of any other orientation mechanism. Copyright 1998 The Association for the Study of Animal Behaviour.
Resumo:
Whether a 1-year nationwide, government supported programme is effective in significantly increasing the number of smoking cessation clinics at major Swiss hospitals as well as providing basic training for the staff running them. We conducted a baseline evaluation of hospital services for smoking cessation, hypertension, and obesity by web search and telephone contact followed by personal visits between October 2005 and January 2006 of 44 major public hospitals in the 26 cantons of Switzerland; we compared the number of active smoking cessation services and trained personnel between baseline to 1 year after starting the programme including a training workshop for doctors and nurses from all hospitals as well as two further follow-up visits. At base line 9 (21%) hospitals had active smoking cessation services, whereas 43 (98%) and 42 (96%) offered medical services for hypertension and obesity respectively. Hospital directors and heads of Internal Medicine of 43 hospitals were interested in offering some form of help to smokers provided they received outside support, primarily funding to get started or to continue. At two identical workshops, 100 health professionals (27 in Lausanne, 73 in Zurich) were trained for one day. After the programme, 22 (50%) hospitals had an active smoking cessation service staffed with at least 1 trained doctor and 1 nurse. A one-year, government-supported national intervention resulted in a substantial increase in the number of hospitals allocating trained staff and offering smoking cessation services to smokers. Compared to the offer for hypertension and obesity this offer is still insufficient.