880 resultados para hospital discharge
Resumo:
Background and Objectives: Patients who survive acute kidney injury (AKI), especially those with partial renal recovery, present a higher long-term mortality risk. However, there is no consensus on the best time to assess renal function after an episode of acute kidney injury or agreement on the definition of renal recovery. In addition, only limited data regarding predictors of recovery are available. Design, Setting, Participants, & Measurements: From 1984 to 2009, 84 adult survivors of acute kidney injury were followed by the same nephrologist (RCRMA) for a median time of 4.1 years. Patients were seen at least once each year after discharge until end stage renal disease (ESRD) or death. In each consultation serum creatinine was measured and glomerular filtration rate estimated. Renal recovery was defined as a glomerular filtration rate value >= 60 mL/min/1.73 m2. A multiple logistic regression was performed to evaluate factors independently associated with renal recovery. Results: The median length of follow-up was 50 months (30-90 months). All patients had stabilized their glomerular filtration rates by 18 months and 83% of them stabilized earlier: up to 12 months. Renal recovery occurred in 16 patients (19%) at discharge and in 54 (64%) by 18 months. Six patients died and four patients progressed to ESRD during the follow up period. Age (OR 1.09, p < 0.0001) and serum creatinine at hospital discharge (OR 2.48, p = 0.007) were independent factors associated with non renal recovery. The acute kidney injury severity, evaluated by peak serum creatinine and need for dialysis, was not associated with non renal recovery. Conclusions: Renal recovery must be evaluated no earlier than one year after an acute kidney injury episode. Nephrology referral should be considered mainly for older patients and those with elevated serum creatinine at hospital discharge.
Resumo:
Background: Little is known in our country about regional differences in the treatment of acute coronary disease. Objective: To analyze the behavior regarding the use of demonstrably effective regional therapies in acute coronary disease. Methods: A total of 71 hospitals were randomly selected, respecting the proportionality of the country in relation to geographic location, among other criteria. In the overall population was regionally analyzed the use of aspirin, clopidogrel, ACE inhibitors / AT1 blocker, beta-blockers and statins, separately and grouped by individual score ranging from 0 (no drug used) to 100 (all drugs used). In myocardial infarction with ST elevation (STEMI) regional differences were analyzed regarding the use of therapeutic recanalization (fibrinolytics and primary angioplasty). Results: In the overall population, within the first 24 hours of hospitalization, the mean score in the North-Northeast (70.5 +/- 22.1) was lower (p < 0.05) than in the Southeast (77.7 +/- 29.5), Midwest (82 +/- 22.1) and South (82.4 +/- 21) regions. At hospital discharge, the score of the North-Northeast region (61.4 +/- 32.9) was lower (p < 0.05) than in the Southeast (69.2 +/- 31.6), Midwest (65.3 +/- 33.6) and South (73.7 +/- 28.1) regions; additionally, the score of the Midwest was lower (p < 0.05) than the South region. In STEMI, the use of recanalization therapies was highest in the Southeast (75.4%, p = 0.001 compared to the rest of the country), and lowest in the North-Northeast (52.5%, p < 0.001 compared to the rest of the country). Conclusion: The use of demonstrably effective therapies in the treatment of acute coronary disease is much to be desired in the country, with important regional differences.
Resumo:
Introduction: The benefits of higher positive end expiratory pressure (PEEP) in patients with acute respiratory distress syndrome (ARDS) have been modest, but few studies have fully tested the "open-lung hypothesis". This hypothesis states that most of the collapsed lung tissue observed in ARDS can be reversed at an acceptable clinical cost, potentially resulting in better lung protection, but requiring more intensive maneuvers. The short-/middle-term efficacy of a maximum recruitment strategy (MRS) was recently described in a small physiological study. The present study extends those results, describing a case-series of non-selected patients with early, severe ARDS submitted to MRS and followed until hospital discharge or death. Methods: MRS guided by thoracic computed tomography (CT) included two parts: a recruitment phase to calculate opening pressures (incremental steps under pressure-controlled ventilation up to maximum inspiratory pressures of 60 cmH(2)O, at constant driving-pressures of 15 cmH(2)O); and a PEEP titration phase (decremental PEEP steps from 25 to 10 cmH2O) used to estimate the minimum PEEP to keep lungs open. During all steps, we calculated the size of the non-aerated (-100 to +100 HU) compartment and the recruitability of the lungs (the percent mass of collapsed tissue re-aerated from baseline to maximum PEEP). Results: A total of 51 severe ARDS patients, with a mean age of 50.7 years (84% primary ARDS) was studied. The opening plateau-pressure was 59.6 (+/- 5.9 cmH(2)O), and the mean PEEP titrated after MRS was 24.6 (+/- 2.9 cmH(2)O). Mean PaO2/FiO(2) ratio increased from 125 (+/- 43) to 300 (+/- 103; P < 0.0001) after MRS and was sustained above 300 throughout seven days. Non-aerated parenchyma decreased significantly from 53.6% (interquartile range (IQR): 42.5 to 62.4) to 12.7% (IQR: 4.9 to 24.2) (P < 0.0001) after MRS. The potentially recruitable lung was estimated at 45% (IQR: 25 to 53). We did not observe major barotrauma or significant clinical complications associated with the maneuver. Conclusions: MRS could efficiently reverse hypoxemia and most of the collapsed lung tissue during the course of ARDS, compatible with a high lung recruitability in non-selected patients with early, severe ARDS. This strategy should be tested in a prospective randomized clinical trial.
Resumo:
The objective of this study is to retrospectively report the results of interventions for controlling a vancomycin-resistant enterococcus (VRE) outbreak in a tertiary-care pediatric intensive care unit (PICU) of a University Hospital. After identification of the outbreak, interventions were made at the following levels: patient care, microbiological surveillance, and medical and nursing staff training. Data were collected from computer-based databases and from the electronic prescription system. Vancomycin use progressively increased after March 2008, peaking in August 2009. Five cases of VRE infection were identified, with 3 deaths. After the interventions, we noted a significant reduction in vancomycin prescription and use (75% reduction), and the last case of VRE infection was identified 4 months later. The survivors remained colonized until hospital discharge. After interventions there was a transient increase in PICU length-of-stay and mortality. Since then, the use of vancomycin has remained relatively constant and strict, no other cases of VRE infection or colonization have been identified and length-of-stay and mortality returned to baseline. In conclusion, we showed that a bundle intervention aiming at a strict control of vancomycin use and full compliance with the Hospital Infection Control Practices Advisory Committee guidelines, along with contact precautions and hand-hygiene promotion, can be effective in reducing vancomycin use and the emergence and spread of vancomycin-resistant bacteria in a tertiary-care PICU.
Resumo:
Background: The causes of death on long-term mortality after acute kidney injury (AKI) have not been well studied. The purpose of the study was to evaluate the role of comorbidities and the causes of death on the long-term mortality after AKI. Methodology/Principal Findings: We retrospectively studied 507 patients who experienced AKI in 2005-2006 and were discharged free from dialysis. In June 2008 (median: 21 months after AKI), we found that 193 (38%) patients had died. This mortality is much higher than the mortality of the population of Sao Paulo City, even after adjustment for age. A multiple survival analysis was performed using Cox proportional hazards regression model and showed that death was associated with Khan's index indicating high risk [adjusted hazard ratio 2.54 (1.38-4.66)], chronic liver disease [1.93 (1.15-3.22)], admission to non-surgical ward [1.85 (1.30-2.61)] and a second AKI episode during the same hospitalization [1.74 (1.12-2.71)]. The AKI severity evaluated either by the worst stage reached during AKI (P=0.20) or by the need for dialysis (P=0.12) was not associated with death. The causes of death were identified by a death certificate in 85% of the non-survivors. Among those who died from circulatory system diseases (the main cause of death), 59% had already suffered from hypertension, 34% from diabetes, 47% from heart failure, 38% from coronary disease, and 66% had a glomerular filtration rate <60 previous to the AKI episode. Among those who died from neoplasms, 79% already had the disease previously. Conclusions: Among AKI survivors who were discharged free from dialysis the increased long-term mortality was associated with their pre-existing chronic conditions and not with the severity of the AKI episode. These findings suggest that these survivors should have a medical follow-up after hospital discharge and that all efforts should be made to control their comorbidities.
Resumo:
Objective: to analyze the impact and burden of care on the Health-Related Quality of Life (HRQOL) of caregivers of individuals with a spinal cord injury (SCI). Method: cross-sectional observational study carried out by reviewing medical records and applying questionnaires. The scale Short Form 36 (SF-36) was used to assess HRQOL and the Caregiver Burden Scale (CBScale) for care burden. Results were analyzed quantitatively. Most patients with SCIs were male, aged 35.4 years old on average, with a predominance of thoracic injuries followed by cervical injuries. Most caregivers were female aged 44.8 years old on average. Results: tetraplegia and secondary complications stand out among the clinical characteristics that contributed to greater care burden and worse HRQOL. Association between care burden with HRQOL revealed that the greater the burden the worse the HRQOL. Conclusion: Preventing care burden through strategies that prepare patients for hospital discharge, integrating the support network, and enabling access to health care services are interventions that could minimize the effects arising from care burden and contribute to improving HRQOL.
Resumo:
Background: Neuromuscular electrostimulation has become a promising issue in cardiovascular rehabilitation. However there are few articles published in the literature regarding neuromuscular electrostimulation in patients with heart failure during hospital stay. Methods: This is a randomized controlled pilot trial that aimed to investigate the effect of neuromuscular electrostimulation in the walked distance by the six-minute walking test in 30 patients admitted to ward for heart failure treatment in a tertiary cardiology hospital. Patients in the intervention group performed a conventional rehabilitation and neuromuscular electrostimulation. Patients underwent 60 minutes of electrostimulation (wave frequency was 20 Hz, pulse duration of 20 us) two times a day for consecutive days until hospital discharge. Results: The walked distance in the six-minute walking test improved 75% in the electrostimulation group (from 379.7 +/- 43.5 to 372.9 +/- 46.9 meters to controls and from 372.9 +/- 62.4 to 500 +/- 68 meters to electrostimulation, p<0.001). On the other hand, the walked distance in the control group did not change. Conclusion: The neuromuscular electrostimulation group showed greater improvement in the walked distance in the six-minute walking test in patients admitted to ward for compensation of heart failure.
Resumo:
Introduction The development of postextubation wallowing dysfunction is well documented in the literature with high prevalence in most studies. However, there are relatively few studies with specific outcomes that focus on the follow-up of these patients until hospital discharge. The purpose of our study was to determine prognostic indicators of dysphagia in ICU patients submitted to prolonged orotracheal intubation (OTI). Methods We conducted a retrospective, observational cohort study from 2010 to 2012 of all patients over 18 years of age admitted to a university hospital ICU who were submitted to prolonged OTI and subsequently received a bedside swallow evaluation (BSE) by a speech pathologist. The prognostic factors analyzed included dysphagia severity rate at the initial swallowing assessment and at hospital discharge, age, time to initiate oral feeding, amount of individual treatment, number of orotracheal intubations, intubation time and length of hospital stay. Results After we excluded patients with neurologic diseases, tracheostomy, esophageal dysphagia and those who were submitted to surgical procedures involving the head and neck, our study sample size was 148 patients. The logistic regression model was used to examine the relationships between independent variables. In the univariate analyses, we found that statistically significant prognostic indicators of dysphagia included dysphagia severity rate at the initial swallowing assessment, time to initiate oral feeding and amount of individual treatment. In the multivariate analysis, we found that dysphagia severity rate at the initial swallowing assessment remained associated with good treatment outcomes. Conclusions Studies of prognostic indicators in different populations with dysphagia can contribute to the design of more effective procedures when evaluating, treating, and monitoring individuals with this type of disorder. Additionally, this study stresses the importance of the initial assessment ratings.
Resumo:
Introduzione. La movimentazione manuale di carichi è stata recentemente proposta come un possibile determinante del distacco di retina. Al fine di confortare quest’ipotesi, sono stati analizzati i tassi di incidenza di distacco di retina regmatogeno (DRR) idiopatico, trattato chirurgicamente, tra i residenti in Toscana addetti ad attività lavorative manuali, non manuali e casalinghe. Metodi. Le schede di dimissione ospedaliera (SDO) della Toscana contengono anche informazioni codificate sulla categoria generica di impiego. Sono stati utilizzati i dati di tutti i pazienti residenti in Toscana con una SDO emessa da un qualsiasi ospedale italiano nel periodo 1997-2009, con diagnosi principale di DRR (ICD-9: 361,0-361,07 e 361,9) e con DRG 36 (“interventi sulla retina”). Dopo l’eliminazione dei soggetti che non soddisfacevano i criteri di eligibilità, è stato deciso di restringere la popolazione in studio ai soggetti di età 25-59 anni, successivamente classificati in addetti ad attività lavorative manuali, non manuali o casalinghe. Risultati. Sono stati identificati 1.946 casi. Tra gli uomini, gli addetti ad attività lavorative manuali hanno riportato un tasso di incidenza standardizzato per età 1,8 volte più alto rispetto agli addetti ad attività lavorative non manuali (17,4 [IC95%, 16,1–18,7] vs. 9,8 [IC95%, 8,8–10,8]). Tra le donne, i tassi di incidenza standardizzati per età erano 1,9 volte più alti negli addetti ad attività lavorative manuali (11,1 [IC95%, 9,8–12,3]) e 1,7 volte più alti nelle casalinghe (9,5 [IC95%, 8,3–10,8]) rispetto agli addetti ad attività lavorative non manuali (5,7 [IC95%, 4,8–6,6]). Conclusioni. Lo studio mette in evidenza come gli addetti ad attività lavorative manuali siano maggiormente affetti da DRR idiopatico rispetto agli addetti ad attività lavorative non manuali. Questi risultati supportano l’ipotesi che la movimentazione manuale di carichi, che difficilmente può ritrovarsi come compito di attività lavorative non manuali, possa avere un ruolo causale nella genesi della patologia.
Resumo:
Untersuchung zum Einfluss der ACD-CPR (Aktive Kompressions- Dekompressions-Reanimation) mit während der Dekompressionsphase blockiertem Gasfluss (Impedance-Threshold-Device) im Vergleich zur Standardreanimationsrechnik auf die Kurzzeit-Überlebensrate von Patienten mit präklinisch aufgetretenem Herzkreislaufstillstand. Die Studie vergleicht die ACD-ITD-CPR gegen die Standard-CPR bei Patienten mit präklinischem Herkreislaufstillstand. Primär untersuchter Parameter war die Ein-Stunden-Überlebensrate nach Krankenhausaufnahme. Sekundär wurden die Tastbarkeit eines Pulses unter CPR, die Wiederkehr des Spontankreislaufs (ROSC), die Rate an Krankenhausaufnahmen, 24-Stunden-Überleben und Krankenhausentlassungen untersucht. Außerdem wurde das neurologische Outcome evaluiert. Die Studie fand in Mainz statt; Mainz bietet sich für die Durchführung notfallmedizinischer Studien in besonderem Maße an. Der Rettungsdienst der Stadt arbeitet nach dem in Deutschland üblichen zweigliedrigen System mit RA/RS auf RTWs und NA auf NAW/NEF. Die Studie wurde nach einer fünfmonatigen Pilotphase und extensivem Training beider Verfahren durchgeführt. Zusätzlich bestanden bei einigen der Beteiligten schon Erfahrungen mit ACD-CPR. Es ergaben sich signifikante Vorteile der ACD-ITD-CPR gegenüber der Standard-CPR hinsichtlich des primär untersuchten Parameters (51% vs. 32% p=0,006), außerdem statistisch auffällige Vorteile hinsichtlich der Pulstastbarkeit unter CPR (85% vs. 69%, p=0,008), der Wiederkehr des Spontankreislaufs (55% vs. 37%, p=0,016), der Rate der Krankenhausaufnahmen (52% vs. 36%, p=0.023) und des 24-Stunden-Überlebens (37% vs. 22%, p=0,033). Es ergaben sich keine statistisch auffälligen Unterschiede hinsichtlich der Rate an Krankenhausentlassungen und im neurologischen Outcome. Die Ergebnisse lassen den Schluss zu, dass die ACD-ITD die Kurzzeitüberlebensrate der Patienten mit außerklinischem Herzkreislaufstillstand verbessern kann. Eine Voraussetzung hierfür ist eine ausreichende und andauernde Ausbildung der beteiligten Helfer und/oder die Verfügbarkeit technisch verbesserter Lösungen. Weitere Untersuchungen zum Einfluss auf die Langzeitüberlebensraten und das neurologische Outcome scheinen angezeigt.
Resumo:
In Italia, il processo di de-istituzionalizzazione e di implementazione di modelli di assistenza per la salute mentale sono caratterizzati da carenza di valutazione. In particolare, non sono state intraprese iniziative per monitorare le attività relative all’assistenza dei pazienti con disturbi psichiatrici. Pertanto, l’obiettivo della tesi è effettuare una valutazione comparativa dei percorsi di cura nell’ambito della salute mentale nei Dipartimenti di Salute Mentale e Dipendenze Patologiche della regione Emilia-Romagna utilizzando indicatori ottenuti dai flussi amministrativi correnti.. I dati necessari alla costruzione degli indicatori sono stati ottenuti attraverso un data linkage dei flussi amministrativi correnti regionali delle schede di dimissione ospedaliera, delle attività territoriali dei Centri di Salute Mentale e delle prescrizioni farmaceutiche, con riferimento all’anno 2010. Gli indicatori sono stati predisposti per tutti i pazienti con diagnosi principale psichiatrica e poi suddivisi per categoria diagnostica in base al ICD9-CM. . Il set di indicatori esaminato comprende i tassi di prevalenza trattata e di incidenza dei disturbi mentali, i tassi di ospedalizzazione, la ri-ospedalizzazione a 7 e 30 giorni dalla dimissione dai reparti psichiatrici, la continuità assistenziale ospedale-territorio, l’adesione ai trattamenti ed il consumo e appropriatezza prescrittiva di farmaci. Sono state rilevate alcune problematiche nella ricostruzione della continuità assistenziale ospedale-territorio ed alcuni limiti degli indicatori relativi alle prescrizioni dei farmaci. Il calcolo degli indicatori basato sui flussi amministrativi correnti si presenta fattibile, pur con i limiti legati alla qualità, completezza ed accuratezza dei dati presenti. L’implementazione di questi indicatori su larga scala (regionale e nazionale) e su base regolare può essere una opportunità per impostare un sistema di sorveglianza, monitoraggio e valutazione dell’assistenza psichiatrica nei DSM.
Resumo:
Cesarean Delivery (CD) rates are rising in many parts of the world. In order to define strategies to reduce them, it is important to explore the role of clinical and organizational factors. This thesis has the objective to describe the contemporary CD practice and study clinical and organizational variables as determinants of CD in all women who gave birth between 2005 and June 2010 in the Emilia Romagna region (Italy). All hospital discharge abstracts of women who delivered between 2005 and mid 2010 in the region were selected and linked with birth certificates. In addition to descriptive statistics, in order to study the role of clinical and organizational variables (teaching or non-teaching hospital, birth volumes, time and day of delivery) multilevel Poisson regression models and a classification tree were used. A substantial inter-hospital variability in CD rate was found, and this was only partially explained by the considered variables. The most important risk factors of CD were: previous CD (RR 4,95; 95%CI: 4,85-5,05), cord prolapse (RR 3,51; 95% CI:2,96-4,16), and malposition/malpresentation (RR 2,72; 95%CI: 2,66-2,77). Delivery between 7 pm and 7 am and during non working days protect against CD in all subgroups including those with a small number of elective CDs while delivery at a teaching hospital and birth volumes were not statistically significant risk factors. The classification tree shows that previous CD and malposition/malpresentation are the most important variables discriminating between high and low risk of CD. These results indicate that other not considered factors might explain CD variability and do not provide clear evidence that small hospitals have a poor performance in terms of CD rate. Some strategies to reduce CD could be found by focusing on the differences in delivery practice between day and night and between working and no-working day deliveries.
Resumo:
L’insufficienza renale acuta(AKI) grave che richiede terapia sostitutiva, è una complicanza frequente nelle unità di terapia intensiva(UTI) e rappresenta un fattore di rischio indipendente di mortalità. Scopo dello studio é stato valutare prospetticamente, in pazienti “critici” sottoposti a terapie sostitutive renali continue(CRRT) per IRA post cardiochirurgia, la prevalenza ed il significato prognostico del recupero della funzione renale(RFR). Pazienti e Metodi:Pazienti(pz) con AKI dopo intervento di cardiochirurgia elettivo o in emergenza con disfunzione di due o più organi trattati con CRRT. Risultati:Dal 1996 al 2011, 266 pz (M 195,F 71, età 65.5±11.3aa) sono stati trattati con CRRT. Tipo di intervento: CABG(27.6%), dissecazione aortica(33%), sostituzione valvolare(21.1%), CABG+sostituzione valvolare(12.6%), altro(5.7%). Parametri all’inizio del trattamento: BUN 86.1±39.4, creatininemia(Cr) 3.96±1.86mg/dL, PAM 72.4±13.6mmHg, APACHE II score 30.7±6.1, SOFAscore 13.7±3. RIFLE: Risk (11%), Injury (31.4%), Failure (57.6%). AKI oligurica (72.2%), ventilazione meccanica (93.2%), inotropi (84.5%). La sopravvivenza a 30 gg ed alla dimissione è stata del 54.2% e del 37.1%. La sopravvivenza per stratificazione APACHE II: <24=85.1 e 66%, 25-29=63.5 e 48.1%, 30-34=51.8 e 31.8%, >34=31.6 e 17.7%. RFR ha consentito l’interruzione della CRRT nel 87.8% (86/98) dei survivors (Cr 1.4±0.6mg/dL) e nel 14.5% (24/166) dei nonsurvivors (Cr 2.2±0.9mg/dL) con un recupero totale del 41.4%. RFR è stato osservato nel 59.5% (44/74) dei pz non oligurici e nel 34.4% dei pz oligurici (66/192). La distribuzione dei pz sulla base dei tempi di RFR è stata:<8=38.2%, 8-14=20.9%, 15-21=11.8%, 22-28=10.9%, >28=18.2%. All’analisi multivariata, l’oliguria, l’età e il CV-SOFA a 7gg dall’inizio della CRRT si sono dimostrati fattori prognostici sfavorevoli su RFR(>21gg). RFR si associa ad una sopravvivenza elevata(78.2%). Conclusioni:RFR significativamente piu frequente nei pz non oligurici si associa ad una sopravvivenza alla dimissione piu elevata. La distribuzione dei pz in rapporto ad APACHE II e SOFAscore dimostra che la sopravvivenza e RFR sono strettamente legati alla gravità della patologia.
Resumo:
Obiettivo Analisi di consumi e costi degli antibiotici sistemici negli ospedali dell’Emilia-Romagna dal 2004 al 2011, con attenzione alla variabilità interaziendale e al significato, in termini di resistenza batterica, dell’aumento di alcuni gruppi terapeutici; Sottoanalisi nei reparti pediatrici, individuando i gruppi terapeutici critici, e valutazione delle reazioni avverse pediatriche da antibiotici segnalate, per il periodo in esame. Metodi I dati di consumo e spesa degli antibiotici sistemici per il periodo 2004-2011 sono stati ottenuti dal database regionale AFO e le giornate di degenza per ogni reparto dal database regionale di dimissione ospedaliera SDO. Le segnalazioni di sospette reazioni avverse da antibiotici tra gennaio 2004 e dicembre 2011 sono state estratte dal database nazionale VigiSegn. Risultati Negli otto anni, il consumo di antibiotici negli ospedali dell’Emilia-Romagna è aumentato del 27% e la spesa del 3%. Il consumo è apparso nettamente superiore nei reparti chirurgici che medici. La prima classe per consumo e spesa sono le penicilline/inibitori delle beta lattamasi. Nei reparti pediatrici, sono stati utilizzati 65 principi attivi diversi e amoxicillina/acido clavulanico è stato il più usato (26% del totale del 2011). Tra gli antibiotici critici, le cefalosporine di terza generazione sono state le più consumate in tutti i reparti pediatrici nel 2011. Tra le molecole il cui uso ospedaliero è vincolato, spiccano il linezolid e la teicoplanina che, comunque, hanno inciso più di tutte nella spesa del 2011 (18% e 15%, rispettivamente). Per la farmacovigilanza, i bambini (3-13 anni) sono stati coinvolti in 23 casi, mentre gli infanti (≤2 anni) solo in 4. L’associazione amoxicillina/acido clavulanico è stata più frequentemente segnalata (n=7), e soltanto 2 casi erano gravi. Conclusioni I risultati mostrano un quadro critico sul massiccio uso delle cefalosporine di terza generazione e sull’incremento del linezolid, da approfondire se per inappropriatezza d’uso oppure per aumento delle resistenze batteriche.
Resumo:
Aims of the study: To assess the prevalence of Antiepileptic Drug (AED) exposure in pregnant women with or without epilepsy and the comparative risk of terminations of pregnancy (TOPs), spontaneous abortions, stillbirth, major congenital malformations (MCMs) and foetal growth retardation (FGR) following intrauterine AED exposure in the Emilia Romagna region (RER), Northern Italy (4 million inhabitants). Methods: Data were obtained from official regional registries: Certificate of Delivery Assistance, Hospital Discharge Card, reimbursed prescription databases and Registry of Congenital Malformations. We identified all the deliveries, hospitalized abortions and MCMs occurred between January 2009 and December 2011. Results: We identified 145,243 pregnancies: 111,284 deliveries (112,845 live births and 279 stillbirths), 16408 spontaneous abortions and 17551 TOPs. Six hundred and eleven pregnancies (0.42% 95% Cl: 0.39-0.46) were exposed to AEDs. Twenty-one per cent of pregnancies ended in TOP in the AED group vs 12% in the non-exposed (OR:2.24; CI 1.41-3.56). The rate of spontaneous abortions and stillbirth was comparable in the two groups. Three hundred fifty-three babies (0.31%, 95% CI: 0.28-0.35) were exposed to AEDs during the first trimester. The rate of MCMs was 2.3% in the AED group (2.2% in babies exposed to monotherapy and 3.1% in babies exposed to polytherapy) vs 2.0% in the non-exposed. The risk of FGR was 12.7 % in the exposed group compared to 10% in the non-exposed. Discussion and Conclusion: The prevalence of AED exposure in pregnancy in the RER was 0.42%. The rate of MCMs in children exposed to AEDs in utero was almost superimposable to the one of the non-exposed, however polytherapy carried a slightly increased risk . The rate of TOPs was significantly higher in the exposed women. Further studies are needed to clarify whether this high rate reflects a higher rate of MCMs detected prenatally or other more elusive reasons.