254 resultados para Risk based maintenance
Resumo:
OBJECTIVES: To examine the association between socioeconomic status (SES) and several cardiovascular disease risk factors (CVRFs) and to assess whether this association has changed over a 15-year observation period. METHODS: Three independent population-based surveys of CVRFs were conducted in representative samples of all adults aged 25-64 years in the Seychelles, a small island state located east to Kenya, in 1989 (N=1081), 1994 (N=1067) and 2004 (N=1255). RESULTS: Among men, current smoking and heavy drinking were more prevalent in the low versus the high SES group, and obesity was less prevalent. The socioeconomic gradient in diabetes reversed over the study period from lower prevalence in the low versus the high SES group to higher prevalence in the low SES group. Hypercholesterolemia was less prevalent in the low versus the high SES group in 1989 but the prevalence was similar in the two groups in 2004. Hypertension showed no consistent socioeconomic pattern. Among women, the SES gradient in smoking tended to reverse over time from lower prevalence in the low SES group to lower prevalence in the high SES group. Obesity and diabetes were more common in the low versus the high SES group over the study period. Heavy drinking, hypertension and hypercholesterolemia were not socially patterned among women. CONCLUSION: The prevalence of several CVRFs was higher in low versus high SES groups in a rapidly developing country in the African region, and an increase of the burden of these CVRFs in the most disadvantaged groups of the population was observed over the 15 years study period.
Resumo:
BACKGROUND AND AIM: There is an ongoing debate on which obesity marker better predicts cardiovascular disease (CVD). In this study, the relationships between obesity markers and high (>5%) 10-year risk of fatal CVD were assessed. METHODS AND RESULTS: A cross-sectional study was conducted including 3047 women and 2689 men aged 35-75years. Body fat percentage was assessed by tetrapolar bioimpedance. CVD risk was assessed using the SCORE risk function and gender- and age-specific cut points for body fat were derived. The diagnostic accuracy of each obesity marker was evaluated through receiver operating characteristics (ROC) analysis. In men, body fat presented a higher correlation (r=0.31) with 10-year CVD risk than waist/hip ratio (WHR, r=0.22), waist (r=0.22) or BMI (r=0.19); the corresponding values in women were 0.18, 0.15, 0.11 and 0.05, respectively (all p<0.05). In both genders, body fat showed the highest area under the ROC curve (AUC): in men, the AUC (95% confidence interval) were 76.0 (73.8-78.2), 67.3 (64.6-69.9), 65.8 (63.1-68.5) and 60.6 (57.9-63.5) for body fat, WHR, waist and BMI, respectively. In women, the corresponding values were 72.3 (69.2-75.3), 66.6 (63.1-70.2), 64.1 (60.6-67.6) and 58.8 (55.2-62.4). The use of the body fat percentage criterion enabled the capture of three times more subjects with high CVD risk than the BMI criterion, and almost twice as much as the WHR criterion. CONCLUSION: Obesity defined by body fat percentage is more related with 10-year risk of fatal CVD than obesity markers based on WHR, waist or BMI.
Resumo:
BACKGROUND: In high-quality cancer registration systems, about one in eight incident cancers are second primary cancers. This is due to a combination of careful diagnostic ascertainment, shared genetic determinants, shared exposure to environmental factors and consequences of treatment for first cancer. METHODS: We used data derived from the Swiss population-based cancer Registries of Vaud and Neuchâtel, including 885,000 inhabitants. RESULTS: Among 107,238 (52% males) first cancers occurring between 1976 and 2010, a total of 126 second sarcomas were observed through active and passive follow-up versus 68.2 expected, corresponding to a standardized incidence ratio (SIR) of 1.85 (95 % CI 1.5-2.2). Significant excess sarcoma risks were observed after skin melanoma (SIR = 3.0), breast cancer (2.2), corpus uteri (2.7), testicular (7.5), thyroid cancer (4.2), Hodgkin lymphoma (5.7) and leukemias (4.0). For breast cancer, the SIR was 3.4 ≥5 years after sarcoma diagnosis. CONCLUSIONS: The common denominator of these neoplasms is the utilization of radiotherapy in their management. Some sarcomas following breast cancer may be due to shared genetic components (i.e., in the Li-Fraumeni syndrome), as well as possibly to shared environmental factors, with sarcomas, including overweight, selected dietary and reproductive factors which are, however, too little defined for any quantitative risk assessment.
Resumo:
BACKGROUND: The epidemiology of congenital small intestinal atresia (SIA) has not been well studied. This study describes the presence of additional anomalies, pregnancy outcomes, total prevalence and association with maternal age in SIA cases in Europe. METHODS: Cases of SIA delivered during January 1990 to December 2006 notified to 20 EUROCAT registers formed the population-based case series. Prevalence over time was estimated using multilevel Poisson regression, and heterogeneity between registers was evaluated from the random component of the intercept. RESULTS: In total 1133 SIA cases were reported among 5126, 164 registered births. Of 1044 singleton cases, 215 (20.6%) cases were associated with a chromosomal anomaly. Of 829 singleton SIA cases with normal karyotype, 221 (26.7%) were associated with other structural anomalies. Considering cases with normal karyotype, the total prevalence per 10 000 births was 1.6 (95% CI 1.5 to 1.7) for SIA, 0.9 (95% CI 0.8 to 1.0) for duodenal atresia and 0.7 (95% CI 0.7 to 0.8) for jejunoileal atresia (JIA). There was no significant trend in SIA, duodenal atresia or JIA prevalence over time (RR=1.0, 95% credible interval (CrI): 1.0 to 1.0 for each), but SIA and duodenal atresia prevalence varied by geographical location (p=0.03 and p=0.04, respectively). There was weak evidence of an increased risk of SIA in mothers aged less than 20 years compared with mothers aged 20 to 29 years (RR=1.3, 95% CrI: 1.0 to 1.8). CONCLUSION: This study found no evidence of a temporal trend in the prevalence of SIA, duodenal atresia or JIA, although SIA and duodenal atresia prevalence varied significantly between registers.
Resumo:
Moderate alcohol consumption has been associated with lower coronary artery disease (CAD) risk. However, data on the CAD risk associated with high alcohol consumption are conflicting. The aim of this study was to examine the impact of heavier drinking on 10-year CAD risk in a population with high mean alcohol consumption. In a population-based study of 5,769 adults (aged 35 to 75 years) without cardiovascular disease in Switzerland, 1-week alcohol consumption was categorized as 0, 1 to 6, 7 to 13, 14 to 20, 21 to 27, 28 to 34, and > or =35 drinks/week or as nondrinkers (0 drinks/week), moderate (1 to 13 drinks/week), high (14 to 34 drinks/week), and very high (> or =35 drinks/week). Blood pressure and lipids were measured, and 10-year CAD risk was calculated according to the Framingham risk score. Seventy-three percent (n = 4,214) of the participants consumed alcohol; 16% (n = 909) were high drinkers and 2% (n = 119) very high drinkers. In multivariate analysis, increasing alcohol consumption was associated with higher high-density lipoprotein cholesterol (from a mean +/- SE of 1.57 +/- 0.01 mmol/L in nondrinkers to 1.88 +/- 0.03 mmol/L in very high drinkers); triglycerides (1.17 +/- 1.01 to 1.32 +/- 1.05 mmol/L), and systolic and diastolic blood pressure (127.4 +/- 0.4 to 132.2 +/- 1.4 mm Hg and 78.7 +/- 0.3 to 81.7 +/- 0.9 mm Hg, respectively) (all p values for trend <0.001). Ten-year CAD risk increased from 4.31 +/- 0.10% to 4.90 +/- 0.37% (p = 0.03) with alcohol use, with a J-shaped relation. Increasing wine consumption was more related to high-density lipoprotein cholesterol levels, whereas beer and spirits were related to increased triglyceride levels. In conclusion, as measured by 10-year CAD risk, the protective effect of alcohol consumption disappears in very high drinkers, because the beneficial increase in high-density lipoprotein cholesterol is offset by the increases in blood pressure levels.
Resumo:
With some 30,000 dependent persons, opiate addiction constitutes a major public health problem in Switzerland. The Swiss Federal Office of Public Health (FOPH) has long played a leading role in the prevention and treatment of opiate addiction and in research on effective means of containing the epidemic of opiate addiction and its consequences. Major milestones on that path have been the successive "Methadone reports" published by that Office and providing guidance on the care of opiate addiction with substitution treatment. In view of updating the recommendations for the appropriateness of substitution treatment for opiate addiction, in particular for the prescription of methadone, the FOPH commissioned a multi-component project involving the following elements. A survey of current attitudes and practices in Switzerland related to opiate substitution treatment Review of Swiss literature on methadone substitution treatment Review of international literature on methadone substitution treatment National Methadone Substitution Conference Multidisciplinary expert panel to evaluate the appropriateness of substitution treatment. The present report documents the process and summarises the results of the latter element above. The RAND appropriateness method (RAM) was used to distil from literature-based evidence and systematically formulated expert opinion, areas where consensus exist on the appropriateness (or inappropriateness) of methadone maintenance treatment (MMT) and areas where disagreement or uncertainty persist and which should be further pursued. The major areas which were addressed by this report are Initial assessment of candidates for MMT Appropriate settings for initiation of MMT (general and special cases) Appropriateness of methadone supportive therapy Co-treatments and accompanying measures Dosage schedules and pharmacokinetic testing Withdrawal from MMT Miscellaneous questions Appropriateness of other (non-methadone) substitution treatment Summary statements for each of the above categories are derived from the panel meeting and presented in the report. In the "first round", agreement was observed for 31% of the 553 theoretical scenarios evaluated. The "second round" rating, following discussion of divergent ratings, resulted in a much higher agreement among panellists, reaching 53% of the 537 scenarios. Frank disagreement was encountered for 7% of all scenarios. Overall 49% of the clinical situations (scenarios) presented were considered appropriate. The areas where at least 50% of the situations were considered appropriate were "initial assessment of candidates for MMT", the "appropriate settings for initiation of MMT", the "appropriate settings for methadone supportive treatment" and "Appropriateness of other (non-methadone) substitution treatment". The area where there was the least consensus on appropriateness concerned "appropriateness of withdrawal from MMT" (6%). The report discusses the implications and limitations of the panel results and provides recommendations for the dissemination, application, and future use of the criteria for the appropriateness of MMT. The RAND Appropriateness Method proved to be an accepted and appreciated method to assess the appropriateness of methadone maintenance treatment for opiate addicts. In the next step, the results of the expert panel process must now be combined with those of the Swiss and international literature reviews and the survey of current attitudes and practices in Switzerland, to be synthesized into formal practice guidelines. Such guidelines should be disseminated to all concerned, promoted, used and rigorously evaluated for compliance and outcome.
Resumo:
Introduction: Osteoporosis (OP) is a systemic skeletal disease characterized by a low bone mineral density (BMD) and a micro-architectural (MA) deterioration. Clinical risk factors (CRF) are often used as a MA approximation. MA is yet evaluable in daily practice by the Trabecular Bone Score (TBS) measure. TBS is a novel grey-level texture measurement reflecting bone micro-architecture based on the use of experimental variograms of 2D projection images. TBS is very simple to obtain, by reanalyzing a lumbar DXA-scan. TBS has proven to have diagnosis and prognosis value, partially independent of CRF and BMD. The aim of the OsteoLaus cohort is to combine in daily practice the CRF and the information given by DXA (BMD, TBS and vertebral fracture assessment (VFA)) to better identify women at high fracture risk. Method: The OsteoLaus cohort (1400 women 50 to 80 years living in Lausanne, Switzerland) started in 2010. This study is derived from the cohort COLAUS who started in Lausanne in 2003. The main goals of COLAUS is to obtain information on the epidemiology and genetic determinants of cardiovascular risk in 6700 men and women. CRF for OP, bone ultrasound of the heel, lumbar spine and hip BMD, VFA by DXA and MA evaluation by TBS are recorded in OsteoLaus. Preliminary results are reported. Results: We included 631 women: mean age 67.4±6.7 y, BMI 26.1±4.6, mean lumbar spine BMD 0.943±0.168 (T-score -1.4 SD), TBS 1.271±0.103. As expected, correlation between BMD and site matched TBS is low (r2=0.16). Prevalence of VFx grade 2/3, major OP Fx and all OP Fx is 8.4%, 17.0% and 26.0% respectively. Age- and BMI-adjusted ORs (per SD decrease) are 1.8 (1.2- 2.5), 1.6 (1.2-2.1), 1.3 (1.1-1.6) for BMD for the different categories of fractures and 2.0 (1.4-3.0), 1.9 (1.4-2.5), 1.4 (1.1-1.7) for TBS respectively. Only 32 to 37% of women with OP Fx have a BMD < -2.5 SD or a TBS < 1.200. If we combine a BMD < -2.5 SD or a TBS < 1.200, 54 to 60% of women with an osteoporotic Fx are identified. Conclusion: As in the already published studies, these preliminary results confirm the partial independence between BMD and TBS. More importantly, a combination of TBS subsequent to BMD increases significantly the identification of women with prevalent OP Fx which would have been miss-classified by BMD alone. For the first time we are able to have complementary information about fracture (VFA), density (BMD), micro- and macro architecture (TBS & HAS) from a simple, low ionizing radiation and cheap device: DXA. Such complementary information is very useful for the patient in the daily practice and moreover will likely have an impact on cost effectiveness analysis.
Resumo:
PURPOSE: To determine the local control and complication rates for children with papillary and/or macular retinoblastoma progressing after chemotherapy and undergoing stereotactic radiotherapy (SRT) with a micromultileaf collimator. METHODS AND MATERIALS: Between 2004 and 2008, 11 children (15 eyes) with macular and/or papillary retinoblastoma were treated with SRT. The mean age was 19 months (range, 2-111). Of the 15 eyes, 7, 6, and 2 were classified as International Classification of Intraocular Retinoblastoma Group B, C, and E, respectively. The delivered dose of SRT was 50.4 Gy in 28 fractions using a dedicated micromultileaf collimator linear accelerator. RESULTS: The median follow-up was 20 months (range, 13-39). Local control was achieved in 13 eyes (87%). The actuarial 1- and 2-year local control rates were both 82%. SRT was well tolerated. Late adverse events were reported in 4 patients. Of the 4 patients, 2 had developed focal microangiopathy 20 months after SRT; 1 had developed a transient recurrence of retinal detachment; and 1 had developed bilateral cataracts. No optic neuropathy was observed. CONCLUSIONS: Linear accelerator-based SRT for papillary and/or macular retinoblastoma in children resulted in excellent tumor control rates with acceptable toxicity. Additional research regarding SRT and its intrinsic organ-at-risk sparing capability is justified in the framework of prospective trials.
Resumo:
QUESTION UNDER STUDY: Hospitals transferring patients retain responsibility until admission to the new health care facility. We define safe transfer conditions, based on appropriate risk assessment, and evaluate the impact of this strategy as implemented at our institution. METHODS: An algorithm defining transfer categories according to destination, equipment monitoring, and medication was developed and tested prospectively over 6 months. Conformity with algorithm criteria was assessed for every transfer and transfer category. After introduction of a transfer coordination centre with transfer nurses, the algorithm was implemented and the same survey was carried out over 1 year. RESULTS: Over the whole study period, the number of transfers increased by 40%, chiefly by ambulance from the emergency department to other hospitals and private clinics. Transfers to rehabilitation centres and nursing homes were reassigned to conventional vehicles. The percentage of patients requiring equipment during transfer, such as an intravenous line, decreased from 34% to 15%, while oxygen or i.v. drug requirement remained stable. The percentage of transfers considered below theoretical safety decreased from 6% to 4%, while 20% of transfers were considered safer than necessary. A substantial number of planned transfers could be "downgraded" by mutual agreement to a lower degree of supervision, and the system was stable on a short-term basis. CONCLUSION: A coordinated transfer system based on an algorithm determining transfer categories, developed on the basis of simple but valid medical and nursing criteria, reduced unnecessary ambulance transfers and treatment during transfer, and increased adequate supervision.
Resumo:
Dans le contexte climatique actuel, les régions méditerranéennes connaissent une intensification des phénomènes hydrométéorologiques extrêmes. Au Maroc, le risque lié aux inondations est devenu problématique, les communautés étant vulnérables aux événements extrêmes. En effet, le développement économique et urbain rapide et mal maîtrisé augmente l'exposition aux phénomènes extrêmes. La Direction du Développement et de la Coopération suisse (DDC) s'implique activement dans la réduction des risques naturels au Maroc. La cartographie des dangers et son intégration dans l'aménagement du territoire représentent une méthode efficace afin de réduire la vulnérabilité spatiale. Ainsi, la DDC a mandaté ce projet d'adaptation de la méthode suisse de cartographie des dangers à un cas d'étude marocain (la ville de Beni Mellal, région de Tadla-Azilal, Maroc). La méthode suisse a été adaptée aux contraintes spécifiques du terrain (environnement semi-aride, morphologie de piémont) et au contexte de transfert de connaissances (caractéristiques socio-économiques et pratiques). Une carte des phénomènes d'inondations a été produite. Elle contient les témoins morphologiques et les éléments anthropiques pertinents pour le développement et l'aggravation des inondations. La modélisation de la relation pluie-débit pour des événements de référence, et le routage des hydrogrammes de crue ainsi obtenus ont permis d'estimer quantitativement l'aléa inondation. Des données obtenues sur le terrain (estimations de débit, extension de crues connues) ont permis de vérifier les résultats des modèles. Des cartes d'intensité et de probabilité ont été obtenues. Enfin, une carte indicative du danger d'inondation a été produite sur la base de la matrice suisse du danger qui croise l'intensité et la probabilité d'occurrence d'un événement pour obtenir des degrés de danger assignables au territoire étudié. En vue de l'implémentation des cartes de danger dans les documents de l'aménagement du territoire, nous nous intéressons au fonctionnement actuel de la gestion institutionnelle du risque à Beni Mellal, en étudiant le degré d'intégration de la gestion et la manière dont les connaissances sur les risques influencent le processus de gestion. L'analyse montre que la gestion est marquée par une logique de gestion hiérarchique et la priorité des mesures de protection par rapport aux mesures passives d'aménagement du territoire. Les connaissances sur le risque restent sectorielles, souvent déconnectées. L'innovation dans le domaine de la gestion du risque résulte de collaborations horizontales entre les acteurs ou avec des sources de connaissances externes (par exemple les universités). Des recommandations méthodologiques et institutionnelles issues de cette étude ont été adressées aux gestionnaires en vue de l'implémentation des cartes de danger. Plus que des outils de réduction du risque, les cartes de danger aident à transmettre des connaissances vers le public et contribuent ainsi à établir une culture du risque. - Severe rainfall events are thought to be occurring more frequently in semi-arid areas. In Morocco, flood hazard has become an important topic, notably as rapid economic development and high urbanization rates have increased the exposure of people and assets in hazard-prone areas. The Swiss Agency for Development and Cooperation (SADC) is active in natural hazard mitigation in Morocco. As hazard mapping for urban planning is thought to be a sound tool for vulnerability reduction, the SADC has financed a project aimed at adapting the Swiss approach for hazard assessment and mapping to the case of Morocco. In a knowledge transfer context, the Swiss method was adapted to the semi-arid environment, the specific piedmont morphology and to socio-economic constraints particular to the study site. Following the Swiss guidelines, a hydro-geomorphological map was established, containing all geomorphic elements related to known past floods. Next, rainfall / runoff modeling for reference events and hydraulic routing of the obtained hydrographs were carried out in order to assess hazard quantitatively. Field-collected discharge estimations and flood extent for known floods were used to verify the model results. Flood hazard intensity and probability maps were obtained. Finally, an indicative danger map as defined within the Swiss hazard assessment terminology was calculated using the Swiss hazard matrix that convolves flood intensity with its recurrence probability in order to assign flood danger degrees to the concerned territory. Danger maps become effective, as risk mitigation tools, when implemented in urban planning. We focus on how local authorities are involved in the risk management process and how knowledge about risk impacts the management. An institutional vulnerability "map" was established based on individual interviews held with the main institutional actors in flood management. Results show that flood hazard management is defined by uneven actions and relationships, it is based on top-down decision-making patterns, and focus is maintained on active mitigation measures. The institutional actors embody sectorial, often disconnected risk knowledge pools, whose relationships are dictated by the institutional hierarchy. Results show that innovation in the risk management process emerges when actors collaborate despite the established hierarchy or when they open to outer knowledge pools (e.g. the academia). Several methodological and institutional recommendations were addressed to risk management stakeholders in view of potential map implementation to planning. Hazard assessment and mapping is essential to an integrated risk management approach: more than a mitigation tool, danger maps represent tools that allow communicating on hazards and establishing a risk culture.
Quality Of Attachment, Perinatal Risk, And Mother-Infant Interaction In A High-Risk Premature Sample
Resumo:
Thirty-three families, each with a premature infant born less than 33 gestational weeks, were observed in a longitudinal exploratory study. Infants were recruited in a neonatal intensive care unit, and follow-up visits took place at 4 months and 12 months of corrected age. The severity of the perinatal problems was evaluated using the Perinatal Risk Inventory (PERI; A.P. Scheiner & M.E. Sexton, 1991). At 4 months, mother infant play interaction was observed and coded according to the CARE-index (P.M. Crittenden, 2003); at 12 months, the Strange Situation Procedure (SSP; M.D.S. Ainsworth, M.C. Blehar, E. Waters. & S. Wall, 1978) was administered. Results indicate a strong correlation between the severity of perinatal problems and the quality of attachment at 12 months. Based on the PERI, infants with high medical risks more frequently tended to be insecurely attached. There also was a significant correlation between insecure attachment and dyadic play interaction at 4 months (i.e., maternal controlling behavior and infant compulsive compliance). Moreover, specific dyadic interactive patterns could be identified as protective or as risk factors regarding the quality of attachment. Considering that attachment may have long-term influence on child development, these results underline the need for particular attention to risk factors regarding attachment among premature infants.
Resumo:
Osteoporotic fracture (OF) is one of the major causes of morbidity and mortality in industrialized countries. Switzerland is among the countries with the greatest risk. Our aim was (1) to calculate the FRAX(®) in a selected Swiss population the day before the occurrence of an OF and (2) to compare the results with the proposed Swiss FRAX(®) thresholds. The Swiss Association Against Osteoporosis proposed guidelines for the treatment of osteoporosis based on age-dependent thresholds. To identify a population at a very high risk of osteoporotic fracture, we included all consecutive patients in the active OF pathway cohort from the Lausanne University Hospital, Switzerland. FRAX(®) was calculated with the available data the day before the actual OF. People with a FRAX(®) body mass index (BMI) or a FRAX(®) (bone mineral density) BMD lower than the Swiss thresholds were not considered at high risk. Two-hundred thirty-seven patients were included with a mean age of 77.2 years, and 80 % were female. Major types of fracture included hip (58 %) and proximal humerus (25 %) fractures. Mean FRAX(®) BMI values were 28.0, 10.0, 13.0, 26.0, and 37.0 % for age groups 50-59, 60-69, 70-79, and 80-89 years old, respectively. Fifty percent of the population was not considered at high risk by the FRAX(®) BMI. FRAX(®) BMD was available for 95 patients, and 45 % had a T score < -2.5 standard deviation. Only 30 % of patients with a normal or osteopenic BMD were classified at high risk by FRAX(®) BMD. The current proposed Swiss thresholds were not able to classify at high risk in 50 to 70 % of the studied population the day before a major OF.
Resumo:
The trabecular bone score (TBS) is a gray-level textural metric that can be extracted from the two-dimensional lumbar spine dual-energy X-ray absorptiometry (DXA) image. TBS is related to bone microarchitecture and provides skeletal information that is not captured from the standard bone mineral density (BMD) measurement. Based on experimental variograms of the projected DXA image, TBS has the potential to discern differences between DXA scans that show similar BMD measurements. An elevated TBS value correlates with better skeletal microstructure; a low TBS value correlates with weaker skeletal microstructure. Lumbar spine TBS has been evaluated in cross-sectional and longitudinal studies. The following conclusions are based upon publications reviewed in this article: 1) TBS gives lower values in postmenopausal women and in men with previous fragility fractures than their nonfractured counterparts; 2) TBS is complementary to data available by lumbar spine DXA measurements; 3) TBS results are lower in women who have sustained a fragility fracture but in whom DXA does not indicate osteoporosis or even osteopenia; 4) TBS predicts fracture risk as well as lumbar spine BMD measurements in postmenopausal women; 5) efficacious therapies for osteoporosis differ in the extent to which they influence the TBS; 6) TBS is associated with fracture risk in individuals with conditions related to reduced bone mass or bone quality. Based on these data, lumbar spine TBS holds promise as an emerging technology that could well become a valuable clinical tool in the diagnosis of osteoporosis and in fracture risk assessment. © 2014 American Society for Bone and Mineral Research.
Resumo:
OBJECTIVE: To investigate whether first trimester exposure to lamotrigine (LTG) monotherapy is specifically associated with an increased risk of orofacial clefts (OCs) relative to other malformations, in response to a signal regarding increased OC risk. METHODS: Population-based case-control study with malformed controls based on EUROCAT congenital anomaly registers. The study population covered 3.9 million births from 19 registries 1995-2005. Registrations included congenital anomaly among livebirths, stillbirths, and terminations of pregnancy following prenatal diagnosis. Cases were 5,511 nonsyndromic OC registrations, of whom 4,571 were isolated, 1,969 were cleft palate (CP), and 1,532 were isolated CP. Controls were 80,052 nonchromosomal, non-OC registrations. We compared first trimester LTG and antiepileptic drug (AED) use vs nonepileptic non-AED use, for mono and polytherapy, adjusting for maternal age. An additional exploratory analysis compared the observed and expected distribution of malformation types associated with LTG use. RESULTS: There were 72 LTG exposed (40 mono- and 32 polytherapy) registrations. The ORs for LTG monotherapy vs no AED use were 0.67 (95% CI 0.10-2.34) for OC relative to other malformations, 0.80 (95% CI 0.11-2.85) for isolated OC, 0.79 (95% CI 0.03-4.35) for CP, and 1.01 (95% CI 0.03-5.57) for isolated CP. ORs for any AED use vs no AED use were 1.43 (95% CI 1.03-1.93) for OC, 1.21 (95% CI 0.82-1.72) for isolated OC, 2.37 (95% CI 1.54-3.43) for CP, and 1.86 (95% CI 1.07-2.94) for isolated CP. The distribution of other nonchromosomal malformation types with LTG exposure was similar to non-AED exposed. CONCLUSION: We find no evidence of a specific increased risk of isolated orofacial clefts relative to other malformations due to lamotrigine (LTG) monotherapy. Our study is not designed to assess whether there is a generalized increased risk of malformations with LTG exposure.