849 resultados para Urban Crash Risk Assessment Tool
Resumo:
The present study identified and compared Coronary Heart Disease (CHD) risk factors quantified as “CHD risk point standards” (CHDRPS) among tri-ethnic (White non-Hispanic [WNH], Hispanic [H], and Black non-Hispanic [BNH]) college students. All 300 tri-ethnic subjects completed the Cardiovascular Risk Assessment Instruments and had blood pressure readings recorded on three occasions. The Bioelectrical Impedance Analysis (BIA) was used to measure body composition. Students' knowledge of CHD risk factors was also measured. In addition, a 15 ml fasting blood sample was collected from 180 subjects and blood lipids and Homocysteine (tHcy) levels were measured. Data were analyzed by gender and ethnicity using one-way Analysis of Variance (ANOVA) with Bonferroni's pairwise mean comparison procedure, Pearson correlation, and Chi-square test with follow-up Bonferroni's Chi-square tests. ^ The mean score of CHDRPS for all subjects was 19.15 ± 6.79. Assigned to the CHD risk category, college students were below-average risk of developing CHD. Males scored significantly (p < 0.013) higher for CHD risk than females, and BNHs scored significantly (p < 0.033) higher than WNHs. High consumption of dietary fat saturated fat and cholesterol resulted in a high CHDRPS among H males and females and WNH females. High alcohol consumption resulted in a high CHDRPS among all subjects. Mean tHcy ± SD of all subjects was 6.33 ± 3. 15 μmol/L. Males had significantly (p < 0.001) higher tHcy than females. Black non-Hispanic females and H females had significantly (p < 0.003) lower tHcy than WNH females. Positive associations were found between tHcy levels and CHDRPS among females (p < 0.001), Hs (p < 0.001), H males (p < 0.049), H females (p < 0.009), and BNH females (p < 0.005). Significant positive correlations were found between BMI levels and CHDRPS in males (p < 0.001), females (p < 0.001), WNHs (p < 0.008), Hs (p < 0.001), WNH males (p < 0.024), H males (p < 0.004) and H females (p < 0.001). The mean knowledge of CHD questions of all subjects was 71.70 ± 7.92 out of 100. The mean knowledge of CHD was significantly higher for WNH males (p < 0.039) than BNH males. A significant inverse correlation (r = 0.392, p < 0.032) was found between the CHD knowledge and CHDRPS in WNH females. The researcher's findings indicate strong gender and ethnic differences in CHD risk factors among the college-age population. ^
Resumo:
A major consequence of contamination at the local level’s population as it relates to environmental health and environmental engineering is childhood lead poisoning. Environmental contamination is one of the pressing environmental concerns facing the world today. Current approaches often focus on large contaminated industrial size sites that are designated by regulatory agencies for site remediation. Prior to this study, there were no known published studies conducted at the local and smaller scale, such as neighborhoods, where often much of the contamination is present to remediate. An environmental health study of local lead-poisoning data in Liberty City, Little Haiti and eastern Little Havana in Miami-Dade County, Florida accounted for a disproportionately high number of the county’s reported childhood lead poisoning cases. An engineering system was developed and designed for a comprehensive risk management methodology that is distinctively applicable to the geographical and environmental conditions of Miami-Dade County, Florida. Furthermore, a scientific approach for interpreting environmental health concerns, while involving detailed environmental engineering control measures and methods for site remediation in contained media was developed for implementation. Test samples were obtained from residents and sites in those specific communities in Miami-Dade County, Florida (Gasana and Chamorro 2002). Currently lead does not have an Oral Assessment, Inhalation Assessment, and Oral Slope Factor; variables that are required to run a quantitative risk assessment. However, various institutional controls from federal agencies’ standards and regulation for contaminated lead in media yield adequate maximum concentration limits (MCLs). For this study an MCL of .0015 (mg/L) was used. A risk management approach concerning contaminated media involving lead demonstrates that the linkage of environmental health and environmental engineering can yield a feasible solution.
Resumo:
Subtitle D of the Resource Conservation and Recovery Act (RCRA) requires a post closure period of 30 years for non hazardous wastes in landfills. Post closure care (PCC) activities under Subtitle D include leachate collection and treatment, groundwater monitoring, inspection and maintenance of the final cover, and monitoring to ensure that landfill gas does not migrate off site or into on site buildings. The decision to reduce PCC duration requires exploration of a performance based methodology to Florida landfills. PCC should be based on whether the landfill is a threat to human health or the environment. Historically no risk based procedure has been available to establish an early end to PCC. Landfill stability depends on a number of factors that include variables that relate to operations both before and after the closure of a landfill cell. Therefore, PCC decisions should be based on location specific factors, operational factors, design factors, post closure performance, end use, and risk analysis. The question of appropriate PCC period for Florida’s landfills requires in depth case studies focusing on the analysis of the performance data from closed landfills in Florida. Based on data availability, Davie Landfill was identified as case study site for a case by case analysis of landfill stability. The performance based PCC decision system developed by Geosyntec Consultants was used for the assessment of site conditions to project PCC needs. The available data for leachate and gas quantity and quality, ground water quality, and cap conditions were evaluated. The quality and quantity data for leachate and gas were analyzed to project the levels of pollutants in leachate and groundwater in reference to maximum contaminant level (MCL). In addition, the projected amount of gas quantity was estimated. A set of contaminants (including metals and organics) were identified as contaminants detected in groundwater for health risk assessment. These contaminants were selected based on their detection frequency and levels in leachate and ground water; and their historical and projected trends. During the evaluations a range of discrepancies and problems that related to the collection and documentation were encountered and possible solutions made. Based on the results of PCC performance integrated with risk assessment, projection of future PCC monitoring needs and sustainable waste management options were identified. According to these results, landfill gas monitoring can be terminated, leachate and groundwater monitoring for parameters above MCL and surveying of the cap integrity should be continued. The parameters which cause longer monitoring periods can be eliminated for the future sustainable landfills. As a conclusion, 30 year PCC period can be reduced for some of the landfill components based on their potential impacts to human health and environment (HH&E).
Resumo:
Objective: To evaluate the ease of application of a heat illness prevention program (HIPP). Design: A mixed-method research design was used: questionnaire and semi-structured interview. Setting: Eleven South Florida high schools in August (mean ambient temperature=84.0°F, mean relative humidity=69.5%) participated in the HIPP. Participants: Certified Athletic Trainers (AT) (n=11; age=22.2+1.2yr; 63.6% female, 36.4% male; 63.6%) implemented the HIPP with their football athletes which included a pre-screening tool, the Heat Illness Index Score- Risk Assessment. Data Collection and Analysis: Participants completed a 17-item questionnaire, 4 of which provided space for open-ended responses. Additionally, semi-structured interviews were voice recorded, and separately transcribed. Results: Three participants (27.7%) were unable to implement the HIPP with any of their athletes. Of the 7 participants (63.6%) who implemented the HIPP to greater than 50% of their athletes, a majority reported that the HIPP was difficult (54.5%) or exceedingly difficult (18.2%) to implement. Lack of appropriate instrumentation (81.8%, n=9/11), lack of coaching staff/administrative support (54.5%, n=6/11), insufficient support staff (54.5%, n=6/11), too many athletes (45.5%, n=5/11), and financial restrictions (36.4%, n=4/11) deterred complete implementation of the HIPP. Conclusions: Because AT in the high school setting often lack the resources, time, and coaches’ support to identify risk factors, predisposing athletes to exertional heat Illnesses (EHI) researchers should develop and validate a suitable screening tool. Further, ATs charged with the health care of high school athletes should seek out prevention programs and screening tools to identify high-risk athletes and monitor athletes throughout exercise in extreme environments.
Resumo:
Awareness of extreme high tide flooding in coastal communities has been increasing in recent years, reflecting growing concern over accelerated sea level rise. As a low-lying, urban coastal community with high value real estate, Miami often tops the rankings of cities worldwide in terms of vulnerability to sea level rise. Understanding perceptions of these changes and how communities are dealing with the impacts reveals much about vulnerability to climate change and the challenges of adaptation. ^ This empirical study uses an innovative mixed-methods approach that combines ethnographic observations of high tide flooding, qualitative interviews and analysis of tidal data to reveal coping strategies used by residents and businesses as well as perceptions of sea level rise and climate change, and to assess the relationship between measurable sea levels and perceptions of flooding. I conduct a case study of Miami Beach's storm water master planning process which included sea level rise projections, one of the first in the nation to do so, that reveals the different and sometimes competing logics of planners, public officials, activists, residents and business interests with regards to climate change adaptation. By taking a deeply contextual account of hazards and adaptation efforts in a local area I demonstrate how this approach can be effective at shedding light on some of the challenges posed by anthropogenic climate change and accelerated rates of sea level rise. ^ The findings highlight challenges for infrastructure planning in low-lying, urban coastal areas, and for individual risk assessment in the context of rapidly evolving discourse about the threat of sea level rise. Recognition of the trade-offs and limits of incremental adaptation strategies point to transformative approaches, at the same time highlighting equity concerns in adaptation governance and planning. This new impact assessment method contributes to the integration of social and physical science approaches to climate change, resulting in improved understanding of socio-ecological vulnerability to environmental change.^
Resumo:
Subtitle D of the Resource Conservation and Recovery Act (RCRA) requires a post closure period of 30 years for non hazardous wastes in landfills. Post closure care (PCC) activities under Subtitle D include leachate collection and treatment, groundwater monitoring, inspection and maintenance of the final cover, and monitoring to ensure that landfill gas does not migrate off site or into on site buildings. The decision to reduce PCC duration requires exploration of a performance based methodology to Florida landfills. PCC should be based on whether the landfill is a threat to human health or the environment. Historically no risk based procedure has been available to establish an early end to PCC. Landfill stability depends on a number of factors that include variables that relate to operations both before and after the closure of a landfill cell. Therefore, PCC decisions should be based on location specific factors, operational factors, design factors, post closure performance, end use, and risk analysis. The question of appropriate PCC period for Florida’s landfills requires in depth case studies focusing on the analysis of the performance data from closed landfills in Florida. Based on data availability, Davie Landfill was identified as case study site for a case by case analysis of landfill stability. The performance based PCC decision system developed by Geosyntec Consultants was used for the assessment of site conditions to project PCC needs. The available data for leachate and gas quantity and quality, ground water quality, and cap conditions were evaluated. The quality and quantity data for leachate and gas were analyzed to project the levels of pollutants in leachate and groundwater in reference to maximum contaminant level (MCL). In addition, the projected amount of gas quantity was estimated. A set of contaminants (including metals and organics) were identified as contaminants detected in groundwater for health risk assessment. These contaminants were selected based on their detection frequency and levels in leachate and ground water; and their historical and projected trends. During the evaluations a range of discrepancies and problems that related to the collection and documentation were encountered and possible solutions made. Based on the results of PCC performance integrated with risk assessment, projection of future PCC monitoring needs and sustainable waste management options were identified. According to these results, landfill gas monitoring can be terminated, leachate and groundwater monitoring for parameters above MCL and surveying of the cap integrity should be continued. The parameters which cause longer monitoring periods can be eliminated for the future sustainable landfills. As a conclusion, 30 year PCC period can be reduced for some of the landfill components based on their potential impacts to human health and environment (HH&E).
Resumo:
Historically, man has empirically acquired knowledge about the therapeutic applications of extracted elements of the natural environment in which belonged. Such knowledge over time culminated in the formation of traditional health systems. Among its features, the use of bioactive plant species - medicinal plants - stands out for its efficiency and high popular acceptance. Despite its importance for public health, the population still has in the open-air fairs the main source for the acquisition of the species used. In these spaces, the trade generally occurs informally, under unfavorable conditions to the quality of the products and to the financial sustainability of the business. In this context, this study aimed to characterize the socioeconomic, cultural and sanitary aspects related to the trade of medicinal plants in municipalities of a semiarid region of Rio Grande do Norte, and additionally, proposing a specific legislation to the activity. Socioeconomic data were collected through on-site interviews, guided by structured form. The observations about the hygienic and sanitary adequacy of physical facilities and practices employed at the point of sale /environment were conducted and recorded with the use of assessment tool developed for use in open markets. The adequacy of medicinal plants to consumption was determined by microbiological analysis. The activity was carried out by individuals who are aged between 21 and 81 years of age, low educational level and low-income, predominantly males. The data showed a tendency to extinction of the activity in all the districts studied. It was observed in all the fairs studied hygiene and sanitation inadequacies that characterized very high health risk, representing in this way, the high probability of Food Transmitted Diseases outbreaks Such conditions were reflected in the high percentage of inadequacy to the consumption of the analyzed medicinal plants samples, illustrating the potential health risk to consumers. To contribute to the correction of hygiene and sanitation inadequacies observed in the studied open-air fairs, educational interventions were made to the training of traders in Good Practices. As a complement, was drafted a specific legislation for the marketing of folk medicine's products in open-air fairs. Such actions, products and its developments will contribute significantly to improving the quality of products available to the population and the preservation of activity, potentially reducing the risks to public health.
Resumo:
Dry eye syndrome is a multifactorial disease of the tear film, resulting from the instability of the lacrimal functional unit that produces volume change, up or tear distribution. In patients in intensive care the cause is enhanced due to various risk factors, such as mechanical ventilation, sedation, lagophthalmos, low temperatures, among others. The study's purpose is to build an assessment tool of Dry Eye Severity in patients in intensive care units based on the systematization of nursing care and their classification systems. The aim of this study is to build an assessment tool of Dry Eye Severity in hospitalized patients in Care Unit Intensiva.Trata is a methodological study conducted in three stages, namely: context analysis, concept analysis, construction of operational definitions and magnitudes of nursing outcome. For the first step we used the methodological framework for Hinds, Chaves and Cypress (1992). For the second step we used the model of Walker and Avant and an integrative review Whitemore seconds, Knalf (2005). This step enabled the identification of the concept of attributes, background and consequent ground and the construction of the settings for the result of nursing severity of dry eye. For the construction of settings and operational magnitudes, it was used Psicometria proposed by Pasquali (1999). As a result of context analysis, visualized from the reflection that the matter should be discussed and that nursing needs to pay attention to the problem of eye injury, so minimizing strategies are created this event with a high prevalence. With the integrative review were located from the crosses 19 853 titles, selected 215, and from the abstracts 96 articles were read in full. From reading 10 were excluded culminating in the sample of 86 articles that were used to analyze the concept and construction of settings. Selected articles were found in greater numbers in the Scopus database (55.82%), performed in the United States (39.53%), and published mainly in the last five years (48.82). Regarding the concept of analysis were identified as antecedents: age, lagophthalmos, environmental factors, medication use, systemic diseases, mechanical ventilation and ophthalmic surgery. As attributes: TBUT <10s, Schimer I test <5 mm in Schimer II test <10mm, reduced osmolarity. As consequential: the ocular surface damage, ocular discomfort, visual instability. The settings were built and added indicators such as: decreased blink mechanism and eyestrain.
Resumo:
Family health history (FHH) in the context of risk assessment has been shown to positively impact risk perception and behavior change. The added value of genetic risk testing is less certain. The aim of this study was to determine the impact of Type 2 Diabetes (T2D) FHH and genetic risk counseling on behavior and its cognitive precursors. Subjects were non-diabetic patients randomized to counseling that included FHH +/- T2D genetic testing. Measurements included weight, BMI, fasting glucose at baseline and 12 months and behavioral and cognitive precursor (T2D risk perception and control over disease development) surveys at baseline, 3, and 12 months. 391 subjects enrolled of which 312 completed the study. Behavioral and clinical outcomes did not differ across FHH or genetic risk but cognitive precursors did. Higher FHH risk was associated with a stronger perceived T2D risk (pKendall < 0.001) and with a perception of "serious" risk (pKendall < 0.001). Genetic risk did not influence risk perception, but was correlated with an increase in perception of "serious" risk for moderate (pKendall = 0.04) and average FHH risk subjects (pKendall = 0.01), though not for the high FHH risk group. Perceived control over T2D risk was high and not affected by FHH or genetic risk. FHH appears to have a strong impact on cognitive precursors of behavior change, suggesting it could be leveraged to enhance risk counseling, particularly when lifestyle change is desirable. Genetic risk was able to alter perceptions about the seriousness of T2D risk in those with moderate and average FHH risk, suggesting that FHH could be used to selectively identify individuals who may benefit from genetic risk testing.
Resumo:
Estimation of absolute risk of cardiovascular disease (CVD), preferably with population-specific risk charts, has become a cornerstone of CVD primary prevention. Regular recalibration of risk charts may be necessary due to decreasing CVD rates and CVD risk factor levels. The SCORE risk charts for fatal CVD risk assessment were first calibrated for Germany with 1998 risk factor level data and 1999 mortality statistics. We present an update of these risk charts based on the SCORE methodology including estimates of relative risks from SCORE, risk factor levels from the German Health Interview and Examination Survey for Adults 2008-11 (DEGS1) and official mortality statistics from 2012. Competing risks methods were applied and estimates were independently validated. Updated risk charts were calculated based on cholesterol, smoking, systolic blood pressure risk factor levels, sex and 5-year age-groups. The absolute 10-year risk estimates of fatal CVD were lower according to the updated risk charts compared to the first calibration for Germany. In a nationwide sample of 3062 adults aged 40-65 years free of major CVD from DEGS1, the mean 10-year risk of fatal CVD estimated by the updated charts was lower by 29% and the estimated proportion of high risk people (10-year risk > = 5%) by 50% compared to the older risk charts. This recalibration shows a need for regular updates of risk charts according to changes in mortality and risk factor levels in order to sustain the identification of people with a high CVD risk.
Resumo:
The goals of this program of research were to examine the link between self-reported vulvar pain and clinical diagnoses, and to create a user-friendly assessment tool to aid in that process. These goals were undertaken through a series of four empirical studies (Chapters 2-6): one archival study, two online studies, and one study conducted in a Women’s Health clinic. In Chapter 2, the link between self-report and clinical diagnosis was confirmed by extracting data from multiple studies conducted in the Sexual Health Research Laboratory over the course of several years. We demonstrated the accuracy of diagnosis based on multiple factors, and explored the varied gynecological presentation of different diagnostic groups. Chapter 3 was based on an online study designed to create the Vulvar Pain Assessment Questionnaire (VPAQ) inventory. Following the construct validation approach, a large pool of potential items was created to capture a broad selection of vulvar pain symptoms. Nearly 300 participants completed the entire item pool, and a series of factor analyses were utilized to narrow down the items and create scales/subscales. Relationships were computed among subscales and validated scales to establish convergent and discriminant validity. Chapters 4 and 5 were conducted in the Department of Obstetrics & Gynecology at Oregon Health & Science University. The brief screening version of the VPAQ was employed with patients of the Program in Vulvar Health at the Center for Women’s Health. The accuracy and usefulness of the VPAQscreen was determined from the perspective of patients as well as their health care providers, and the treatment-seeking experiences of patients was explored. Finally, a second online study was conducted to confirm the factor structure, internal consistency, and test-retest reliability of the VPAQ inventory. The results presented in these chapters confirm the link between targeted questions and accurate diagnoses, and provide a guideline that is useful and accessible for providers and patients.
Resumo:
The recent crisis of the capitalistic economic system has altered the working conditions and occupations in the European Union. The recession situation has accelerated trends and has brought transformations that have been observed before. Changes have not looked the same way in all the countries of the Union. The social occupation norms, labour relations models and the type of global welfare provision can help underline some of these inequalities. Poor working conditions can expose workers to situations of great risk. This is one of the basic assumptions of the theoretical models and analytical studies of the approach to the psychosocial work environment. Changes in working conditions of the population seems to be important to explain in the worst health states. To observe these features in the current period of economic recession it has made a comparative study of trend through the possibilities of the European Working Conditions Survey in the 2005 and 2010 editions. It has also set different multivariate logistic regression models to explore potential partnerships with the worst conditions of employment and work. It seems that the economic crisis has intensified changes in working conditions and highlighted the effects of those conditions on the poor health of the working population. This conclusion can’t be extended for all EU countries; some differences were observed in terms of global welfare models.
Resumo:
PURPOSE: To evaluate the addition of cetuximab to neoadjuvant chemotherapy before chemoradiotherapy in high-risk rectal cancer. PATIENTS AND METHODS: Patients with operable magnetic resonance imaging-defined high-risk rectal cancer received four cycles of capecitabine/oxaliplatin (CAPOX) followed by capecitabine chemoradiotherapy, surgery, and adjuvant CAPOX (four cycles) or the same regimen plus weekly cetuximab (CAPOX+C). The primary end point was complete response (CR; pathologic CR or, in patients not undergoing surgery, radiologic CR) in patients with KRAS/BRAF wild-type tumors. Secondary end points were radiologic response (RR), progression-free survival (PFS), overall survival (OS), and safety in the wild-type and overall populations and a molecular biomarker analysis. RESULTS: One hundred sixty-five eligible patients were randomly assigned. Ninety (60%) of 149 assessable tumors were KRAS or BRAF wild type (CAPOX, n = 44; CAPOX+C, n = 46), and in these patients, the addition of cetuximab did not improve the primary end point of CR (9% v 11%, respectively; P = 1.0; odds ratio, 1.22) or PFS (hazard ratio [HR], 0.65; P = .363). Cetuximab significantly improved RR (CAPOX v CAPOX+C: after chemotherapy, 51% v 71%, respectively; P = .038; after chemoradiation, 75% v 93%, respectively; P = .028) and OS (HR, 0.27; P = .034). Skin toxicity and diarrhea were more frequent in the CAPOX+C arm. CONCLUSION: Cetuximab led to a significant increase in RR and OS in patients with KRAS/BRAF wild-type rectal cancer, but the primary end point of improved CR was not met.
Resumo:
AIMS: Our aims were to evaluate the distribution of troponin I concentrations in population cohorts across Europe, to characterize the association with cardiovascular outcomes, to determine the predictive value beyond the variables used in the ESC SCORE, to test a potentially clinically relevant cut-off value, and to evaluate the improved eligibility for statin therapy based on elevated troponin I concentrations retrospectively.
METHODS AND RESULTS: Based on the Biomarkers for Cardiovascular Risk Assessment in Europe (BiomarCaRE) project, we analysed individual level data from 10 prospective population-based studies including 74 738 participants. We investigated the value of adding troponin I levels to conventional risk factors for prediction of cardiovascular disease by calculating measures of discrimination (C-index) and net reclassification improvement (NRI). We further tested the clinical implication of statin therapy based on troponin concentration in 12 956 individuals free of cardiovascular disease in the JUPITER study. Troponin I remained an independent predictor with a hazard ratio of 1.37 for cardiovascular mortality, 1.23 for cardiovascular disease, and 1.24 for total mortality. The addition of troponin I information to a prognostic model for cardiovascular death constructed of ESC SCORE variables increased the C-index discrimination measure by 0.007 and yielded an NRI of 0.048, whereas the addition to prognostic models for cardiovascular disease and total mortality led to lesser C-index discrimination and NRI increment. In individuals above 6 ng/L of troponin I, a concentration near the upper quintile in BiomarCaRE (5.9 ng/L) and JUPITER (5.8 ng/L), rosuvastatin therapy resulted in higher absolute risk reduction compared with individuals <6 ng/L of troponin I, whereas the relative risk reduction was similar.
CONCLUSION: In individuals free of cardiovascular disease, the addition of troponin I to variables of established risk score improves prediction of cardiovascular death and cardiovascular disease.
Resumo:
OBJECTIVE: To determine risk of Down syndrome (DS) in multiple relative to singleton pregnancies, and compare prenatal diagnosis rates and pregnancy outcome.
DESIGN: Population-based prevalence study based on EUROCAT congenital anomaly registries.
SETTING: Eight European countries.
POPULATION: 14.8 million births 1990-2009; 2.89% multiple births.
METHODS: DS cases included livebirths, fetal deaths from 20 weeks, and terminations of pregnancy for fetal anomaly (TOPFA). Zygosity is inferred from like/unlike sex for birth denominators, and from concordance for DS cases.
MAIN OUTCOME MEASURES: Relative risk (RR) of DS per fetus/baby from multiple versus singleton pregnancies and per pregnancy in monozygotic/dizygotic versus singleton pregnancies. Proportion of prenatally diagnosed and pregnancy outcome.
STATISTICAL ANALYSIS: Poisson and logistic regression stratified for maternal age, country and time.
RESULTS: Overall, the adjusted (adj) RR of DS for fetus/babies from multiple versus singleton pregnancies was 0.58 (95% CI 0.53-0.62), similar for all maternal ages except for mothers over 44, for whom it was considerably lower. In 8.7% of twin pairs affected by DS, both co-twins were diagnosed with the condition. The adjRR of DS for monozygotic versus singleton pregnancies was 0.34 (95% CI 0.25-0.44) and for dizygotic versus singleton pregnancies 1.34 (95% CI 1.23-1.46). DS fetuses from multiple births were less likely to be prenatally diagnosed than singletons (adjOR 0.62 [95% CI 0.50-0.78]) and following diagnosis less likely to be TOPFA (adjOR 0.40 [95% CI 0.27-0.59]).
CONCLUSIONS: The risk of DS per fetus/baby is lower in multiple than singleton pregnancies. These estimates can be used for genetic counselling and prenatal screening.