969 resultados para Pieter Visscher
Resumo:
BACKGROUND: A complete remission is essential for prolonging survival in patients with acute myeloid leukemia (AML). Daunorubicin is a cornerstone of the induction regimen, but the optimal dose is unknown. In older patients, it is usual to give daunorubicin at a dose of 45 to 50 mg per square meter of body-surface area. METHODS: Patients in whom AML or high-risk refractory anemia had been newly diagnosed and who were 60 to 83 years of age (median, 67) were randomly assigned to receive cytarabine, at a dose of 200 mg per square meter by continuous infusion for 7 days, plus daunorubicin for 3 days, either at the conventional dose of 45 mg per square meter (411 patients) or at an escalated dose of 90 mg per square meter (402 patients); this treatment was followed by a second cycle of cytarabine at a dose of 1000 mg per square meter every 12 hours [DOSAGE ERROR CORRECTED] for 6 days. The primary end point was event-free survival. RESULTS: The complete remission rates were 64% in the group that received the escalated dose of daunorubicin and 54% in the group that received the conventional dose (P=0.002); the rates of remission after the first cycle of induction treatment were 52% and 35%, respectively (P<0.001). There was no significant difference between the two groups in the incidence of hematologic toxic effects, 30-day mortality (11% and 12% in the two groups, respectively), or the incidence of moderate, severe, or life-threatening adverse events (P=0.08). Survival end points in the two groups did not differ significantly overall, but patients in the escalated-treatment group who were 60 to 65 years of age, as compared with the patients in the same age group who received the conventional dose, had higher rates of complete remission (73% vs. 51%), event-free survival (29% vs. 14%), and overall survival (38% vs. 23%). CONCLUSIONS: In patients with AML who are older than 60 years of age, escalation of the dose of daunorubicin to twice the conventional dose, with the entire dose administered in the first induction cycle, effects a more rapid response and a higher response rate than does the conventional dose, without additional toxic effects. (Current Controlled Trials number, ISRCTN77039377; and Netherlands National Trial Register number, NTR212.)
Resumo:
BACKGROUND: Phaeochromocytomas and paragangliomas are neuro-endocrine tumours that occur sporadically and in several hereditary tumour syndromes, including the phaeochromocytoma-paraganglioma syndrome. This syndrome is caused by germline mutations in succinate dehydrogenase B (SDHB), C (SDHC), or D (SDHD) genes. Clinically, the phaeochromocytoma-paraganglioma syndrome is often unrecognised, although 10-30% of apparently sporadic phaeochromocytomas and paragangliomas harbour germline SDH-gene mutations. Despite these figures, the screening of phaeochromocytomas and paragangliomas for mutations in the SDH genes to detect phaeochromocytoma-paraganglioma syndrome is rarely done because of time and financial constraints. We investigated whether SDHB immunohistochemistry could effectively discriminate between SDH-related and non-SDH-related phaeochromocytomas and paragangliomas in large retrospective and prospective tumour series. METHODS: Immunohistochemistry for SDHB was done on 220 tumours. Two retrospective series of 175 phaeochromocytomas and paragangliomas with known germline mutation status for phaeochromocytoma-susceptibility or paraganglioma-susceptibility genes were investigated. Additionally, a prospective series of 45 phaeochromocytomas and paragangliomas was investigated for SDHB immunostaining followed by SDHB, SDHC, and SDHD mutation testing. FINDINGS: SDHB protein expression was absent in all 102 phaeochromocytomas and paragangliomas with an SDHB, SDHC, or SDHD mutation, but was present in all 65 paraganglionic tumours related to multiple endocrine neoplasia type 2, von Hippel-Lindau disease, and neurofibromatosis type 1. 47 (89%) of the 53 phaeochromocytomas and paragangliomas with no syndromic germline mutation showed SDHB expression. The sensitivity and specificity of the SDHB immunohistochemistry to detect the presence of an SDH mutation in the prospective series were 100% (95% CI 87-100) and 84% (60-97), respectively. INTERPRETATION: Phaeochromocytoma-paraganglioma syndrome can be diagnosed reliably by an immunohistochemical procedure. SDHB, SDHC, and SDHD germline mutation testing is indicated only in patients with SDHB-negative tumours. SDHB immunohistochemistry on phaeochromocytomas and paragangliomas could improve the diagnosis of phaeochromocytoma-paraganglioma syndrome. FUNDING: The Netherlands Organisation for Scientific Research, Dutch Cancer Society, Vanderes Foundation, Association pour la Recherche contre le Cancer, Institut National de la Santé et de la Recherche Médicale, and a PHRC grant COMETE 3 for the COMETE network.
Resumo:
Self-absorption of the Eu2+ emission is an important aspect in SrI2:Eu that affects its scintillation performance. To calculate the probability of self-absorption, we measured the light yield and the decay time of 1–15 mm thick SrI2:2%Eu samples at temperatures between 78 K and 600 K. The obtained properties of SrI2:2%Eu crystals were then compared to those of SrI2:5%Eu. The decay times of SrI2:5%Eu crystals were the same or somewhat longer compared to those of twice as thickSrI2:2%Eu crystals. Accordingly, doubling the thickness has the same effect on the probability of self-absorption as doubling the Eu concentration.
Improvement of LaBr3:5%Ce scintillation properties by Li+, Na+, Mg2+, Ca2+, Sr2+, and Ba2+ co-doping
Resumo:
This paper reports on the effects of Li+, Na+, Mg2+, Ca2+, Sr2+, and Ba2+ co-doping on the scintillation properties of LaBr3:5%Ce3+. Pulse-height spectra of various gamma and X-ray sources with energies from 8 keV to 1.33 MeV were measured from which the values of light yield and energy resolution were derived. Sr2+ and Ca2+ co-doped crystals showed excellent energy resolution as compared to standard LaBr3:Ce. The proportionality of the scintillation response to gamma and X-rays of Ca2+, Sr2+, and Ba2+ co-doped samples also considerably improves. The effects of the co-dopants on emission spectra, decay time, and temperature stability of the light yield were studied. Multiple thermoluminescence glow peaks, decrease of the light yield at temperatures below 295 K, and additional long scintillation decay components were observed and related to charge carrier traps appearing in LaBr3:Ce3+ with Ca2+, Sr2+, and Ba2+ co-doping.
Resumo:
Sr2+ co-doped LaBr3:5%Ce scintillators show a record low energy resolution of 2% at 662 keV and a considerably better proportional response compared to standard LaBr3:5%Ce. This paper reports on the optical properties and time response of Sr co-doped LaBr3:5%Ce. Multiple excitation and emission bands were observed in X-ray and optically excited luminescence measurements. Those bands are ascribed to three different Ce3+ sites. The first is the unperturbed site with the same luminescence properties as those of standard LaBr3:Ce. The other two are perturbed sites with red-shifted 4f-5d1 Ce3+ excitation and emission bands, longer Ce3+ decay times, and smaller Stokes shifts. The lowering of the lowest 5d level of Ce3+ was ascribed to larger crystal field interactions at the perturbed sites. Two types of point defects in the LaBr3 matrix were proposed to explain the observed results. No Ce4+ ions were detected in Sr co-doped LaBr3:5%Ce by diffuse reflectance measurements.
Resumo:
Background: Available studies vary in their estimated prevalence of attention deficit/hyperactivity disor-der (ADHD) in substance use disorder (SUD) patients, ranging from 2 to 83%. A better understanding ofthe possible reasons for this variability and the effect of the change from DSM-IV to DSM-5 is needed.Methods: A two stage international multi-center, cross-sectional study in 10 countries, among patientsform inpatient and outpatient addiction treatment centers for alcohol and/or drug use disorder patients. Atotal of 3558 treatment seeking SUD patients were screened for adult ADHD. A subsample of 1276 subjects,both screen positive and screen negative patients, participated in a structured diagnostic interview. 5AdultsResults: Prevalence of DSM-IV and DSM-5 adult ADHD varied for DSM-IV from 5.4% (CI 95%: 2.4–8.3) forHungary to 31.3% (CI 95%:25.2–37.5) for Norway and for DSM-5 from 7.6% (CI 95%: 4.1–11.1) for Hungary to32.6% (CI 95%: 26.4–38.8) for Norway. Using the same assessment procedures in all countries and centersresulted in substantial reduction of the variability in the prevalence of adult ADHD reported in previousstudies among SUD patients (2–83% → 5.4–31.3%). The remaining variability was partly explained byprimary substance of abuse and by country (Nordic versus non-Nordic countries). Prevalence estimatesfor DSM-5 were slightly higher than for DSM-IV.Conclusions: Given the generally high prevalence of adult ADHD, all treatment seeking SUD patientsshould be screened and, after a confirmed diagnosis, treated for ADHD since the literature indicates poorprognoses of SUD in treatment seeking SUD patients with ADHD.
Resumo:
Aims To determine comorbidity patterns in treatment-seeking substance use disorder (SUD) patients with and without adult attention deficit hyperactivity disorder (ADHD), with an emphasis on subgroups defined by ADHD subtype, taking into account differences related to gender and primary substance of abuse. Design Data were obtained from the cross-sectional International ADHD in Substance use disorder Prevalence (IASP) study. Setting Forty-seven centres of SUD treatment in 10 countries. Participants A total of 1205 treatment-seeking SUD patients. Measurements Structured diagnostic assessments were used for all disorders: presence of ADHD was assessed with the Conners' Adult ADHD Diagnostic Interview for DSM-IV (CAADID), the presence of antisocial personality disorder (ASPD), major depression (MD) and (hypo)manic episode (HME) was assessed with the Mini International Neuropsychiatric Interview-Plus (MINI Plus), and the presence of borderline personality disorder (BPD) was assessed with the Structured Clinical Interview for DSM-IV Axis II (SCID II). Findings The prevalence of DSM-IV adult ADHD in this SUD sample was 13.9%. ASPD [odds ratio (OR) = 2.8, 95% confidence interval (CI) = 1.8–4.2], BPD (OR = 7.0, 95% CI = 3.1–15.6 for alcohol; OR = 3.4, 95% CI = 1.8–6.4 for drugs), MD in patients with alcohol as primary substance of abuse (OR = 4.1, 95% CI = 2.1–7.8) and HME (OR = 4.3, 95% CI = 2.1–8.7) were all more prevalent in ADHD+ compared with ADHD− patients (P < 0.001). These results also indicate increased levels of BPD and MD for alcohol compared with drugs as primary substance of abuse. Comorbidity patterns differed between ADHD subtypes with increased MD in the inattentive and combined subtype (P < 0.01), increased HME and ASPD in the hyperactive/impulsive (P < 0.01) and combined subtypes (P < 0.001) and increased BPD in all subtypes (P < 0.001) compared with SUD patients without ADHD. Seventy-five per cent of ADHD patients had at least one additional comorbid disorder compared with 37% of SUD patients without ADHD. Conclusions Treatment-seeking substance use disorder patients with attention deficit hyperactivity disorder are at a very high risk for additional externalizing disorders.
Resumo:
Aims: To assess observations with multimodality imaging of the Absorb bioresorbable everolimus-eluting vascular scaffold performed in two consecutive cohorts of patients who were serially investigated either at 6 and 24 months or at 12 and 36 months. Methods and results: In the ABSORB multicentre single-arm trial, 45 patients (cohort B1) and 56 patients (cohort B2) underwent serial invasive imaging, specifically quantitative coronary angiography (QCA), intravascular ultrasound (IVUS), radiofrequency backscattering (IVUS-VH) and optical coherence tomography (OCT). Between one and three years, late luminal loss remained unchanged (6 months: 0.19 mm, 1 year: 0.27 mm, 2 years: 0.27 mm, 3 years: 0.29 mm) and the in-segment angiographic restenosis rate for the entire cohort B (n=101) at three years was 6%. On IVUS, mean lumen, scaffold, plaque and vessel area showed enlargement up to two years. Mean lumen and scaffold area remained stable between two and three years whereas significant reduction in plaque behind the struts occurred with a trend toward adaptive restrictive remodelling of EEM. Hyperechogenicity of the vessel wall, a surrogate of the bioresorption process, decreased from 23.1% to 10.4% with a reduction of radiofrequency backscattering for dense calcium and necrotic core. At three years, the count of strut cores detected on OCT increased significantly, probably reflecting the dismantling of the scaffold; 98% of struts were covered. In the entire cohort B (n=101), the three-year major adverse cardiac event rate was 10.0% without any scaffold thrombosis. Conclusions: The current investigation demonstrated the dynamics of vessel wall changes after implantation of a bioresorbable scaffold, resulting at three years in stable luminal dimensions, a low restenosis rate and a low clinical major adverse cardiac events rate.
Resumo:
BACKGROUND The safety and efficacy of drug-eluting stents (DES) in the treatment of coronary artery disease have been assessed in several randomised trials. However, none of these trials were powered to assess the safety and efficacy of DES in women because only a small proportion of recruited participants were women. We therefore investigated the safety and efficacy of DES in female patients during long-term follow-up. METHODS We pooled patient-level data for female participants from 26 randomised trials of DES and analysed outcomes according to stent type (bare-metal stents, early-generation DES, and newer-generation DES). The primary safety endpoint was a composite of death or myocardial infarction. The secondary safety endpoint was definite or probable stent thrombosis. The primary efficacy endpoint was target-lesion revascularisation. Analysis was by intention to treat. FINDINGS Of 43,904 patients recruited in 26 trials of DES, 11,557 (26·3%) were women (mean age 67·1 years [SD 10·6]). 1108 (9·6%) women received bare-metal stents, 4171 (36·1%) early-generation DES, and 6278 (54·3%) newer-generation DES. At 3 years, estimated cumulative incidence of the composite of death or myocardial infarction occurred in 132 (12·8%) women in the bare-metal stent group, 421 (10·9%) in the early-generation DES group, and 496 (9·2%) in the newer-generation DES group (p=0·001). Definite or probable stent thrombosis occurred in 13 (1·3%), 79 (2·1%), and 66 (1·1%) women in the bare-metal stent, early-generation DES, and newer-generation DES groups, respectively (p=0·01). The use of DES was associated with a significant reduction in the 3 year rates of target-lesion revascularisation (197 [18·6%] women in the bare-metal stent group, 294 [7·8%] in the early-generation DES group, and 330 [6·3%] in the newer-generation DES group, p<0·0001). Results did not change after adjustment for baseline characteristics in the multivariable analysis. INTERPRETATION The use of DES in women is more effective and safe than is use of bare-metal stents during long-term follow-up. Newer-generation DES are associated with an improved safety profile compared with early-generation DES, and should therefore be thought of as the standard of care for percutaneous coronary revascularisation in women. FUNDING Women in Innovation Initiative of the Society of Cardiovascular Angiography and Interventions.
Resumo:
OBJECTIVES This study sought to validate the Logistic Clinical SYNTAX (Synergy Between Percutaneous Coronary Intervention With Taxus and Cardiac Surgery) score in patients with non-ST-segment elevation acute coronary syndromes (ACS), in order to further legitimize its clinical application. BACKGROUND The Logistic Clinical SYNTAX score allows for an individualized prediction of 1-year mortality in patients undergoing contemporary percutaneous coronary intervention. It is composed of a "Core" Model (anatomical SYNTAX score, age, creatinine clearance, and left ventricular ejection fraction), and "Extended" Model (composed of an additional 6 clinical variables), and has previously been cross validated in 7 contemporary stent trials (>6,000 patients). METHODS One-year all-cause death was analyzed in 2,627 patients undergoing percutaneous coronary intervention from the ACUITY (Acute Catheterization and Urgent Intervention Triage Strategy) trial. Mortality predictions from the Core and Extended Models were studied with respect to discrimination, that is, separation of those with and without 1-year all-cause death (assessed by the concordance [C] statistic), and calibration, that is, agreement between observed and predicted outcomes (assessed with validation plots). Decision curve analyses, which weight the harms (false positives) against benefits (true positives) of using a risk score to make mortality predictions, were undertaken to assess clinical usefulness. RESULTS In the ACUITY trial, the median SYNTAX score was 9.0 (interquartile range 5.0 to 16.0); approximately 40% of patients had 3-vessel disease, 29% diabetes, and 85% underwent drug-eluting stent implantation. Validation plots confirmed agreement between observed and predicted mortality. The Core and Extended Models demonstrated substantial improvements in the discriminative ability for 1-year all-cause death compared with the anatomical SYNTAX score in isolation (C-statistics: SYNTAX score: 0.64, 95% confidence interval [CI]: 0.56 to 0.71; Core Model: 0.74, 95% CI: 0.66 to 0.79; Extended Model: 0.77, 95% CI: 0.70 to 0.83). Decision curve analyses confirmed the increasing ability to correctly identify patients who would die at 1 year with the Extended Model versus the Core Model versus the anatomical SYNTAX score, over a wide range of thresholds for mortality risk predictions. CONCLUSIONS Compared to the anatomical SYNTAX score alone, the Core and Extended Models of the Logistic Clinical SYNTAX score more accurately predicted individual 1-year mortality in patients presenting with non-ST-segment elevation acute coronary syndromes undergoing percutaneous coronary intervention. These findings support the clinical application of the Logistic Clinical SYNTAX score.
Resumo:
OBJECTIVES The authors sought to examine the adoption of transcatheter aortic valve replacement (TAVR) in Western Europe and investigate factors that may influence the heterogeneous use of this therapy. BACKGROUND Since its commercialization in 2007, the number of TAVR procedures has grown exponentially. METHODS The adoption of TAVR was investigated in 11 European countries: Germany, France, Italy, United Kingdom, Spain, the Netherlands, Switzerland, Belgium, Portugal, Denmark, and Ireland. Data were collected from 2 sources: 1) lead physicians submitted nation-specific registry data; and 2) an implantation-based TAVR market tracker. Economic indexes such as healthcare expenditure per capita, sources of healthcare funding, and reimbursement strategies were correlated to TAVR use. Furthermore, we assessed the extent to which TAVR has penetrated its potential patient population. RESULTS Between 2007 and 2011, 34,317 patients underwent TAVR. Considerable variation in TAVR use existed across nations. In 2011, the number of TAVR implants per million individuals ranged from 6.1 in Portugal to 88.7 in Germany (33 ± 25). The annual number of TAVR implants performed per center across nations also varied widely (range 10 to 89). The weighted average TAVR penetration rate was low: 17.9%. Significant correlation was found between TAVR use and healthcare spending per capita (r = 0.80; p = 0.005). TAVR-specific reimbursement systems were associated with higher TAVR use than restricted systems (698 ± 232 vs. 213 ± 112 implants/million individuals ≥ 75 years; p = 0.002). CONCLUSIONS The authors' findings indicate that TAVR is underutilized in high and prohibitive surgical risk patients with severe aortic stenosis. National economic indexes and reimbursement strategies are closely linked with TAVR use and help explain the inequitable adoption of this therapy.
Resumo:
AIMS To assess serially the edge vascular response (EVR) of a bioresorbable vascular scaffold (BVS) compared to a metallic everolimus-eluting stent (EES). METHODS AND RESULTS Non-serial evaluations of the Absorb BVS at one year have previously demonstrated proximal edge constrictive remodelling and distal edge changes in plaque composition with increase of the percent fibro-fatty (FF) tissue component. The 5 mm proximal and distal segments adjacent to the implanted devices were investigated serially with intravascular ultrasound (IVUS), post procedure, at six months and at two years, from the ABSORB Cohort B1 (n=45) and the SPIRIT II (n=113) trials. Twenty-two proximal and twenty-four distal edge segments were available for analysis in the ABSORB Cohort B1 trial. In the SPIRIT II trial, thirty-three proximal and forty-six distal edge segments were analysed. At the 5-mm proximal edge, the vessels treated with an Absorb BVS from post procedure to two years demonstrated a lumen loss (LL) of 6.68% (-17.33; 2.08) (p=0.027) with a trend toward plaque area increase of 7.55% (-4.68; 27.11) (p=0.06). At the 5-mm distal edge no major changes were evident at either time point. At the 5-mm proximal edge the vessels treated with a XIENCE V EES from post procedure to two years did not show any signs of LL, only plaque area decrease of 6.90% (-17.86; 4.23) (p=0.035). At the distal edge no major changes were evident with regard to either lumen area or vessel remodelling at the same time point. CONCLUSIONS The IVUS-based serial evaluation of the EVR up to two years following implantation of a bioresorbable everolimus-eluting scaffold shows a statistically significant proximal edge LL; however, this finding did not seem to have any clinical implications in the serial assessment. The upcoming imaging follow-up of the Absorb BVS at three years is anticipated to provide further information regarding the vessel wall behaviour at the edges.
Resumo:
BACKGROUND The long-term results after second generation everolimus eluting bioresorbable vascular scaffold (Absorb BVS) placement in small vessels are unknown. Therefore, we investigated the impact of vessel size on long-term outcomes, after Absorb BVS implantation. METHODS In ABSORB Cohort B Trial, out of the total study population (101 patients), 45 patients were assigned to undergo 6-month and 2-year angiographic follow-up (Cohort B1) and 56 patients to have angiographic follow-up at 1-year (Cohort B2). The pre-reference vessel diameter (RVD) was <2.5 mm (small-vessel group) in 41 patients (41 lesions) and ≥2.5 mm (large-vessel group) in 60 patients (61 lesions). Outcomes were compared according to pre-RVD. RESULTS At 2-year angiographic follow-up no differences in late lumen loss (0.29±0.16 mm vs 0.25±0.22 mm, p=0.4391), and in-segment binary restenosis (5.3% vs 5.3% p=1.0000) were demonstrated between groups. In the small-vessel group, intravascular ultrasound analysis showed a significant increase in vessel area (12.25±3.47 mm(2) vs 13.09±3.38 mm(2) p=0.0015), scaffold area (5.76±0.96 mm(2) vs 6.41±1.30 mm(2) p=0.0008) and lumen area (5.71±0.98 mm(2) vs 6.20±1.27 mm(2) p=0.0155) between 6-months and 2-year follow-up. No differences in plaque composition were reported between groups at either time point. At 2-year clinical follow-up, no differences in ischaemia-driven major adverse cardiac events (7.3% vs 10.2%, p=0.7335), myocardial infarction (4.9% vs 1.7%, p=0.5662) or ischaemia-driven target lesion revascularisation (2.4% vs 8.5%, p=0.3962) were reported between small and large vessels. No deaths or scaffold thrombosis were observed. CONCLUSIONS Similar clinical and angiographic outcomes at 2-year follow-up were reported in small and large vessel groups. A significant late lumen enlargement and positive vessel remodelling were observed in small vessels.
Resumo:
OBJECTIVES The aim of the current Valve Academic Research Consortium (VARC)-2 initiative was to revisit the selection and definitions of transcatheter aortic valve implantation (TAVI) clinical endpoints to make them more suitable to the present and future needs of clinical trials. In addition, this document is intended to expand the understanding of patient risk stratification and case selection. BACKGROUND A recent study confirmed that VARC definitions have already been incorporated into clinical and research practice and represent a new standard for consistency in reporting clinical outcomes of patients with symptomatic severe aortic stenosis (AS) undergoing TAVI. However, as the clinical experience with this technology has matured and expanded, certain definitions have become unsuitable or ambiguous. METHODS AND RESULTS Two in-person meetings (held in September 2011 in Washington, DC, and in February 2012 in Rotterdam, The Netherlands) involving VARC study group members, independent experts (including surgeons, interventional and noninterventional cardiologists, imaging specialists, neurologists, geriatric specialists, and clinical trialists), the US Food and Drug Administration (FDA), and industry representatives, provided much of the substantive discussion from which this VARC-2 consensus manuscript was derived. This document provides an overview of risk assessment and patient stratification that need to be considered for accurate patient inclusion in studies. Working groups were assigned to define the following clinical endpoints: mortality, stroke, myocardial infarction, bleeding complications, acute kidney injury, vascular complications, conduction disturbances and arrhythmias, and a miscellaneous category including relevant complications not previously categorized. Furthermore, comprehensive echocardiographic recommendations are provided for the evaluation of prosthetic valve (dys)function. Definitions for the quality of life assessments are also reported. These endpoints formed the basis for several recommended composite endpoints. CONCLUSIONS This VARC-2 document has provided further standardization of endpoint definitions for studies evaluating the use of TAVI, which will lead to improved comparability and interpretability of the study results, supplying an increasingly growing body of evidence with respect to TAVI and/or surgical aortic valve replacement. This initiative and document can furthermore be used as a model during current endeavors of applying definitions to other transcatheter valve therapies (for example, mitral valve repair).
Resumo:
Horses were domesticated from the Eurasian steppes 5,000-6,000 years ago. Since then, the use of horses for transportation, warfare, and agriculture, as well as selection for desired traits and fitness, has resulted in diverse populations distributed across the world, many of which have become or are in the process of becoming formally organized into closed, breeding populations (breeds). This report describes the use of a genome-wide set of autosomal SNPs and 814 horses from 36 breeds to provide the first detailed description of equine breed diversity. F(ST) calculations, parsimony, and distance analysis demonstrated relationships among the breeds that largely reflect geographic origins and known breed histories. Low levels of population divergence were observed between breeds that are relatively early on in the process of breed development, and between those with high levels of within-breed diversity, whether due to large population size, ongoing outcrossing, or large within-breed phenotypic diversity. Populations with low within-breed diversity included those which have experienced population bottlenecks, have been under intense selective pressure, or are closed populations with long breed histories. These results provide new insights into the relationships among and the diversity within breeds of horses. In addition these results will facilitate future genome-wide association studies and investigations into genomic targets of selection.