947 resultados para 12930-036
Resumo:
Although there has been a significant decrease in caries prevalence in developed countries, the slower progression of dental caries requires methods capable of detecting and quantifying lesions at an early stage. The aim of this study was to evaluate the effectiveness of fluorescence-based methods (DIAGNOdent 2095 laser fluorescence device [LF], DIAGNOdent 2190 pen [LFpen], and VistaProof fluorescence camera [FC]) in monitoring the progression of noncavitated caries-like lesions on smooth surfaces. Caries-like lesions were developed in 60 blocks of bovine enamel using a bacterial model of Streptococcus mutans and Lactobacillus acidophilus . Enamel blocks were evaluated by two independent examiners at baseline (phase I), after the first cariogenic challenge (eight days) (phase II), and after the second cariogenic challenge (a further eight days) (phase III) by two independent examiners using the LF, LFpen, and FC. Blocks were submitted to surface microhardness (SMH) and cross-sectional microhardness analyses. The intraclass correlation coefficient for intra- and interexaminer reproducibility ranged from 0.49 (FC) to 0.94 (LF/LFpen). SMH values decreased and fluorescence values increased significantly among the three phases. Higher values for sensitivity, specificity, and area under the receiver operating characteristic curve were observed for FC (phase II) and LFpen (phase III). A significant correlation was found between fluorescence values and SMH in all phases and integrated loss of surface hardness (ΔKHN) in phase III. In conclusion, fluorescence-based methods were effective in monitoring noncavitated caries-like lesions on smooth surfaces, with moderate correlation with SMH, allowing differentiation between sound and demineralized enamel.
Resumo:
The role of genetic polymorphisms in pediatric brain tumor (PBT) etiology is poorly understood. We hypothesized that single nucleotide polymorphisms (SNPs) identified in genome-wide association studies (GWAS) on adult glioma would also be associated with PBT risk. The study is based on the Cefalo study, a population-based multicenter case-control study. Saliva DNA from 245 cases and 489 controls, aged 7-19 years at diagnosis/reference date, was extracted and genotyped for 29 SNPs reported by GWAS to be significantly associated with risk of adult glioma. Data were analyzed using unconditional logistic regression. Stratified analyses were performed for two histological subtypes: astrocytoma alone and the other tumor types combined. The results indicated that four SNPs, CDKN2BAS rs4977756 (p = 0.036), rs1412829 (p = 0.037), rs2157719 (p = 0.018) and rs1063192 (p = 0.021), were associated with an increased susceptibility to PBTs, whereas the TERT rs2736100 was associated with a decreased risk (p = 0.018). Moreover, the stratified analyses showed a decreased risk of astrocytoma associated with RTEL1 rs6089953, rs6010620 and rs2297440 (p trend = 0.022, p trend = 0.042, p trend = 0.029, respectively) as well as an increased risk of this subtype associated with RTEL1 rs4809324 (p trend = 0.033). In addition, SNPs rs10464870 and rs891835 in CCDC26 were associated with an increased risk of non-astrocytoma tumor subtypes (p trend = 0.009, p trend = 0.007, respectively). Our findings indicate that SNPs in CDKN2BAS, TERT, RTEL1 and CCDC26 may be associated with the risk of PBTs. Therefore, we suggest that pediatric and adult brain tumors might share common genetic risk factors and similar etiological pathways.
Resumo:
PURPOSE The purpose of this study was to identify morphologic factors affecting type I endoleak formation and bird-beak configuration after thoracic endovascular aortic repair (TEVAR). METHODS Computed tomography (CT) data of 57 patients (40 males; median age, 66 years) undergoing TEVAR for thoracic aortic aneurysm (34 TAA, 19 TAAA) or penetrating aortic ulcer (n = 4) between 2001 and 2010 were retrospectively reviewed. In 28 patients, the Gore TAG® stent-graft was used, followed by the Medtronic Valiant® in 16 cases, the Medtronic Talent® in 8, and the Cook Zenith® in 5 cases. Proximal landing zone (PLZ) was in zone 1 in 13, zone 2 in 13, zone 3 in 23, and zone 4 in 8 patients. In 14 patients (25%), the procedure was urgent or emergent. In each case, pre- and postoperative CT angiography was analyzed using a dedicated image processing workstation and complimentary in-house developed software based on a 3D cylindrical intensity model to calculate aortic arch angulation and conicity of the landing zones (LZ). RESULTS Primary type Ia endoleak rate was 12% (7/57) and subsequent re-intervention rate was 86% (6/7). Left subclavian artery (LSA) coverage (p = 0.036) and conicity of the PLZ (5.9 vs. 2.6 mm; p = 0.016) were significantly associated with an increased type Ia endoleak rate. Bird-beak configuration was observed in 16 patients (28%) and was associated with a smaller radius of the aortic arch curvature (42 vs. 65 mm; p = 0.049). Type Ia endoleak was not associated with a bird-beak configuration (p = 0.388). Primary type Ib endoleak rate was 7% (4/57) and subsequent re-intervention rate was 100%. Conicity of the distal LZ was associated with an increased type Ib endoleak rate (8.3 vs. 2.6 mm; p = 0.038). CONCLUSIONS CT-based 3D aortic morphometry helps to identify risk factors of type I endoleak formation and bird-beak configuration during TEVAR. These factors were LSA coverage and conicity within the landing zones for type I endoleak formation and steep aortic angulation for bird-beak configuration.
Resumo:
OBJETIVES The main objective of the present randomized pilot study was to explore the effects of upstream prasugrel or ticagrelor or clopidogrel for patients with ST-segment-elevation myocardial infarction (STEMI) undergoing primary percutaneous coronary intervention (PCI). BACKGROUND Administration of clopidogrel "as soon as possible" has been advocated for STEMI. Pretreatment with prasugrel and ticagrelor may improve reperfusion. Currently, the angiographic effects of upstream administration of these agents are poorly understood. METHODS A total of 132 patients with STEMI within the first 12 hr of chest pain referred to primary angioplasty were randomized to upstream clopidogrel (600 mg), prasugrel (60 mg), or ticagrelor (180 mg) while still in the emergency room. All patients underwent protocol-mandated thrombus aspiration. RESULTS Macroscopic thrombus material was retrieved in 79.5% of the clopidogrel group, 65.9% of the prasugrel group, and 54.3% of the ticagrelor group (P = 0.041). At baseline angiography, large thrombus burden was 97.7% vs. 87.8% vs. 80.4% in the clopidogrel, prasugrel, and ticagrelor groups, respectively (P = 0.036). Also, at baseline, 97.7% presented with an occluded target vessel in the clopidogrel group, 87.8% in the prasugrel group and 78.3% in the ticagrelor group (P = 0.019). At the end of the procedure, the percentages of patients with combined TIMI grade III flow and myocardial blush grade III were 52.3% for clopidogrel, 80.5% for prasugrel, and 67.4% for ticagrelor (P = 0.022). CONCLUSIONS In patients with STEMI undergoing primary PCI within 12 hr, upstream clopidogrel, prasugrel or ticagrelor have varying angiographic findings, with a trend toward better results for the latter two agents. © 2015 Wiley Periodicals, Inc.
Resumo:
BACKGROUND The aim of newborn screening (NBS) for CF is to detect children with 'classic' CF where early treatment is possible and improves prognosis. Children with inconclusive CF diagnosis (CFSPID) should not be detected, as there is no evidence for improvement through early treatment. No algorithm in current NBS guidelines explains what to do when sweat test (ST) fails. This study compares the performance of three different algorithms for further diagnostic evaluations when first ST is unsuccessful, regarding the numbers of children detected with CF and CFSPID, and the time until a definite diagnosis. METHODS In Switzerland, CF-NBS was introduced in January 2011 using an IRT-DNA-IRT algorithm followed by a ST. In children, in whom ST was not possible (no or insufficient sweat), 3 different protocols were applied between 2011 and 2014: in 2011, ST was repeated until it was successful (protocol A), in 2012 we proceeded directly to diagnostic DNA testing (protocol B), and 2013-2014, fecal elastase (FE) was measured in the stool, in order to determine a pancreas insufficiency needing immediate treatment (protocol C). RESULTS The ratio CF:CFSPID was 7:1 (27/4) with protocol A, 2:1 (22/10) with protocol B, and 14:1 (54/4) with protocol C. The mean time to definite diagnosis was significantly shorter with protocol C (33days) compared to protocol A or B (42 and 40days; p=0.014 compared to A, and p=0.036 compared to B). CONCLUSIONS The algorithm for the diagnostic part of the newborn screening used in the CF centers is important and affects the performance of a CF-NBS program with regard to the ratio CF:CFSPID and the time until definite diagnosis. Our results suggest to include FE after initial sweat test failure in the CF-NBS guidelines to keep the proportion of CFSPID low and the time until definite diagnosis short.
Resumo:
The effects of crystal chemistry and melt composition on the control of clinopyroxene/melt element partitioning (D) during the assimilation of olivine/peridotite by felsic magma have been investigated in Mesozoic high-Mg diorites from North China. The assimilation resulted in significant increase of Mg, Cr and Ni and only slight (< 30%) decrease of incompatible elements of the magma, and the compositional variations have been mirrored by the normally and reversely zoned clinopyroxene microphenocrysts formed at the early stage of the magma evolution. The Mg# [100 × Mg / (Mg + Fe)] values of the reversely zoned clinopyroxenes increase from 65 to 75 in the core to 85–90 in the high-Mg midsection, and reduce back to 73–79 at the rim. Trace element profiles across all these clinopyroxene domains have been measured by LA-ICP-MS. The melt trace element composition has been constrained from bulk rock analyses of the fine-grained low- and high-Mg diorites. Clinopyroxene/melt partition coefficients for rare earth elements (REE) and Y in the high-Mg group zonings (Mg# > 73–79, DDy < 1.2) are positively correlated with tetrahedral IVAl and increase by a factor of 3–4 as tetrahedral IVAl increases from 0.01 to 0.1 per formula unit (pfu). These systematic variations are interpreted to be controlled by the clinopyroxene composition. In contrast, partition coefficients for low-Mg group zonings (Mg# < 75–79, DDy > 1.2) are elevated by up to an order of magnitude (for REE and Y) or more (for Zr and Hf) at similar IVAl, indicating dominant control of melt composition/structure. DZr and DHf show a larger sensitivity to the compositional change of crystal and melt than DREE. DTi values for the low- and high-Mg zonings show a uniform dependence on IVAl. DSr and DLi are insensitive to the compositional change of clinopyroxene and melt, resulting in Sr depletions in the clinopyroxene zonings with elevated REE without crystallization of plagioclase. Our observations show that crystal chemistry and melt composition/structure may alternatively control clinopyroxene/melt partitioning during the assimilation of peridotite by felsic magma, and may be useful for deciphering clinopyroxene compositions and related crust–mantle processes.
Resumo:
BACKGROUND In an effort to reduce firearm mortality rates in the USA, US states have enacted a range of firearm laws to either strengthen or deregulate the existing main federal gun control law, the Brady Law. We set out to determine the independent association of different firearm laws with overall firearm mortality, homicide firearm mortality, and suicide firearm mortality across all US states. We also projected the potential reduction of firearm mortality if the three most strongly associated firearm laws were enacted at the federal level. METHODS We constructed a cross-sectional, state-level dataset from Nov 1, 2014, to May 15, 2015, using counts of firearm-related deaths in each US state for the years 2008-10 (stratified by intent [homicide and suicide]) from the US Centers for Disease Control and Prevention's Web-based Injury Statistics Query and Reporting System, data about 25 firearm state laws implemented in 2009, and state-specific characteristics such as firearm ownership for 2013, firearm export rates, and non-firearm homicide rates for 2009, and unemployment rates for 2010. Our primary outcome measure was overall firearm-related mortality per 100 000 people in the USA in 2010. We used Poisson regression with robust variances to derive incidence rate ratios (IRRs) and 95% CIs. FINDINGS 31 672 firearm-related deaths occurred in 2010 in the USA (10·1 per 100 000 people; mean state-specific count 631·5 [SD 629·1]). Of 25 firearm laws, nine were associated with reduced firearm mortality, nine were associated with increased firearm mortality, and seven had an inconclusive association. After adjustment for relevant covariates, the three state laws most strongly associated with reduced overall firearm mortality were universal background checks for firearm purchase (multivariable IRR 0·39 [95% CI 0·23-0·67]; p=0·001), ammunition background checks (0·18 [0·09-0·36]; p<0·0001), and identification requirement for firearms (0·16 [0·09-0·29]; p<0·0001). Projected federal-level implementation of universal background checks for firearm purchase could reduce national firearm mortality from 10·35 to 4·46 deaths per 100 000 people, background checks for ammunition purchase could reduce it to 1·99 per 100 000, and firearm identification to 1·81 per 100 000. INTERPRETATION Very few of the existing state-specific firearm laws are associated with reduced firearm mortality, and this evidence underscores the importance of focusing on relevant and effective firearms legislation. Implementation of universal background checks for the purchase of firearms or ammunition, and firearm identification nationally could substantially reduce firearm mortality in the USA. FUNDING None.
Resumo:
PURPOSE The Geographic Atrophy Progression (GAP) study was designed to assess the rate of geographic atrophy (GA) progression and to identify prognostic factors by measuring the enlargement of the atrophic lesions using fundus autofluorescence (FAF) and color fundus photography (CFP). DESIGN Prospective, multicenter, noninterventional natural history study. PARTICIPANTS A total of 603 participants were enrolled in the study; 413 of those had gradable lesion data from FAF or CFP, and 321 had gradable lesion data from both FAF and CFP. METHODS Atrophic lesion areas were measured by FAF and CFP to assess lesion progression over time. Lesion size assessments and best-corrected visual acuity (BCVA) were conducted at screening/baseline (day 0) and at 3 follow-up visits: month 6, month 12, and month 18 (or early exit). MAIN OUTCOME MEASURES The GA lesion progression rate in disease subgroups and mean change from baseline visual acuity. RESULTS Mean (standard error) lesion size changes from baseline, determined by FAF and CFP, respectively, were 0.88 (0.1) and 0.78 (0.1) mm(2) at 6 months, 1.85 (0.1) and 1.57 (0.1) mm(2) at 12 months, and 3.14 (0.4) and 3.17 (0.5) mm(2) at 18 months. The mean change in lesion size from baseline to month 12 was significantly greater in participants who had eyes with multifocal atrophic spots compared with those with unifocal spots (P < 0.001) and those with extrafoveal lesions compared with those with foveal lesions (P = 0.001). The mean (standard deviation) decrease in visual acuity was 6.2 ± 15.6 letters for patients with image data available. Atrophic lesions with a diffuse (mean 0.95 mm(2)) or banded (mean 1.01 mm(2)) FAF pattern grew more rapidly by month 6 compared with those with the "none" (mean, 0.13 mm(2)) and focal (mean, 0.36 mm(2)) FAF patterns. CONCLUSIONS Although differences were observed in mean lesion size measurements using FAF imaging compared with CFP, the measurements were highly correlated with one another. Significant differences were found in lesion progression rates in participants stratified by hyperfluorescence pattern subtype. This large GA natural history study provides a strong foundation for future clinical trials.
Resumo:
OBJECTIVE Growth hormone (GH) has a strong lipolytic action and its secretion is increased during exercise. Data on fuel metabolism and its hormonal regulation during prolonged exercise in patients with growth hormone deficiency (GHD) is scarce. This study aimed at evaluating the hormonal and metabolic response during aerobic exercise in GHD patients. DESIGN Ten patients with confirmed GHD and 10 healthy control individuals (CI) matched for age, sex, BMI, and waist performed a spiroergometric test to determine exercise capacity (VO2max). Throughout a subsequent 120-minute exercise on an ergometer at 50% of individual VO2max free fatty acids (FFA), glucose, GH, cortisol, catecholamines and insulin were measured. Additionally substrate oxidation assessed by indirect calorimetry was determined at begin and end of exercise. RESULTS Exercise capacity was lower in GHD compared to CI (VO2max 35.5±7.4 vs 41.5±5.5ml/min∗kg, p=0.05). GH area under the curve (AUC-GH), peak-GH and peak-FFA were lower in GHD patients during exercise compared to CI (AUC-GH 100±93.2 vs 908.6±623.7ng∗min/ml, p<0.001; peak-GH 1.5±1.53 vs 12.57±9.36ng/ml, p<0.001, peak-FFA 1.01±0.43 vs 1.51±0.56mmol/l, p=0.036, respectively). There were no significant differences for insulin, cortisol, catecholamines and glucose. Fat oxidation at the end of exercise was higher in CI compared to GHD patients (295.7±73.9 vs 187.82±103.8kcal/h, p=0.025). CONCLUSION A reduced availability of FFA during a 2-hour aerobic exercise and a reduced fat oxidation at the end of exercise may contribute to the decreased exercise capacity in GHD patients. Catecholamines and cortisol do not compensate for the lack of the lipolytic action of GH in patients with GHD.
Resumo:
Background. Different individual (demographic) characteristic and health system related characteristics have been identified in the literature to contribute to different rates of maternal health care utilization in developing countries. This study is going to evaluate the individual and quality of health predictors of maternal health care utilization in rural Jordanian villages. ^ Methods. Data from a 2004 survey was used. Individual (predisposing and enabling) variables, quality of health care variables, and maternal care utilization variables were selected for 477 women who had a live birth during the last 5 years. The conceptual framework used in this study will be the Aday-Andersen model for health services utilization. ^ Results. 82.4% of women received at least one antenatal care visit. Individually, village of residence (p=0.036), parity (p=0.048), education (p=0.006), and health insurance (p=0.029) were found to be significant; in addition to respectful treatment (p=0.045) and clean facilities (p=0.001) were the only quality of health care factors found to be significant in predicting antenatal care use. Using logistic regression, living in southern villages (OR=4.7, p=0.01) and availability of transportation (sometimes OR=3.2, p=0.01 and never OR=2.4, p<0.05) were the only two factors to influence maternal care use. ^ Conclusions. Living in the South and transportation are major barriers to maternal care utilization in rural Jordan. Other important cultural factors of interest in some villages should be addressed in future research. Perceptions of women regarding quality of health services should be seriously taken into account. ^
Resumo:
Group B Streptococcus (GBS) is a leading cause of life-threatening infection in neonates and young infants, pregnant women, and non-pregnant adults with underlying medical conditions. Immunization has theoretical potential to prevent significant morbidity and mortality from GBS disease. Alpha C protein (α C), found in 70% of non-type III capsule polysaccharide group B Streptococcus, elicits antibodies protective against α C-expressing strains in experimental animals and is an appealing carrier for a GBS conjugate vaccine. We determined whether natural exposure to α C elicits antibodies in women and if high maternal α C-specific serum antibody at delivery is associated with protection against neonatal disease. An ELISA was designed to measure α C-specific IgM and IgG in human sera. A case-control design (1:3 ratio) was used to match α C-expressing GBS colonized and non-colonized women by age and compare quantified serum α C-specific IgM and IgG. Sera also were analyzed from bacteremic neonates and their mothers and from women with invasive GBS disease. Antibody concentrations were compared using t-tests on log-transformed data. Geometric mean concentrations of α C-specific IgM and IgG were similar in sera from 58 α C strain colonized and 174 age-matched non-colonized women (IgG 245 and 313 ng/ml; IgM 257 and 229 ng/ml, respectively). Delivery sera from mothers of 42 neonates with GBS α C sepsis had similar concentrations of α C-specific IgM (245 ng/ml) and IgG (371 ng/ml), but acute sera from 13 women with invasive α C-expressing GBS infection had significantly higher concentrations (IgM 383 and IgG 476 ng/ml [p=0.036 and 0.038, respectively]). Convalescent sera from 5 of these women 16-49 days later had high α C-specific IgM and IgG concentrations (1355 and 4173 ng/ml, respectively). In vitro killing of α C-expressing GBS correlated with total α C-specific antibody concentration. Invasive disease but not colonization elicits α C-specific IgM and IgG in adults. Whether α C-specific IgG induced by vaccine would protect against disease in neonates merits further investigation. ^
Resumo:
According to the United Nations Program on HIV/AIDS (UNAIDS, 2008), in 2007 about 67 per cent of all HIV-infected patients in the world were in Sub-Saharan Africa, with 35% of new infections and 38% of the AIDS deaths occurring in Southern Africa. Globally, the number of children younger than 15 years of age infected with HIV increased from 1.6 million in 2001 to 2.0 million in 2007 and almost 90% of these were in Sub-Saharan Africa. (UNAIDS, 2008).^ Both clinical and laboratory monitoring of children on Highly Active Anti-Retroviral Therapy (HAART) are important and necessary to optimize outcomes. Laboratory monitoring of HIV viral load and genotype resistance testing, which are important in patient follow-up to optimize treatment success, are both generally expensive and beyond the healthcare budgets of most developing countries. This is especially true for the impoverished Sub-Saharan African nations. It is therefore important to identify those factors that are associated with virologic failure in HIV-infected Sub-Saharan African children. This will inform practitioners in these countries so that they can predict which patients are more likely to develop virologic failure and therefore target the limited laboratory monitoring budgets towards these at-risk patients. The objective of this study was to examine those factors that are associated with virologic failure in HIV-infected children taking Highly Active Anti-retroviral Therapy in Botswana, a developing Sub-Saharan African country. We examined these factors in a Case-Control study using medical records of HIV-infected children and adolescents on HAART at the Botswana-Baylor Children's Clinical Center of Excellence (BBCCCOE) in Gaborone, Botswana. Univariate and Multivariate Regression Analyses were performed to identify predictors of virologic failure in these children.^ The study population comprised of 197 cases (those with virologic failure) and 544 controls (those with virologic success) with ages ranging from 3 months to 16 years at baseline. Poor adherence (pill count <95% on at least 3 consecutive occasions) was the strongest independent predictor of virologic failure (adjusted OR = 269.97, 95% CI = 104.13 to 699.92; P < 0.001). Other independent predictors of virologic failure identified were: First Line NNRTI with Nevirapine (OR = 2.99, 95% CI = 1.19 to7.54; P = 0.020), Baseline HIV-1 Viral Load >750,000/ml (OR = 257, 95% CI = 1.47 to 8.63; P = 0.005), Positive History of PMTCT (OR = 11.65, 95% CI = 3.04-44.57; P < 0.001), Multiple Care-givers (>=3) (OR = 2.56, 95% CI = 1.06 to 6.19; P = 0.036) and Residence in a Village (OR = 2.85, 95% CI = 1.36 to 5.97; P = 0.005).^ The results of this study may help to improve virologic outcomes and reduce the costs of caring for HIV-infected children in resource-limited settings. ^ Keywords: Virologic Failure, Highly Active Anti-Retroviral Therapy, Sub-Saharan Africa, Children, Adherence.^
Resumo:
Background and Objectives: African American (AA) women are disproportionately affected with hypertension (HTN). The aim of this randomized controlled trial was to evaluate the effectiveness of a 6-week culturally-tailored educational intervention for AA women with primary HTN who lived in rural Northeast Texas. ^ Methods: Sixty AA women, 29 to 86 years (M 57.98 ±12.37) with primary HTN were recruited from four rural locations and randomized to intervention (n =30) and wait-list control groups ( n =30) to determine the effectiveness of the intervention on knowledge, attitudes, beliefs, social support, adherence to a hypertension regimen, and blood pressure (BP) control. Survey and BP measurements were collected at baseline, 3 weeks, 6 weeks (post intervention) and 6 months post intervention. Culturally-tailored educational classes were provided for 90 minutes once a week for 6 weeks in two local churches and a community center. The wait-list control group received usual care and were offered education at the conclusion of the data collection six months post-intervention. Linear mixed models were used to test for differences between the groups. ^ Results: A significant overall main effect (Time) was found for systolic blood pressure, F(3, 174) =11.104, p=.000, and diastolic blood pressure. F(3, 174) =4.781, p=.003 for both groups. Age was a significant covariate for diastolic blood pressure. F(1, 56) =6.798 p=.012. Participants 57 years or older (n=30) had lower diastolic BPS than participants younger than 57 (n=30). No significant differences were found between groups on knowledge, adherence, or attitudes. Participants with lower incomes had significantly less knowledge about HBP Prevention (r=.036, p=.006). ^ Conclusion: AA women who participated in a 6 week intervention program demonstrated a significant decrease in BP over a 6 month period regardless of whether they were in the intervention or control group. These rural AA women had a relatively good knowledge of HTN and reported an average level of compliance, compared to other populations. Satisfaction with the program was high and there was no attrition, suggesting that AA women will participate in research studies that are culturally tailored to them, held in familiar community locations, and conducted by a trusted person with whom they can identify. Future studies using a different program with larger sample sizes are warranted to try to decrease the high level of HTN-related complications in AA women. ^
Resumo:
This study analyzed the relationship between fasting blood glucose (FBG) and 8-year mortality in the Hypertension Detection Follow-up Program (HDFP) population. Fasting blood glucose (FBG) was examined both as a continuous variable and by specified FBG strata: Normal (FBG 60–100 mg/dL), Impaired (FBG ≥100 and ≤125 mg/dL), and Diabetic (FBG>125 mg/dL or pre-existing diabetes) subgroups. The relationship between type 2 diabetes was examined with all-cause mortality. This thesis described and compared the characteristics of fasting blood glucose strata by recognized glucose cut-points; described the mortality rates in the various fasting blood glucose strata using Kaplan-Meier mortality curves, and compared the mortality risk of various strata using Cox Regression analysis. Overall, mortality was significantly greater among Referred Care (RC) participants compared to Stepped Care (SC) {HR = 1.17; 95% CI (1.052,1.309); p-value = 0.004}, as reported by the HDFP investigators in 1979. Compared with SC participants, the RC mortality rate was significantly higher for the Normal FBG group {HR = 1.18; 95% CI (1.029,1.363); p-value = 0.019} and the Impaired FBG group, {HR = 1.34; 95% CI (1.036,1.734); p-value = 0.026,}. However, for the diabetic group, 8-year mortality did not differ significantly between the RC and SC groups after adjusting for race, gender, age, smoking status among Diabetic individuals {HR = 1.03; 95% CI (0.816,1.303); p-value = 0.798}. This latter finding is possibly due to a lack of a treatment difference of hypertension among Diabetic participants in both RC and SC groups. The largest difference in mortality between RC and SC was in the Impaired subgroup, suggesting that hypertensive patients with FBG between 100 and 125 mg/dL would benefit from aggressive antihypertensive therapy.^