929 resultados para bandwidth 2.0 GHz to 2.45 GHz


Relevância:

100.00% 100.00%

Publicador:

Resumo:

OBJECTIVE The aim of this study was to evaluate whether the distribution pattern of early ischemic changes in the initial MRI allows a practical method for estimating leptomeningeal collateralization in acute ischemic stroke (AIS). METHODS Seventy-four patients with AIS underwent MRI followed by conventional angiogram and mechanical thrombectomy. Diffusion restriction in Diffusion weighted imaging (DWI) and correlated T2-hyperintensity of the infarct were retrospectively analyzed and subdivided in accordance with Alberta Stroke Program Early CT score (ASPECTS). Patients were angiographically graded in collateralization groups according to the method of Higashida, and dichotomized in 2 groups: 29 subjects with collateralization grade 3 or 4 (well-collateralized group) and 45 subjects with grade 1 or 2 (poorly-collateralized group). Individual ASPECTS areas were compared among the groups. RESULTS Means for overall DWI-ASPECTS were 6.34 vs. 4.51 (well vs. poorly collateralized groups respectively), and for T2-ASPECTS 9.34 vs 8.96. A significant difference between groups was found for DWI-ASPECTS (p<0.001), but not for T2-ASPECTS (p = 0.088). Regarding the individual areas, only insula, M1-M4 and M6 showed significantly fewer infarctions in the well-collateralized group (p-values <0.001 to 0.015). 89% of patients in the well-collateralized group showed 0-2 infarctions in these six areas (44.8% with 0 infarctions), while 59.9% patients of the poor-collateralized group showed 3-6 infarctions. CONCLUSION Patients with poor leptomeningeal collateralization show more infarcts on the initial MRI, particularly in the ASPECTS areas M1 to M4, M6 and insula. Therefore DWI abnormalities in these areas may be a surrogate marker for poor leptomeningeal collaterals and may be useful for estimation of the collateral status in routine clinical evaluation.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

BACKGROUND: We evaluated the feasibility of an augmented robotics-assisted tilt table (RATT) for incremental cardiopulmonary exercise testing (CPET) and exercise training in dependent-ambulatory stroke patients. METHODS: Stroke patients (Functional Ambulation Category ≤ 3) underwent familiarization, an incremental exercise test (IET) and a constant load test (CLT) on separate days. A RATT equipped with force sensors in the thigh cuffs, a work rate estimation algorithm and real-time visual feedback to guide the exercise work rate was used. Feasibility assessment considered technical feasibility, patient tolerability, and cardiopulmonary responsiveness. RESULTS: Eight patients (4 female) aged 58.3 ± 9.2 years (mean ± SD) were recruited and all completed the study. For IETs, peak oxygen uptake (V'O2peak), peak heart rate (HRpeak) and peak work rate (WRpeak) were 11.9 ± 4.0 ml/kg/min (45 % of predicted V'O2max), 117 ± 32 beats/min (72 % of predicted HRmax) and 22.5 ± 13.0 W, respectively. Peak ratings of perceived exertion (RPE) were on the range "hard" to "very hard". All 8 patients reached their limit of functional capacity in terms of either their cardiopulmonary or neuromuscular performance. A ventilatory threshold (VT) was identified in 7 patients and a respiratory compensation point (RCP) in 6 patients: mean V'O2 at VT and RCP was 8.9 and 10.7 ml/kg/min, respectively, which represent 75 % (VT) and 85 % (RCP) of mean V'O2peak. Incremental CPET provided sufficient information to satisfy the responsiveness criteria and identification of key outcomes in all 8 patients. For CLTs, mean steady-state V'O2 was 6.9 ml/kg/min (49 % of V'O2 reserve), mean HR was 90 beats/min (56 % of HRmax), RPEs were > 2, and all patients maintained the active work rate for 10 min: these values meet recommended intensity levels for bouts of training. CONCLUSIONS: The augmented RATT is deemed feasible for incremental cardiopulmonary exercise testing and exercise training in dependent-ambulatory stroke patients: the approach was found to be technically implementable, acceptable to the patients, and it showed substantial cardiopulmonary responsiveness. This work has clinical implications for patients with severe disability who otherwise are not able to be tested.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

IMPORTANCE Associations between subclinical thyroid dysfunction and fractures are unclear and clinical trials are lacking. OBJECTIVE To assess the association of subclinical thyroid dysfunction with hip, nonspine, spine, or any fractures. DATA SOURCES AND STUDY SELECTION The databases of MEDLINE and EMBASE (inception to March 26, 2015) were searched without language restrictions for prospective cohort studies with thyroid function data and subsequent fractures. DATA EXTRACTION Individual participant data were obtained from 13 prospective cohorts in the United States, Europe, Australia, and Japan. Levels of thyroid function were defined as euthyroidism (thyroid-stimulating hormone [TSH], 0.45-4.49 mIU/L), subclinical hyperthyroidism (TSH <0.45 mIU/L), and subclinical hypothyroidism (TSH ≥4.50-19.99 mIU/L) with normal thyroxine concentrations. MAIN OUTCOME AND MEASURES The primary outcome was hip fracture. Any fractures, nonspine fractures, and clinical spine fractures were secondary outcomes. RESULTS Among 70,298 participants, 4092 (5.8%) had subclinical hypothyroidism and 2219 (3.2%) had subclinical hyperthyroidism. During 762,401 person-years of follow-up, hip fracture occurred in 2975 participants (4.6%; 12 studies), any fracture in 2528 participants (9.0%; 8 studies), nonspine fracture in 2018 participants (8.4%; 8 studies), and spine fracture in 296 participants (1.3%; 6 studies). In age- and sex-adjusted analyses, the hazard ratio (HR) for subclinical hyperthyroidism vs euthyroidism was 1.36 for hip fracture (95% CI, 1.13-1.64; 146 events in 2082 participants vs 2534 in 56,471); for any fracture, HR was 1.28 (95% CI, 1.06-1.53; 121 events in 888 participants vs 2203 in 25,901); for nonspine fracture, HR was 1.16 (95% CI, 0.95-1.41; 107 events in 946 participants vs 1745 in 21,722); and for spine fracture, HR was 1.51 (95% CI, 0.93-2.45; 17 events in 732 participants vs 255 in 20,328). Lower TSH was associated with higher fracture rates: for TSH of less than 0.10 mIU/L, HR was 1.61 for hip fracture (95% CI, 1.21-2.15; 47 events in 510 participants); for any fracture, HR was 1.98 (95% CI, 1.41-2.78; 44 events in 212 participants); for nonspine fracture, HR was 1.61 (95% CI, 0.96-2.71; 32 events in 185 participants); and for spine fracture, HR was 3.57 (95% CI, 1.88-6.78; 8 events in 162 participants). Risks were similar after adjustment for other fracture risk factors. Endogenous subclinical hyperthyroidism (excluding thyroid medication users) was associated with HRs of 1.52 (95% CI, 1.19-1.93) for hip fracture, 1.42 (95% CI, 1.16-1.74) for any fracture, and 1.74 (95% CI, 1.01-2.99) for spine fracture. No association was found between subclinical hypothyroidism and fracture risk. CONCLUSIONS AND RELEVANCE Subclinical hyperthyroidism was associated with an increased risk of hip and other fractures, particularly among those with TSH levels of less than 0.10 mIU/L and those with endogenous subclinical hyperthyroidism. Further study is needed to determine whether treating subclinical hyperthyroidism can prevent fractures.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

BACKGROUND Whether screening for thrombophilia is useful for patients after a first episode of venous thromboembolism (VTE) is a controversial issue. However, the impact of thrombophilia on the risk of recurrence may vary depending on the patient's age at the time of the first VTE. PATIENTS AND METHODS Of 1221 VTE patients (42 % males) registered in the MAISTHRO (MAin-ISar-THROmbosis) registry, 261 experienced VTE recurrence during a 5-year follow-up after the discontinuation of anticoagulant therapy. RESULTS Thrombophilia was more common among patients with VTE recurrence than those without (58.6 % vs. 50.3 %; p = 0.017). Stratifying patients by the age at the time of their initial VTE, Cox proportional hazards analyses adjusted for age, sex and the presence or absence of established risk factors revealed a heterozygous prothrombin (PT) G20210A mutation (hazard ratio (HR) 2.65; 95 %-confidence interval (CI) 1.71 - 4.12; p < 0.001), homozygosity/double heterozygosity for the factor V Leiden and/or PT mutation (HR 2.35; 95 %-CI 1.09 - 5.07, p = 0.030), and an antithrombin deficiency (HR 2.12; 95 %-CI 1.12 - 4.10; p = 0.021) to predict recurrent VTE in patients aged 40 years or older, whereas lupus anticoagulants (HR 3.05; 95%-CI 1.40 - 6.66; p = 0.005) increased the risk of recurrence in younger patients. Subgroup analyses revealed an increased risk of recurrence for a heterozygous factor V Leiden mutation only in young females without hormonal treatment whereas the predictive value of a heterozygous PT mutation was restricted to males over the age of 40 years. CONCLUSIONS Our data do not support a preference of younger patients for thrombophilia testing after a first venous thromboembolic event.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

OBJECTIVES This study compared clinical outcomes and revascularization strategies among patients presenting with low ejection fraction, low-gradient (LEF-LG) severe aortic stenosis (AS) according to the assigned treatment modality. BACKGROUND The optimal treatment modality for patients with LEF-LG severe AS and concomitant coronary artery disease (CAD) requiring revascularization is unknown. METHODS Of 1,551 patients, 204 with LEF-LG severe AS (aortic valve area <1.0 cm(2), ejection fraction <50%, and mean gradient <40 mm Hg) were allocated to medical therapy (MT) (n = 44), surgical aortic valve replacement (SAVR) (n = 52), or transcatheter aortic valve replacement (TAVR) (n = 108). CAD complexity was assessed using the SYNTAX score (SS) in 187 of 204 patients (92%). The primary endpoint was mortality at 1 year. RESULTS LEF-LG severe AS patients undergoing SAVR were more likely to undergo complete revascularization (17 of 52, 35%) compared with TAVR (8 of 108, 8%) and MT (0 of 44, 0%) patients (p < 0.001). Compared with MT, both SAVR (adjusted hazard ratio [adj HR]: 0.16; 95% confidence interval [CI]: 0.07 to 0.38; p < 0.001) and TAVR (adj HR: 0.30; 95% CI: 0.18 to 0.52; p < 0.001) improved survival at 1 year. In TAVR and SAVR patients, CAD severity was associated with higher rates of cardiovascular death (no CAD: 12.2% vs. low SS [0 to 22], 15.3% vs. high SS [>22], 31.5%; p = 0.037) at 1 year. Compared with no CAD/complete revascularization, TAVR and SAVR patients undergoing incomplete revascularization had significantly higher 1-year cardiovascular death rates (adj HR: 2.80; 95% CI: 1.07 to 7.36; p = 0.037). CONCLUSIONS Among LEF-LG severe AS patients, SAVR and TAVR improved survival compared with MT. CAD severity was associated with worse outcomes and incomplete revascularization predicted 1-year cardiovascular mortality among TAVR and SAVR patients.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

IMPORTANCE Despite antirestenotic efficacy of coronary drug-eluting stents (DES) compared with bare metal stents (BMS), the relative risk of stent thrombosis and adverse cardiovascular events is unclear. Although dual antiplatelet therapy (DAPT) beyond 1 year provides ischemic event protection after DES, ischemic event risk is perceived to be less after BMS, and the appropriate duration of DAPT after BMS is unknown. OBJECTIVE To compare (1) rates of stent thrombosis and major adverse cardiac and cerebrovascular events (MACCE; composite of death, myocardial infarction, or stroke) after 30 vs 12 months of thienopyridine in patients treated with BMS taking aspirin and (2) treatment duration effect within the combined cohorts of randomized patients treated with DES or BMS as prespecified secondary analyses. DESIGN, SETTING, AND PARTICIPANTS International, multicenter, randomized, double-blinded, placebo-controlled trial comparing extended (30-months) thienopyridine vs placebo in patients taking aspirin who completed 12 months of DAPT without bleeding or ischemic events after receiving stents. The study was initiated in August 2009 with the last follow-up visit in May 2014. INTERVENTIONS Continued thienopyridine or placebo at months 12 through 30 after stent placement, in 11,648 randomized patients treated with aspirin, of whom 1687 received BMS and 9961 DES. MAIN OUTCOMES AND MEASURES Stent thrombosis, MACCE, and moderate or severe bleeding. RESULTS Among 1687 patients treated with BMS who were randomized to continued thienopyridine vs placebo, rates of stent thrombosis were 0.5% vs 1.11% (n = 4 vs 9; hazard ratio [HR], 0.49; 95% CI, 0.15-1.64; P = .24), rates of MACCE were 4.04% vs 4.69% (n = 33 vs 38; HR, 0.92; 95% CI, 0.57-1.47; P = .72), and rates of moderate/severe bleeding were 2.03% vs 0.90% (n = 16 vs 7; P = .07), respectively. Among all 11,648 randomized patients (both BMS and DES), stent thrombosis rates were 0.41% vs 1.32% (n = 23 vs 74; HR, 0.31; 95% CI, 0.19-0.50; P < .001), rates of MACCE were 4.29% vs 5.74% (n = 244 vs 323; HR, 0.73; 95% CI, 0.62-0.87; P < .001), and rates of moderate/severe bleeding were 2.45% vs 1.47% (n = 135 vs 80; P < .001). CONCLUSIONS AND RELEVANCE Among patients undergoing coronary stent placement with BMS and who tolerated 12 months of thienopyridine, continuing thienopyridine for an additional 18 months compared with placebo did not result in statistically significant differences in rates of stent thrombosis, MACCE, or moderate or severe bleeding. However, the BMS subset may have been underpowered to identify such differences, and further trials are suggested. TRIAL REGISTRATION clinicaltrials.gov Identifier: NCT00977938.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The objectives of this study were to describe a new spinal cord injury scale for dogs, evaluate repeatability through determining inter-rater variability of scores, compare these scores to another established system (a modified Frankel scale), and determine if the modified Frankel scale and the newly developed scale were useful as prognostic indicators for return to ambulation. A group of client-owned dogs with spinal cord injury were examined by 2 independent observers who applied the new Texas Spinal Cord Injury Score (TSCIS) and a modified Frankel scale that has been used previously. The newly developed scale was designed to describe gait, postural reactions and nociception in each limb. Weighted kappa statistics were utilized to determine inter-rater variability for the modified Frankel scale and individual components of the TSCIS. Comparisons were made between raters for the overall TSCIS score and between scales using Spearman's rho. An additional group of dogs with surgically treated thoracolumbar disk herniation was enrolled to look at correlation of both scores with spinal cord signal characteristics on magnetic resonance imaging (MRI) and ambulatory outcome at discharge. The actual agreement between raters for the modified Frankel scale was 88%, with a weighted kappa value of 0.93. The TSCIS had weighted kappa scores for gait, proprioceptive positioning and nociception components that ranged from 0.72 to 0.94. Correlation between raters for the overall TSCIS score was Spearman's rho=0.99 (P<0.001). Comparison of the overall TSCIS score to the modified Frankel score resulted in a Spearman's rho value of 0.90 (P<0.001). The modified Frankel score was weakly correlated with the length of hyperintensity of the spinal cord: L2 vertebral body length ratio on mid-sagittal T2-weighted MRI (Spearman's rho=-0.45, P=0.042) as was the overall TSCIS score (Spearman's rho=-0.47, P=0.037). There was also a significant difference in admitting modified Frankel scores (P=0.029) and admitting overall TSCIS scores (P=0.02) between dogs that were ambulatory at discharge and those that were not. Results from this study suggest that the TSCIS is an easy to administer scale for evaluating canine spinal cord injury based on the standard neurological exam and correlates well with a previously described modified Frankel scale.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

BACKGROUND There are concerns about the effects of in utero exposure to antiretroviral drugs (ARVs) on the development of HIV-exposed but uninfected (HEU) children. The aim of this study was to evaluate whether in utero exposure to ARVs is associated with lower birth weight/height and reduced growth during the first 2 years of life. METHODS This cohort study was conducted among HEU infants born between 1996 and 2010 in Tertiary children's hospital in Rio de Janeiro, Brazil. Weight was measured by mechanical scale, and height was measured by measuring board. Z-scores for weight-for-age (WAZ), length-for-age (LAZ) and weight-for-length were calculated. We modeled trajectories by mixed-effects models and adjusted for mother's age, CD4 cell count, viral load, year of birth and family income. RESULTS A total of 588 HEU infants were included of whom 155 (26%) were not exposed to ARVs, 114 (19%) were exposed early (first trimester) and 319 (54%) later. WAZ were lower among infants exposed early compared with infants exposed later: adjusted differences were -0.52 (95% confidence interval [CI]: -0.99 to -0.04, P = 0.02) at birth and -0.22 (95% CI: -0.47 to 0.04, P = 0.10) during follow-up. LAZ were lower during follow-up: -0.35 (95% CI: -0.63 to -0.08, P = 0.01). There were no differences in weight-for-length scores. Z-scores of infants exposed late during pregnancy were similar to unexposed infants. CONCLUSIONS In HEU children, early exposure to ARVs was associated with lower WAZ at birth and lower LAZ up to 2 years of life. Growth of HEU children needs to be monitored closely.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

OBJECTIVE To illustrate an approach to compare CD4 cell count and HIV-RNA monitoring strategies in HIV-positive individuals on antiretroviral therapy (ART). DESIGN Prospective studies of HIV-positive individuals in Europe and the USA in the HIV-CAUSAL Collaboration and The Center for AIDS Research Network of Integrated Clinical Systems. METHODS Antiretroviral-naive individuals who initiated ART and became virologically suppressed within 12 months were followed from the date of suppression. We compared 3 CD4 cell count and HIV-RNA monitoring strategies: once every (1) 3 ± 1 months, (2) 6 ± 1 months, and (3) 9-12 ± 1 months. We used inverse-probability weighted models to compare these strategies with respect to clinical, immunologic, and virologic outcomes. RESULTS In 39,029 eligible individuals, there were 265 deaths and 690 AIDS-defining illnesses or deaths. Compared with the 3-month strategy, the mortality hazard ratios (95% CIs) were 0.86 (0.42 to 1.78) for the 6 months and 0.82 (0.46 to 1.47) for the 9-12 month strategy. The respective 18-month risk ratios (95% CIs) of virologic failure (RNA >200) were 0.74 (0.46 to 1.19) and 2.35 (1.56 to 3.54) and 18-month mean CD4 differences (95% CIs) were -5.3 (-18.6 to 7.9) and -31.7 (-52.0 to -11.3). The estimates for the 2-year risk of AIDS-defining illness or death were similar across strategies. CONCLUSIONS Our findings suggest that monitoring frequency of virologically suppressed individuals can be decreased from every 3 months to every 6, 9, or 12 months with respect to clinical outcomes. Because effects of different monitoring strategies could take years to materialize, longer follow-up is needed to fully evaluate this question.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Climate variability drives significant changes in the physical state of the North Pacific, and there may be important impacts of this variability on the upper ocean carbon balance across the basin. We address this issue by considering the response of seven biogeochemical ocean models to climate variability in the North Pacific. The models' upper ocean pCO(2) and air-sea CO(2) flux respond similarly to climate variability on seasonal to decadal timescales. Modeled seasonal cycles of pCO(2) and its temperature- and non-temperature-driven components at three contrasting oceanographic sites capture the basic features found in observations (Takahashi et al., 2002, 2006; Keeling et al., 2004; Brix et al., 2004). However, particularly in the Western Subarctic Gyre, the models have difficulty representing the temporal structure of the total pCO(2) seasonal cycle because it results from the difference of these two large and opposing components. In all but one model, the air-sea CO(2) flux interannual variability (1 sigma) in the North Pacific is smaller ( ranges across models from 0.03 to 0.11 PgC/yr) than in the Tropical Pacific ( ranges across models from 0.08 to 0.19 PgC/yr), and the time series of the first or second EOF of the air-sea CO(2) flux has a significant correlation with the Pacific Decadal Oscillation (PDO). Though air-sea CO(2) flux anomalies are correlated with the PDO, their magnitudes are small ( up to +/- 0.025 PgC/yr ( 1 sigma)). Flux anomalies are damped because anomalies in the key drivers of pCO(2) ( temperature, dissolved inorganic carbon (DIC), and alkalinity) are all of similar magnitude and have strongly opposing effects that damp total pCO(2) anomalies.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Anion exchange membranes (AEMs) are a potential method for determining the plant available N status of soils; however, their capacity for use with turfgrass has not been researched extensively. The main objective of this experiment was to determine the relationship between soil nitrate desorbed from AEMs and growth response and quality of turfgrass managed as a residential lawn. Two field experiments were conducted with a bluegrass-ryegrass-fescue mixture receiving four rates of N fertilizer (0, 98, 196, and 392 kg N ha(-1) yr(-1)) with clippings returned or removed. The soils at the two sites were a Paxton fine sandy loam (coarse-loamy, mixed, active, mesic Oxyaquic Dystrudepts) and a variant of a Hinckley gravelly sandy loam (sandy-skeletal, mixed, mesic Typic Udorthents). Anion exchange membranes were inserted into plots and exchanged weekly during the growing seasons of 1998 and 1999. Nitrate-N was desorbed from AEMs and quantified. As N fertilization rates increased, desorbed NO3-N increased. The relationship of desorbed NO3-N from AEMs to clipping yield and turfgrass quality was characterized using quadratic response plateau (QRP) and Cate-Nelson models (C-Ns). Critical levels of desorbed NO3-N ranged from 0.86 to 8.0 microgram cm(-2) d(-1) for relative dry matter yield (DMY) and from 2.3 to 12 microgram cm(-2) d(-1) for turfgrass quality depending upon experimental treatment. Anion exchange membranes show promise of indicating the critical levels of soil NO3-N desorbed from AEMs necessary to achieve maximum turfgrass quality and yield without overapplication of N.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Background. Surgical site infections (SSI) are one of the most common nosocomial infections in the United States. This study was conducted following an increase in the rate of SSI following spinal procedures at the study hospital. ^ Methods. This study examined patient and hospital associated risk factors for SSI using existing data on patients who had spinal surgery performed at the study hospital between December 2003 and August 2005. There were 59 patients with SSI identified as cases; controls were randomly selected from patients who had spinal procedures performed at the study hospital during the study period, but did not develop infection. Of the 245 original records reviewed, 5% were missing more than half the variables and were eliminated from the data set. A total of 234 patients were included in the final analysis, representing 55 cases and 179 controls. Multivariable analysis was conducted using logistic regression to control for confounding variables. ^ Results. Three variables were found to be significant risk factors for SSI in the study population: presence of comorbidities (odds ratio 3.15, 95% confidence interval 1.20 to 8.26), cut time above the population median of 100 minutes (odds ratio 2.98, 95% confidence interval 1.12 to 5.49), and use of iodine only for preoperative skin antisepsis (odds ratio 0.16, 95% confidence interval 0.06 to 0.45). Several risk factors of specific concern to the study hospital, such as operating room, hospital staff involved in the procedures and workers' compensation status, were not shown to be statistically significant. In addition, multiple factors that have been identified in prior studies, such as method of hair removal, smoking status, or incontinence, were not shown to be statistically significant in this population. ^ Conclusions. This study confirms that increased cut time is a risk for post-operative infection. Use of iodine only was found to decrease risk of infection; further study is recommended in a population with higher usage of chlorhexadine gluconate. Presence of comorbidities at the time of surgery was also found to be a risk factor for infection; however, specific comorbidities were not studied. ^

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Hypertension (HT) is mediated by the interaction of many genetic and environmental factors. Previous genome-wide linkage analysis studies have found many loci that show linkage to HT or blood pressure (BP) regulation, but the results were generally inconsistent. Gene by environment interaction is among the reasons that potentially explain these inconsistencies between studies. Here we investigate influences of gene by smoking (GxS) interaction on HT and BP in European American (EA), African American (AA) and Mexican American (MA) families from the GENOA study. A variance component-based method was utilized to perform genome-wide linkage analysis of systolic blood pressure (SBP), diastolic blood pressure (DBP), and HT status, as well as bivariate analysis for SBP and DBP for smokers, non-smokers, and combined groups. The most significant results were found for SBP in MA. The strongest signal was for chromosome 17q24 (LOD = 4.2), increased to (LOD = 4.7) in bivariate analysis but there was no evidence of GxS interaction at this locus (p = 0.48). Two signals were identified only in one group: on chromosome 15q26.2 (LOD = 3.37) in non-smokers and chromosome 7q21.11 (LOD = 1.4) in smokers, both of which had strong evidence for GxS interaction (p = 0.00039 and 0.009 respectively). There were also two other signals, one on chromosome 20q12 (LOD = 2.45) in smokers, which became much higher in the combined sample (LOD = 3.53), and one on chromosome 6p22.2 (LOD = 2.06) in non-smokers. Neither peak had very strong evidence for GxS interaction (p = 0.08 and 0.06 respectively). A fine mapping association study was performed using 200 SNPs in 30 genes located under the linkage signals on chromosomes 15 and 17. Under the chromosome 15 peak, the association analysis identified 6 SNPs accounting for a 7 mmHg increase in SBP in MA non-smokers. For the chromosome 17 linkage peak, the association analysis identified 3 SNPs accounting for a 6 mmHg increase in SBP in MA. However, none of these SNPs was significant after correcting for multiple testing, and accounting for them in the linkage analysis produced very small reductions in the linkage signal. ^ The linkage analysis of BP traits considering the smoking status produced very interesting signals for SBP in the MA population. The fine mapping association analysis gave some insight into the contribution of some SNPs to two of the identified signals, but since these SNPs did not remain significant after multiple testing correction and did not explain the linkage peaks, more work is needed to confirm these exploratory results and identify the culprit variations under these linkage peaks. ^

Relevância:

100.00% 100.00%

Publicador:

Resumo:

High rates of overweight and obesity in African American women have been attributed, in part, to poor health habits, such as physical inactivity, and cultural influences on body image perceptions. The purpose of this study was to determine the relationship among body mass index (BMI=kg/m2), body image perception (perceived and desired) and physical activity, both self-reported and objectively measured. Anthropometric measures of BMI and Pulvers' culturally relevant body image, physical activity and demographic data were collected from 249 African American women in Houston. Women ( M = 44.8 yrs, SD = 9.5) were educated (53% college graduates) and were overweight (M = 35.0 kg/m2, SD = 9.2). Less than half of women perceived their weight correctly regardless of their actual weight (p < 0.001). Nearly three-fourths (73.9%) of women who were normal weight desired to be obese, and only 39.4% of women desired to be normal weight, regardless of actual or perceived weight. Women in all weight classes (normal, overweight and obese) varied in objective measures of physical activity (F(2,112) = 4.424, p = .014). Regression analyses showed objectively measured physical activity was significantly associated with BMI ( Beta = -2.45, p < .01) and self-reported walking was significantly associated with perceived BMI (Beta = -.156, p = .017). Results suggest African American women who are smaller want to be larger and African American women who are larger want to be smaller, revealing dichotomous distortion in body images. Low rates of physical activity may be a factor. Research is needed to increase physical activity levels in African American women, leading to improved satisfaction with normal weight as desirable for health and beauty. Supported by NCI (NIH) 1R01CA109403. ^

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Background. Obstructive genitourinary defects include all anomalies causing obstruction anywhere along the urinary tract. Previous studies have noted a large excess of males among infants affected by these types of defects. This is the first epidemiologic study focused solely on obstructive genitourinary defects (OGD). ^ Methods. Data on 1,683 mild and 302 severe cases of isolated OGD born between 1999 and 2003 and ascertained by the Texas Birth Defects Registry were compared to all births in Texas during the same time period. Adjusted prevalence odds ratios (POR) were calculated for infant sex, birth weight, gestational age, mother’s race/ethnicity, mother’s age, mother’s education, parity, birth year, start of prenatal care, multiple birth, and public health region of birth. Severe cases were defined as those cases that died prior to birth, died after birth, or underwent surgery for OGD in the first year of life. Cases of OGD that had other major birth defects besides OGD were excluded from this study. ^ Results. Severe cases of OGD were more likely than mild cases to have multiple obstructive genitourinary anomalies (37.8% vs. 18.9%) and bilateral defects (40.9% vs. 31.3%). Males had a significantly greater risk of OGD than females for both severe and mild cases: adjusted POR = 3.26 (95% CI = 2.45-4.33) and adjusted POR = 2.60 (95% CI = 2.33-2.90), respectively. Infants with both severe and mild OGD were more likely to be very preterm birth at birth compared with infants without OGD: crude POR of 16.19 (95% CI = 10.60-24.74) and 4.75 (95% CI = 3.54-6.37), respectively. Among the severe group, minority races had a decreased risk of OGD with an adjusted POR of 0.74 (95% CI = 0.55-0.98) compared with whites. Among the mild cases, increased rates of OGD were found in older mothers (adjusted POR = 1.10, 95% CI = 1.05-1.15), college/higher educated mothers (adjusted POR = 1.07, 95% CI = 1.01-1.13) and multiple births (adjusted POR = 1.28, 95% CI = 1.01-1.62). There was also a decreased risk of mild cases among black mothers compared to whites (adjusted POR = 0.63, 95% CI = 0.52-0.76). Compared to 1999, the prevalence of mild cases of OGD increased significantly over the 5 year study period with an adjusted POR of 1.10 (95% CI = 1.06-1.15) by 2003. ^ Conclusion. Risk factors of OGD for both severe and mild forms were male sex and preterm birth. Severe cases were more likely to have multiple OGD defects and be affected bilaterally. An increase in prevalence of mild cases of OGD over time and differences in rates of black, older, and higher educated mothers in mild cases may be attributed to ultrasound use. ^