801 resultados para Hold-up risk
Resumo:
INTRODUCTION The high risk of cardiovascular events in smokers requires adequate control of other cardiovascular risk factors (CVRFs) to curtail atherosclerosis progression. However, it is unclear which CVRFs have the most influence on atherosclerosis progression in smokers. METHODS In 260 smokers aged 40-70 included in a smoking cessation trial, we analyzed the association between traditional CVRFs, high-sensitivity C-reactive protein (hs-CRP), smoking cessation and 3-year progression of carotid intima-media thickness (CIMT, assessed by repeated ultrasound measurements) in a longitudinal multivariate model. RESULTS Participants (mean age 52 years, 47% women) had a mean smoking duration of 32 years with a median daily consumption of 20 cigarettes. Baseline CIMT was 1185 μm (95% confidence interval [CI]: 1082-1287) and increased by 93 μm (95% CI: 25-161) and 108 μm (95% CI: 33-183) after 1 and 3 years, respectively. Age, male sex, daily cigarette consumption, systolic blood pressure (SBP), but neither low-density lipoprotein cholesterol nor hs-CRP, were independently associated with baseline CIMT (all P ≤ .05). Baseline SBP, but neither low-density lipoprotein cholesterol nor hs-CRP, was associated with 3-year atherosclerosis progression (P = .01 at 3 years). The higher the SBP at baseline, the steeper was the CIMT increase over 3-year follow-up. We found an increase of 26 μm per each 10-mmHg raise in SBP at 1 year and an increase of 39 μm per each 10 mmHg raise in SBP at 3 years. Due to insufficient statistical power, we could not exclude an effect of smoking abstinence on CIMT progression. CONCLUSION Control of blood pressure may be an important factor to limit atherosclerosis progression in smokers, besides support for smoking cessation. IMPLICATIONS Among 260 smokers aged 40-70 years with a mean smoking duration of 32 years, baseline SBP was associated with atherosclerosis progression over 3 years, as measured by CIMT (P = .01 at 3 years), independently of smoking variables and other CVRFs. The higher the SBP at baseline, the steeper was the CIMT increase over 3-year follow-up. Our findings emphasize the importance of focusing not only on smoking cessation among smokers, but to simultaneously control other CVRFs, particularly blood pressure, in order to prevent future cardiovascular disease.
Alcoholic Cirrhosis Increases Risk for Autoimmune Diseases: A Nationwide Registry-Based Cohort Study
Resumo:
BACKGROUND & AIMS Alcoholic cirrhosis is associated with hyperactivation and dysregulation of the immune system. In addition to its ability to increase risk for infections, it also may increase the risk for autoimmune diseases. We studied the incidence of autoimmune diseases among patients with alcoholic cirrhosis vs controls in Denmark. METHODS We collected data from nationwide health care registries to identify and follow up all citizens of Denmark diagnosed with alcoholic cirrhosis from 1977 through 2010. Each patient was matched with 5 random individuals from the population (controls) of the same sex and age. The incidence rates of various autoimmune diseases were compared between patients with cirrhosis and controls and adjusted for the number of hospitalizations in the previous year (a marker for the frequency of clinical examination). RESULTS Of the 24,679 patients diagnosed with alcoholic cirrhosis, 532 developed an autoimmune disease, yielding an overall increased adjusted incidence rate ratio (aIRR) of 1.36 (95% confidence interval [CI], 1.24-1.50). The strongest associations were with Addison's disease (aIRR, 2.47; 95% CI, 1.04-5.85), inflammatory bowel disease (aIRR, 1.56; 95% CI, 1.26-1.92), celiac disease (aIRR, 5.12; 95% CI, 2.58-10.16), pernicious anemia (aIRR, 2.35; 95% CI, 1.50-3.68), and psoriasis (aIRR, 4.06; 95% CI, 3.32-4.97). There was no increase in the incidence rate for rheumatoid arthritis (aIRR, 0.89; 95% CI, 0.69-1.15); the incidence rate for polymyalgia rheumatica decreased in patients with alcoholic cirrhosis compared with controls (aIRR, 0.47; 95% CI, 0.33-0.67). CONCLUSIONS Based on a nationwide cohort study of patients in Denmark, alcoholic cirrhosis is a risk factor for several autoimmune diseases.
Resumo:
BACKGROUND & AIMS Non-selective beta-blockers (NSBB) are used in patients with cirrhosis and oesophageal varices. Experimental data suggest that NSBB inhibit angiogenesis and reduce bacterial translocation, which may prevent hepatocellular carcinoma (HCC). We therefore assessed the effect of NSBB on HCC by performing a systematic review with meta-analyses of randomized trials. METHODS Electronic and manual searches were combined. Authors were contacted for unpublished data. Included trials assessed NSBB for patients with cirrhosis; the control group could receive any other intervention than NSBB. Fixed and random effects meta-analyses were performed with I(2) as a measure of heterogeneity. Subgroup, sensitivity, regression and sequential analyses were performed to evaluate heterogeneity, bias and the robustness of the results after adjusting for multiple testing. RESULTS Twenty-three randomized trials on 2618 patients with cirrhosis were included, of which 12 reported HCC incidence and 23 reported HCC mortality. The mean duration of follow-up was 26 months (range 8-82). In total, 47 of 694 patients randomized to NSBB developed HCC vs 65 of 697 controls (risk difference -0.026; 95% CI-0.052 to -0.001; number needed to treat 38 patients). There was no heterogeneity (I(2) = 7%) or evidence of small study effects (Eggers P = 0.402). The result was not confirmed in sequential analysis, which suggested that 3719 patients were needed to achieve the required information size. NSBB did not reduce HCC-related mortality (RD -0.011; 95% CI -0.040 to 0.017). CONCLUSIONS Non-selective beta-blockers may prevent HCC in patients with cirrhosis.
Resumo:
BACKGROUND The safety and efficacy of new-generation drug-eluting stents (DES) in women with multiple atherothrombotic risk (ATR) factors is unclear. METHODS AND RESULTS We pooled patient-level data for women enrolled in 26 randomized trials. Study population was categorized based on the presence or absence of high ATR, which was defined as having history of diabetes mellitus, prior percutaneous or surgical coronary revascularization, or prior myocardial infarction. The primary end point was major adverse cardiovascular events defined as a composite of all-cause mortality, myocardial infarction, or target lesion revascularization at 3 years of follow-up. Out of 10 449 women included in the pooled database, 5333 (51%) were at high ATR. Compared with women not at high ATR, those at high ATR had significantly higher risk of major adverse cardiovascular events (15.8% versus 10.6%; adjusted hazard ratio: 1.53; 95% confidence interval: 1.34-1.75; P=0.006) and all-cause mortality. In high-ATR risk women, the use of new-generation DES was associated with significantly lower risk of 3-year major adverse cardiovascular events (adjusted hazard ratio: 0.69; 95% confidence interval: 0.52-0.92) compared with early-generation DES. The benefit of new-generation DES on major adverse cardiovascular events was uniform between high-ATR and non-high-ATR women, without evidence of interaction (Pinteraction=0.14). At landmark analysis, in high-ATR women, stent thrombosis rates were comparable between DES generations in the first year, whereas between 1 and 3 years, stent thrombosis risk was lower with new-generation devices. CONCLUSIONS Use of new-generation DES even in women at high ATR is associated with a benefit consistent over 3 years of follow-up and a substantial improvement in very-late thrombotic safety.
Resumo:
PURPOSE The safe clinical implementation of pencil beam scanning (PBS) proton therapy for lung tumors is complicated by the delivery uncertainties caused by breathing motion. The purpose of this feasibility study was to investigate whether a voluntary breath-hold technique could limit the delivery uncertainties resulting from interfractional motion. METHODS AND MATERIALS Data from 15 patients with peripheral lung tumors previously treated with stereotactic radiation therapy were included in this study. The patients had 1 computed tomographic (CT) scan in voluntary breath-hold acquired before treatment and 3 scans during the treatment course. PBS proton treatment plans with 2 fields (2F) and 3 fields (3F), respectively, were calculated based on the planning CT scan and subsequently recalculated on the 3 repeated CT scans. Recalculated plans were considered robust if the V95% (volume receiving ≥95% of the prescribed dose) of the gross target volume (GTV) was within 5% of what was expected from the planning CT data throughout the simulated treatment. RESULTS A total of 14/15 simulated treatments for both 2F and 3F met the robustness criteria. Reduced V95% was associated with baseline shifts (2F, P=.056; 3F, P=.008) and tumor size (2F, P=.025; 3F, P=.025). Smaller tumors with large baseline shifts were also at risk for reduced V95% (interaction term baseline/size: 2F, P=.005; 3F, P=.002). CONCLUSIONS The breath-hold approach is a realistic clinical option for treating lung tumors with PBS proton therapy. Potential risk factors for reduced V95% are small targets in combination with large baseline shifts. On the basis of these results, the baseline shift of the tumor should be monitored (eg, through image guided therapy), and appropriate measures should be taken accordingly. The intrafractional motion needs to be investigated to confirm that the breath-hold approach is robust.
Resumo:
We used meat-inspection data collected over a period of three years in Switzerland to evaluate slaughterhouse-level, farm-level and animal-level factors that may be associated with whole carcass condemnation (WCC) in cattle after slaughter. The objective of this study was to identify WCC risk factors so they can be communicated to, and managed by, the slaughter industry and veterinary services. During meat inspection, there were three main important predictors of the risk of WCC; the slaughtered animal's sex, age, and the size of the slaughterhouse it was processed in. WCC for injuries and significant weight loss (visible welfare indicators) were almost exclusive to smaller slaughterhouses. Cattle exhibiting clinical syndromes that were not externally visible (e.g. pneumonia lesions) and that are associated with fattening of cattle, end up in larger slaughterhouses. For this reason, it is important for animal health surveillance to collect data from both types of slaughterhouses. Other important risk factors for WCC were on-farm mortality rate and the number of cattle on the farm of origin. This study highlights the fact that the many risk factors for WCC are as complex as the production system itself, with risk factors interacting with one another in ways which are sometimes difficult to interpret biologically. Risk-based surveillance aimed at farms with reoccurring health problems (e.g. a history of above average condemnation rates) may be more appropriate than the selection, of higher-risk animals arriving at slaughter. In Switzerland, the introduction of a benchmarking system that would provide feedback to the farmer with information on condemnation reasons, and his/her performance compared to the national/regional average could be a first step towards improving herd-management and financial returns for producers.
Resumo:
Sentinel lymph node (SLN) detection techniques have the potential to change the standard of surgical care for patients with prostate cancer. We performed a lymphatic mapping study and determined the value of fluorescence SLN detection with indocyanine green (ICG) for the detection of lymph node metastases in intermediate- and high-risk patients undergoing radical prostatectomy and extended pelvic lymph node dissection. A total of 42 patients received systematic or specific ICG injections into the prostate base, the midportion, the apex, the left lobe, or the right lobe. We found (1) that external and internal iliac regions encompass the majority of SLNs, (2) that common iliac regions contain up to 22% of all SLNs, (3) that a prostatic lobe can drain into the contralateral group of pelvic lymph nodes, and (4) that the fossa of Marcille also receives significant drainage. Among the 12 patients who received systematic ICG injections, 5 (42%) had a total of 29 lymph node metastases. Of these, 16 nodes were ICG positive, yielding 55% sensitivity. The complex drainage pattern of the prostate and the low sensitivity of ICG for the detection of lymph node metastases reported in our study highlight the difficulties related to the implementation of SNL techniques in prostate cancer. PATIENT SUMMARY There is controversy about how extensive lymph node dissection (LND) should be during prostatectomy. We investigated the lymphatic drainage of the prostate and whether sentinel node fluorescence techniques would be useful to detect node metastases. We found that the drainage pattern is complex and that the sentinel node technique is not able to replace extended pelvic LND.
Resumo:
Background. Nosocomial invasive aspergillosis (a highly fatal disease) is an increasing problem for immunocompromised patients. Aspergillus spp. can be transmitted via air (most commonly) and by water. ^ The hypothesis for this prospective study was that there is an association between patient occupancy, housekeeping practices, patients, visitors, and Aspergillus spp. loading. Rooms were sampled as not terminally cleaned (dirty) and terminally cleaned (clean). The secondary hypothesis was that Aspergillus spp. positive samples collected from more than one sampling location within the same patient room represent the same isolate. ^ Methods. Between April and October 2004, 2873 environmental samples (713 air, 607 water, 1256 surface and 297 spore traps) were collected in and around 209 “clean” and “dirty” patient rooms in a large cancer center hospital. Water sources included aerosolized water from patient room showerheads, sinks, drains, and toilets. Bioaerosol samples were from the patient room and from the running shower, flushing toilet, and outside the building. The surface samples included sink and shower drains, showerheads, and air grills. Aspergillus spp. positive samples were also sent for PCR, molecular typing (n = 89). ^ Results. All water samples were negative for Aspergillus spp. There were a total of 130 positive culturable samples (5.1%). The predominant species found was Aspergillus niger. Of the positive culturable samples, 106 (14.9%) were air and 24 (3.8%) were surface. There were 147 spore trap samples, and 49.5% were positive for Aspergillus/Penicillum spp. Of the culturable positive samples sent for PCR, 16 were indistinguishable matches. There was no significant relationship between air and water samples and positive samples from the same room. ^ Conclusion. Primarily patients, visitors and staff bring the Aspergillus spp. into the hospital. The high number of A. niger samples suggests the spores are entering the hospital from outdoors. Eliminating the materials brought to the patient floors from the outside, requiring employees, staff, and visitors to wear cover up over their street clothes, and improved cleaning procedures could further reduce positive samples. Mold strains change frequently; it is probably more significant to understand pathogenicity of viable spores than to commit resources on molecular strain testing on environmental samples alone. ^
Resumo:
Background. The rise in survival rates along with more detailed follow-up using sophisticated imaging studies among non-small lung cancer (NSCLC) patients has led to an increased risk of second primary tumors (SPT) among these cases. Population and hospital based studies of lung cancer patients treated between 1974 and 1996 have found an increasing risk over time for the development of all cancers following treatment of non-small cell lung cancer (NSCLC). During this time the primary modalities for treatment were surgery alone, radiation alone, surgery and post-operative radiation therapy, or combinations of chemotherapy and radiation (sequentially or concurrently). There is limited information in the literature about the impact of treatment modalities on the development of second primary tumors in these patients. ^ Purpose. To investigate the impact of treatment modalities on the risk of second primary tumors in patients receiving treatment with curative intent for non-metastatic (Stage I–III) non-small cell lung cancer (NSCLC). ^ Methods. The hospital records of 1,095 NSCLC patients who were diagnosed between 1980–2001 and received treatment with curative intent at M.D. Anderson Cancer Center with surgery alone, radiation alone (with a minimum total radiation dose of at least 45Gy), surgery and post-operative radiation therapy, radiation therapy in combination with chemotherapy or surgery in combination with chemotherapy and radiation were retrospectively reviewed. A second primary malignancy was be defined as any tumor histologically different from the initial cancer, or of another anatomic location, or a tumor of the same location and histology as the initial tumor having an interval between cancers of at least five years. Only primary tumors occurring after treatment for NSCLC will qualified as second primary tumors for this study. ^ Results. The incidence of second primary tumor was 3.3%/year and the rate increased over time following treatment. The type of NSCLC treatment was not found to have a striking effect upon SPT development. Increased rates were observed in the radiation only and chemotherapy plus radiation treatment groups; but, these increases did not exceed expected random variation. Higher radiation treatment dose, patient age and weight loss prior to index NSCLC treatment were associated with higher SPT development. ^
Resumo:
Purpose. To determine if self-efficacy (SE) changes predicted total fat (TF) and total fiber (TFB) intake and the relationship between SE changes and the two dietary outcomes. ^ Design. This is a secondary analysis, utilizing baseline and first follow up (FFU) data from the NULIFE, a randomized trial. ^ Setting. Nutrition classes were taught in the Texas Medical Center in Houston, Texas. ^ Participants. 79 pre-menopausal, 25--45 year old African American women with an 85% response rate at FFU. ^ Method. Dietary intake was assessed with the Arizona Food Frequency Questionnaire and SE with the Self Efficacy for Dietary Change Questionnaire. Analysis was done using Stata version 9. Linear and logistic regression was used with adjustment for confounders. ^ Results. Linear regression analyses showed that SE changes for eating fruits and vegetables predicted total fiber intake in the control group for both the univariate (P = 0.001) and multivariate (P = 0.01) models while SE for eating fruits and vegetables at first follow-up predicted total fiber intake in the intervention for both models (P = 0.000). Logistic regression analyses of low fat SE changes and 30% or less for total fat intake, showed an adjusted OR of 0.22 (95% CI = 0.03, 1.48; P = 0.12) in the intervention group. The logistic regression analyses of SE changes in fruits and vegetables and 10g or more for total fiber intake, showed an adjusted OR of 6.25 (95% CI = 0.53, 72.78; P = 0.14) in the control group. ^ Conclusion. SE for eating fruits and vegetables at first follow-up predicted intervention groups' TFB intake and intervention women that increased their SE for eating a low fat diet were more likely to achieve the study goal of 30% or less calories from TF. SE changes for eating fruits and vegetables predicted the control's TFB intake and control women that increased their SE for eating fruits and vegetables were more likely to achieve the study goal of 10 g or more from TFB. Limitations are use of self-report measures, small sample size, and possible control group contamination.^
Resumo:
Apolipoprotein E (ApoE) plays a major role in the metabolism of high density and low density lipoproteins (HDL and LDL). Its common protein isoforms (E2, E3, E4) are risk factors for coronary artery disease (CAD) and explain between 16 to 23% of the inter-individual variation in plasma apoE levels. Linkage analysis has been completed for plasma apoE levels in the GENOA study (Genetic Epidemiology Network of Atherosclerosis). After stratification of the population by lipoprotein levels and body mass index (BMI) to create more homogeneity with regard to biological context for apoE levels, Hispanic families showed significant linkage on chromosome 17q for two strata (LOD=2.93 at 104 cM for a low cholesterol group, LOD=3.04 at 111 cM for a low cholesterol, high HDLC group). Replication of 17q linkage was observed for apoB and apoE levels in the unstratified Hispanic and African-American populations, and for apoE levels in African-American families. Replication of this 17q linkage in different populations and strata provides strong support for the presence of gene(s) in this region with significant roles in the determination of inter-individual variation in plasma apoE levels. Through a positional and functional candidate gene approach, ten genes were identified in the 17q linked region, and 62 polymorphisms in these genes were genotyped in the GENOA families. Association analysis was performed with FBAT, GEE, and variance-component based tests followed by conditional linkage analysis. Association studies with partial coverage of TagSNPs in the gene coding for apolipoprotein H (APOH) were performed, and significant results were found for 2 SNPs (APOH_20951 and APOH_05407) in the Hispanic low cholesterol strata accounting for 3.49% of the inter-individual variation in plasma apoE levels. Among the other candidate genes, we identified a haplotype block in the ACE1 gene that contains two major haplotypes associated with apoE levels as well as total cholesterol, apoB and LDLC levels in the unstratified Hispanic population. Identifying genes responsible for the remaining 60% of inter-individual variation in plasma apoE level, will yield new insights into the understanding of genetic interactions involved in the lipid metabolism, and a more precise understanding of the risk factors leading to CAD. ^
Resumo:
Several studies have examined the association between high glycemic index (GI) and glycemic load (GL) diets and the risk for coronary heart disease (CHD). However, most of these studies were conducted primarily on white populations. The primary aim of this study was to examine whether high GI and GL diets are associated with increased risk for developing CHD in whites and African Americans, non-diabetics and diabetics, and within stratifications of body mass index (BMI) and hypertension (HTN). Baseline and 17-year follow-up data from ARIC (Atherosclerosis Risk in Communities) study was used. The study population (13,051) consisted of 74% whites, 26% African Americans, 89% non-diabetics, 11% diabetics, 43% male, 57% female aged 44 to 66 years at baseline. Data from the ARIC food frequency questionnaire at baseline were analyzed to provide GI and GL indices for each subject. Increases of 25 and 30 units for GI and GL respectively were used to describe relationships on incident CHD risk. Adjusted hazard ratios for propensity score with 95% confidence intervals (CI) were used to assess associations. During 17 years of follow-up (1987 to 2004), 1,683 cases of CHD was recorded. Glycemic index was associated with 2.12 fold (95% CI: 1.05, 4.30) increased incident CHD risk for all African Americans and GL was associated with 1.14 fold (95% CI: 1.04, 1.25) increased CHD risk for all whites. In addition, GL was also an important CHD risk factor for white non-diabetics (HR=1.59; 95% CI: 1.33, 1.90). Furthermore, within stratum of BMI 23.0 to 29.9 in non-diabetics, GI was associated with an increased hazard ratio of 11.99 (95% CI: 2.31, 62.18) for CHD in African Americans, and GL was associated with 1.23 fold (1.08, 1.39) increased CHD risk in whites. Body mass index modified the effect of GI and GL on CHD risk in all whites and white non-diabetics. For HTN, both systolic blood pressure and diastolic blood pressure modified the effect on GI and GL on CHD risk in all whites and African Americans, white and African American non-diabetics, and white diabetics. Further studies should examine other factors that could influence the effects of GI and GL on CHD risk, including dietary factors, physical activity, and diet-gene interactions. ^
Resumo:
Screening for latent tuberculosis infection (LTBI) is an integral component of an effective tuberculosis control strategy, but one that is often relegated to the lowest priority. In a state with higher than national average rates of tuberculosis, due consideration should be given to LTBI screening. Recent large scale contact investigations in the middle school of Del Rio, Texas, raised questions about the status of school screening for LTBI. An evidence based approach was used to evaluate school screening in high risk areas of Texas. A review of the literature revealed that the current recommendations for LTBI screening in children is based on administration of a risk factor questionnaire that should be based on the four main risk factors for LTBI in children that have been identified. Six representative areas in Texas were identified for evaluation of the occurrence of contact investigations in schools for the period of 2006 to 2009 and any use of school screening programs. Of the five reporting areas that responded, only one utilized a school screening program; this reporting area had the lowest percentage of contact investigations occurring in schools. Contact investigations were most common in middle schools and least common in elementary schools. In metropolitan areas, colleges represented up to 42.9% of contact investigations. The number of contact investigations has increased from 2006 to 2008. This report represents a small sample, and further research into the frequency, distribution and risk for contact investigations in schools and the efficacy of screening programs should be done. ^
Resumo:
Background. Primary liver cancer, the majority of which is hepatocellular carcinoma, is the third most common cause of mortality from cancer. It has one of the worst prognosis outcomes and an overall 5-year survival of only 5-6%. Hepatocellular carcinoma has been shown to have wide variations in geographic distribution and there is a marked difference in the incidence between different races and gender. Previously low-rate countries, including the US, have shown to have doubled the incidence of HCC during the past two decades. Even though the incidence of HCC is higher in males as compared to females, female hormones, especially estrogens have been postulated to have a role in the development of hepatocellular carcinoma on a molecular level. Despite the frequent usage of oral contraceptive pills (OCP) and previously, hormone replacement therapy (HRT), their role on HCC development has not been studied thoroughly. We aim to examine the association between exogenous hormone intake (oral contraceptives and post-menopausal hormone replacement therapy) and the development of HCC. Methods. This study is part of an ongoing hospital-based case-control study which is conducted at the Department of Gastrointestinal Oncology at The University of Texas M. D. Anderson Cancer Center. From January 2005 up to January 2008, a total of 77 women with pathologically confirmed hepatocellular carcinoma (cases) and 277 healthy women (controls) were included in the investigation. Information about the use of hormonal contraceptives, hormone replacement therapy and risk factors of hepatocellular cancer was collected by personal interview. Univariate and multivariate logistic regression analyses were done to estimate the crude odds ratios (OR) and adjusted odds ratios (AOR). Results. We found statistically significant protective effect for the use of HRT on the development of HCC, AOR=0.42 (95% CI, 0.21, 0.81). The significance was observed for estrogen replacement, AOR=0.43 (95% CI, 0.22, 0.83) and not for progesterone replacement, AOR=0.49 (95% CI, 0.10, 2.35). On the other hand, any hormonal contraceptive use, which encompasses oral contraceptive pills, implants and injections, did not show a statistical significance either in the crude OR=0.58 (95% CI, 0.33, 1.01) or AOR=0.56 (95% CI 0.26, 1.18). Conclusions. As corroborated by previous studies, HRT confers 58% HCC risk reduction among American women. The more important question of the association between hormonal contraceptives and HCC remains controversial. Further studies are warranted to explore the mechanism of the protective effect of HRT and the relationship between hormonal contraception and HCC.^
Resumo:
Earlier age at puberty is a known risk factor for breast cancer and suspected to influence prostate cancer; yet few studies have assessed early life risk factors for puberty. The overall objectives was to determine the relationship between birth-weight-for-gestational-age (BWGA), weight gain in infancy and pubertal status in girls and boys at 10.8 and 11.8 years and who were born of preeclamptic (PE) and normotensive (NT) mothers. Data for this study were collected from hospital and public health medical records and at a follow-up visit at 10.8 and 11.8 years for girls and boys, respectively. We used stratified analysis and multivariable logistic regression modeling to assess effect measure modifier and to determine the relationship between BWGA, weight gain in infancy and childhood and pubertal status, respectively. ^ There was no difference in the relationship between BWGA and pubertal status by maternal PE status for girls and boys; however, there was a non-significant increase in the odds of having been born small-for-gestational-age (SGA) in girls who were pubertal for breast or pubic hair Tanner stage 2+ compared to those who B1 or PH1. In contrast, boys who were pubertal for genital and pubic hair Tanner stage 2+ had lower odds of having been born SGA than those who were prepubertal for G1 or PH1. ^ In girls who were pubertal for breast development, the odds of having gained one additional unit SD for weight was highest between 3 to 6 months and 6-12 months for those who were B2+ vs. B1. For pubic hair development, weight gain between 6-12 months had the greatest effect for girls of PE mothers only. In boys, there were no statistically significant associations between weight gain and genital Tanner stage at any of the intervals; however, weight gain between 3-6 months did affect pubic hair tanner stage in boys of NT mothers. This study provide important evidence regarding the role of SGA and weight gain at specific age intervals on puberty; however, larger studies need to shed light on modifiable exposures for behavioral interventions in pregnancy, postpartum and in childhood.^