867 resultados para Poisson Regression
Resumo:
Background. Although acquired immune deficiency syndrome-associated morbidity has diminished due to excellent viral control, multimorbidity may be increasing among human immunodeficiency virus (HIV)-infected persons compared with the general population. Methods. We assessed the prevalence of comorbidities and multimorbidity in participants of the Swiss HIV Cohort Study (SHCS) compared with the population-based CoLaus study and the primary care-based FIRE (Family Medicine ICPC-Research using Electronic Medical Records) records. The incidence of the respective endpoints were assessed among SHCS and CoLaus participants. Poisson regression models were adjusted for age, sex, body mass index, and smoking. Results. Overall, 74 291 participants contributed data to prevalence analyses (3230 HIV-infected; 71 061 controls). In CoLaus, FIRE, and SHCS, multimorbidity was present among 26%, 13%, and 27% of participants. Compared with nonsmoking individuals from CoLaus, the incidence of cardiovascular disease was elevated among smoking individuals but independent of HIV status (HIV-negative smoking: incidence rate ratio [IRR] = 1.7, 95% confidence interval [CI] = 1.2-2.5; HIV-positive smoking: IRR = 1.7, 95% CI = 1.1-2.6; HIV-positive nonsmoking: IRR = 0.79, 95% CI = 0.44-1.4). Compared with nonsmoking HIV-negative persons, multivariable Poisson regression identified associations of HIV infection with hypertension (nonsmoking: IRR = 1.9, 95% CI = 1.5-2.4; smoking: IRR = 2.0, 95% CI = 1.6-2.4), kidney (nonsmoking: IRR = 2.7, 95% CI = 1.9-3.8; smoking: IRR = 2.6, 95% CI = 1.9-3.6), and liver disease (nonsmoking: IRR = 1.8, 95% CI = 1.4-2.4; smoking: IRR = 1.7, 95% CI = 1.4-2.2). No evidence was found for an association of HIV-infection or smoking with diabetes mellitus. Conclusions. Multimorbidity is more prevalent and incident in HIV-positive compared with HIV-negative individuals. Smoking, but not HIV status, has a strong impact on cardiovascular risk and multimorbidity.
Resumo:
Children living near highways are exposed to higher concentrations of traffic-related carcinogenic pollutants. Several studies reported an increased risk of childhood cancer associated with traffic exposure, but the published evidence is inconclusive. We investigated whether cancer risk is associated with proximity of residence to highways in a nation-wide cohort study including all children aged <16 years from Swiss national censuses in 1990 and 2000. Cancer incidence was investigated in time to event analyses (1990-2008) using Cox proportional hazards models and incidence density analyses (1985-2008) using Poisson regression. Adjustments were made for socio-economic factors, ionising background radiation and electromagnetic fields. In time to event analysis based on 532 cases the adjusted hazard ratio for leukaemia comparing children living <100 m from a highway with unexposed children (≥500 m) was 1.43 (95 % CI 0.79, 2.61). Results were similar in incidence density analysis including 1367 leukaemia cases (incidence rate ratio (IRR) 1.57; 95 % CI 1.09, 2.25). Associations were similar for acute lymphoblastic leukaemia (IRR 1.64; 95 % CI 1.10, 2.43) and stronger for leukaemia in children aged <5 years (IRR 1.92; 95 % CI 1.22, 3.04). Little evidence of association was found for other tumours. Our study suggests that young children living close to highways are at increased risk of developing leukaemia.
Resumo:
Trabecular bone score (TBS) is a grey-level textural index of bone microarchitecture derived from lumbar spine dual-energy X-ray absorptiometry (DXA) images. TBS is a BMD-independent predictor of fracture risk. The objective of this meta-analysis was to determine whether TBS predicted fracture risk independently of FRAX probability and to examine their combined performance by adjusting the FRAX probability for TBS. We utilized individual level data from 17,809 men and women in 14 prospective population-based cohorts. Baseline evaluation included TBS and the FRAX risk variables and outcomes during follow up (mean 6.7 years) comprised major osteoporotic fractures. The association between TBS, FRAX probabilities and the risk of fracture was examined using an extension of the Poisson regression model in each cohort and for each sex and expressed as the gradient of risk (GR; hazard ratio per 1SD change in risk variable in direction of increased risk). FRAX probabilities were adjusted for TBS using an adjustment factor derived from an independent cohort (the Manitoba Bone Density Cohort). Overall, the GR of TBS for major osteoporotic fracture was 1.44 (95% CI: 1.35-1.53) when adjusted for age and time since baseline and was similar in men and women (p > 0.10). When additionally adjusted for FRAX 10-year probability of major osteoporotic fracture, TBS remained a significant, independent predictor for fracture (GR 1.32, 95%CI: 1.24-1.41). The adjustment of FRAX probability for TBS resulted in a small increase in the GR (1.76, 95%CI: 1.65, 1.87 vs. 1.70, 95%CI: 1.60-1.81). A smaller change in GR for hip fracture was observed (FRAX hip fracture probability GR 2.25 vs. 2.22). TBS is a significant predictor of fracture risk independently of FRAX. The findings support the use of TBS as a potential adjustment for FRAX probability, though the impact of the adjustment remains to be determined in the context of clinical assessment guidelines. This article is protected by copyright. All rights reserved.
Resumo:
PURPOSE To identify the influence of fixed prosthesis type on biologic and technical complication rates in the context of screw versus cement retention. Furthermore, a multivariate analysis was conducted to determine which factors, when considered together, influence the complication and failure rates of fixed implant-supported prostheses. MATERIALS AND METHODS Electronic searches of MEDLINE (PubMed), EMBASE, and the Cochrane Library were conducted. Selected inclusion and exclusion criteria were used to limit the search. Data were analyzed statistically with simple and multivariate random-effects Poisson regressions. RESULTS Seventy-three articles qualified for inclusion in the study. Screw-retained prostheses showed a tendency toward and significantly more technical complications than cemented prostheses with single crowns and fixed partial prostheses, respectively. Resin chipping and ceramic veneer chipping had high mean event rates, at 10.04 and 8.95 per 100 years, respectively, for full-arch screwed prostheses. For "all fixed prostheses" (prosthesis type not reported or not known), significantly fewer biologic and technical complications were seen with screw retention. Multivariate analysis revealed a significantly greater incidence of technical complications with cemented prostheses. Full-arch prostheses, cantilevered prostheses, and "all fixed prostheses" had significantly higher complication rates than single crowns. A significantly greater incidence of technical and biologic complications was seen with cemented prostheses. CONCLUSION Screw-retained fixed partial prostheses demonstrated a significantly higher rate of technical complications and screw-retained full-arch prostheses demonstrated a notably high rate of veneer chipping. When "all fixed prostheses" were considered, significantly higher rates of technical and biologic complications were seen for cement-retained prostheses. Multivariate Poisson regression analysis failed to show a significant difference between screw- and cement-retained prostheses with respect to the incidence of failure but demonstrated a higher rate of technical and biologic complications for cement-retained prostheses. The incidence of technical complications was more dependent upon prosthesis and retention type than prosthesis or abutment material.
Resumo:
BACKGROUND We investigated the rate of severe hypoglycemic events and confounding factors in patients with type-2-diabetes treated with sulfonylurea (SU) at specialized diabetes centers, documented in the German/Austrian DPV-Wiss-database. METHODS Data from 29,485 SU-treated patients were analyzed (median[IQR] age 70.8[62.2-77.8]yrs, diabetes-duration 8.2[4.3-12.8]yrs). The primary objective was to estimate the event-rate of severe hypoglycemia (requiring external help, causing unconsciousness/coma/convulsion and/or emergency.hospitalization). Secondary objectives included exploration of confounding risk-factors through group-comparison and Poisson-regression. RESULTS Severe hypoglycemic events were reported in 826(2.8%) of all patients during their most recent year of SU-treatment. Of these, n = 531(1.8%) had coma, n = 501(1.7%) were hospitalized at least once. The adjusted event-rate of severe hypoglycemia [95%CI] was 3.9[3.7-4.2] events/100 patient-years (coma: 1.9[1.8-2.1]; hospitalization: 1.6[1.5-1.8]). Adjusted event-rates by diabetes-treatment were 6.7 (SU + insulin), 4.9 (SU + insulin + other OAD), 3.1 (SU + other OAD), and 3.8 (SU only). Patients with ≥1 severe event were older (p < 0.001) and had longer diabetes-duration (p = 0.020) than patients without severe events. Participation in educational diabetes-programs and indirect measures of insulin-resistance (increased BMI, plasma-triglycerides) were associated with fewer events (all p < 0.001). Impaired renal function was common (N = 3,113 eGFR ≤30 mL/min) and associated with an increased rate of severe events (≤30 mL/min: 7.7; 30-60 mL/min: 4.8; >60 mL/min: 3.9). CONCLUSIONS These real-life data showed a rate of severe hypoglycemia of 3.9/100 patient-years in SU-treated patients from specialized diabetes centers. Higher risk was associated with known risk-factors including lack of diabetes-education, older age, and decreased eGFR, but also with lower BMI and lower triglyceride-levels, suggesting that SU-treatment in those patients should be considered with caution. This article is protected by copyright. All rights reserved.
Resumo:
BACKGROUND In an effort to reduce firearm mortality rates in the USA, US states have enacted a range of firearm laws to either strengthen or deregulate the existing main federal gun control law, the Brady Law. We set out to determine the independent association of different firearm laws with overall firearm mortality, homicide firearm mortality, and suicide firearm mortality across all US states. We also projected the potential reduction of firearm mortality if the three most strongly associated firearm laws were enacted at the federal level. METHODS We constructed a cross-sectional, state-level dataset from Nov 1, 2014, to May 15, 2015, using counts of firearm-related deaths in each US state for the years 2008-10 (stratified by intent [homicide and suicide]) from the US Centers for Disease Control and Prevention's Web-based Injury Statistics Query and Reporting System, data about 25 firearm state laws implemented in 2009, and state-specific characteristics such as firearm ownership for 2013, firearm export rates, and non-firearm homicide rates for 2009, and unemployment rates for 2010. Our primary outcome measure was overall firearm-related mortality per 100 000 people in the USA in 2010. We used Poisson regression with robust variances to derive incidence rate ratios (IRRs) and 95% CIs. FINDINGS 31 672 firearm-related deaths occurred in 2010 in the USA (10·1 per 100 000 people; mean state-specific count 631·5 [SD 629·1]). Of 25 firearm laws, nine were associated with reduced firearm mortality, nine were associated with increased firearm mortality, and seven had an inconclusive association. After adjustment for relevant covariates, the three state laws most strongly associated with reduced overall firearm mortality were universal background checks for firearm purchase (multivariable IRR 0·39 [95% CI 0·23-0·67]; p=0·001), ammunition background checks (0·18 [0·09-0·36]; p<0·0001), and identification requirement for firearms (0·16 [0·09-0·29]; p<0·0001). Projected federal-level implementation of universal background checks for firearm purchase could reduce national firearm mortality from 10·35 to 4·46 deaths per 100 000 people, background checks for ammunition purchase could reduce it to 1·99 per 100 000, and firearm identification to 1·81 per 100 000. INTERPRETATION Very few of the existing state-specific firearm laws are associated with reduced firearm mortality, and this evidence underscores the importance of focusing on relevant and effective firearms legislation. Implementation of universal background checks for the purchase of firearms or ammunition, and firearm identification nationally could substantially reduce firearm mortality in the USA. FUNDING None.
Resumo:
Survivors of childhood cancer have a higher mortality than the general population. We describe cause-specific long-term mortality in a population-based cohort of childhood cancer survivors. We included all children diagnosed with cancer in Switzerland (1976-2007) at age 0-14 years, who survived ≥5 years after diagnosis and followed survivors until December 31, 2012. We obtained causes of death (COD) from the Swiss mortality statistics and used data from the Swiss general population to calculate age-, calendar year- and sex-standardized mortality ratios (SMR), and absolute excess risks (AER) for different COD, by Poisson regression. We included 3'965 survivors and 49'704 person years at risk. Of these, 246 (6.2%) died, which was 11 times higher than expected (SMR 11.0). Mortality was particularly high for diseases of the respiratory (SMR 14.8) and circulatory system (SMR 12.7), and for second cancers (SMR 11.6). The pattern of cause-specific mortality differed by primary cancer diagnosis, and changed with time since diagnosis. In the first 10 years after 5-year survival, 78.9% of excess deaths were caused by recurrence of the original cancer (AER 46.1). Twenty-five years after diagnosis, only 36.5% (AER 9.1) were caused by recurrence, 21.3% by second cancers (AER 5.3) and 33.3% by circulatory diseases (AER 8.3). Our study confirms an elevated mortality in survivors of childhood cancer for at least 30 years after diagnosis with an increased proportion of deaths caused by late toxicities of the treatment. The results underline the importance of clinical follow-up continuing years after the end of treatment for childhood cancer. This article is protected by copyright. All rights reserved.
Resumo:
Syndromic surveillance (SyS) systems currently exploit various sources of health-related data, most of which are collected for purposes other than surveillance (e.g. economic). Several European SyS systems use data collected during meat inspection for syndromic surveillance of animal health, as some diseases may be more easily detected post-mortem than at their point of origin or during the ante-mortem inspection upon arrival at the slaughterhouse. In this paper we use simulation to evaluate the performance of a quasi-Poisson regression (also known as an improved Farrington) algorithm for the detection of disease outbreaks during post-mortem inspection of slaughtered animals. When parameterizing the algorithm based on the retrospective analyses of 6 years of historic data, the probability of detection was satisfactory for large (range 83-445 cases) outbreaks but poor for small (range 20-177 cases) outbreaks. Varying the amount of historical data used to fit the algorithm can help increasing the probability of detection for small outbreaks. However, while the use of a 0·975 quantile generated a low false-positive rate, in most cases, more than 50% of outbreak cases had already occurred at the time of detection. High variance observed in the whole carcass condemnations time-series, and lack of flexibility in terms of the temporal distribution of simulated outbreaks resulting from low reporting frequency (monthly), constitute major challenges for early detection of outbreaks in the livestock population based on meat inspection data. Reporting frequency should be increased in the future to improve timeliness of the SyS system while increased sensitivity may be achieved by integrating meat inspection data into a multivariate system simultaneously evaluating multiple sources of data on livestock health.
Resumo:
BACKGROUND: Cardiovascular diseases are the leading cause of death worldwide and in Switzerland. When applied, treatment guidelines for patients with acute ST-segment elevation myocardial infarction (STEMI) improve the clinical outcome and should eliminate treatment differences by sex and age for patients whose clinical situations are identical. In Switzerland, the rate at which STEMI patients receive revascularization may vary by patient and hospital characteristics. AIMS: To examine all hospitalizations in Switzerland from 2010-2011 to determine if patient or hospital characteristics affected the rate of revascularization (receiving either a percutaneous coronary intervention or a coronary artery bypass grafting) in acute STEMI patients. DATA AND METHODS: We used national data sets on hospital stays, and on hospital infrastructure and operating characteristics, for the years 2010 and 2011, to identify all emergency patients admitted with the main diagnosis of acute STEMI. We then calculated the proportion of patients who were treated with revascularization. We used multivariable multilevel Poisson regression to determine if receipt of revascularization varied by patient and hospital characteristics. RESULTS: Of the 9,696 cases we identified, 71.6% received revascularization. Patients were less likely to receive revascularization if they were female, and 80 years or older. In the multivariable multilevel Poisson regression analysis, there was a trend for small-volume hospitals performing fewer revascularizations but this was not statistically significant while being female (Relative Proportion = 0.91, 95% CI: 0.86 to 0.97) and being older than 80 years was still associated with less frequent revascularization. CONCLUSION: Female and older patients were less likely to receive revascularization. Further research needs to clarify whether this reflects differential application of treatment guidelines or limitations in this kind of routine data.
Resumo:
The population mixing hypothesis proposes that childhood leukaemia (CL) might be a rare complication of a yet unidentified subclinical infection. Large population influxes into previously isolated rural areas may foster localised epidemics of the postulated infection causing a subsequent increase of CL. While marked population growth after a period of stability was central to the formulation of the hypothesis and to the early studies on population mixing, there is a lack of objective criteria to define such growth patterns. We aimed to determine whether periods of marked population growth coincided with increases in the risk of CL in Swiss municipalities. We identified incident cases of CL aged 0-15 years for the period 1985-2010 from the Swiss Childhood Cancer Registry. Annual data on population counts in Swiss municipalities were obtained for 1980-2010. As exposures, we defined (1) cumulative population growth during a 5-year moving time window centred on each year (1985-2010) and (2) periods of 'take-off growth' identified by segmented linear regression. We compared CL incidence across exposure categories using Poisson regression and tested for effect modification by degree of urbanisation. Our study included 1500 incident cases and 2561 municipalities. The incident rate ratio (IRR) comparing the highest to the lowest quintile of 5-year population growth was 1.18 (95 % CI 0.96, 1.46) in all municipalities and 1.33 (95 % CI 0.93, 1.92) in rural municipalities (p value interaction 0.36). In municipalities with take-off growth, the IRR comparing the take-off period (>6 % annual population growth) with the initial period of low or negative growth (<2 %) was 2.07 (95 % CI 0.95, 4.51) overall and 2.99 (1.11, 8.05) in rural areas (p interaction 0.52). Our study provides further support for the population mixing hypothesis and underlines the need to distinguish take-off growth from other growth patterns in future research.
Resumo:
Southeast Texas, including Houston, has a large presence of industrial facilities and has been documented to have poorer air quality and significantly higher cancer rates than the remainder of Texas. Given citizens’ concerns in this 4th largest city in the U.S., Mayor Bill White recently partnered with the UT School of Public Health to determine methods to evaluate the health risks of hazardous air pollutants (HAPs). Sexton et al. (2007) published a report that strongly encouraged analytic studies linking these pollutants with health outcomes. In response, we set out to complete the following aims: 1. determine the optimal exposure assessment strategy to assess the association between childhood cancer rates and increased ambient levels of benzene and 1,3-butadiene (in an ecologic setting) and 2. evaluate whether census tracts with the highest levels of benzene or 1,3-butadiene have higher incidence of childhood lymphohematopoietic cancer compared with census tracts with the lowest levels of benzene or 1,3-butadiene, using Poisson regression. The first aim was achieved by evaluating the usefulness of four data sources: geographic information systems (GIS) to identify proximity to point sources of industrial air pollution, industrial emission data from the U.S. EPA’s Toxic Release Inventory (TRI), routine monitoring data from the U.S. EPA Air Quality System (AQS) from 1999-2000 and modeled ambient air levels from the U.S. EPA’s 1999 National Air Toxic Assessment Project (NATA) ASPEN model. Further, once these four data sources were evaluated, we narrowed them down to two: the routine monitoring data from the AQS for the years 1998-2000 and the 1999 U.S. EPA NATA ASPEN modeled data. We applied kriging (spatial interpolation) methodology to the monitoring data and compared the kriged values to the ASPEN modeled data. Our results indicated poor agreement between the two methods. Relative to the U.S. EPA ASPEN modeled estimates, relying on kriging to classify census tracts into exposure groups would have caused a great deal of misclassification. To address the second aim, we additionally obtained childhood lymphohematopoietic cancer data for 1995-2004 from the Texas Cancer Registry. The U.S. EPA ASPEN modeled data were used to estimate ambient levels of benzene and 1,3-butadiene in separate Poisson regression analyses. All data were analyzed at the census tract level. We found that census tracts with the highest benzene levels had elevated rates of all leukemia (rate ratio (RR) = 1.37; 95% confidence interval (CI), 1.05-1.78). Among census tracts with the highest 1,3-butadiene levels, we observed RRs of 1.40 (95% CI, 1.07-1.81) for all leukemia. We detected no associations between benzene or 1,3-butadiene levels and childhood lymphoma incidence. This study is the first to examine this association in Harris and surrounding counties in Texas and is among the first to correlate monitored levels of HAPs with childhood lymphohematopoietic cancer incidence, evaluating several analytic methods in an effort to determine the most appropriate approach to test this association. Despite recognized weakness of ecologic analyses, our analysis suggests an association between childhood leukemia and hazardous air pollution.^
Resumo:
Generalized linear Poisson and logistic regression models were utilized to examine the relationship between temperature and precipitation and cases of Saint Louis encephalitis virus spread in the Houston metropolitan area. The models were investigated with and without repeated measures, with a first order autoregressive (AR1) correlation structure used for the repeated measures model. The two types of Poisson regression models, with and without correlation structure, showed that a unit increase in temperature measured in degrees Fahrenheit increases the occurrence of the virus 1.7 times and a unit increase in precipitation measured in inches increases the occurrence of the virus 1.5 times. Logistic regression did not show these covariates to be significant as predictors for encephalitis activity in Houston for either correlation structure. This discrepancy for the logistic model could be attributed to the small data set.^ Keywords: Saint Louis Encephalitis; Generalized Linear Model; Poisson; Logistic; First Order Autoregressive; Temperature; Precipitation. ^
Resumo:
Additive and multiplicative models of relative risk were used to measure the effect of cancer misclassification and DS86 random errors on lifetime risk projections in the Life Span Study (LSS) of Hiroshima and Nagasaki atomic bomb survivors. The true number of cancer deaths in each stratum of the cancer mortality cross-classification was estimated using sufficient statistics from the EM algorithm. Average survivor doses in the strata were corrected for DS86 random error ($\sigma$ = 0.45) by use of reduction factors. Poisson regression was used to model the corrected and uncorrected mortality rates with covariates for age at-time-of-bombing, age at-time-of-death and gender. Excess risks were in good agreement with risks in RERF Report 11 (Part 2) and the BEIR-V report. Bias due to DS86 random error typically ranged from $-$15% to $-$30% for both sexes, and all sites and models. The total bias, including diagnostic misclassification, of excess risk of nonleukemia for exposure to 1 Sv from age 18 to 65 under the non-constant relative projection model was $-$37.1% for males and $-$23.3% for females. Total excess risks of leukemia under the relative projection model were biased $-$27.1% for males and $-$43.4% for females. Thus, nonleukemia risks for 1 Sv from ages 18 to 85 (DRREF = 2) increased from 1.91%/Sv to 2.68%/Sv among males and from 3.23%/Sv to 4.02%/Sv among females. Leukemia excess risks increased from 0.87%/Sv to 1.10%/Sv among males and from 0.73%/Sv to 1.04%/Sv among females. Bias was dependent on the gender, site, correction method, exposure profile and projection model considered. Future studies that use LSS data for U.S. nuclear workers may be downwardly biased if lifetime risk projections are not adjusted for random and systematic errors. (Supported by U.S. NRC Grant NRC-04-091-02.) ^
Resumo:
Purpose. This project was designed to describe the association between wasting and CD4 cell counts in HIV-infected men in order to better understand the role of wasting in progression of HIV infection.^ Methods. Baseline and prevalence data were collected from a cross-sectional survey of 278 HIV-infected men seen at the Houston Veterans Affairs Medical Center Special Medicine Clinic, from June 1, 1991 to January 1, 1994. A follow-up study was conducted among those at risk, to investigate the incidence of wasting and the association between wasting and low CD4 cell counts. Wasting was described by four methods. Z-scores for age-, sex-, and height-adjusted weight; sex-, and age-adjusted mid-arm muscle circumference (MAMC); and fat-free mass; and the ratio of extra-cellular mass (ECM) to body-cell mass (BCM) $>$ 1.20. FFM, ECM, and BCM were estimated from bioelectrical impedance analysis. MAMC was calculated from triceps skinfold and mid-arm circumference. The relationship between wasting and covariates was examined with logistic regression in the cross-sectional study, and with Poisson regression in the follow-up study. The association between death and wasting was examined with Cox's regression.^ Results. The prevalence of wasting ranged from 5% (weight and ECM:BCM) to almost 14% (MAMC and FFM) among the 278 men examined. The odds of wasting, associated with baseline CD4 cell count $<$200, was significant for each method but weight, and ranged from 4.6 to 12.7. Use of antiviral therapy was significantly protective of MAMC, FFM and ECM:BCM (OR $\approx$ 0.2), whereas the need for antibacterial therapy was a risk (OR 3.1, 95% CI 1.1-8.7). The average incidence of wasting ranged from 4 to 16 per 100 person-years among the approximately 145 men followed for 160 person-years. Low CD4 cell count seemed to increase the risk of wasting, but statistical significance was not reached. The effect of the small sample size on the power to detect a significant association should be considered. Wasting, by MAMC and FFM, was significantly associated with death, after adjusting for baseline serum albumin concentration and CD4 cell count.^ Conclusions. Wasting by MAMC and FFM were strongly associated with baseline CD4 cell counts in both the prevalence and incidence study and strong predictors of death. Of the two methods, MAMC is convenient, has available reference population data, may be the most appropriate for assessing the nutritional status of HIV-infected men. ^
Resumo:
Invasive pneumococcal disease (IPD) causes significant health burden in the US, is responsible for the majority of bacterial meningitis, and causes more deaths than any other vaccine preventable bacterial disease in the US. The estimated National IPD rate is 14.3 cases per 100,000 population with a case-fatality rate of 1.5 cases per 100,000 population. Although cases of IPD are routinely reported to the local health department in Harris County Texas, the incidence (IR) and case-fatality (CFR) rates have not been reported. Additionally, it is important to know which serotypes of S. pneumoniae are circulating in Harris County Texas and to determine if ‘replacement disease’ is occurring. ^ This study reported incidence and case-fatality rates from 2003 to 2009, and described the trends in IPD, including the IPD serotypes circulating in Harris County Texas during the study period, particularly in 2008 and 2010. Annual incidence rates were calculated and reported for 2003 to 2009, using complete surveillance-year data. ^ Geographic information system (GIS) software was used to create a series of maps of the data reported during the study period. Cluster and outlier analysis and hot spot analysis were conducted using both case counts by census tract and disease rate by census tract. ^ IPD age- and race-adjusted IR for Harris County Texas and their 95% confidence intervals (CIs) were 1.40 (95% CI 1.0, 1.8), 1.71 (95% CI 1.24, 2.17), 3.13 (95% CI 2.48, 3.78), 3.08 (95% CI 2.43, 3.74), 5.61 (95% CI 4.79, 6.43), 8.11 (95% CI 7.11, 9.1), and 7.65 (95% CI 6.69, 8.61) for the years 2003 to 2009, respectively (rates were age- and race-adjusted to each year's midyear US population estimates). A Poisson regression model demonstrated a statistically significant increasing trend of about 32 percent per year in the IPD rates over the course of the study period. IPD age- and race-adjusted case-fatality rates (CFR) for Harris County Texas were also calculated and reported. A Poisson regression model demonstrated a statistically significant increasing trend of about 26 percent per year in the IPD case-fatality rates from 2003 through 2009. A logistic regression model associated the risk of dying from IPD to alcohol abuse (OR 4.69, 95% CI 2.57, 8.56) and to meningitis (OR 2.42, 95% CI 1.46, 4.03). ^ The prevalence of non-vaccine serotypes (NVT) among IPD cases with serotyped isolates was 98.2 percent. In 2008, the year with the sample more geographically representative of all areas of Harris County Texas, the prevalence was 96 percent. Given these findings, it is reasonable to conclude that ‘replacement disease’ is occurring in Harris County Texas, meaning that, the majority of IPD is caused by serotypes not included in the PCV7 vaccine. Also in conclusion, IPD rates increased during the study period in Harris County Texas.^