867 resultados para Acute Diarrheal Disease
Resumo:
Globally, the main contributors to morbidity and mortality are chronic diseases, including cardiovascular disease and diabetes. Chronic diseases are costly and partially avoidable, with around sixty percent of deaths and nearly fifty percent of the global disease burden attributable to these conditions. By 2020, chronic illnesses will likely be the leading cause of disability worldwide. Existing health care systems, both national and international, that focus on acute episodic health conditions, cannot address the worldwide transition to chronic illness; nor are they appropriate for the ongoing care and management of those already afflicted with chronic diseases. International and Australian strategic planning documents articulate similar elements to manage chronic disease; including the need for aligning sectoral policies for health, forming partnerships and engaging communities in decision-making. The Australian National Chronic Disease Strategy focuses on four core areas for managing chronic disease; prevention across the continuum, early detection and treatment, integrated and coordinated care, and self-management. Such a comprehensive approach incorporates the entire population continuum, from the ‘healthy’, to those with risk factors, through to people suffering from chronic conditions and their sequelae. This chapter examines comprehensive approach to the prevention, management and care of the population with non-communicable, chronic diseases and communicable diseases. It analyses models of care in the context of need, service delivery options and the potential to prevent or manage early intervention for chronic and communicable diseases. Approaches to chronic diseases require integrated approaches that incorporate interventions targeted at both individuals and populations, and emphasise the shared risk factors of different conditions. Communicable diseases are a common and significant contributor to ill health throughout the world. In many countries, this impact has been minimised by the combined efforts of preventative health measures and improved treatment of infectious diseases. However in underdeveloped nations, communicable diseases continue to contribute significantly to the burden of disease. The aim of this chapter is to outline the impact that chronic and communicable diseases have on the health of the community, the public health strategies that are used to reduce the burden of those diseases and the old and emerging risks to public health from infectious diseases.
Resumo:
Objective: With growing recognition of the role of inflammation in the development of chronic and acute disease, fish oil is increasingly used as a therapeutic agent, but the nature of the intervention may pose barriers to adherence in clinical populations. Our objective was to investigate the feasibility of using a fish oil supplement in hemodialysis patients. ---------- Design: This was a nonrandomized intervention study.---------- Setting: Eligible patients were recruited at the Hemodialysis Unit of Wesley Hospital, Brisbane, Queensland, Australia. Patients The sample included 28 maintenance hemodialysis patients out of 43 eligible patients in the unit. Exclusion criteria included patients regularly taking a fish oil supplement at baseline, receiving hemodialysis for less than 3 months, or being unable to give informed consent.---------- Intervention: Eicosapentaenoic acid (EPA) was administered at 2000 mg/day (4 capsules) for 12 weeks. Adherence was measured at baseline and weekly throughout the study according to changes in plasma EPA, and was further measured subjectively by self-report.---------- Results: Twenty patients (74%) adhered to the prescription based on changes in plasma EPA, whereas an additional two patients self-reported good adherence. There was a positive relationship between fish oil intake and change in plasma EPA. Most patients did not report problems with taking the fish oil. Using the baseline data, it was not possible to characterize adherent patients.---------- Conclusions: Despite potential barriers, including the need to take a large number of prescribed medications already, 74% of hemodialysis patients adhered to the intervention. This study demonstrated the feasibility of using fish oil in a clinical population.
Acute exercise improves postprandial cardiovascular risk factors in overweight and obese individuals
Resumo:
Objectives The effects of 30 min of exercise on postprandial lipaemia in the overweight and obese are unknown as previous studies have only investigated bouts of at least 60 min in lean, healthy individuals. The aim of this study was to investigate whether a single 30-min bout of resistance, aerobic or combined exercise at moderate-intensity would decrease postprandial lipaemia, glucose and insulin levels as well as increase resting energy expenditure and increase fat oxidation following a high fat meal consumed 14 h after the exercise bout, in overweight and obese individuals compared to no exercise. We also compared the effects of the different exercise modalities. Methods This study was a randomized cross-over design which examined the postprandial effects of 30 min of different types of exercise in the evening prior to a breakfast meal in overweight and obese men and women. Participants were randomized on four occasions, each one-week apart, to each condition; either no exercise, aerobic exercise, resistance exercise or a combination of aerobic exercise and resistance exercise. Results An acute bout of combination training did not have any significant effect on postprandial measurements compared to no exercise. However, aerobic exercise significantly reduced postprandial triglyceride levels by 8% compared to no exercise (p = 0.02) and resistance exercise decreased postprandial insulin levels by 30% compared to aerobic exercise (p = 0.01). Conclusion These results indicate that a single moderate-intensity 30 min bout of aerobic or resistance exercise improves risk factors associated with cardiovascular disease in overweight and obese individuals.
Resumo:
Physical inactivity is a leading factor associated with cardiovascular disease and a major contributor to the global burden of disease in developed countries. Subjective mood states associated with acute exercise are likely to influence future exercise adherence and warrant further investigation. The present study examined the effects of a single bout of vigorous exercise on mood and anxiety between individuals with substantially different exercise participation histories. Mood and anxiety were assessed one day before an exercise test (baseline), 5 minutes before (pre-test) and again 10 and 25 minutes post-exercise. Participants were 31 university students (16 males, 15 females; Age M = 20), with 16 participants reporting a history of regular exercise with the remaining 15 reporting to not exercise regularly. Each participant completed an incremental exercise test on a Monark cycle ergometer to volitional exhaustion. Regular exercisers reported significant post-exercise improvements in mood and reductions in state anxiety. By contrast, non-regular exercisers reported an initial decline in post-exercise mood and increased anxiety, followed by an improvement in mood and reduction in anxiety back to pre-exercise levels. Our findings suggest that previous exercise participation mediates affective responses to acute bouts of vigorous exercise. We suggest that to maximise positive mood changes following exercise, practitioners should carefully consider the individual’s exercise participation history before prescribing new regimes.
Resumo:
While much of the genetic variation in RNA viruses arises because of the error-prone nature of their RNA-dependent RNA polymerases, much larger changes may occur as a result of recombination. An extreme example of genetic change is found in defective interfering (DI) viral particles, where large sections of the genome of a parental virus have been deleted and the residual sub-genome fragment is replicated by complementation by co-infecting functional viruses. While most reports of DI particles have referred to studies in vitro, there is some evidence for the presence of DI particles in chronic viral infections in vivo. In this study, short fragments of dengue virus (DENV) RNA containing only key regulatory elements at the 3' and 5' ends of the genome were recovered from the sera of patients infected with any of the four DENV serotypes. Identical RNA fragments were detected in the supernatant from cultures of Aedes mosquito cells that were infected by the addition of sera from dengue patients, suggesting that the sub-genomic RNA might be transmitted between human and mosquito hosts in defective interfering (DI) viral particles. In vitro transcribed sub-genomic RNA corresponding to that detected in vivo could be packaged in virus like particles in the presence of wild type virus and transmitted for at least three passages in cell culture. DENV preparations enriched for these putative DI particles reduced the yield of wild type dengue virus following co-infections of C6-36 cells. This is the first report of DI particles in an acute arboviral infection in nature. The internal genomic deletions described here are the most extensive defects observed in DENV and may be part of a much broader disease attenuating process that is mediated by defective viruses.
Resumo:
Older adults, especially those acutely ill, are vulnerable to developing malnutrition due to a range of risk factors. The high prevalence and extensive consequences of malnutrition in hospitalised older adults have been reported extensively. However, there are few well-designed longitudinal studies that report the independent relationship between malnutrition and clinical outcomes after adjustment for a wide range of covariates. Acutely ill older adults are exceptionally prone to nutritional decline during hospitalisation, but few reports have studied this change and impact on clinical outcomes. In the rapidly ageing Singapore population, all this evidence is lacking, and the characteristics associated with the risk of malnutrition are also not well-documented. Despite the evidence on malnutrition prevalence, it is often under-recognised and under-treated. It is therefore crucial that validated nutrition screening and assessment tools are used for early identification of malnutrition. Although many nutrition screening and assessment tools are available, there is no universally accepted method for defining malnutrition risk and nutritional status. Most existing tools have been validated amongst Caucasians using various approaches, but they are rarely reported in the Asian elderly and none has been validated in Singapore. Due to the multiethnicity, cultural, and language differences in Singapore older adults, the results from non-Asian validation studies may not be applicable. Therefore it is important to identify validated population and setting specific nutrition screening and assessment methods to accurately detect and diagnose malnutrition in Singapore. The aims of this study are therefore to: i) characterise hospitalised elderly in a Singapore acute hospital; ii) describe the extent and impact of admission malnutrition; iii) identify and evaluate suitable methods for nutritional screening and assessment; and iv) examine changes in nutritional status during admission and their impact on clinical outcomes. A total of 281 participants, with a mean (+SD) age of 81.3 (+7.6) years, were recruited from three geriatric wards in Tan Tock Seng Hospital over a period of eight months. They were predominantly Chinese (83%) and community-dwellers (97%). They were screened within 72 hours of admission by a single dietetic technician using four nutrition screening tools [Tan Tock Seng Hospital Nutrition Screening Tool (TTSH NST), Nutritional Risk Screening 2002 (NRS 2002), Mini Nutritional Assessment-Short Form (MNA-SF), and Short Nutritional Assessment Questionnaire (SNAQ©)] that were administered in no particular order. The total scores were not computed during the screening process so that the dietetic technician was blinded to the results of all the tools. Nutritional status was assessed by a single dietitian, who was blinded to the screening results, using four malnutrition assessment methods [Subjective Global Assessment (SGA), Mini Nutritional Assessment (MNA), body mass index (BMI), and corrected arm muscle area (CAMA)]. The SGA rating was completed prior to computation of the total MNA score to minimise bias. Participants were reassessed for weight, arm anthropometry (mid-arm circumference, triceps skinfold thickness), and SGA rating at discharge from the ward. The nutritional assessment tools and indices were validated against clinical outcomes (length of stay (LOS) >11days, discharge to higher level care, 3-month readmission, 6-month mortality, and 6-month Modified Barthel Index) using multivariate logistic regression. The covariates included age, gender, race, dementia (defined using DSM IV criteria), depression (defined using a single question “Do you often feel sad or depressed?”), severity of illness (defined using a modified version of the Severity of Illness Index), comorbidities (defined using Charlson Comorbidity Index, number of prescribed drugs and admission functional status (measured using Modified Barthel Index; MBI). The nutrition screening tools were validated against the SGA, which was found to be the most appropriate nutritional assessment tool from this study (refer section 5.6) Prevalence of malnutrition on admission was 35% (defined by SGA), and it was significantly associated with characteristics such as swallowing impairment (malnourished vs well-nourished: 20% vs 5%), poor appetite (77% vs 24%), dementia (44% vs 28%), depression (34% vs 22%), and poor functional status (MBI 48.3+29.8 vs 65.1+25.4). The SGA had the highest completion rate (100%) and was predictive of the highest number of clinical outcomes: LOS >11days (OR 2.11, 95% CI [1.17- 3.83]), 3-month readmission (OR 1.90, 95% CI [1.05-3.42]) and 6-month mortality (OR 3.04, 95% CI [1.28-7.18]), independent of a comprehensive range of covariates including functional status, disease severity and cognitive function. SGA is therefore the most appropriate nutritional assessment tool for defining malnutrition. The TTSH NST was identified as the most suitable nutritional screening tool with the best diagnostic performance against the SGA (AUC 0.865, sensitivity 84%, specificity 79%). Overall, 44% of participants experienced weight loss during hospitalisation, and 27% had weight loss >1% per week over median LOS 9 days (range 2-50). Wellnourished (45%) and malnourished (43%) participants were equally prone to experiencing decline in nutritional status (defined by weight loss >1% per week). Those with reduced nutritional status were more likely to be discharged to higher level care (adjusted OR 2.46, 95% CI [1.27-4.70]). This study is the first to characterise malnourished hospitalised older adults in Singapore. It is also one of the very few studies to (a) evaluate the association of admission malnutrition with clinical outcomes in a multivariate model; (b) determine the change in their nutritional status during admission; and (c) evaluate the validity of nutritional screening and assessment tools amongst hospitalised older adults in an Asian population. Results clearly highlight that admission malnutrition and deterioration in nutritional status are prevalent and are associated with adverse clinical outcomes in hospitalised older adults. With older adults being vulnerable to risks and consequences of malnutrition, it is important that they are systematically screened so timely and appropriate intervention can be provided. The findings highlighted in this thesis provide an evidence base for, and confirm the validity of the current nutrition screening and assessment tools used among hospitalised older adults in Singapore. As the older adults may have developed malnutrition prior to hospital admission, or experienced clinically significant weight loss of >1% per week of hospitalisation, screening of the elderly should be initiated in the community and continuous nutritional monitoring should extend beyond hospitalisation.
Resumo:
Background: Access to cardiac services is essential for appropriate implementation of evidence-based therapies to improve outcomes. The Cardiac Accessibility and Remoteness Index for Australia (Cardiac ARIA) aimed to derive an objective, geographic measure reflecting access to cardiac services. Methods: An expert panel defined an evidence-based clinical pathway. Using Geographic Information Systems (GIS), a numeric/alpha index was developed at two points along the continuum of care. The acute category (numeric) measured the time from the emergency call to arrival at an appropriate medical facility via road ambulance. The aftercare category (alpha) measured access to four basic services (family doctor, pharmacy, cardiac rehabilitation, and pathology services) when a patient returned to their community. Results: The numeric index ranged from 1 (access to principle referral center with cardiac catheterization service ≤ 1 hour) to 8 (no ambulance service, > 3 hours to medical facility, air transport required). The alphabetic index ranged from A (all 4 services available within 1 hour drive-time) to E (no services available within 1 hour). 13.9 million (71%) Australians resided within Cardiac ARIA 1A locations (hospital with cardiac catheterization laboratory and all aftercare within 1 hour). Those outside Cardiac 1A were over-represented by people aged over 65 years (32%) and Indigenous people (60%). Conclusion: The Cardiac ARIA index demonstrated substantial inequity in access to cardiac services in Australia. This methodology can be used to inform cardiology health service planning and the methodology could be applied to other common disease states within other regions of the world.
Resumo:
Background Coronary heart disease (CHD) and depression are leading causes of disease burden globally and the two often co-exist. Depression is common after Myocardial Infarction (MI) and it has been estimated that 15-35% of patients experience depressive symptoms. Co-morbid depression can impair health related quality of life (HRQOL), decrease medication adherence and appropriate utilisation of health services, lead to increased morbidity and suicide risk, and is associated with poorer CHD risk factor profiles and reduced survival. We aim to determine the feasibility of conducting a randomised, multi-centre trial designed to compare a tele-health program (MoodCare) for depression and CHD secondary prevention, with Usual Care (UC). Methods Over 1600 patients admitted after index admission for Acute Coronary Syndrome (ACS) are being screened for depression at six metropolitan hospitals in the Australian states of Victoria and Queensland. Consenting participants are then contacted at two weeks post-discharge for baseline assessment. One hundred eligible participants are to be randomised to an intervention or a usual medical care control group (50 per group). The intervention consists of up to 10 × 30-40 minute structured telephone sessions, delivered by registered psychologists, commencing within two weeks of baseline screening. The intervention focuses on depression management, lifestyle factors (physical activity, healthy eating, smoking cessation, alcohol intake), medication adherence and managing co-morbidities. Data collection occurs at baseline (Time 1), 6 months (post-intervention) (Time 2), 12 months (Time 3) and 24 months follow-up for longer term effects (Time 4). We are comparing depression (Cardiac Depression Scale [CDS]) and HRQOL (Short Form-12 [SF-12]) scores between treatment and UC groups, assessing the feasibility of the program through patient acceptability and exploring long term maintenance effects. A cost-effectiveness analysis of the costs and outcomes for patients in the intervention and control groups is being conducted from the perspective of health care costs to the government. Discussion This manuscript presents the protocol for a randomised, multi-centre trial to evaluate the feasibility of a tele-based depression management and CHD secondary prevention program for ACS patients. The results of this trial will provide valuable new information about potential psychological and wellbeing benefits, cost-effectiveness and acceptability of an innovative tele-based depression management and secondary prevention program for CHD patients experiencing depression.
Resumo:
Background Seasonal changes in cardiovascular disease (CVD) risk factors may be due to exposure to seasonal environmental variables like temperature and acute infections or seasonal behavioural patterns in physical activity and diet. Investigating the seasonal pattern of risk factors should help determine the causes of the seasonal pattern in CVD. Few studies have investigated the seasonal variation in risk factors using repeated measurements from the same individual, which is important as individual and population seasonal patterns may differ. Methods The authors investigated the seasonal pattern in systolic and diastolic blood pressure, heart rate, body weight, total cholesterol, triglycerides, high-density lipoprotein cholesterol, C reactive protein and fibrinogen. Measurements came from 38 037 participants in the population-based cohort, the Tromsø Study, examined up to eight times from 1979 to 2008. Individual and population seasonal patterns were estimated using a cosinor in a mixed model. Results All risk factors had a highly statistically significant seasonal pattern with a peak time in winter, except for triglycerides (peak in autumn), C reactive protein and fibrinogen (peak in spring). The sizes of the seasonal variations were clinically modest. Conclusions Although the authors found highly statistically significant individual seasonal patterns for all risk factors, the sizes of the changes were modest, probably because this subarctic population is well adapted to a harsh climate. Better protection against seasonal risk factors like cold weather could help reduce the winter excess in CVD observed in milder climates.
Resumo:
Rationale: The Australasian Nutrition Care Day Survey (ANCDS) evaluated if malnutrition and decreased food intake are independent risk factors for negative outcomes in hospitalised patients. Methods: A multicentre (56 hospitals) cross-sectional survey was conducted in two phases. Phase 1 evaluated nutritional status (defined by Subjective Global Assessment) and 24-hour food intake recorded as 0, 25, 50, 75, and 100% intake. Phase 2 data, which included length of stay (LOS), readmissions and mortality, were collected 90 days post-Phase 1. Logistic regression was used to control for confounders: age, gender, disease type and severity (using Patient Clinical Complexity Level scores). Results: Of 3122 participants (53% males, mean age: 65±18 years) 32% were malnourished and 23% consumed�25% of the offered food. Median LOS for malnourished (MN) patients was higher than well-nourished (WN) patients (15 vs. 10 days, p<0.0001). Median LOS for patients consuming �25% of the food was higher than those consuming �50% (13 vs. 11 days, p<0.0001). MN patients had higher readmission rates (36% vs. 30%, p = 0.001). The odds ratios of 90-day in-hospital mortality were 1.8 times greater for MN patients (CI: 1.03 3.22, p = 0.04) and 2.7 times greater for those consuming �25% of the offered food (CI: 1.54 4.68, p = 0.001). Conclusion: The ANCDS demonstrates that malnutrition and/or decreased food intake are associated with longer LOS and readmissions. The survey also establishes that malnutrition and decreased food intake are independent risk factors for in-hospital mortality in acute care patients; and highlights the need for appropriate nutritional screening and support during hospitalisation. Disclosure of Interest: None Declared.
Resumo:
Review question/objective What is the effect of using the teach-back method for health education to improve adherence to treatment regimen and self-management in chronic disease? Inclusion criteria Types of participants This review will consider all studies that include adult patients (aged 18 years and over) in any healthcare setting, either as inpatients (eg acute care, medical and surgical wards) or those who attend primary health care, family medical practice, general medical practice, clinics, outpatient departments, rehabilitation or community settings. Participants need to have been diagnosed as having one or more chronic diseases including heart failure, diabetes, cardiovascular disease, cancer, respiratory disease, asthma, chronic obstructive pulmonary disease, chronic kidney disease, arthritis, epilepsy or a mental health condition. Studies that include seriously ill patients, and/or those who have impairments in verbal communication and cognitive function will be excluded. Types of intervention This review will consider studies that investigate the use of the teach-back method alone or in combination with other supporting education, either in routine or research intervention education programs; regardless of how long the programs were and whether or not a follow-up was conducted. The intervention could be delivered by any healthcare professional. The comparator will be any health education for chronic disease that does not include the teach-back method. Types of outcomes Primary outcomes of interest are disease-specific knowledge, adherence, and self-management knowledge, behavior and skills measured using patient report, nursing observation or validated measurement scales. Secondary outcomes include knowledge retention, self-efficacy, hospital readmission, hospitalization, and quality of life, also measured using patient report, nursing observation, hospital records or validated measurement scales.
Resumo:
Background Acute respiratory illness, a leading cause of cough in children, accounts for a substantial proportion of childhood morbidity and mortality worldwide. In some children acute cough progresses to chronic cough (> 4 weeks duration), impacting on morbidity and decreasing quality of life. Despite the importance of chronic cough as a cause of substantial childhood morbidity and associated economic, family and social costs, data on the prevalence, predictors, aetiology and natural history of the symptom are scarce. This study aims to comprehensively describe the epidemiology, aetiology and outcomes of cough during and after acute respiratory illness in children presenting to a tertiary paediatric emergency department. Methods/design A prospective cohort study of children aged <15 years attending the Royal Children's Hospital Emergency Department, Brisbane, for a respiratory illness that includes parent reported cough (wet or dry) as a symptom. The primary objective is to determine the prevalence and predictors of chronic cough (>= 4 weeks duration) post presentation with acute respiratory illness. Demographic, epidemiological, risk factor, microbiological and clinical data are completed at enrolment. Subjects complete daily cough dairies and weekly follow-up contacts for 28(+/-3) days to ascertain cough persistence. Children who continue to cough for 28 days post enrolment are referred to a paediatric respiratory physician for review. Primary analysis will be the proportion of children with persistent cough at day 28(+/-3). Multivariate analyses will be performed to evaluate variables independently associated with chronic cough at day 28(+/-3). Discussion Our protocol will be the first to comprehensively describe the natural history, epidemiology, aetiology and outcomes of cough during and after acute respiratory illness in children. The results will contribute to studies leading to the development of evidence-based clinical guidelines to improve the early detection and management of chronic cough in children during and after acute respiratory illness.
Resumo:
Background Recurrent protracted bacterial bronchitis (PBB), chronic suppurative lung disease (CSLD) and bronchiectasis are characterised by a chronic wet cough and are important causes of childhood respiratory morbidity globally. Haemophilus influenzae and Streptococcus pneumoniae are the most commonly associated pathogens. As respiratory exacerbations impair quality of life and may be associated with disease progression, we will determine if the novel 10-valent pneumococcal-Haemophilus influenzae protein D conjugate vaccine (PHiD-CV) reduces exacerbations in these children. Methods A multi-centre, parallel group, double-blind, randomised controlled trial in tertiary paediatric centres from three Australian cities is planned. Two hundred six children aged 18 months to 14 years with recurrent PBB, CSLD or bronchiectasis will be randomised to receive either two doses of PHiD-CV or control meningococcal (ACYW(135)) conjugate vaccine 2 months apart and followed for 12 months after the second vaccine dose. Randomisation will be stratified by site, age (<6 years and >= 6 years) and aetiology (recurrent PBB or CSLD/bronchiectasis). Clinical histories, respiratory status (including spirometry in children aged >= 6 years), nasopharyngeal and saliva swabs, and serum will be collected at baseline and at 2, 3, 8 and 14 months post-enrolment. Local and systemic reactions will be recorded on daily diaries for 7 and 30 days, respectively, following each vaccine dose and serious adverse events monitored throughout the trial. Fortnightly, parental contact will help record respiratory exacerbations. The primary outcome is the incidence of respiratory exacerbations in the 12 months following the second vaccine dose. Secondary outcomes include: nasopharyngeal carriage of H. influenzae and S. pneumoniae vaccine and vaccine-related serotypes; systemic and mucosal immune responses to H. influenzae proteins and S. pneumoniae vaccine and vaccine-related serotypes; impact upon lung function in children aged >= 6 years; and vaccine safety. Discussion As H. influenzae is the most common bacterial pathogen associated with these chronic respiratory diseases in children, a novel pneumococcal conjugate vaccine that also impacts upon H. influenzae and helps prevent respiratory exacerbations would assist clinical management with potential short- and long-term health benefits. Our study will be the first to assess vaccine efficacy targeting H. influenzae in children with recurrent PBB, CSLD and bronchiectasis.
Resumo:
Background Australian Indigenous children are the only population worldwide to receive the 7-valent pneumococcal conjugate vaccine (7vPCV) at 2, 4, and 6 months of age and the 23-valent pneumococcal polysaccharide vaccine (23vPPV) at 18 months of age. We evaluated this program's effectiveness in reducing the risk of hospitalization for acute lower respiratory tract infection (ALRI) in Northern Territory (NT) Indigenous children aged 5-23 months. Methods We conducted a retrospective cohort study involving all NT Indigenous children born from 1 April 2000 through 31 October 2004. Person-time at-risk after 0, 1, 2, and 3 doses of 7vPCV and after 0 and 1 dose of 23vPPV and the number of ALRI following each dose were used to calculate dose-specific rates of ALRI for children 5-23 months of age. Rates were compared using Cox proportional hazards models, with the number of doses of each vaccine serving as time-dependent covariates. Results There were 5482 children and 8315 child-years at risk, with 2174 episodes of ALRI requiring hospitalization (overall incidence, 261 episodes per 1000 child-years at risk). Elevated risk of ALRI requiring hospitalization was observed after each dose of the 7vPCV vaccine, compared with that for children who received no doses, and an even greater elevation in risk was observed after each dose of the 23vPPV ( adjusted hazard ratio [HR] vs no dose, 1.39; 95% confidence interval [CI], 1.12-1.71;). Risk was highest among children Pp. 002 vaccinated with the 23vPPV who had received < 3 doses of the 7vPCV (adjusted HR, 1.81; 95% CI, 1.32-2.48). Conclusions Our results suggest an increased risk of ALRI requiring hospitalization after pneumococcal vaccination, particularly after receipt of the 23vPPV booster. The use of the 23vPPV booster should be reevaluated.