892 resultados para 1,01


Relevância:

60.00% 60.00%

Publicador:

Resumo:

Background: Specialised disease management programmes for chronic heart failure (CHF) improve survival, quality of life and reduce healthcare utilisation. The overall efficacy of structured telephone support or telemonitoring as an individual component of a CHF disease management strategy remains inconclusive. Objectives: To review randomised controlled trials (RCTs) of structured telephone support or telemonitoring compared to standard practice for patients with CHF in order to quantify the effects of these interventions over and above usual care for these patients. Search strategy: Databases (the Cochrane Central Register of Controlled Trials (CENTRAL), Database of Abstracts of Reviews of Effects (DARE) and Health Technology Assessment Database (HTA) on The Cochrane Library, MEDLINE, EMBASE, CINAHL, AMED and Science Citation Index Expanded and Conference Citation Index on ISI Web of Knowledge) and various search engines were searched from 2006 to November 2008 to update a previously published non-Cochrane review. Bibliographies of relevant studies and systematic reviews and abstract conference proceedings were handsearched. No language limits were applied. Selection criteria: Only peer reviewed, published RCTs comparing structured telephone support or telemonitoring to usual care of CHF patients were included. Unpublished abstract data was included in sensitivity analyses. The intervention or usual care could not include a home visit or more than the usual (four to six weeks) clinic follow-up. Data collection and analysis: Data were presented as risk ratio (RR) with 95% confidence intervals (CI). Primary outcomes included all-cause mortality, all-cause and CHF-related hospitalisations which were meta-analysed using fixed effects models. Other outcomes included length of stay, quality of life, acceptability and cost and these were described and tabulated. Main results: Twenty-five studies and five published abstracts were included. Of the 25 full peer-reviewed studies meta-analysed, 16 evaluated structured telephone support (5613 participants), 11 evaluated telemonitoring (2710 participants), and two tested both interventions (included in counts). Telemonitoring reduced all-cause mortality (RR 0.66, 95% CI 0.54 to 0.81, P < 0.0001) with structured telephone support demonstrating a non-significant positive effect (RR 0.88, 95% CI 0.76 to 1.01, P = 0.08). Both structured telephone support (RR 0.77, 95% CI 0.68 to 0.87, P < 0.0001) and telemonitoring (RR 0.79, 95% CI 0.67 to 0.94, P = 0.008) reduced CHF-related hospitalisations. For both interventions, several studies improved quality of life, reduced healthcare costs and were acceptable to patients. Improvements in prescribing, patient knowledge and self-care, and New York Heart Association (NYHA) functional class were observed. Authors' conclusions: Structured telephone support and telemonitoring are effective in reducing the risk of all-cause mortality and CHF-related hospitalisations in patients with CHF; they improve quality of life, reduce costs, and evidence-based prescribing.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Background: Rapid weight gain in infancy is an important predictor of obesity in later childhood. Our aim was to determine which modifiable variables are associated with rapid weight gain in early life. Methods: Subjects were healthy infants enrolled in NOURISH, a randomised, controlled trial evaluating an intervention to promote positive early feeding practices. This analysis used the birth and baseline data for NOURISH. Birthweight was collected from hospital records and infants were also weighed at baseline assessment when they were aged 4-7 months and before randomisation. Infant feeding practices and demographic variables were collected from the mother using a self administered questionnaire. Rapid weight gain was defined as an increase in weight-for-age Z-score (using WHO standards) above 0.67 SD from birth to baseline assessment, which is interpreted clinically as crossing centile lines on a growth chart. Variables associated with rapid weight gain were evaluated using a multivariable logistic regression model. Results: Complete data were available for 612 infants (88% of the total sample recruited) with a mean (SD) age of 4.3 (1.0) months at baseline assessment. After adjusting for mother's age, smoking in pregnancy, BMI, and education and infant birthweight, age, gender and introduction of solid foods, the only two modifiable factors associated with rapid weight gain to attain statistical significance were formula feeding [OR=1.72 (95%CI 1.01-2.94), P= 0.047] and feeding on schedule [OR=2.29 (95%CI 1.14-4.61), P=0.020]. Male gender and lower birthweight were non-modifiable factors associated with rapid weight gain. Conclusions: This analysis supports the contention that there is an association between formula feeding, feeding to schedule and weight gain in the first months of life. Mechanisms may include the actual content of formula milk (e.g. higher protein intake) or differences in feeding styles, such as feeding to schedule, which increase the risk of overfeeding. Trial Registration: Australian Clinical Trials Registry ACTRN12608000056392

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Objective: To investigate the mental and general health of infertile women who had not sought medical advice for their recognized infertility and were therefore not represented in clinical populations. Design: Longitudinal cohort study.Setting Population based.Patient(s) Participants in the Australian Longitudinal Study on Women's Health aged 28-33 years in 2006 who had ever tried to conceive or had been pregnant (n = 5,936).Intervention(s) None.Main Outcome Measure(s) Infertility, not seeking medical advice. Result(s): Compared with fertile women (n = 4,905), infertile women (n = 1,031) had higher odds of self-reported depression (odds ratio [OR] 1.20, 95% confidence interval [CI] 1.01-1.43), endometriosis (5.43, 4.01-7.36), polycystic ovary syndrome (9.52, 7.30-12.41), irregular periods (1.99, 1.68-2.36), type II diabetes (4.70, 1.79-12.37), or gestational diabetes (1.66, 1.12-2.46). Compared with infertile women who sought medical advice (n = 728), those who had not sought medical advice (n = 303) had higher odds of self-reported depression (1.67, 1.18-2.37), other mental health problems (3.14, 1.14-8.64), urinary tract infections (1.67, 1.12-2.49), heavy periods (1.63, 1.16-2.29), or a cancer diagnosis (11.33, 2.57-49.89). Infertile women who had or had not sought medical advice had similar odds of reporting an anxiety disorder or anxiety-related symptoms. Conclusion(s): Women with self-reported depression were unlikely to have sought medical advice for infertility. Depression and depressive symptoms may be barriers to seeking medical advice for infertility.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

This study examined the effects of pre-cooling duration on performance and neuromuscular function for self-paced intermittent-sprint shuttle running in the heat. Eight male, team-sport athletes completed two 35-min bouts of intermittent-sprint shuttle running separated by a 15-min recovery on three separate occasions (33°C, 34% relative humidity). Mixed-method pre-cooling was completed for 20 min (COOL20), 10-min (COOL10) or no cooling (CONT) and reapplied for 5-min mid-exercise. Performance was assessed via sprint times, percentage decline and shuttle-running distance covered. Maximal voluntary contractions (MVC), voluntary activation (VA) and evoked twitch properties were recorded pre- and post-intervention and mid- and post-exercise. Core temperature (T c), skin temperature, heart rate, capillary blood metabolites, sweat losses, perceptual exertion and thermal stress were monitored throughout. Venous blood draws pre- and post-exercise were analyzed for muscle damage and inflammation markers. Shuttle-running distances covered were increased 5.2 ± 3.3% following COOL20 (P < 0.05), with no differences observed between COOL10 and CONT (P > 0.05). COOL20 aided in the maintenance of mid- and post-exercise MVC (P < 0.05; d > 0.80), despite no conditional differences in VA (P > 0.05). Pre-exercise T c was reduced by 0.15 ± 0.13°C with COOL20 (P < 0.05; d > 1.10), and remained lower throughout both COOL20 and COOL10 compared to CONT (P < 0.05; d > 0.80). Pre-cooling reduced sweat losses by 0.4 ± 0.3 kg (P < 0.02; d > 1.15), with COOL20 0.2 ± 0.4 kg less than COOL10 (P = 0.19; d = 1.01). Increased pre-cooling duration lowered physiological demands during exercise heat stress and facilitated the maintenance of self-paced intermittent-sprint performance in the heat. Importantly, the dose-response interaction of pre-cooling and sustained neuromuscular responses may explain the improved exercise performance in hot conditions.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Remote monitoring for heart failure has been evaluated in numerous systematic reviews. The aim of this meta-review was to appraise their quality and synthesise results. We electronically searched online databases, performed a forward citation search and hand-searched bibliographies. Systematic reviews of remote monitoring interventions that were used for surveillance of heart failure patients were included. Seven (41%) systematic reviews pooled results for meta-analysis. Eight (47%) considered all non-invasive remote monitoring strategies. Five (29%) focused on telemonitoring. Four (24%) included both non-invasive and invasive technologies. According to AMSTAR criteria, ten (58%) systematic reviews were of poor methodological quality. In high quality reviews, the relative risk of mortality in patients who received remote monitoring ranged from 0.53 (95% CI=0.29-0.96) to 0.88 (95% CI=0.76-1.01). High quality reviews also reported that remote monitoring reduced the relative risk of all-cause (0.52; 95% CI=0.28-0.96 to 0.96; 95% CI=0.90–1.03) and heart failure-related hospitalizations (0.72; 95% CI=0.64–0.81 to RR 0.79; 95% CI=0.67-0.94) and, as a consequence, healthcare costs. As the high quality reviews reported that remote monitoring reduced hospitalizations, mortality and healthcare costs, research efforts should now be directed towards optimising these interventions in preparation for more widespread implementation.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

It has been reported that poor nutritional status, in the form of weight loss and resulting body mass index (BMI) changes, is an issue in people with Parkinson's disease (PWP). The symptoms resulting from Parkinson's disease (PD) and the side effects of PD medication have been implicated in the aetiology of nutritional decline. However, the evidence on which these claims are based is, on one hand, contradictory, and on the other, restricted primarily to otherwise healthy PWP. Despite the claims that PWP suffer from poor nutritional status, evidence is lacking to inform nutrition-related care for the management of malnutrition in PWP. The aims of this thesis were to better quantify the extent of poor nutritional status in PWP, determine the important factors differentiating the well-nourished from the malnourished and evaluate the effectiveness of an individualised nutrition intervention on nutritional status. Phase DBS: Nutritional status in people with Parkinson's disease scheduled for deep-brain stimulation surgery The pre-operative rate of malnutrition in a convenience sample of people with Parkinson's disease (PWP) scheduled for deep-brain stimulation (DBS) surgery was determined. Poorly controlled PD symptoms may result in a higher risk of malnutrition in this sub-group of PWP. Fifteen patients (11 male, median age 68.0 (42.0 – 78.0) years, median PD duration 6.75 (0.5 – 24.0) years) participated and data were collected during hospital admission for the DBS surgery. The scored PG-SGA was used to assess nutritional status, anthropometric measures (weight, height, mid-arm circumference, waist circumference, body mass index (BMI)) were taken, and body composition was measured using bioelectrical impedance spectroscopy (BIS). Six (40%) of the participants were malnourished (SGA-B) while 53% reported significant weight loss following diagnosis. BMI was significantly different between SGA-A and SGA-B (25.6 vs 23.0kg/m 2, p<.05). There were no differences in any other variables, including PG-SGA score and the presence of non-motor symptoms. The conclusion was that malnutrition in this group is higher than that in other studies reporting malnutrition in PWP, and it is under-recognised. As poorer surgical outcomes are associated with poorer pre-operative nutritional status in other surgeries, it might be beneficial to identify patients at nutritional risk prior to surgery so that appropriate nutrition interventions can be implemented. Phase I: Nutritional status in community-dwelling adults with Parkinson's disease The rate of malnutrition in community-dwelling adults (>18 years) with Parkinson's disease was determined. One hundred twenty-five PWP (74 male, median age 70.0 (35.0 – 92.0) years, median PD duration 6.0 (0.0 – 31.0) years) participated. The scored PG-SGA was used to assess nutritional status, anthropometric measures (weight, height, mid-arm circumference (MAC), calf circumference, waist circumference, body mass index (BMI)) were taken. Nineteen (15%) of the participants were malnourished (SGA-B). All anthropometric indices were significantly different between SGA-A and SGA-B (BMI 25.9 vs 20.0kg/m2; MAC 29.1 – 25.5cm; waist circumference 95.5 vs 82.5cm; calf circumference 36.5 vs 32.5cm; all p<.05). The PG-SGA score was also significantly lower in the malnourished (2 vs 8, p<.05). The nutrition impact symptoms which differentiated between well-nourished and malnourished were no appetite, constipation, diarrhoea, problems swallowing and feel full quickly. This study concluded that malnutrition in community-dwelling PWP is higher than that documented in community-dwelling elderly (2 – 11%), yet is likely to be under-recognised. Nutrition impact symptoms play a role in reduced intake. Appropriate screening and referral processes should be established for early detection of those at risk. Phase I: Nutrition assessment tools in people with Parkinson's disease There are a number of validated and reliable nutrition screening and assessment tools available for use. None of these tools have been evaluated in PWP. In the sample described above, the use of the World Health Organisation (WHO) cut-off (≤18.5kg/m2), age-specific BMI cut-offs (≤18.5kg/m2 for under 65 years, ≤23.5kg/m2 for 65 years and older) and the revised Mini-Nutritional Assessment short form (MNA-SF) were evaluated as nutrition screening tools. The PG-SGA (including the SGA classification) and the MNA full form were evaluated as nutrition assessment tools using the SGA classification as the gold standard. For screening, the MNA-SF performed the best with sensitivity (Sn) of 94.7% and specificity (Sp) of 78.3%. For assessment, the PG-SGA with a cut-off score of 4 (Sn 100%, Sp 69.8%) performed better than the MNA (Sn 84.2%, Sp 87.7%). As the MNA has been recommended more for use as a nutrition screening tool, the MNA-SF might be more appropriate and take less time to complete. The PG-SGA might be useful to inform and monitor nutrition interventions. Phase I: Predictors of poor nutritional status in people with Parkinson's disease A number of assessments were conducted as part of the Phase I research, including those for the severity of PD motor symptoms, cognitive function, depression, anxiety, non-motor symptoms, constipation, freezing of gait and the ability to carry out activities of daily living. A higher score in all of these assessments indicates greater impairment. In addition, information about medical conditions, medications, age, age at PD diagnosis and living situation was collected. These were compared between those classified as SGA-A and as SGA-B. Regression analysis was used to identify which factors were predictive of malnutrition (SGA-B). Differences between the groups included disease severity (4% more severe SGA-A vs 21% SGA-B, p<.05), activities of daily living score (13 SGA-A vs 18 SGA-B, p<.05), depressive symptom score (8 SGA-A vs 14 SGA-B, p<.05) and gastrointestinal symptoms (4 SGA-A vs 6 SGA-B, p<.05). Significant predictors of malnutrition according to SGA were age at diagnosis (OR 1.09, 95% CI 1.01 – 1.18), amount of dopaminergic medication per kg body weight (mg/kg) (OR 1.17, 95% CI 1.04 – 1.31), more severe motor symptoms (OR 1.10, 95% CI 1.02 – 1.19), less anxiety (OR 0.90, 95% CI 0.82 – 0.98) and more depressive symptoms (OR 1.23, 95% CI 1.07 – 1.41). Significant predictors of a higher PG-SGA score included living alone (β=0.14, 95% CI 0.01 – 0.26), more depressive symptoms (β=0.02, 95% CI 0.01 – 0.02) and more severe motor symptoms (OR 0.01, 95% CI 0.01 – 0.02). More severe disease is associated with malnutrition, and this may be compounded by lack of social support. Phase II: Nutrition intervention Nineteen of the people identified in Phase I as requiring nutrition support were included in Phase II, in which a nutrition intervention was conducted. Nine participants were in the standard care group (SC), which received an information sheet only, and the other 10 participants were in the intervention group (INT), which received individualised nutrition information and weekly follow-up. INT gained 2.2% of starting body weight over the 12 week intervention period resulting in significant increases in weight, BMI, mid-arm circumference and waist circumference. The SC group gained 1% of starting weight over the 12 weeks which did not result in any significant changes in anthropometric indices. Energy and protein intake (18.3kJ/kg vs 3.8kJ/kg and 0.3g/kg vs 0.15g/kg) increased in both groups. The increase in protein intake was only significant in the SC group. The changes in intake, when compared between the groups, were no different. There were no significant changes in any motor or non-motor symptoms or in "off" times or dyskinesias in either group. Aspects of quality of life improved over the 12 weeks as well, especially emotional well-being. This thesis makes a significant contribution to the evidence base for the presence of malnutrition in Parkinson's disease as well as for the identification of those who would potentially benefit from nutrition screening and assessment. The nutrition intervention demonstrated that a traditional high protein, high energy approach to the management of malnutrition resulted in improved nutritional status and anthropometric indices with no effect on the presence of Parkinson's disease symptoms and a positive effect on quality of life.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Objective To evaluate the effectiveness of the 7-valent pneumococcal conjugate vaccine (PCV7) in preventing pneumonia, diagnosed radiologically according to World Health Organization (WHO) criteria, among indigenous infants in the Northern Territory of Australia. Methods We conducted a historical cohort study of consecutive indigenous birth cohorts between 1 April 1998 and 28 February 2005. Children were followed up to 18 months of age. The PCV7 programme commenced on 1 June 2001. All chest X-rays taken within 3 days of any hospitalization were assessed. The primary endpoint was a first episode of WHO-defined pneumonia requiring hospitalization. Cox proportional hazards models were used to compare disease incidence. Findings There were 526 pneumonia events among 10 600 children - an incidence of 3.3 per 1000 child-months; 183 episodes (34.8%) occurred before 5 months of age and 247 (47.0%) by 7 months. Of the children studied, 27% had received 3 doses of vaccine by 7 months of age. Hazard ratios for endpoint pneumonia were 1.01 for 1 versus 0 doses; 1.03 for 2 versus 0 doses; and 0.84 for 3 versus 0 doses. Conclusion There was limited evidence that PCV7 reduced the incidence of radiologically confirmed pneumonia among Northern Territory indigenous infants, although there was a non-significant trend towards an effect after receipt of the third dose. These findings might be explained by lack of timely vaccination and/or occurrence of disease at an early age. Additionally, the relative contribution of vaccine-type pneumococcus to severe pneumonia in a setting where multiple other pathogens are prevalent may differ with respect to other settings where vaccine efficacy has been clearly established.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Purpose: Inaccurate accommodation during nearwork and subsequent accommodative hysteresis may influence myopia development. Myopia is highly prevalent in Singapore; an untested theory is that Chinese children are prone to these accommodation characteristics. We measured the accuracy of accommodation responses during and nearwork-induced transient myopia (NITM) after periods spent reading Chinese and English texts. Methods: Refractions of 40 emmetropic and 43 myopic children were measured with a free-space autorefractor for four reading tasks of 10-minute durations: Chinese (SimSun, 10.5 points) and English (Times New Roman, 12 points) texts at 25 cm and 33 cm. Accuracy was obtained by subtracting accommodation response from accommodation demand. Nearwork-induced transient myopia was obtained by subtracting pretask distance refraction from posttask refraction, and regression was determined as the time for the posttask refraction to return to pretask levels. Results: There were significant, but small, effects of text type (Chinese, 0.97 ± 0.32 diopters [D] vs. English, 1.00 ± 0.37 D; F1,1230 = 7.24, p = 0.007) and reading distance (33 cm, 1.01 ± 0.30 D vs. 25 cm, 0.97 ± 0.39 D; F1,1230 = 7.74, p = 0.005) on accommodation accuracy across all participants. Accuracy was similar for emmetropic and myopic children across all reading tasks. Neither text type nor reading distance had significant effects on NITM or its regression. Myopes had greater NITM (by 0.07 D) (F1,81 = 5.05, p = 0.03) that took longer (by 50s) (F1,81 = 31.08, p < 0.01) to dissipate. Conclusions: Reading Chinese text caused smaller accommodative lags than reading English text, but the small differences were not clinically significant. Myopic children had significantly greater NITM and longer regression than emmetropic children for both texts. Whether differences in NITM are a cause or consequence of myopia cannot be answered from this study.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Background Trials of new technologies to remotely monitor for signs and symptoms of worsening heart failure are continually emerging. The extent to which technological differences impact the effectiveness of non-invasive remote monitoring for heart failure management is unknown. Objective To examine the effect of specific technology used for non-invasive remote monitoring of people with heart failure on all-cause mortality and heart failure-related hospitalisations. Methods A sub-analysis of a large systematic review and meta-analysis was conducted. Studies were stratified according to the specific type of technology used and separate meta-analyses were performed. Four different types of non-invasive remote monitoring technologies were identified including structured telephone calls, videophone, interactive voice response devices and telemonitoring. Results Only structured telephone calls and telemonitoring were effective in reducing the risk of all-cause mortality (RR 0.87; 95% CI=0.75-1.01; p=0.06 and 0.62; 95% CI=0.50-0.77; p<0.0001) and heart failure-related hospitalisations (RR 0.77; 95% CI=0.68-0.87; p<0.001) and 0.75; 95% CI=0.63-0.91; p=0.003). More research data is required for videophone and interactive voice response technologies. Conclusions This sub-analysis identified that only two of the four specific technologies used for non-invasive remote monitoring in heart failure improved outcomes. When results of studies that involved these disparate technologies were combined in previous meta-analyses, significant improvements in outcomes were identified. As such, this study has highlighted implications for future meta-analyses of randomised controlled trials focused on evaluating the effectiveness of remote monitoring in heart failure.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

BACKGROUND Malaria remains a public health problem in the remote and poor area of Yunnan Province, China. Yunnan faces an increasing risk of imported malaria infections from Mekong river neighboring countries. This study aimed to identify the high risk area of malaria transmission in Yunnan Province, and to estimate the effects of climatic variability on the transmission of Plasmodium vivax and Plasmodium falciparum in the identified area. METHODS We identified spatial clusters of malaria cases using spatial cluster analysis at a county level in Yunnan Province, 2005-2010, and estimated the weekly effects of climatic factors on P. vivax and P. falciparum based on a dataset of daily malaria cases and climatic variables. A distributed lag nonlinear model was used to estimate the impact of temperature, relative humidity and rainfall up to 10-week lags on both types of malaria parasite after adjusting for seasonal and long-term effects. RESULTS The primary cluster area was identified along the China-Myanmar border in western Yunnan. A 1°C increase in minimum temperature was associated with a lag 4 to 9 weeks relative risk (RR), with the highest effect at lag 7 weeks for P. vivax (RR = 1.03; 95% CI, 1.01, 1.05) and 6 weeks for P. falciparum (RR = 1.07; 95% CI, 1.04, 1.11); a 10-mm increment in rainfall was associated with RRs of lags 2-4 weeks and 9-10 weeks, with the highest effect at 3 weeks for both P. vivax (RR = 1.03; 95% CI, 1.01, 1.04) and P. falciparum (RR = 1.04; 95% CI, 1.01, 1.06); and the RRs with a 10% rise in relative humidity were significant from lag 3 to 8 weeks with the highest RR of 1.24 (95% CI, 1.10, 1.41) for P. vivax at 5-week lag. CONCLUSIONS Our findings suggest that the China-Myanmar border is a high risk area for malaria transmission. Climatic factors appeared to be among major determinants of malaria transmission in this area. The estimated lag effects for the association between temperature and malaria are consistent with the life cycles of both mosquito vector and malaria parasite. These findings will be useful for malaria surveillance-response systems in the Mekong river region.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

BACKGROUND Dengue fever (DF) outbreaks often arise from imported DF cases in Cairns, Australia. Few studies have incorporated imported DF cases in the estimation of the relationship between weather variability and incidence of autochthonous DF. The study aimed to examine the impact of weather variability on autochthonous DF infection after accounting for imported DF cases and then to explore the possibility of developing an empirical forecast system. METHODOLOGY/PRINCIPAL FINDS Data on weather variables, notified DF cases (including those acquired locally and overseas), and population size in Cairns were supplied by the Australian Bureau of Meteorology, Queensland Health, and Australian Bureau of Statistics. A time-series negative-binomial hurdle model was used to assess the effects of imported DF cases and weather variability on autochthonous DF incidence. Our results showed that monthly autochthonous DF incidences were significantly associated with monthly imported DF cases (Relative Risk (RR):1.52; 95% confidence interval (CI): 1.01-2.28), monthly minimum temperature ((o)C) (RR: 2.28; 95% CI: 1.77-2.93), monthly relative humidity (%) (RR: 1.21; 95% CI: 1.06-1.37), monthly rainfall (mm) (RR: 0.50; 95% CI: 0.31-0.81) and monthly standard deviation of daily relative humidity (%) (RR: 1.27; 95% CI: 1.08-1.50). In the zero hurdle component, the occurrence of monthly autochthonous DF cases was significantly associated with monthly minimum temperature (Odds Ratio (OR): 1.64; 95% CI: 1.01-2.67). CONCLUSIONS/SIGNIFICANCE Our research suggested that incidences of monthly autochthonous DF were strongly positively associated with monthly imported DF cases, local minimum temperature and inter-month relative humidity variability in Cairns. Moreover, DF outbreak in Cairns was driven by imported DF cases only under favourable seasons and weather conditions in the study.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

BACKGROUND: Physical activity, particularly walking, is greatly beneficial to health; yet a sizeable proportion of older adults are insufficiently active. The importance of built environment attributes for walking is known, but few studies of older adults have examined neighbourhood destinations and none have investigated access to specific, objectively-measured commercial destinations and walking. METHODS: We undertook a secondary analysis of data from the Western Australian state government's health surveillance survey for those aged 65--84 years and living in the Perth metropolitan region from 2003--2009 (n = 2,918). Individual-level road network service areas were generated at 400 m and 800 m distances, and the presence or absence of six commercial destination types within the neighbourhood service areas identified (food retail, general retail, medical care services, financial services, general services, and social infrastructure). Adjusted logistic regression models examined access to and mix of commercial destination types within neighbourhoods for associations with self-reported walking behaviour. RESULTS: On average, the sample was aged 72.9 years (SD = 5.4), and was predominantly female (55.9%) and married (62.0%). Overall, 66.2% reported some weekly walking and 30.8% reported sufficient walking (>=150 min/week). Older adults with access to general services within 400 m (OR = 1.33, 95% CI = 1.07-1.66) and 800 m (OR = 1.20, 95% CI = 1.02-1.42), and social infrastructure within 800 m (OR = 1.19, 95% CI = 1.01-1.40) were more likely to engage in some weekly walking. Access to medical care services within 400 m (OR = 0.77, 95% CI = 0.63-0.93) and 800 m (OR = 0.83, 95% CI = 0.70-0.99) reduced the odds of sufficient walking. Access to food retail, general retail, financial services, and the mix of commercial destination types within the neighbourhood were all unrelated to walking. CONCLUSIONS: The types of neighbourhood commercial destinations that encourage older adults to walk appear to differ slightly from those reported for adult samples. Destinations that facilitate more social interaction, for example eating at a restaurant or church involvement, or provide opportunities for some incidental social contact, for example visiting the pharmacy or hairdresser, were the strongest predictors for walking among seniors in this study. This underscores the importance of planning neighbourhoods with proximate access to social infrastructure, and highlights the need to create residential environments that support activity across the life course.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Background: Physical activity after breast cancer diagnosis is associated with improved survival. This study examines levels of and changes in physical activity following breast cancer diagnosis, overall and by race. Methods: The Carolina Breast Cancer Study, Phase III, assessed pre- and post-diagnosis physical activity levels in a cohort of 1,735 women, aged 20-74, diagnosed with invasive breast cancer between 2008 and 2011 in 44 counties of North Carolina. Logistic regression and analysis of variance were used to examine whether demographic, behavioral and clinical characteristics were associated with activity levels. Results: Only 35% of breast cancer survivors met current physical activity guidelines post-diagnosis. A decrease in activity following diagnosis was reported by 59% of patients, with the average study participant reducing their activity by 230 minutes (95% CI: 190, 270). Following adjustment for potential confounders, when compared to white women, African-American women were less likely to meet national physical activity guidelines post-diagnosis (odds ratio: 1.38, 95% CI: 1.01, 1.88), reported less weekly post-diagnosis physical activity (182 vs. 215 minutes; p=0.13), and reported higher average reductions in pre- versus post-diagnosis weekly activity (262 vs. 230 minutes; p-value = 0.13). Conclusion: Despite compelling evidence demonstrating the benefits of physical activity post-breast cancer, it is clear that more work needs to be done to promote physical activity in breast cancer patients, especially among African-American women.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Importance Myopia is a significant public health problem, making it important to determine whether a bifocal spectacle treatment involving near prism slows myopia progression in children. Objective To determine whether bifocal and prismatic bifocal spectacles control myopia in children with high rates of myopia progression and to assess whether the treatment effect is dependent on the lag of accommodation and/or near phoria status. Design, Setting, and Participants This 3-year randomized clinical trial was conducted in a private practice. A total of 135 (73 female and 62 male) Chinese-Canadian children (aged 8-13 years; mean [SE] age, 10.29 [0.15] years; mean [SE] myopia, −3.08 [0.10] D) with myopia progression of at least 0.50 D in the preceding year were randomly assigned to 1 of 3 treatments. A total of 128 (94.8%) completed the trial. Interventions Single-vision lenses (control, n = 41), +1.50-D executive bifocals (n = 48), and +1.50-D executive bifocals with 3-Δ base-in prism in the near segment of each lens (n = 46). Main Outcomes and Measures Myopia progression (primary) measured using an automated refractor following cycloplegia and increase in axial length (secondary) measured using ultrasonography at intervals of 6 months for 36 months. Results Myopia progression over 3 years was an average (SE) of −2.06 (0.13) D for the single-vision lens group, −1.25 (0.10) D for the bifocal group, and −1.01 (0.13) D for the prismatic bifocal group. Axial length increased an average (SE) of 0.82 (0.05) mm, 0.57 (0.07) mm, and 0.54 (0.06) mm, respectively. The treatment effect of bifocals (0.81 D) and prismatic bifocals (1.05 D) was significant (P < .001). Both bifocal groups had less axial elongation (0.25 mm and 0.28 mm, respectively) than the single-vision lens group (P < .001). For children with high lags of accommodation (≥1.01 D), the treatment effect of both bifocals and prismatic bifocals was similar (1.1 D) (P < .001). For children with low lags (<1.01 D), the treatment effect of prismatic bifocals (0.99 D) was greater than of bifocals (0.50 D) (P = .03). The treatment effect of both bifocals and prismatic bifocals was independent of the near phoria status. Conclusions and Relevance Bifocal spectacles can slow myopia progression in children with an annual progression rate of at least 0.50 D after 3 years. These results suggest that prismatic bifocals are more effective for myopic children with low lags of accommodation.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Background The high recurrence rate of chronic venous leg ulcers has a significant impact on an individual’s quality of life and healthcare costs. Objectives This study aimed to identify risk and protective factors for recurrence of venous leg ulcers using a theoretical approach by applying a framework of self and family management of chronic conditions to underpin the study. Design Secondary analysis of combined data collected from three previous prospective longitudinal studies. Setting The contributing studies’ participants were recruited from two metropolitan hospital outpatient wound clinics and three community-based wound clinics. Participants Data were available on a sample of 250 adults, with a leg ulcer of primarily venous aetiology, who were followed after ulcer healing for a median follow-up time of 17 months after healing (range: 3 to 36 months). Methods Data from the three studies were combined. The original participant data were collected through medical records and self-reported questionnaires upon healing and every 3 months thereafter. A Cox proportion-hazards regression analysis was undertaken to determine the influential factors on leg ulcer recurrence based on the proposed conceptual framework. Results The median time to recurrence was 42 weeks (95% CI 31.9–52.0), with an incidence of 22% (54 of 250 participants) recurrence within three months of healing, 39% (91 of 235 participants) for those who were followed for six months, 57% (111 of 193) by 12 months, 73% (53 of 72) by two years and 78% (41 of 52) of those who were followed up for three years. A Cox proportional-hazards regression model revealed that the risk factors for recurrence included a history of deep vein thrombosis (HR 1.7, 95% CI 1.07–2.67, p=0.024), history of multiple previous leg ulcers (HR 4.4, 95% CI 1.84–10.5, p=0.001), and longer duration (in weeks) of previous ulcer (HR 1.01, 95% CI 1.003–1.01, p<0.001); while the protective factors were elevating legs for at least 30 minutes per day (HR 0.33, 95% CI 0.19–0.56, p<0.001), higher levels of self-efficacy (HR 0.95, 95% CI 0.92–0.99, p=0.016), and walking around for at least three hours/day (HR 0.66, 95% CI 0.44–0.98, p=0.040). Conclusions Results from this study provide a comprehensive examination of risk and protective factors associated with leg ulcer recurrence based on the chronic disease self and family management framework. These results in turn provide essential steps towards developing and testing interventions to promote optimal prevention strategies for venous leg ulcer recurrence.