864 resultados para age of entomology in Britain


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Objective Harassment from motorists is a major constraint on cycling that has been under-researched. We examined incidence and correlates of harassment of cyclists. Methods Cyclists in Queensland, Australia were surveyed in 2009 about their experiences of harassment while cycling, from motor vehicle occupants. Respondents also indicated the forms of harassment they experienced. Logistic regression modeling was used to examine gender and other correlates of harassment. Results Of 1830 respondents, 76% of men and 72% of women reported harassment in the previous 12 months. The most reported forms of harassment were driving too close (66%), shouting abuse (63%), and making obscene gestures/sexual harassment (45%). Older age, overweight/obesity, less cycling experience (< 2 years) and less frequent cycling (< 3 days/week) were associated with less likelihood of harassment, while living in highly advantaged areas (SEIFA deciles 8 or 9), cycling for recreation, and cycling for competition were associated with increased likelihood of harassment. Gender was not associated with reports of harassment. Conclusions Efforts to decrease harassment should include a closer examination of the circumstances that give rise to harassment, as well as fostering road environments and driver attitudes and behaviors that recognize that cyclists are legitimate road users.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Background We investigated the geographical variation of water supply and sanitation indicators (WS&S) and their role to the risk of schistosomiasis and hookworm infection in school age children in West Africa. The aim was to predict large-scale geographical variation in WS&S, quantify the attributable risk of S. haematobium, S. mansoni and hookworm infections due to WS&S and identify communities where sustainable transmission control could be targeted across the region. Methods National cross-sectional household-based demographic health surveys were conducted in 24,542 households in Burkina Faso, Ghana and Mali, in 2003–2006. We generated spatially-explicit predictions of areas without piped water, toilet facilities and finished floors in West Africa, adjusting for household covariates. Using recently published helminth prevalence data we developed Bayesian geostatistical models (MGB) of S. haematobium, S. mansoni and hookworm infection in West Africa including environmental and the mapped outputs for WS&S. Using these models we estimated the effect of WS&S on parasite risk, quantified their attributable fraction of infection, and mapped the risk of infection in West Africa. Findings Our maps show that most areas in West Africa are very poorly served by water supply except in major urban centers. There is a better geographical coverage for toilet availability and improved household flooring. We estimated smaller attributable risks for water supply in S. mansoni (47%) compared to S. haematobium (71%), and 5% of hookworm cases could be averted by improving sanitation. Greater levels of inadequate sanitation increased the risk of schistosomiasis, and increased levels of unsafe water supply increased the risk of hookworm. The role of floor type for S. haematobium infection (21%) was comparable to that of S. mansoni (16%), but was significantly higher for hookworm infection (86%). S. haematobium and hookworm maps accounting for WS&S show small clusters of maximal prevalence areas in areas bordering Burkina Faso and Mali smaller. The map of S. mansoni shows that this parasite is much more wide spread across the north of the Niger River basin than previously predicted. Interpretation Our maps identify areas where the Millennium Development Goal for water and sanitation is lagging behind. Our results show that WS&S are important contributors to the burden of major helminth infections of children in West Africa. Including information about WS&S as well as the “traditional” environmental risk factors in spatial models of helminth risk yielded a substantial gain both in model fit and at explaining the proportion of spatial variance in helminth risk. Mapping the distribution of infection risk adjusted for WS&S allowed the identification of communities in West Africa where integrative preventive chemotherapy and engineering interventions will yield the greatest public health benefits.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

PURPOSE: To examine the visual predictors of falls and injurious falls among older adults with glaucoma. METHODS: Prospective falls data were collected for 71 community-dwelling adults with primary open-angle glaucoma, mean age 73.9 ± 5.7 years, for one year using monthly falls diaries. Baseline assessment of central visual function included high-contrast visual acuity and Pelli-Robson contrast sensitivity. Binocular integrated visual fields were derived from monocular Humphrey Field Analyser plots. Rate ratios (RR) for falls and injurious falls with 95% confidence intervals (CIs) were based on negative binomial regression models. RESULTS: During the one year follow-up, 31 (44%) participants experienced at least one fall and 22 (31%) experienced falls that resulted in an injury. Greater visual impairment was associated with increased falls rate, independent of age and gender. In a multivariate model, more extensive field loss in the inferior region was associated with higher rate of falls (RR 1.57, 95%CI 1.06, 2.32) and falls with injury (RR 1.80, 95%CI 1.12, 2.98), adjusted for all other vision measures and potential confounding factors. Visual acuity, contrast sensitivity, and superior field loss were not associated with the rate of falls; topical beta-blocker use was also not associated with increased falls risk. CONCLUSIONS: Falls are common among older adults with glaucoma and occur more frequently in those with greater visual impairment, particularly in the inferior field region. This finding highlights the importance of the inferior visual field region in falls risk and assists in identifying older adults with glaucoma at risk of future falls, for whom potential interventions should be targeted. KEY WORDS: glaucoma, visual field, visual impairment, falls, injury

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The aim of this study was to determine whether spatiotemporal interactions between footballers and the ball in 1 vs. 1 sub-phases are influenced by their proximity to the goal area. Twelve participants (age 15.3 ± 0.5 years) performed as attackers and defenders in 1 vs. 1 dyads across three field positions: (a) attacking the goal, (b) in midfield, and (c) advancing away from the goal area. In each position, the dribbler was required to move beyond an immediate defender with the ball towards the opposition goal. Interactions of attacker-defender dyads were filmed with player and ball displacement trajectories digitized using manual tracking software. One-way repeated measures analysis of variance was used to examine differences in mean defender-to-ball distance after this value had stabilized. Maximum attacker-to-ball distance was also compared as a function of proximity-to-goal. Significant differences were observed for defender-to-ball distance between locations (a) and (c) at the moment when the defender-to-ball distance had stabilized (a: 1.69 ± 0.64 m; c: 1.15 ± 0.59 m; P < 0.05). Findings indicate that proximity-to-goal influenced the performance of players, particularly when attacking or advancing away from goal areas, providing implications for training design in football. In this study, the task constraints of football revealed subtly different player interactions than observed in previous studies of dyadic systems in basketball and rugby union.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Education might be conceptualized as a swarm of signs. Deleuze, in Proust and Signs (1964/2000) suggests that “Everything that teaches us something emits signs” (p. 4). Such conceptualizations regard education as fluid, multiple and temporal; a young child can display great skill in decoding some signs but not others. Regarding education as temporal and complex operates at some distance to the sociocultural concepts suggested by Vygotsky (1978) which focus on linear sequences of gaining managed, culturally-loaded knowledge from more experienced others. Despite differing theorizations around apprenticeship, during early years education a child becomes sensitive to signs that collectively prioritize conventionalized knowledge acquisition and communication practices. Drawing for learning and communicating exemplifies apprenticeship as a creative process rather than as sequential or culturally driven, and serves to exemplify Deleuzian concepts around the relationships between time and learning, rather than age or development stage and learning.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Background and significance: Older adults with chronic diseases are at increasing risk of hospital admission and readmission. Approximately 75% of adults have at least one chronic condition, and the odds of developing a chronic condition increases with age. Chronic diseases consume about 70% of the total Australian health expenditure, and about 59% of hospital events for chronic conditions are potentially preventable. These figures have brought to light the importance of the management of chronic disease among the growing older population. Many studies have endeavoured to develop effective chronic disease management programs by applying social cognitive theory. However, limited studies have focused on chronic disease self-management in older adults at high risk of hospital readmission. Moreover, although the majority of studies have covered wide and valuable outcome measures, there is scant evidence on examining the fundamental health outcomes such as nutritional status, functional status and health-related quality of life. Aim: The aim of this research was to test social cognitive theory in relation to self-efficacy in managing chronic disease and three health outcomes, namely nutritional status, functional status, and health-related quality of life, in older adults at high risk of hospital readmission. Methods: A cross-sectional study design was employed for this research. Three studies were undertaken. Study One examined the nutritional status and validation of a nutritional screening tool; Study Two explored the relationships between participants. characteristics, self-efficacy beliefs, and health outcomes based on the study.s hypothesized model; Study Three tested a theoretical model based on social cognitive theory, which examines potential mechanisms of the mediation effects of social support and self-efficacy beliefs. One hundred and fifty-seven patients aged 65 years and older with a medical admission and at least one risk factor for readmission were recruited. Data were collected from medical records on demographics, medical history, and from self-report questionnaires. The nutrition data were collected by two registered nurses. For Study One, a contingency table and the kappa statistic was used to determine the validity of the Malnutrition Screening Tool. In Study Two, standard multiple regression, hierarchical multiple regression and logistic regression were undertaken to determine the significant influential predictors for the three health outcome measures. For Study Three, a structural equation modelling approach was taken to test the hypothesized self-efficacy model. Results: The findings of Study One suggested that a high prevalence of malnutrition continues to be a concern in older adults as the prevalence of malnutrition was 20.6% according to the Subjective Global Assessment. Additionally, the findings confirmed that the Malnutrition Screening Tool is a valid nutritional screening tool for hospitalized older adults at risk of readmission when compared to the Subjective Global Assessment with high sensitivity (94%), and specificity (89%) and substantial agreement between these two methods (k = .74, p < .001; 95% CI .62-.86). Analysis data for Study Two found that depressive symptoms and perceived social support were the two strongest influential factors for self-efficacy in managing chronic disease in a hierarchical multiple regression. Results of multivariable regression models suggested advancing age, depressive symptoms and less tangible support were three important predictors for malnutrition. In terms of functional status, a standard regression model found that social support was the strongest predictor for the Instrumental Activities of Daily Living, followed by self-efficacy in managing chronic disease. The results of standard multiple regression revealed that the number of hospital readmission risk factors adversely affected the physical component score, while depressive symptoms and self-efficacy beliefs were two significant predictors for the mental component score. In Study Three, the results of the structural equation modelling found that self-efficacy partially mediated the effect of health characteristics and depression on health-related quality of life. The health characteristics had strong direct effects on functional status and body mass index. The results also indicated that social support partially mediated the relationship between health characteristics and functional status. With regard to the joint effects of social support and self-efficacy, social support fully mediated the effect of health characteristics on self-efficacy, and self-efficacy partially mediated the effect of social support on functional status and health-related quality of life. The results also demonstrated that the models fitted the data well with relative high variance explained by the models, implying the hypothesized constructs under discussion were highly relevant, and hence the application for social cognitive theory in this context was supported. Conclusion: This thesis highlights the applicability of social cognitive theory on chronic disease self-management in older adults at risk of hospital readmission. Further studies are recommended to validate and continue to extend the development of social cognitive theory on chronic disease self-management in older adults to improve their nutritional and functional status, and health-related quality of life.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Purpose There has been little community-based research regarding multiple-type victimization experiences of young people in Asia, and none in Malaysia. This study aimed to estimate prevalence, explore gender differences, as well as describe typical perpetrators and family and social risk factors among Malaysian adolescents. Methods A cross-sectional survey of 1,870 students was conducted in 20 randomly selected secondary schools in Selangor state (mean age: 16 years; 58.8% female). The questionnaire included items on individual, family, and social background and different types of victimization experiences in childhood. Results Emotional and physical types of victimization were most common. A significant proportion of adolescents (22.1%) were exposed to more than one type, with 3% reporting all four types. Compared with females, males reported more physical, emotional, and sexual victimization. The excess of sexual victimization among boys was due to higher exposure to noncontact events, whereas prevalence of forced intercourse was equal for both genders (3.0%). Although adult male perpetrators predominate, female adults and peers of both genders also contribute substantially. Low quality of parent–child relationships and poor school and neighborhood environments had the strongest associations with victimization. Family structure (parental divorce, presence of step-parent or single parent, or household size), parental drug use, and rural/urban location were not influential in this sample. Conclusion This study extends the analysis of multiple-type victimization to a Malaysian population. Although some personal, familial, and social factors correlate with those found in western nations, there are cross-cultural differences, especially with regard to the nature of sexual violence based on gender and the influence of family structure.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Aim: Children with Down syndrome have been identified as having difficulty delaying gratification when compared to mental age matched children who are developing typically. This study investigated the association between individual characteristics hypopthesized to be associated with ability to delay as well as the strategies children used in a waiting task. Method: Thirty-two children with Down syndrome and 50 typically developing children matched for mental age completed the tasks. Observations of their behaviour while waiting were video-recorded for later analysis. In addition, parents completed questionnaires with respect to their child’s personality and behaviour. Results: Children with Down syndrome were significantly less able to delay gratification than the comparison group. Different patterns of association were found for the two groups between the observational and questionnaire measures and delay time. Conclusions: Children with Down syndrome have greater difficulty delaying gratification than would be predicted on the basis of their mental age. The contributions to delay appear to differ from those for typically developing children and these differences need to be considered when planning interventions for developing this skill

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Introduction Critical care patients frequently receive blood transfusions. Some reports show an association between aged or stored blood and increased morbidity and mortality, including the development of transfusion-related acute lung injury (TRALI). However, the existence of conflicting data endorses the need for research to either reject this association, or to confirm it and elucidate the underlying mechanisms. Methods Twenty-eight sheep were randomised into two groups, receiving saline or lipopolysaccharide (LPS). Sheep were further randomised to also receive transfusion of pooled and heat-inactivated supernatant from fresh (Day 1) or stored (Day 42) non-leucoreduced human packed red blood cells (PRBC) or an infusion of saline. TRALI was defined by hypoxaemia during or within two hours of transfusion and histological evidence of pulmonary oedema. Regression modelling compared physiology between groups, and to a previous study, using stored platelet concentrates (PLT). Samples of the transfused blood products also underwent cytokine array and biochemical analyses, and their neutrophil priming ability was measured in vitro. Results TRALI did not develop in sheep that first received saline-infusion. In contrast, 80% of sheep that first received LPS-infusion developed TRALI following transfusion with "stored PRBC." The decreased mean arterial pressure and cardiac output as well as increased central venous pressure and body temperature were more severe for TRALI induced by "stored PRBC" than by "stored PLT." Storage-related accumulation of several factors was demonstrated in both "stored PRBC" and "stored PLT", and was associated with increased in vitro neutrophil priming. Concentrations of several factors were higher in the "stored PRBC" than in the "stored PLT," however, there was no difference to neutrophil priming in vitro. Conclusions In this in vivo ovine model, both recipient and blood product factors contributed to the development of TRALI. Sick (LPS infused) sheep rather than healthy (saline infused) sheep predominantly developed TRALI when transfused with supernatant from stored but not fresh PRBC. "Stored PRBC" induced a more severe injury than "stored PLT" and had a different storage lesion profile, suggesting that these outcomes may be associated with storage lesion factors unique to each blood product type. Therefore, the transfusion of fresh rather than stored PRBC may minimise the risk of TRALI.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Background: Women who birth in private facilities in Australia are more likely to have a caesarean birth than women who birth in public facilities and these differences remain after accounting for sector differences in the demographic and health risk profiles of women. However, the extent to which women’s preferences and/or freedom to choose their mode of birth further account for differences in the likelihood of caesarean birth between the sectors remains untested. Method: Women who birthed in Queensland, Australia during a two-week period in 2009 were mailed a self-report survey approximately three months after birth. Seven hundred and fifty-seven women provided cross-sectional retrospective data on where they birthed (public or private facility), mode of birth (vaginal or caesarean) and risk factors, along with their preferences and freedom to choose their mode of birth. A hierarchical logistic regression was conducted to determine the extent to which maternal risk and freedom to choose one’s mode of birth explain sector differences in the likelihood of having a caesarean birth. Findings: While there was no sector difference in women’s preference for mode of birth, women who birthed in private facilities had higher odds of feeling able to choose either a vaginal or caesarean birth, and feeling able to choose only a caesarean birth. Women had higher odds of having caesarean birth if they birthed in private facilities, even after accounting for significant risk factors such as age, body mass index, previous caesarean and use of assisted reproductive technology. However, there was no association between place of birth and odds of having a caesarean birth after also accounting for freedom to choose one’s mode of birth. Conclusions: These findings call into question suggestions that the higher caesarean birth rate in the private sector in Australia is attributable to increased levels of obstetric risk among women birthing in the private sector or maternal preferences alone. Instead, the determinants of sector differences in the likelihood of caesarean births are complex and are linked to differences in the perceived choices for mode of birth between women birthing in the private and public systems.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

People with Parkinson’s disease (PD) have been reported to be at higher risk of malnutrition than an age-matched population due to PD motor and non-motor symptoms and pharmacotherapy side effects. The prevalence of malnutrition in PD has yet to be well-defined. Community-dwelling people with PD, aged > 18 years, were recruited (n = 97, 61 M, 36 F). The Patient-Generated Subjective Global Assessment (PGSGA) was used to assess nutritional status, the Parkinson’s Disease Questionnaire (PDQ-39) was used to assess quality of life, and the Beck’s Depression Inventory (BDI) was used to measure depression. Levodopa equivalent doses (LEDs) were calculated based on reported Parkinson’s disease medication. Weight, height, mid-arm circumference (MAC) and calf circumference were measured. Cognitive function was measured using the Addenbrooke’s Cognitive Examination. Average age was 70.0 (9.1, 35–92) years. Based on SGA, 16 (16.5%) were moderately malnourished (SGA B) while none were severely malnourished (SGA C). The well-nourished participants (SGA A) had a better quality of life, t(90) = −2.28, p < 0.05, and reported less depressive symptoms, t(94)= −2.68, p < 0.05 than malnourished participants. Age, years since diagnosis, cognitive function and LEDs did not signifi cantly differ between the groups. The well-nourished participants had lower PG-SGA scores, t(95) = −5.66, p = 0.00, higher BMIs, t(95) = 3.44, p < 0.05, larger MACs, t(95) = 3.54, p < 0.05 and larger calf circumferences, t(95) = 2.29, p < 0.05 than malnourished participants. Prevalence of malnutrition in community-dwelling adults with PD in this study is comparable to that in other studies with community-dwelling adults without PD and is higher than other PD studies where a nutritional status assessment tool was used. Further research is required to understand the primary risk factors for malnutrition in this group.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Background: Dopamine D2 receptor (DRD2) is thought to be critical in regulating the dopaminergic pathway in the brain which is known to be important in the aetiology of schizophrenia. It is therefore not surprising that most antipsychotic medication acts on the Dopamine D2 receptor. DRD2 is widely expressed in brain, levels are reduced in brains of schizophrenia patients and DRD2 polymorphisms have been associated with reduced brain expression. We have previously identified a genetic variant in DRD2, rs6277 to be strongly implicated in schizophrenia susceptibility. Methods: To identity new associations in the DRD2 gene with disease status and clinical severity, we genotyped seven single nucleotide polymorphisms (SNPs) in DRD2 using a multiplex mass spectrometry method. SNPs were chosen using a haplotype block-based gene-tagging approach so the entire DRD2 gene was represented. Results: One polymorphism rs2734839 was found to be significantly associated with schizophrenia as well as late onset age. Individuals carrying the genetic variation were more than twice as likely to have schizophrenia compared to controls. Conclusions: Our results suggest that DRD2 genetic variation is a good indicator for schizophrenia risk and may also be used as a predictor age of onset.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Background: Understanding the spatial distribution of suicide can inform the planning, implementation and evaluation of suicide prevention activity. This study explored spatial clusters of suicide in Australia, and investigated likely socio-demographic determinants of these clusters. Methods: National suicide and population data at a statistical local area (SLA) level were obtained from the Australian Bureau of Statistics for the period of 1999 to 2003. Standardised mortality ratios (SMR) were calculated at the SLA level, and Geographic Information System (GIS) techniques were applied to investigate the geographical distribution of suicides and detect clusters of high risk in Australia. Results: Male suicide incidence was relatively high in the northeast of Australia, and parts of the east coast, central and southeast inland, compared with the national average. Among the total male population and males aged 15 to 34, Mornington Shire had the whole or a part of primary high risk cluster for suicide, followed by the Bathurst-Melville area, one of the secondary clusters in the north coastal area of the Northern Territory. Other secondary clusters changed with the selection of cluster radius and age group. For males aged 35 to 54 years, only one cluster in the east of the country was identified. There was only one significant female suicide cluster near Melbourne while other SLAs had very few female suicide cases and were not identified as clusters. Male suicide clusters had a higher proportion of Indigenous population and lower median socio-economic index for area (SEIFA) than the national average, but their shapes changed with selection of maximum cluster radii setting. Conclusion: This study found high suicide risk clusters at the SLA level in Australia, which appeared to be associated with lower median socio-economic status and higher proportion of Indigenous population. Future suicide prevention programs should focus on these high risk areas.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

From human biomonitoring data that are increasingly collected in the United States, Australia, and in other countries from large-scale field studies, we obtain snap-shots of concentration levels of various persistent organic pollutants (POPs) within a cross section of the population at different times. Not only can we observe the trends within this population with time, but we can also gain information going beyond the obvious time trends. By combining the biomonitoring data with pharmacokinetic modeling, we can re-construct the time-variant exposure to individual POPs, determine their intrinsic elimination half-lives in the human body, and predict future levels of POPs in the population. Different approaches have been employed to extract information from human biomonitoring data. Pharmacokinetic (PK) models were combined with longitudinal data1, with single2 or multiple3 average concentrations of a cross-sectional data (CSD), or finally with multiple CSD with or without empirical exposure data4. In the latter study, for the first time, the authors based their modeling outputs on two sets of CSD and empirical exposure data, which made it possible that their model outputs were further constrained due to the extensive body of empirical measurements. Here we use a PK model to analyze recent levels of PBDE concentrations measured in the Australian population. In this study, we are able to base our model results on four sets5-7 of CSD; we focus on two PBDE congeners that have been shown3,5,8-9 to differ in intake rates and half-lives with BDE-47 being associated with high intake rates and a short half-life and BDE-153 with lower intake rates and a longer half-life. By fitting the model to PBDE levels measured in different age groups in different years, we determine the level of intake of BDE-47 and BDE-153, as well as the half-lives of these two chemicals in the Australian population.