975 resultados para Dickinson, Jonathan, 1688-1747.


Relevância:

10.00% 10.00%

Publicador:

Resumo:

MC1R gene variants have previously been associated with red hair and fair skin color, moreover skin ultraviolet sensitivity and a strong association with melanoma has been demonstrated for three variant alleles that are active in influencing pigmentation: Arg151Cys, Arg160Trp, and Asp294His. This study has confirmed these pigmentary associations with MC1R genotype in a collection of 220 individuals drawn from the Nambour community in Queensland, Australia, 111 of whom were at high risk and 109 at low risk of basal cell carcinoma and squamous cell carcinoma. Comparative allele frequencies for nine MC1R variants that have been reported in the Caucasian population were determined for these two groups, and an association between prevalence of basal cell carcinoma, squamous cell carcinoma, solar keratosis and the same three active MC1R variant alleles was demonstrated [odds ratio = 3.15 95% CI (1.7, 5.82)]. Three other commonly occurring variant alleles: Val60Leu, Val92Met, and Arg163Gln were identified as having a minimal impact on pigmentation phenotype as well as basal cell carcinoma and squamous cell carcinoma risk. A significant heterozygote effect was demonstrated where individuals carrying a single MC1R variant allele were more likely to have fair and sun sensitive skin as well as carriage of a solar lesion when compared with those individuals with a consensus MC1R genotype. After adjusting for the effects of pigmentation on the association between MC1R variant alleles and basal cell carcinoma and squamous cell carcinoma risk, the association persisted, confirming that presence of at least one variant allele remains informative in terms of predicting risk for developing a solar-induced skin lesion beyond that information wained through observation of pigmentation phenotype.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

AIM: To document and compare current practice in nutrition assessment of Parkinson’s disease by dietitians in Australia and Canada in order to identify priority areas for review and development of practice guidelines and direct future research. METHODS: An online survey was distributed to DAA members and PEN subscribers through their email newsletters. The survey captured current practice in the phases of the Nutrition Care Plan. The results of the assessment phase are presented here. RESULTS: Eighty-four dietitians responded. Differences in practice existed in the choice of nutrition screening and assessment tools, including appropriate BMI ranges. Nutrition impact symptoms were commonly assessed, but information about Parkinson’s disease medication interactions were not consistently assessed. CONCLUSIONS: he variation in practice related to the use of screening and assessment methods may result in the identification of different goals for subsequent interventions. Even more practice variation was evident for those items more specific to Parkinson’s disease and may be due to the lack of evidence to guide practice. Further research is required to support decisions for nutrition assessment of Parkinson’s disease.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Aim This study aimed to demonstrate how supervisors and students use their time during the three domains of nutrition and dietetic clinical placement and to what extent patient care and non-patient activities change during placement compared to pre- and post- placement. Methods A cohort survey design was used with students from two Queensland universities, and their supervisors in 2010. Participants recorded their time use in either a paper-based or an electronic survey. Supervisors’ and students’ time-use was calculated as independent daily means according to time use categories reported over the length of the placement. Mean daily number of occasions of service, length of occasions of service, project and other time use in minutes was reported as productivity output indicators and the data imputed. A linear mixed modelling approach was used to describe the relationship between the stage of placement and time use in minutes. Results Combined students’ (n= 21) and supervisors’ (n=29) time use as occasions of service or length of occasions of service in patient care activities were significantly different pre, during and post placement. On project-based placements in food service management and community public health nutrition, supervisors’ project activity time significantly decreased during placements with students undertaking more time in project activities. Conclusions This study showed students do not reduce occasions of service in patient care and they enhance project activities in food service and community public health nutrition while on placement. A larger study is required to confirm these results.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The International Classification of Diseases, Version 10, Australian modification (ICD-10- AM) is commonly used to classify diseases in hospital patients. ICD-10-AM defines malnutrition as “BMI < 18.5 kg/m2 or unintentional weight loss of ≥ 5% with evidence of suboptimal intake resulting in subcutaneous fat loss and/or muscle wasting”. The Australasian Nutrition Care Day Survey (ANCDS) is the most comprehensive survey to evaluate malnutrition prevalence in acute care patients from Australian and New Zealand hospitals1. This study determined if malnourished participants were assigned malnutritionrelated codes as per ICD-10-AM. The ANCDS recruited acute care patients from 56 hospitals. Hospital-based dietitians evaluated participants’ nutritional status using BMI and Subjective Global Assessment (SGA). In keeping with the ICD-10-AM definition, malnutrition was defined as BMI <18.5kg/m2, SGA-B (moderately malnourished) or SGA-C (severely malnourished). After three months, in this prospective cohort study, hospitals’ health information/medical records department provided coding results for malnourished participants. Although malnutrition was prevalent in 32% (n= 993) of the cohort (N= 3122), a significantly small number were coded for malnutrition (n= 162, 16%, p<0.001). In 21 hospitals, none of the malnourished participants were coded. This is the largest study to provide a snapshot of malnutrition-coding in Australian and New Zealand hospitals. Findings highlight gaps in malnutrition documentation and/or subsequent coding, which could potentially result in significant loss of casemix-related revenue for hospitals. Dietitians must lead the way in developing structured processes for malnutrition identification, documentation and coding.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The Australasian Nutrition Care Day Survey (ANCDS) reported two-in-five patients in Australian and New Zealand hospitals consume ≤50% of the offered food. The ANCDS found a significant association between poor food intake and increased in-hospital mortality after controlling for confounders (nutritional status, age, disease type and severity)1. Evidence for the effectiveness of medical nutrition therapy (MNT) in hospital patients eating poorly is lacking. An exploratory study was conducted in respiratory, neurology and orthopaedic wards of an Australian hospital. At baseline, 24-hour food intake (0%, 25%, 50%, 75%, 100% of offered meals) was evaluated for patients hospitalised for ≥2 days and not under dietetic review. Patients consuming ≤50% of offered meals due to nutrition-impact symptoms were referred to ward dietitians for MNT with food intake re-evaluated on day-7. 184 patients were observed over four weeks. Sixty-two patients (34%) consumed ≤50% of the offered meals. Simple interventions (feeding/menu assistance, diet texture modifications) improved intake to ≥75% in 30 patients who did not require further MNT. Of the 32 patients referred for MNT, baseline and day-7 data were available for 20 patients (68±17years, 65% females, BMI: 22±5kg/m2, median energy, protein intake: 2250kJ, 25g respectively). On day-7, 17 participants (85%) demonstrated significantly higher consumption (4300kJ, 53g; p<0.01). Three participants demonstrated no improvement due to ongoing nutrition-impact symptoms. “Percentage food intake” was a quick tool to identify patients in whom simple interventions could enhance intake. MNT was associated with improved dietary intake in hospital patients. Further research is needed to establish a causal relationship.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Australian authorities have set ambitious policy objectives to shift Australia’s current transport profile of heavy reliance on private motor cars to sustainable modes. Improving accessibility of public transport is a central component of that objective. Past studies on accessibility to public transport focus on walking time and/or waiting time. However, travellers’ perceptions of the interface leg journeys may depend not only on these direct and tangible factors but also on social and psychological factors. This paper extends previous research that identified five salient perspectives of rail access by means of a statement sorting activity and cluster analysis with a small sample of rail passengers in three Australian cities (Zuniga et al, 2013). This study collects a new data set including 144 responses from Brisbane and Melbourne to an online survey made up of a Likert-scaled statement sorting exercise and questionnaire. It employs factor analysis to examine the statement rankings and uncovers seven underlying factors in the exploratory manner, i.e., station, safety, access, transfer, service attitude, traveler’s physical activity levels, and environmental concern. Respondents from groups stratified by rail use frequency are compared in terms of their scores of those factors. Findings from this study indicate a need to re-conceptualize accessibility to intra-urban rail travel in agreement with current policy agenda, and to target behavioral intervention to multiple dimensions of accessibility influencing passengers’ travel choices. Arguments in this paper are not limited to intra-urban rail transit, but may also be relevant to public transport in general.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Australian authorities have set ambitious policy objectives to shift Australia’s current transport profile of heavy reliance on private motor cars to sustainable modes. Improving accessibility of public transport is a central component of that objective. Past studies on accessibility to public transport focus on walking time and/or waiting time. However, travellers’ perceptions of the interface leg journeys may depend not only on these direct and tangible factors but also on social and psychological factors. This paper extends previous research that identified five salient perspectives of rail access by means of a statement sorting activity and cluster analysis with a small sample of rail passengers in three Australian cities (Zuniga et al, 2013). This study collects a new data set including 144 responses from Brisbane and Melbourne to an online survey made up of a Likert-scaled statement sorting exercise and questionnaire. It employs factor analysis to examine the statement rankings and uncovers seven underlying factors in the exploratory manner, i.e., station, safety, access, transfer, service attitude, traveler’s physical activity levels, and environmental concern. Respondents from groups stratified by rail use frequency are compared in terms of their scores of those factors. Findings from this study indicate a need to re-conceptualize accessibility to intra-urban rail travel in agreement with current policy agenda, and to target behavioral intervention to multiple dimensions of accessibility influencing passengers’ travel choices. Arguments in this paper are not limited to intra-urban rail transit, but may also be relevant to public transport in general.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Crashes that occur on motorways contribute to a significant proportion (40-50%) of non-recurrent motorway congestions. Hence, reducing the frequency of crashes assists in addressing congestion issues (Meyer, 2008). Crash likelihood estimation studies commonly focus on traffic conditions in a short time window around the time of a crash while longer-term pre-crash traffic flow trends are neglected. In this paper we will show, through data mining techniques that a relationship between pre-crash traffic flow patterns and crash occurrence on motorways exists. We will compare them with normal traffic trends and show this knowledge has the potential to improve the accuracy of existing models and opens the path for new development approaches. The data for the analysis was extracted from records collected between 2007 and 2009 on the Shibuya and Shinjuku lines of the Tokyo Metropolitan Expressway in Japan. The dataset includes a total of 824 rear-end and sideswipe crashes that have been matched with crashes corresponding to traffic flow data using an incident detection algorithm. Traffic trends (traffic speed time series) revealed that crashes can be clustered with regards to the dominant traffic patterns prior to the crash. Using the K-Means clustering method with Euclidean distance function allowed the crashes to be clustered. Then, normal situation data was extracted based on the time distribution of crashes and were clustered to compare with the “high risk” clusters. Five major trends have been found in the clustering results for both high risk and normal conditions. The study discovered traffic regimes had differences in the speed trends. Based on these findings, crash likelihood estimation models can be fine-tuned based on the monitored traffic conditions with a sliding window of 30 minutes to increase accuracy of the results and minimize false alarms.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Dwell time at the busway station has a significant effect on bus capacity and delay. Dwell time has conventionally been estimated using models developed on the basis of field survey data. However field survey is resource and cost intensive, so dwell time estimation based on limited observations can be somewhat inaccurate. Most public transport systems are now equipped with Automatic Passenger Count (APC) and/or Automatic Fare Collection (AFC) systems. AFC in particular reduces on-board ticketing time, driver’s work load and ultimately reduces bus dwell time. AFC systems can record all passenger transactions providing transit agencies with access to vast quantities of data. AFC data provides transaction timestamps, however this information differs from dwell time because passengers may tag on or tag off at times other than when doors open and close. This research effort contended that models could be developed to reliably estimate dwell time distributions when measured distributions of transaction times are known. Development of the models required calibration and validation using field survey data of actual dwell times, and an appreciation of another component of transaction time being bus time in queue. This research develops models for a peak period and off peak period at a busway station on the South East Busway (SEB) in Brisbane, Australia.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The Bus Rapid Transit (BRT) station is the interface between passengers and services. The station is crucial to line operation as it is typically the only location where buses can pass each other. Congestion may occur here when buses maneuvering into and out of the platform lane interfere with bus flow, or when a queue of buses forms upstream of the platform lane blocking the passing lane. Further, some systems include operation where express buses do not observe the station, resulting in a proportion of non-stopping buses. It is important to understand the operation of the station under this type of operation and its effect on BRT line capacity. This study uses microscopic traffic simulation modeling to treat the BRT station operation and to analyze the relationship between station bus capacity and BRT line bus capacity. First, the simulation model is developed for the limit state scenario and then a statistical model is defined and calibrated for a specified range of controlled scenarios of dwell time characteristics. A field survey was conducted to verify the parameters such as dwell time, clearance time and coefficient of variation of dwell time to obtain relevant station bus capacity. The proposed model for BRT bus capacity provides a better understanding of BRT line capacity and is useful to transit authorities in BRT planning, design and operation.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Based on regional-scale studies, aboveground production and litter decomposition are thought to positively covary, because they are driven by shared biotic and climatic factors. Until now we have been unable to test whether production and decomposition are generally coupled across climatically dissimilar regions, because we lacked replicated data collected within a single vegetation type across multiple regions, obfuscating the drivers and generality of the association between production and decomposition. Furthermore, our understanding of the relationships between production and decomposition rests heavily on separate meta-analyses of each response, because no studies have simultaneously measured production and the accumulation or decomposition of litter using consistent methods at globally relevant scales. Here, we use a multi-country grassland dataset collected using a standardized protocol to show that live plant biomass (an estimate of aboveground net primary production) and litter disappearance (represented by mass loss of aboveground litter) do not strongly covary. Live biomass and litter disappearance varied at different spatial scales. There was substantial variation in live biomass among continents, sites and plots whereas among continent differences accounted for most of the variation in litter disappearance rates. Although there were strong associations among aboveground biomass, litter disappearance and climatic factors in some regions (e.g. U.S. Great Plains), these relationships were inconsistent within and among the regions represented by this study. These results highlight the importance of replication among regions and continents when characterizing the correlations between ecosystem processes and interpreting their global-scale implications for carbon flux. We must exercise caution in parameterizing litter decomposition and aboveground production in future regional and global carbon models as their relationship is complex.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Rail operators recognize a need to increase ridership in order to improve the economic viability of rail service, and to magnify the role that rail travel plays in making cities feel liveable. This study extends previous research that used cluster analysis with a small sample of rail passengers to identify five salient perspectives of rail access (Zuniga et al, 2013). In this project stage, we used correlation techniques to determine how those perspectives would resonate with two larger study populations, including a relatively homogeneous sample of university students in Brisbane, Australia and a diverse sample of rail passengers in Melbourne, Australia. Findings from Zuniga et al. (2013) described a complex typology of current passengers that was based on respondents’ subjective attitudes and perceptions rather than socio-demographic or travel behaviour characteristics commonly used for segmentation analysis. The typology included five qualitative perspectives of rail travel. Based on the transport accessibility literature, we expected to find that perspectives from that study emphasizing physical access to rail stations would be shared by current and potential rail passengers who live further from rail stations. Other perspectives might be shared among respondents who live nearby, since the relevance of distance would be diminished. The population living nearby would thus represent an important target group for increasing ridership, since making rail travel accessible to them does not require expansion of costly infrastructure such as new lines or stations. By measuring the prevalence of each perspective in a larger respondent pool, results from this study provide insight into the typical socio-demographic and travel behaviour characteristics that correspond to each perspective of intra-urban rail travel. In several instances, our quantitative findings reinforced Zuniga et al.’s (2013) qualitative descriptions of passenger types, further validating the original research. This work may directly inform rail operators’ approach to increasing ridership through marketing and improvements to service quality and station experience. Operators in other parts of Australia and internationally may also choose to replicate the study locally, to fine-tune understanding of diverse customer bases. Developing regional and international collaboration would provide additional opportunities to evaluate and benchmark service and station amenities as they address the various access dimensions.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The objective of exercise training is to initiate desirable physiological adaptations that ultimately enhance physical work capacity. Optimal training prescription requires an individualized approach, with an appropriate balance of training stimulus and recovery and optimal periodization. Recovery from exercise involves integrated physiological responses. The cardiovascular system plays a fundamental role in facilitating many of these responses, including thermoregulation and delivery/removal of nutrients and waste products. As a marker of cardiovascular recovery, cardiac parasympathetic reactivation following a training session is highly individualized. It appears to parallel the acute/intermediate recovery of the thermoregulatory and vascular systems, as described by the supercompensation theory. The physiological mechanisms underlying cardiac parasympathetic reactivation are not completely understood. However, changes in cardiac autonomic activity may provide a proxy measure of the changes in autonomic input into organs and (by default) the blood flow requirements to restore homeostasis. Metaboreflex stimulation (e.g. muscle and blood acidosis) is likely a key determinant of parasympathetic reactivation in the short term (0–90 min post-exercise), whereas baroreflex stimulation (e.g. exercise-induced changes in plasma volume) probably mediates parasympathetic reactivation in the intermediate term (1–48 h post-exercise). Cardiac parasympathetic reactivation does not appear to coincide with the recovery of all physiological systems (e.g. energy stores or the neuromuscular system). However, this may reflect the limited data currently available on parasympathetic reactivation following strength/resistance-based exercise of variable intensity. In this review, we quantitatively analyse post-exercise cardiac parasympathetic reactivation in athletes and healthy individuals following aerobic exercise, with respect to exercise intensity and duration, and fitness/training status. Our results demonstrate that the time required for complete cardiac autonomic recovery after a single aerobic-based training session is up to 24 h following low-intensity exercise, 24–48 h following threshold-intensity exercise and at least 48 h following high-intensity exercise. Based on limited data, exercise duration is unlikely to be the greatest determinant of cardiac parasympathetic reactivation. Cardiac autonomic recovery occurs more rapidly in individuals with greater aerobic fitness. Our data lend support to the concept that in conjunction with daily training logs, data on cardiac parasympathetic activity are useful for individualizing training programmes. In the final sections of this review, we provide recommendations for structuring training microcycles with reference to cardiac parasympathetic recovery kinetics. Ultimately, coaches should structure training programmes tailored to the unique recovery kinetics of each individual.