904 resultados para Time study


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Electrical resistivity of soils and sediments is strongly influenced by the presence of interstitial water. Taking advantage of this dependency, electrical-resistivity imaging (ERI) can be effectively utilized to estimate subsurface soil-moisture distributions. The ability to obtain spatially extensive data combined with time-lapse measurements provides further opportunities to understand links between land use and climate processes. In natural settings, spatial and temporal changes in temperature and porewater salinity influence the relationship between soil moisture and electrical resistivity. Apart from environmental factors, technical, theoretical, and methodological ambiguities may also interfere with accurate estimation of soil moisture from ERI data. We have examined several of these complicating factors using data from a two-year study at a forest-grassland ecotone, a boundary between neighboring but different plant communities.At this site, temperature variability accounts for approximately 20-45 of resistivity changes from cold winter to warm summer months. Temporal changes in groundwater conductivity (mean=650 S/cm =57.7) and a roughly 100-S/cm spatial difference between the forest and grassland had only a minor influence on the moisture estimates. Significant seasonal fluctuations in temperature and precipitation had negligible influence on the basic measurement errors in data sets. Extracting accurate temporal changes from ERI can be hindered by nonuniqueness of the inversion process and uncertainties related to time-lapse inversion schemes. The accuracy of soil moisture obtained from ERI depends on all of these factors, in addition to empirical parameters that define the petrophysical soil-moisture/resistivity relationship. Many of the complicating factors and modifying variables to accurately quantify soil moisture changes with ERI can be accounted for using field and theoretical principles.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper investigates quality of service (QoS) and resource productivity implications of transit route passenger loading and travel time. It highlights the value of occupancy load factor as a direct passenger comfort QoS measure. Automatic Fare Collection data for a premium radial bus route in Brisbane, Australia, is used to investigate time series correlation between occupancy load factor and passenger average travel time. Correlation is strong across the entire span of service in both directions. Passengers tend to be making longer, peak direction commuter trips under significantly less comfortable conditions than off-peak. The Transit Capacity and Quality of Service Manual uses segment based load factor as a measure of onboard loading comfort QoS. This paper provides additional insight into QoS by relating the two route based dimensions of occupancy load factor and passenger average travel time together in a two dimensional format, both from the passenger’s and operator’s perspectives. Future research will apply Value of Time to QoS measurement, reflecting perceived passenger comfort through crowding and average time spent onboard. This would also assist in transit service quality econometric modeling. The methodology can be readily applied in a practical setting where AFC data for fixed scheduled routes is available. The study outcomes also provide valuable research and development directions.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This presentation investigates quality of service (QoS) and resource productivity implications of transit route passenger loading and travel time. It highlights the value of occupancy load factor as a direct passenger comfort QoS measure. Automatic Fare Collection data for a premium radial bus route in Brisbane, Australia, is used to investigate time series correlation between occupancy load factor and passenger average travel time. Correlation is strong across the entire span of service in both directions. Passengers tend to be making longer, peak direction commuter trips under significantly less comfortable conditions than off-peak. The Transit Capacity and Quality of Service Manual uses segment based load factor as a measure of onboard loading comfort QoS. This paper provides additional insight into QoS by relating the two route based dimensions of occupancy load factor and passenger average travel time together in a two dimensional format, both from the passenger’s and operator’s perspectives. Future research will apply Value of Time to QoS measurement, reflecting perceived passenger comfort through crowding and average time spent onboard. This would also assist in transit service quality econometric modeling. The methodology can be readily applied in a practical setting where AFC data for fixed scheduled routes is available. The study outcomes also provide valuable research and development directions.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Background Ascites, the most frequent complication of cirrhosis, is associated with poor prognosis and reduced quality of life. Recurrent hospital admissions are common and often unplanned, resulting in increased use of hospital services. Aims To examine use of hospital services by patients with cirrhosis and ascites requiring paracentesis, and to investigate factors associated with early unplanned readmission. Methods A retrospective review of the medical chart and clinical databases was performed for patients who underwent paracentesis between October 2011 and October 2012. Clinical parameters at index admission were compared between patients with and without early unplanned hospital readmissions. Results The 41 patients requiring paracentesis had 127 hospital admissions, 1164 occupied bed days and 733 medical imaging services. Most admissions (80.3%) were for management of ascites, of which 41.2% were unplanned. Of those eligible, 69.7% were readmitted and 42.4% had an early unplanned readmission. Twelve patients died and nine developed spontaneous bacterial peritonitis. Of those eligible for readmission, more patients died (P = 0.008) and/or developed spontaneous bacterial peritonitis (P = 0.027) if they had an early unplanned readmission during the study period. Markers of liver disease, as well as haemoglobin (P = 0.029), haematocrit (P = 0.024) and previous heavy alcohol use (P = 0.021) at index admission, were associated with early unplanned readmission. Conclusion Patients with cirrhosis and ascites comprise a small population who account for substantial use of hospital services. Markers of disease severity may identify patients at increased risk of early readmission. Alternative models of care should be considered to reduce unplanned hospital admissions, healthcare costs and pressure on emergency services.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

BACKGROUND Measuring disease and injury burden in populations requires a composite metric that captures both premature mortality and the prevalence and severity of ill-health. The 1990 Global Burden of Disease study proposed disability-adjusted life years (DALYs) to measure disease burden. No comprehensive update of disease burden worldwide incorporating a systematic reassessment of disease and injury-specific epidemiology has been done since the 1990 study. We aimed to calculate disease burden worldwide and for 21 regions for 1990, 2005, and 2010 with methods to enable meaningful comparisons over time. METHODS We calculated DALYs as the sum of years of life lost (YLLs) and years lived with disability (YLDs). DALYs were calculated for 291 causes, 20 age groups, both sexes, and for 187 countries, and aggregated to regional and global estimates of disease burden for three points in time with strictly comparable definitions and methods. YLLs were calculated from age-sex-country-time-specific estimates of mortality by cause, with death by standardised lost life expectancy at each age. YLDs were calculated as prevalence of 1160 disabling sequelae, by age, sex, and cause, and weighted by new disability weights for each health state. Neither YLLs nor YLDs were age-weighted or discounted. Uncertainty around cause-specific DALYs was calculated incorporating uncertainty in levels of all-cause mortality, cause-specific mortality, prevalence, and disability weights. FINDINGS Global DALYs remained stable from 1990 (2·503 billion) to 2010 (2·490 billion). Crude DALYs per 1000 decreased by 23% (472 per 1000 to 361 per 1000). An important shift has occurred in DALY composition with the contribution of deaths and disability among children (younger than 5 years of age) declining from 41% of global DALYs in 1990 to 25% in 2010. YLLs typically account for about half of disease burden in more developed regions (high-income Asia Pacific, western Europe, high-income North America, and Australasia), rising to over 80% of DALYs in sub-Saharan Africa. In 1990, 47% of DALYs worldwide were from communicable, maternal, neonatal, and nutritional disorders, 43% from non-communicable diseases, and 10% from injuries. By 2010, this had shifted to 35%, 54%, and 11%, respectively. Ischaemic heart disease was the leading cause of DALYs worldwide in 2010 (up from fourth rank in 1990, increasing by 29%), followed by lower respiratory infections (top rank in 1990; 44% decline in DALYs), stroke (fifth in 1990; 19% increase), diarrhoeal diseases (second in 1990; 51% decrease), and HIV/AIDS (33rd in 1990; 351% increase). Major depressive disorder increased from 15th to 11th rank (37% increase) and road injury from 12th to 10th rank (34% increase). Substantial heterogeneity exists in rankings of leading causes of disease burden among regions. INTERPRETATION Global disease burden has continued to shift away from communicable to non-communicable diseases and from premature death to years lived with disability. In sub-Saharan Africa, however, many communicable, maternal, neonatal, and nutritional disorders remain the dominant causes of disease burden. The rising burden from mental and behavioural disorders, musculoskeletal disorders, and diabetes will impose new challenges on health systems. Regional heterogeneity highlights the importance of understanding local burden of disease and setting goals and targets for the post-2015 agenda taking such patterns into account. Because of improved definitions, methods, and data, these results for 1990 and 2010 supersede all previously published Global Burden of Disease results.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The reliable response to weak biological signals requires that they be amplified with fidelity. In E. coli, the flagellar motors that control swimming can switch direction in response to very small changes in the concentration of the signaling protein CheY-P, but how this works is not well understood. A recently proposed allosteric model based on cooperative conformational spread in a ring of identical protomers seems promising as it is able to qualitatively reproduce switching, locked state behavior and Hill coefficient values measured for the rotary motor. In this paper we undertook a comprehensive simulation study to analyze the behavior of this model in detail and made predictions on three experimentally observable quantities: switch time distribution, locked state interval distribution, Hill coefficient of the switch response. We parameterized the model using experimental measurements, finding excellent agreement with published data on motor behavior. Analysis of the simulated switching dynamics revealed a mechanism for chemotactic ultrasensitivity, in which cooperativity is indispensable for realizing both coherent switching and effective amplification. These results showed how cells can combine elements of analog and digital control to produce switches that are simultaneously sensitive and reliable. © 2012 Ma et al.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A sound understanding of travellers’ behavioural changes and adaptation when facing a natural disaster is a key factor in efficiently and effectively managing transport networks at such times. This study specifically investigates the importance of travel/traffic information and its impact on travel behaviour during natural disasters. Using the 2011 Brisbane flood as a case study, survey respondents’ perceptions of the importance of travel/traffic information before, during, and after the flood were modelled using random-effects ordered logit. A hysteresis phenomenon was observed: respondents’ perceptions of the importance of travel/traffic information increased during the flood, and although its perceived importance decreased after the flood, it did not return to the pre-flood level. Results also reveal that socio-demographic features (such as gender and age) have a significant impact on respondents’ perceptions of the importance of travel/traffic information. The roles of travel time and safety in a respondent’s trip planning are also significantly correlated to their perception of the importance of this information. The analysis further shows that during the flood, respondents generally thought that travel/traffic information was important, and adjusted their travel plans according to information received. When controlling for other factors, the estimated odds of changing routes and cancelling trips for a respondent who thought that travel/traffic information was important, are respectively about three times and seven times the estimated odds for a respondent who thought that travel/traffic information was not important. In contrast, after the flood, the influence of travel/traffic information on respondents’ travel behaviour diminishes. Finally, the analysis shows no evidence of the influence of travel/traffic information’s on respondents’ travel mode; this indicates that inducing travel mode change is a challenging task.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A comprehensive literature review has been undertaken exploring the stressors placed on the personal relationships of Australian Army personnel, through service life and also overseas deployments. This work is the first step in a program of research aimed at developing a screening tool, aimed at acting as an early warning system to enable the right assistance to be given to affected personnel at the earliest possible time. It is envisioned that this tool will be utilised by the day-to-day managers of Australian Army personnel, of whom the vast majority are not health practitioners. This review has identified the commonalities of relationships that last through service life and/or deployments, and those that fail. These factors are those which will aid the development of the screening tool, and enable the early identification of Australian Army personnel who are at risk of having their personal relationship break down. Several of the known relationship stressors are relevant to other ‘high intensity’ professions, such as paramedics. Personal experience as an Army Officer has helped to highlight the importance of this research, and the benefits of developing a tool tailored to the unique social microclimate that is the Australian Army are clear. This research is, to the author’s knowledge, unique in the Australian context.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

BACKGROUND: Registered nurses and midwives play an essential role in detecting patients at risk of deterioration through ongoing assessment and action in response to changing health status. Yet, evidence suggests that clinical deterioration frequently goes unnoticed in hospitalised patients. While much attention has been paid to early warning and rapid response systems, little research has examined factors related to physical assessment skills. OBJECTIVES: To determine a minimum data set of core skills used during nursing assessment of hospitalised patients and identify nurse and workplace predictors of the use of physical assessment to detect patient deterioration. DESIGN: The study used a single-centre, cross-sectional survey design. SETTING and PARTICIPANTS: The study included 434 registered nurses and midwives (Grades 5-7) involved in clinical care of patients on acute care wards, including medicine, surgery, oncology, mental health and maternity service areas, at a 929-bed tertiary referral teaching hospital in Southeast Queensland, Australia. METHODS: We conducted a hospital-wide survey of registered nurses and midwives using the 133-item Physical Assessment Skills Inventory and the 58-item Barriers to Registered Nurses’ Use of Physical Assessment scale. Median frequency for each physical assessment skill was calculated to determine core skills. To explore predictors of core skill utilisation, backward stepwise general linear modelling was conducted. Means and regression coefficients are reported with 95% confidence intervals. A p value < .05 was considered significant for all analyses. RESULTS: Core skills used by most nurses every time they worked included assessment of temperature, oxygen saturation, blood pressure, breathing effort, skin, wound and mental status. Reliance on others and technology (F = 35.77, p < .001), lack of confidence (F = 5.52, p = .02), work area (F = 3.79, p = .002), and clinical role (F = 44.24, p < .001) were significant predictors of the extent of physical assessment skill use. CONCLUSIONS: The increasing acuity of the acute care patient plausibly warrants more than vital signs assessment; however, our study confirms nurses’ physical assessment core skill set is mainly comprised of vital signs. The focus on these endpoints of deterioration as dictated by early warning and rapid response systems may divert attention from and devalue comprehensive nursing assessment that could detect subtle changes in health status earlier in the patient's hospitalisation.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Background: While weight gain following breast cancer is considered common, results supporting these findings are dated. This work describes changes in body weight following breast cancer over 72 months, compares weight with normative data and explores whether weight changes over time are associated with personal, diagnostic, treatment or behavioral characteristics. Methods: A population-based sample of 287 Australian women diagnosed with early-stage invasive breast cancer was assessed prospectively at six, 12, 18 and 72 months post-surgery. Weight was clinically measured and linear mixed models were used to explore associations between weight and participant characteristics (collected via self-administered questionnaire). Those with BMI changes of one or more units were considered to have experienced clinically significant changes in weight. Results: More than half (57%) of participants were overweight or obese at 6 months post-surgery, and by 72 months post-surgery 68% of women were overweight or obese. Among those who gained more weight than age-matched norms, clinically significant weight gain between 6 and 18 months and 6 and 72 months post-surgery was observed in 24% and 39% of participants, respectively (median [range] weight gain: 3.9kg [2.0-11.3kg] and 5.2kg [0.6-28.7], respectively). Clinically-significant weight losses were observed in up to 24% of the sample (median [range] weight loss between 6 and 72 months post-surgery: -6.4kg [-1.9--24.6kg]). More extensive lymph node removal, being treated on the non-dominant side, receiving radiation therapy and lower physical activity levels at 6 months was associated with higher body weights post-breast cancer (group differences >3kg; all p<0.05). Conclusions: While average weight gain among breast cancer survivors in the long-term is small, subgroups of women experience greater gains linked with adverse health and above that experienced by age-matched counterparts. Weight change post-breast cancer is a contemporary public health issue and the integration of healthy weight education and support into standard breast cancer care has potential to significantly improve the length and quality of cancer survivorship.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Background Drink driving remains an important issue to address in terms of health and injury prevention even though research shows that over time there has been a steady decline in drink driving. This has been attributed to the introduction of countermeasures such as random breath testing (RBT), changing community attitudes and norms leading to less acceptance of the behaviour and, to a lesser degree, the implementation of programs designed to deter offenders from engaging in drink driving. Most of the research to date has focused on the hard core offenders - those with high blood alcohol content at the time of arrest, and those who have more than one offence. Aims There has been little research on differences within the first offender population or on factors contributing to second offences. This research aims to fill the gap by reporting on those factors in a sample of offenders. Methods This paper reports on a study that involved interviewing 198 first offenders in court and following up this group 6-8 months post offence. Of these original participants, 101 offenders were able to be followed up, with 88 included in this paper on the basis that they had driven a vehicle since the offence. Results Interestingly, while the rate of reported apprehended second offences was low in that time frame (3%), a surprising number of offenders reported that they had driven under the influence at a much higher rate (27%). That is a large proportion of first offenders were willing to risk the much larger penalties associated with a second offence in order to engage in drink driving. Discussion and conclusions Key characteristics of this follow up group are examined to inform the development of a evidence based brief intervention program that targets first time offenders with the goal of decreasing the rate of repeat drink driving.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Background Despite the widely recognised importance of sustainable health care systems, health services research remains generally underfunded in Australia. The Australian Centre for Health Services Innovation (AusHSI) is funding health services research in the state of Queensland. AusHSI has developed a streamlined protocol for applying and awarding funding using a short proposal and accelerated peer review. Method An observational study of proposals for four health services research funding rounds from May 2012 to November 2013. A short proposal of less than 1,200 words was submitted using a secure web-based portal. The primary outcome measures are: time spent preparing proposals; a simplified scoring of grant proposals (reject, revise or accept for interview) by a scientific review committee; and progressing from submission to funding outcomes within eight weeks. Proposals outside of health services research were deemed ineligible. Results There were 228 eligible proposals across 4 funding rounds: from 29% to 79% were shortlisted and 9% to 32% were accepted for interview. Success rates increased from 6% (in 2012) to 16% (in 2013) of eligible proposals. Applicants were notified of the outcomes within two weeks from the interview; which was a maximum of eight weeks after the submission deadline. Applicants spent 7 days on average preparing their proposal. Applicants with a ranking of reject or revise received written feedback and suggested improvements for their proposals, and resubmissions composed one third of the 2013 rounds. Conclusions The AusHSI funding scheme is a streamlined application process that has simplified the process of allocating health services research funding for both applicants and peer reviewers. The AusHSI process has minimised the time from submission to notification of funding outcomes.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Purpose Many haematological cancer survivors report long-term physiological and psychosocial effects, which persist far beyond treatment completion. Cancer services have been required to extend care to the post-treatment phase to implement survivorship care strategies into routine practice. As key members of the multidisciplinary team, cancer nurses’ perspectives are essential to inform future developments in survivorship care provision. Methods This is a pilot survey study, involving 119 nurses caring for patients with haematological malignancy in an Australian tertiary cancer care centre. The participants completed an investigator developed survey designed to assess cancer care nurses’ perspectives on their attitudes, confidence levels, and practice in relation to post-treatment survivorship care for patients with a haematological malignancy. Results Overall, the majority of participants agreed that all of the survivorship interventions included in the survey should be within the scope of the nursing role. Nurses reported being least confident in discussing fertility and employment/financial issues with patients and conducting psychosocial distress screening. The interventions performed least often included, discussing fertility, intimacy and sexuality issues and communicating survivorship care with the patient’s primary health care providers. Nurses identified lack of time, limited educational resources, lack of dedicated end-of-treatment consultation and insufficient skills/knowledge as the key barriers to survivorship care provision. Conclusion Cancer centres should implement an appropriate model of survivorship care and provide improved training and educational resources for nurses to enable them to deliver quality survivorship care and meet the needs of haematological cancer survivors.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This study considered the relationship between professional learning, teacher agency and school improvement. Specifically, it explored the principal's role in supporting teacher agency in their professional learning. It found that, with appropriate pressure and support from principals, school improvement for the betterment of student learning is attainable through teacher professional learning that is based 'within' a school. Particularly, it ascertained that schools need to give greater attention to the allocation of time for teacher professional learning, specifically: time before, during and after professional learning activities. Privileging time efficiently and effectively, heightens teacher agency in their learning.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Background The high recurrence rate of chronic venous leg ulcers has a significant impact on an individual’s quality of life and healthcare costs. Objectives This study aimed to identify risk and protective factors for recurrence of venous leg ulcers using a theoretical approach by applying a framework of self and family management of chronic conditions to underpin the study. Design Secondary analysis of combined data collected from three previous prospective longitudinal studies. Setting The contributing studies’ participants were recruited from two metropolitan hospital outpatient wound clinics and three community-based wound clinics. Participants Data were available on a sample of 250 adults, with a leg ulcer of primarily venous aetiology, who were followed after ulcer healing for a median follow-up time of 17 months after healing (range: 3 to 36 months). Methods Data from the three studies were combined. The original participant data were collected through medical records and self-reported questionnaires upon healing and every 3 months thereafter. A Cox proportion-hazards regression analysis was undertaken to determine the influential factors on leg ulcer recurrence based on the proposed conceptual framework. Results The median time to recurrence was 42 weeks (95% CI 31.9–52.0), with an incidence of 22% (54 of 250 participants) recurrence within three months of healing, 39% (91 of 235 participants) for those who were followed for six months, 57% (111 of 193) by 12 months, 73% (53 of 72) by two years and 78% (41 of 52) of those who were followed up for three years. A Cox proportional-hazards regression model revealed that the risk factors for recurrence included a history of deep vein thrombosis (HR 1.7, 95% CI 1.07–2.67, p=0.024), history of multiple previous leg ulcers (HR 4.4, 95% CI 1.84–10.5, p=0.001), and longer duration (in weeks) of previous ulcer (HR 1.01, 95% CI 1.003–1.01, p<0.001); while the protective factors were elevating legs for at least 30 minutes per day (HR 0.33, 95% CI 0.19–0.56, p<0.001), higher levels of self-efficacy (HR 0.95, 95% CI 0.92–0.99, p=0.016), and walking around for at least three hours/day (HR 0.66, 95% CI 0.44–0.98, p=0.040). Conclusions Results from this study provide a comprehensive examination of risk and protective factors associated with leg ulcer recurrence based on the chronic disease self and family management framework. These results in turn provide essential steps towards developing and testing interventions to promote optimal prevention strategies for venous leg ulcer recurrence.