669 resultados para setting time
Resumo:
Objective: To estimate the time spent by the researchers for preparing grant proposals, and to examine whether spending more time increase the chances of success. Design: Observational study. Setting: The National Health and Medical Research Council (NHMRC) of Australia. Participants: Researchers who submitted one or more NHMRC Project Grant proposals in March 2012. Main outcome measures: Total researcher time spent preparing proposals; funding success as predicted by the time spent. Results: The NHMRC received 3727 proposals of which 3570 were reviewed and 731 (21%) were funded. Among our 285 participants who submitted 632 proposals, 21% were successful. Preparing a new proposal took an average of 38 working days of researcher time and a resubmitted proposal took 28 working days, an overall average of 34 days per proposal. An estimated 550 working years of researchers' time (95% CI 513 to 589) was spent preparing the 3727 proposals, which translates into annual salary costs of AU$66 million. More time spent preparing a proposal did not increase the chances of success for the lead researcher (prevalence ratio (PR) of success for 10 day increase=0.91, 95% credible interval 0.78 to 1.04) or other researchers (PR=0.89, 95% CI 0.67 to 1.17). Conclusions: Considerable time is spent preparing NHMRC Project Grant proposals. As success rates are historically 20–25%, much of this time has no immediate benefit to either the researcher or society, and there are large opportunity costs in lost research output. The application process could be shortened so that only information relevant for peer review, not administration, is collected. This would have little impact on the quality of peer review and the time saved could be reinvested into research.
Resumo:
Introduction: Nursing in the cardiac catheterisation laboratory (CCL) varies globally in terms of scope and deployment. In the US, all allied staff are cross-trained into all CCL roles. The Australian and New Zealand experience has legislative frameworks that reserves specific functions to nurses. Yet, the nursing role within the CCL is poorly researched and defined. Aim: This study sought to gain deeper understanding of the perceived role of CCL nurses in Australia and New Zealand. Method: A descriptive qualitative study using semi-structured in-depth interviews was used. A cross-sectional sample of 23 senior clinical nurses or nursing managers representing 16 CCLs across Australia and New Zealand was obtained. Data were digitally recorded and transcribed verbatim prior to analysis by three researchers. Results: Five major themes emerged from the data. These themes were: 1. The CCL is a unique environment; 2. CCL nursing is a unique and advanced cardiac nursing discipline; 3. The recruitment attributes for CCL nurses are advanced; 4. Education needs to be standardised; and 5. The evidence to support practice is poor. Discussion: The CCL environment is a dynamic, deeply interdisciplinary setting with CCL nursing seen to be a unique advanced practice role. Yet the time has come for a scope of practice, educational standards, guidelines and competencies was expressed by the participants. Conclusion: Nursing in the CCL is an advanced practice role working within a complex interdisciplinary environment. Further work is required to define the role of CCL nurses together with the evidence-base for their practice.
Resumo:
Several fringing coral reefs in Moreton Bay, Southeast Queensland, some 300 km south of the Great Barrier Reef (GBR), are set in a relatively high latitude, estuarine environment that is considered marginal for coral growth. Previous work indicated that these marginal reefs, as with many fringing reefs of the inner GBR, ceased accreting in the mid-Holocene. This research presents for the first time data from the subsurface profile of the mid-Holocene fossil reef at Wellington Point comprising U/Th dates of in situ and framework corals, and trace element analysis from the age constrained carbonate fragments. Based on trace element proxies the palaeo-water quality during reef accretion was reconstructed. Results demonstrate that the reef initiated more than 7,000 yr BP during the post glacial transgression, and the initiation progressed to the west as sea level rose. In situ micro-atolls indicate that sea level was at least 1 m above present mean sea level by 6,680 years ago. The reef remained in "catch-up" mode, with a seaward sloping upper surface, until it stopped aggrading abruptly at ca 6,000 yr BP; no lateral progradation occurred. Changes in sediment composition encountered in the cores suggest that after the laterite substrate was covered by the reef, most of the sediment was produced by the carbonate factory with minimal terrigenous influence. Rare earth element, Y and Ba proxies indicate that water quality during reef accretion was similar to oceanic waters, considered suitable for coral growth. A slight decline in water quality on the basis of increased Ba in the later stages of growth may be related to increased riverine input and partial closing up of the bay due to either tidal delta progradation, climatic change and/or slight sea level fall. The age data suggest that termination of reef growth coincided with a slight lowering of sea level, activation of ENSO and consequent increase in seasonality, lowering of temperatures and the constrictions to oceanic flushing. At the cessation of reef accretion the environmental conditions in the western Moreton Bay were changing from open marine to estuarine. The living coral community appears to be similar to the fossil community, but without the branching Acropora spp. that were more common in the fossil reef. In this marginal setting coral growth periods do not always correspond to periods of reef accretion due to insufficient coral abundance. Due to several environmental constraints modern coral growth is insufficient for reef growth. Based on these findings Moreton Bay may be unsuitable as a long term coral refuge for most species currently living in the GBR.
Resumo:
BACKGROUND: The prevalence of protein-energy malnutrition in older adults is reported to be as high as 60% and is associated with poor health outcomes. Inadequate feeding assistance and mealtime interruptions may contribute to malnutrition and poor nutritional intake during hospitalisation. Despite being widely implemented in practice in the United Kingdom and increasingly in Australia, there have been few studies examining the impact of strategies such as Protected Mealtimes and dedicated feeding assistant roles on nutritional outcomes of elderly inpatients. AIMS: The aim of this research was to implement and compare three system-level interventions designed to specifically address mealtime barriers and improve energy intakes of medical inpatients aged ≥65 years. This research also aimed to evaluate the sustainability of any changes to mealtime routines six months post-intervention and to gain an understanding of staff perceptions of the post-intervention mealtime experience. METHODS: Three mealtime assistance interventions were implemented in three medical wards at Royal Brisbane and Women's Hospital: AIN-only: Additional assistant-in-nursing (AIN) with dedicated nutrition role. PM-only: Multidisciplinary approach to meals, including Protected Mealtimes. PM+AIN: Combined intervention: AIN + multidisciplinary approach to meals. An action research approach was used to carefully design and implement the three interventions in partnership with ward staff and managers. Significant time was spent in consultation with staff throughout the implementation period to facilitate ownership of the interventions and increase likelihood of successful implementation. A pre-post design was used to compare the implementation and nutritional outcomes of each intervention to a pre-intervention group. Using the same wards, eligible participants (medical inpatients aged ≥65 years) were recruited to the preintervention group between November 2007 and March 2008 and to the intervention groups between January and June 2009. The primary nutritional outcome was daily energy and protein intake, which was determined by visually estimating plate waste at each meal and mid-meal on Day 4 of admission. Energy and protein intakes were compared between the pre and post intervention groups. Data were collected on a range of covariates (demographics, nutritional status and known risk factors for poor food intake), which allowed for multivariate analysis of the impact of the interventions on nutritional intake. The provision of mealtime assistance to participants and activities of ward staff (including mealtime interruptions) were observed in the pre-intervention and intervention groups, with staff observations repeated six months post-intervention. Focus groups were conducted with nursing and allied health staff in June 2009 to explore their attitudes and behaviours in response to the three mealtime interventions. These focus group discussions were analysed using thematic analysis. RESULTS: A total of 254 participants were recruited to the study (pre-intervention: n=115, AIN-only: n=58, PM-only: n=39, PM+AIN: n=42). Participants had a mean age of 80 years (SD 8), and 40% (n=101) were malnourished on hospital admission, 50% (n=108) had anorexia and 38% (n=97) required some assistance at mealtimes. Occasions of mealtime assistance significantly increased in all interventions (p<0.01). However, no change was seen in mealtime interruptions. No significant difference was seen in mean total energy and protein intake between the preintervention and intervention groups. However, when total kilojoule intake was compared with estimated requirements at the individual level, participants in the intervention groups were more likely to achieve adequate energy intake (OR=3.4, p=0.01), with no difference noted between interventions (p=0.29). Despite small improvements in nutritional adequacy, the majority of participants in the intervention groups (76%, n=103) had inadequate energy intakes to meet their estimated energy requirements. Patients with cognitive impairment or feeding dependency appeared to gain substantial benefit from mealtime assistance interventions. The increase in occasions of mealtime assistance by nursing staff during the intervention period was maintained six-months post-intervention. Staff focus groups highlighted the importance of clearly designating and defining mealtime responsibilities in order to provide adequate mealtime care. While the purpose of the dedicated feeding assistant was to increase levels of mealtime assistance, staff indicated that responsibility for mealtime duties may have merely shifted from nursing staff to the assistant. Implementing the multidisciplinary interventions empowered nursing staff to "protect" the mealtime from external interruptions, but further work is required to empower nurses to prioritise mealtime activities within their own work schedules. Staff reported an increase in the profile of nutritional care on all wards, with additional non-nutritional benefits noted including improved mobility and functional independence, and better identification of swallowing difficulties. IMPLICATIONS: The PhD research provides clinicians with practical strategies to immediately introduce change to deliver better mealtime care in the hospital setting, and, as such, has initiated local and state-wide roll-out of mealtime assistance programs. Improved nutritional intakes of elderly inpatients was observed; however given the modest effect size and reducing lengths of hospital stays, better nutritional outcomes may be achieved by targeting the hospital-to-home transition period. Findings from this study suggest that mealtime assistance interventions for elderly inpatients with cognitive impairment and/or functional dependency show promise.
Resumo:
This paper describes the theory and practice for a stable haptic teleoperation of a flying vehicle. It extends passivity-based control framework for haptic teleoperation of aerial vehicles in the longest intercontinental setting that presents great challenges. The practicality of the control architecture has been shown in maneuvering and obstacle-avoidance tasks over the internet with the presence of significant time-varying delays and packet losses. Experimental results are presented for teleoperation of a slave quadrotor in Australia from a master station in the Netherlands. The results show that the remote operator is able to safely maneuver the flying vehicle through a structure using haptic feedback of the state of the slave and the perceived obstacles.
Resumo:
Basing signature schemes on strong lattice problems has been a long standing open issue. Today, two families of lattice-based signature schemes are known: the ones based on the hash-and-sign construction of Gentry et al.; and Lyubashevsky’s schemes, which are based on the Fiat-Shamir framework. In this paper we show for the first time how to adapt the schemes of Lyubashevsky to the ring signature setting. In particular we transform the scheme of ASIACRYPT 2009 into a ring signature scheme that provides strong properties of security under the random oracle model. Anonymity is ensured in the sense that signatures of different users are within negligible statistical distance even under full key exposure. In fact, the scheme satisfies a notion which is stronger than the classical full key exposure setting as even if the keypair of the signing user is adversarially chosen, the statistical distance between signatures of different users remains negligible. Considering unforgeability, the best lattice-based ring signature schemes provide either unforgeability against arbitrary chosen subring attacks or insider corruption in log-sized rings. In this paper we present two variants of our scheme. In the basic one, unforgeability is ensured in those two settings. Increasing signature and key sizes by a factor k (typically 80 − 100), we provide a variant in which unforgeability is ensured against insider corruption attacks for arbitrary rings. The technique used is pretty general and can be adapted to other existing schemes.
Resumo:
Technological advances have led to an influx of affordable hardware that supports sensing, computation and communication. This hardware is increasingly deployed in public and private spaces, tracking and aggregating a wealth of real-time environmental data. Although these technologies are the focus of several research areas, there is a lack of research dealing with the problem of making these capabilities accessible to everyday users. This thesis represents a first step towards developing systems that will allow users to leverage the available infrastructure and create custom tailored solutions. It explores how this notion can be utilized in the context of energy monitoring to improve conventional approaches. The project adopted a user-centered design process to inform the development of a flexible system for real-time data stream composition and visualization. This system features an extensible architecture and defines a unified API for heterogeneous data streams. Rather than displaying the data in a predetermined fashion, it makes this information available as building blocks that can be combined and shared. It is based on the insight that individual users have diverse information needs and presentation preferences. Therefore, it allows users to compose rich information displays, incorporating personally relevant data from an extensive information ecosystem. The prototype was evaluated in an exploratory study to observe its natural use in a real-world setting, gathering empirical usage statistics and conducting semi-structured interviews. The results show that a high degree of customization does not warrant sustained usage. Other factors were identified, yielding recommendations for increasing the impact on energy consumption.
Resumo:
Introduction Radiographer abnormality detection systems that highlight abnormalities on trauma radiographs (‘red dot’ system) have been operating for more than 30 years. Recently, a number of pitfalls have been identified. These limitations initiated the evolution of a radiographer commenting system, whereby a radiographer provides a brief description of abnormalities identified in emergency healthcare settings. This study investigated radiographers' participation in abnormality detection systems, their perceptions of benefits, barriers and enablers to radiographer commenting, and perceptions of potential radiographer image interpretation services for emergency settings. Methods A cross-sectional survey was implemented. Participants included radiographers from four metropolitan hospitals in Queensland, Australia. Conventional descriptive statistics, histograms and thematic analysis were undertaken. Results Seventy-three surveys were completed and included in the analysis (68% response rate); 30 (41%) of respondents reported participating in abnormality detection in 20% or less of examinations, and 26(36%) reported participating in 80% or more of examinations. Five overarching perceived benefits of radiographer commenting were identified: assisting multidisciplinary teams, patient care, radiographer ability, professional benefits and quality of imaging. Frequently reported perceived barriers included ‘difficulty accessing image interpretation education’, ‘lack of time’ and ‘low confidence in interpreting radiographs’. Perceived enablers included ‘access to image interpretation education’ and ‘support from radiologist colleagues’. Conclusions A range of factors are likely to contribute to the successful implementation of radiographer commenting in addition to abnormality detection in emergency settings. Effective image interpretation education amenable to completion by radiographers would likely prove valuable in preparing radiographers for participation in abnormality detection and commenting systems in emergency settings.
Resumo:
Background Early feeding practices lay the foundation for children’s eating habits and weight gain. Questionnaires are available to assess parental feeding but overlapping and inconsistent items, subscales and terminology limit conceptual clarity and between study comparisons. Our aim was to consolidate a range of existing items into a parsimonious and conceptually robust questionnaire for assessing feeding practices with very young children (<3 years). Methods Data were from 462 mothers and children (age 21–27 months) from the NOURISH trial. Items from five questionnaires and two study-specific items were submitted to a priori item selection, allocation and verification, before theoretically-derived factors were tested using Confirmatory Factor Analysis. Construct validity of the new factors was examined by correlating these with child eating behaviours and weight. Results Following expert review 10 factors were specified. Of these, 9 factors (40 items) showed acceptable model fit and internal reliability (Cronbach’s α: 0.61-0.89). Four factors reflected non-responsive feeding practices: ‘Distrust in Appetite’, ‘Reward for Behaviour’, ‘Reward for Eating’, and ‘Persuasive Feeding’. Five factors reflected structure of the meal environment and limits: ‘Structured Meal Setting’, ‘Structured Meal Timing’, ‘Family Meal Setting’, ‘Overt Restriction’ and ‘Covert Restriction’. Feeding practices generally showed the expected pattern of associations with child eating behaviours but none with weight. Conclusion The Feeding Practices and Structure Questionnaire (FPSQ) provides a new reliable and valid measure of parental feeding practices, specifically maternal responsiveness to children’s hunger/satiety signals facilitated by routine and structure in feeding. Further validation in more diverse samples is required.
Resumo:
Background Many countries are scaling up malaria interventions towards elimination. This transition changes demands on malaria diagnostics from diagnosing ill patients to detecting parasites in all carriers including asymptomatic infections and infections with low parasite densities. Detection methods suitable to local malaria epidemiology must be selected prior to transitioning a malaria control programme to elimination. A baseline malaria survey conducted in Temotu Province, Solomon Islands in late 2008, as the first step in a provincial malaria elimination programme, provided malaria epidemiology data and an opportunity to assess how well different diagnostic methods performed in this setting. Methods During the survey, 9,491 blood samples were collected and examined by microscopy for Plasmodium species and density, with a subset also examined by polymerase chain reaction (PCR) and rapid diagnostic tests (RDTs). The performances of these diagnostic methods were compared. Results A total of 256 samples were positive by microscopy, giving a point prevalence of 2.7%. The species distribution was 17.5% Plasmodium falciparum and 82.4% Plasmodium vivax. In this low transmission setting, only 17.8% of the P. falciparum and 2.9% of P. vivax infected subjects were febrile (≥38°C) at the time of the survey. A significant proportion of infections detected by microscopy, 40% and 65.6% for P. falciparum and P. vivax respectively, had parasite density below 100/μL. There was an age correlation for the proportion of parasite density below 100/μL for P. vivax infections, but not for P. falciparum infections. PCR detected substantially more infections than microscopy (point prevalence of 8.71%), indicating a large number of subjects had sub-microscopic parasitemia. The concordance between PCR and microscopy in detecting single species was greater for P. vivax (135/162) compared to P. falciparum (36/118). The malaria RDT detected the 12 microscopy and PCR positive P. falciparum, but failed to detect 12/13 microscopy and PCR positive P. vivax infections. Conclusion Asymptomatic malaria infections and infections with low and sub-microscopic parasite densities are highly prevalent in Temotu province where malaria transmission is low. This presents a challenge for elimination since the large proportion of the parasite reservoir will not be detected by standard active and passive case detection. Therefore effective mass screening and treatment campaigns will most likely need more sensitive assays such as a field deployable molecular based assay.
Resumo:
OBJECTIVES: To determine risk factors for herpes simplex 2 (HSV2) infection in women in a polygynous rural Gambian population. METHODS: Data from women who participated in a cross-sectional survey of reproductive health were matched to their own and, for women who had been or were married (ever-married), their spouses' data collected in a cross-sectional survey of fertility interests, including information on marital histories. RESULTS: Data were available on 150 never-married and 525 ever-married women. HSV2 prevalence was 16% amongst never-married women and 36% amongst ever-married women. For ever-married women, their own personal characteristics (age, ethnicity and genital cutting status) and events from their husbands' marriage history were important determinants of HSV2 infection. Women whose husbands married for the first time over age 35 were at greater risk than women whose husbands married by age 24 [odds ratio (OR) 2.72, 95% confidence interval (CI) 1.20-6.10]. Women whose husband reported interest in a new marriage were more likely to be HSV2 positive (OR 1.91, 95% CI 1.18-3.09). Women whose husbands were currently monogamous but had had previous marriages (OR 2.76, 95% CI 1.30-5.88) and women in currently polygynous marriages (OR 2.88, 95% CI 1.66-5.01) were three times as likely to be HSV2 positive as women who were their husband's only wife ever. CONCLUSION: Much transmission of HSV2 in this setting occurs within marriage where opportunity for personal protection is limited. High levels of transmission within marriage may undermine the impact of sexual behaviour change programmes aiming to reduce HSV2 and HIV incidence and complicate their evaluation.
Resumo:
Objective: To measure alcohol-related harms to the health of young people presenting to emergency departments (EDs) of Gold Coast public hospitals before and after the increase in the federal government "alcopops" tax in 2008. Design, setting and participants: Interrupted time series analysis over 5 years (28 April 2005 to 27 April 2010) of 15-29-year-olds presenting to EDs with alcohol-related harms compared with presentations of selected control groups. Main outcome measures: Proportion of 15-29-year-olds presenting to EDs with alcohol-related harms compared with (i) 30-49-year-olds with alcohol-related harms, (ii)15-29-year-olds with asthma or appendicitis, and (iii) 15-29-yearolds with any non-alcohol and non-injury related ED presentation. Results: Over a third of 15-29-year-olds presented to ED with alcohol-related conditions, as opposed to around a quarter for all other age groups. There was no significant decrease in alcohol-related ED presentations of 15-29-year-olds compared with any of the control groups after the increase in the tax. We found similar results for males and females, narrow and broad definitions of alcoholrelated harms, under-19s, and visitors to and residents of the Gold Coast. Conclusions: The increase in the tax on al copops was not associated with any reduction in alcohol-related harms in this population in a unique tourist and holiday region. A more comprehensive approach to reducing alcohol harms in young people is needed.
Resumo:
The factors influencing both teacher and student readiness to use Facebook as part of their teaching and learning in a vocational educational institution were studied through a qualitative case study. Data included teacher and student questionnaire and focus group interviews. While it was found that the students demonstrated readiness and willingness to incorporate Facebook into their current learning, the teachers were more reluctant. Different perceptions around control of learning, time, and concerns around compartmentalisation of learning and social lives would need to be addressed before Facebook could be used as a formal learner engagement strategy.
Resumo:
The foundation of mental health nursing has historically been grounded in an interpersonal, person-centred process of health care, yet recent evidence suggests that the interactional work of mental health nursing is being eroded. Literature emphasizes the importance of person-centred care on consumer outcomes, a model reliant upon the intimate engagement of nurses and consumers. Yet, the arrival of medical interventions in psychiatry has diverted nursing work from the therapeutic nursing role to task-based roles delegated by medicine, distancing nurses from consumers. This study used work sampling methodology to observe the proportion of time nurses working in an inpatient mental health setting spend in the activities of direct care, indirect care and service-related activities. Nurses spent 32 of their time in direct care, 52% in indirect care and 17% in service-related activities. Mental health nurses need to re-establish their therapeutic availability to maximize consumer experiences and outcomes.
Resumo:
This paper investigates quality of service (QoS) and resource productivity implications of transit route passenger loading and travel time. It highlights the value of occupancy load factor as a direct passenger comfort QoS measure. Automatic Fare Collection data for a premium radial bus route in Brisbane, Australia, is used to investigate time series correlation between occupancy load factor and passenger average travel time. Correlation is strong across the entire span of service in both directions. Passengers tend to be making longer, peak direction commuter trips under significantly less comfortable conditions than off-peak. The Transit Capacity and Quality of Service Manual uses segment based load factor as a measure of onboard loading comfort QoS. This paper provides additional insight into QoS by relating the two route based dimensions of occupancy load factor and passenger average travel time together in a two dimensional format, both from the passenger’s and operator’s perspectives. Future research will apply Value of Time to QoS measurement, reflecting perceived passenger comfort through crowding and average time spent onboard. This would also assist in transit service quality econometric modeling. The methodology can be readily applied in a practical setting where AFC data for fixed scheduled routes is available. The study outcomes also provide valuable research and development directions.