820 resultados para seedling emergency
Resumo:
- Objective To explore the potential for using a basic text search of routine emergency department data to identify product-related injury in infants and to compare the patterns from routine ED data and specialised injury surveillance data. - Methods Data was sourced from the Emergency Department Information System (EDIS) and the Queensland Injury Surveillance Unit (QISU) for all injured infants between 2009 and 2011. A basic text search was developed to identify the top five infant products in QISU. Sensitivity, specificity, and positive predictive value were calculated and a refined search was used with EDIS. Results were manually reviewed to assess validity. Descriptive analysis was conducted to examine patterns between datasets. - Results The basic text search for all products showed high sensitivity and specificity, and most searches showed high positive predictive value. EDIS patterns were similar to QISU patterns with strikingly similar month-of-age injury peaks, admission proportions and types of injuries. - Conclusions This study demonstrated a capacity to identify a sample of valid cases of product-related injuries for specified products using simple text searching of routine ED data. - Implications As the capacity for large datasets grows and the capability to reliably mine text improves, opportunities for expanded sources of injury surveillance data increase. This will ultimately assist stakeholders such as consumer product safety regulators and child safety advocates to appropriately target prevention initiatives.
Resumo:
Social media platforms such as Facebook and Twitter are now widely recognised as playing an increasingly important role in the dissemination of information during crisis events. They are used by emergency management organisations as well as by the public to share information and advice. However, the official use of social media for crisis communication within emergency management organisations is still relatively new and ad hoc, rather than being systematically embedded within or effectively coordinated across agencies. This policy report suggests a more effectively coordinated approach to leverage social media use, involving stronger networking between social media staff within emergency management organisations. This could be realised by establishing a national network of social media practitioners managed by the Australia-New Zealand Emergency Management Committee (ANZEMC), reinforced by a Federal government task force that promotes further policy initiatives in this space.
Resumo:
One of the major impediments for the use of UAVs in civilian environment is the capability to replicate some of the functionality of safe manned aircraft operations. One critical aspect is emergency landing. Once the possible landing sites have been rated, a decision on the most suitable choice to land is required. This is a multi-criteria decision making (MCDM) problem which needs to take into account various factors in its selection of landing site. This report summarises relevant literature in MCDM in the context of emergency forced landing and proposes and compares two algorithms and methods for this task.
Resumo:
Emergency Medical Dispatchers (EMDs) are charged with taking the calls of those who ring the national emergency number for urgent medical assistance, for dispatching paramedical crews, and for providing as much assistance as can be offered remotely until paramedics arrive. In a job role which is filled with vicarious trauma, emergency situations, pressure, abuse, grief and loss, EMDs are often challenged in maintaining their mental health. The seemingly senseless death of a teenager who commits suicide, the devastating loss of a baby to Sudden Infant Death Syndrome, lives lost through natural disasters, and multiple vehicle fatalities are only a few of the types of experiences EMDs are faced with in the course of their work. However, amongst the horror are positive stories such as coaching a caller to negotiate the birth of a baby and saving a life in jeopardy from heart failure. EMD’s need to cope with the daily challenges of the role; make sense of their work and create meaning in order to have a fulfilled and sustainable career. Although some people in this work struggle greatly to withstand the impacts of vicarious trauma, there are also stories of personal growth. In this Chapter we use a case study to explore how meaning is made for those who are an auditory witness to a continual flux of trauma for others and how the traumatic experiences EMDs bear witness to can also be a catalyst for posttraumatic growth.
Resumo:
- Objective Ambulance personnel provide emergency medical services to the community, often attending to highly challenging and traumatic scenes in complex and chaotic circumstances. Currently the assessment of predictors of psychological well-being remains limited. The current study investigated whether workplace belongingness was significant in predicting psychological distress as well as the presence of resilience in ambulance personnel whilst controlling for more routinely examined factors. - Method Australian ambulance officers (N = 740) completed a survey battery including the Kessler 10 (Kessler & Mroczek, 1994), Brief Resilience Scale (Smith et al., 2008) and Psychological Sense of Organisational Membership (Cockshaw & Shochet, 2010) scale. - Results Controlling for more commonly examined factors such as severity of trauma exposure and length of service, hierarchical multiple regression analyses demonstrated that workplace belongingness was significantly associated with reduced distress levels and enhanced resilience levels. - Conclusions Results suggest that strategies to enhance a sense of workplace belongingness in emergency service organisations could promote the well-being of emergency workers despite routine exposure to potentially traumatic events.
Resumo:
We have tested the efficacy of putative microsatellite single sequence repeat (SSR) markers, previously identified in a 2-49 (Gluyas Early/Gala) × Janz doubled haploid wheat (Triticum aestivum) population, as being linked to partial seedling resistance to crown rot disease caused by Fusarium pseudograminearum. The quantitative trait loci (QTLs) delineated by these markers have been tested for linkage to resistance in an independent Gluyas Early × Janz doubled haploid population. The presence of a major QTL on chromosome 1DL (QCr.usq-1D1) and a minor QTL on chromosome 2BS (QCr.usq-2B1) was confirmed. However, a putative minor QTL on chromosome 2A was not confirmed. The QTL on 1D was inherited from Gluyas Early, a direct parent of 2-49, whereas the 2B QTL was inherited from Janz. Three other putative QTLs identified in 2-49 × Janz (on 1AL, 4BL, and 7BS) were inherited by 2-49 from Gala and were not able to be confirmed in this study. The screening of SSR markers on a small sample of elite wheat genotypes indicated that not all of the most tightly linked SSR markers flanking the major QTLs on 1D and 1A were polymorphic in all backgrounds, indicating the need for additional flanking markers when backcrossing into some elite pedigrees. Comparison of SSR haplotypes with those of other genotypes exhibiting partial crown rot resistance suggests that additional, novel sources of crown rot resistance are available.
Resumo:
Root system characteristics are of fundamental importance to soil exploration and below-ground resource acquisition. Root architectural traits determine the in situ space-filling properties of a root system or root architecture. The growth angle of root axes is a principal component of root system architecture that has been strongly associated with acquisition efficiency in many crop species. The aims of this study were to examine the extent of genotypic variability for the growth angle and number of seminal roots in 27 current Australian and 3 CIMMYT wheat (Triticum aestivum L.) genotypes, and to quantify using fractal analysis the root system architecture of a subset of wheat genotypes contrasting in drought tolerance and seminal root characteristics. The growth angle and number of seminal roots showed significant genotypic variation among the wheat genotypes with values ranging from 36 to 56 (degrees) and 3 to 5 (plant-1), respectively. Cluster analysis of wheat genotypes based on similarity in their seminal root characteristics resulted in four groups. The group composition reflected to some extent the genetic background and environmental adaptation of genotypes. Wheat cultivars grown widely in the Mediterranean environments of southern and western Australia generally had wider growth angle and lower number of seminal axes. In contrast, cultivars with superior performance on deep clay soils in the northern cropping region, such as SeriM82, Baxter, Babax, and Dharwar Dry exhibited a narrower angle of seminal axes. The wheat genotypes also showed significant variation in fractal dimension (D). The D values calculated for the individual segments of each root system suggested that, compared to the standard cultivar Hartog, the drought-tolerant genotypes adapted to the northern region tended to distribute relatively more roots in the soil volume directly underneath the plant. These findings suggest that wheat root system architecture is closely linked to the angle of seminal root axes at the seedling stage. The implications of genotypic variation in the seminal root characteristics and fractal dimension for specific adaptation to drought environment types are discussed with emphasis on the possible exploitation of root architectural traits in breeding for improved wheat cultivars for water-limited environments.
Resumo:
Seed persistence is poorly quantified for invasive plants of subtropical and tropical environments and Lantana camara, one of the world's worst weeds, is no exception. We investigated germination, seedling emergence, and seed survival of two lantana biotypes (Pink and pink-edged red [PER]) in southeastern Queensland, Australia. Controlled experiments were undertaken in 2002 and repeated in 2004, with treatments comprising two differing environmental regimes (irrigated and natural rainfall) and sowing depths (0 and 2 cm). Seed survival and seedling emergence were significantly affected by all factors (time, biotype, environment, sowing depth, and cohort) (P < 0.001). Seed dormancy varied with treatment (environment, sowing depth, biotype, and cohort) (P < 0.001), but declined rapidly after 6 mo. Significant differential responses by the two biotypes to sowing depth and environment were detected for both seed survival and seedling emergence (P < 0.001). Seed mass was consistently lower in the PER biotype at the population level (P < 0.001), but this variation did not adequately explain the differential responses. Moreover, under natural rainfall the magnitude of the biotype effect was unlikely to result in ecologically significant differences. Seed survival after 36 mo under natural rainfall ranged from 6.8 to 21.3%. Best fit regression analysis of the decline in seed survival over time yielded a five-parameter exponential decay model with a lower asymptote approaching −0.38 (% seed survival = [( 55 − (−0.38)) • e (k • t)] + −0.38; R2 = 88.5%; 9 df). Environmental conditions and burial affected the slope parameter or k value significantly (P < 0.01). Seed survival projections from the model were greatest for buried seeds under natural rainfall (11 yr) and least under irrigation (3 yr). Experimental data and model projections suggest that lantana has a persistent seed bank and this should be considered in management programs, particularly those aimed at eradication.
Resumo:
This research aims to develop an Integrated Lean Six Sigma approach to investigate and resolve the patient flow problems in hospital emergency departments. It was proposed that the voice of the customer and the voice of the process should be considered simultaneously to investigate the current process of patient flow. Statistical analysis, visual process mapping with A3 problem solving sheet, and cause and effect diagrams have been used to identify the major patient flow issues. This research found that engaged frontline workers, long-term leadership obligation, an understanding of patients' requirements and the implementation of a systematic integration of lean strategies could continuously improve patient flow, health care service and growth in the emergency departments.
Resumo:
Bellyache bush (Jatropha gossypiifolia L.) is an invasive weed that has the potential to greatly reduce biodiversity and pasture productivity in northern Australia’s rangelands. This paper reports an approach to develop best practice options for controlling medium to dense infestations of bellyache bush using combinations of control methods. The efficacy of five single treatments including foliar spraying, slashing, stick raking, burning and do nothing (control) were compared against 15 combinations of these treatments over 4 successive years. Treatments were evaluated using several attributes, including plant mortality, changes in population demographics, seedling recruitment, pasture yield and cost of treatment. Foliar spraying once each year for 4 years proved the most cost-effective control strategy, with no bellyache bush plants recorded at the end of the study. Single applications of slashing, stick raking and to a lesser extent burning, when followed up with foliar spraying also led to significantly reduced densities of bellyache bush and changed the population from a growing one to a declining one. Total experimental cost estimates over 4 successive years for treatments where burning, stick raking, foliar spraying, and slashing were followed with foliar spraying were AU$408, AU$584, AU$802 and AU$789 ha–1, respectively. Maximum pasture yield of 5.4 t ha–1 occurred with repeated foliar spraying. This study recommends that treatment combinations using either foliar spraying alone or as a follow up with slashing, stick raking or burning are best practice options following consideration of the level of control, changes in pasture yield and cost effectiveness.
Resumo:
Development of a national diagnostic database for Emergency Plant Pests which will be web-accessible.
Resumo:
Objective The aim of this study was to gather patients' perceptions regarding their choice between public and private hospital EDs for those who hold private health insurance. The findings of this study will contribute to knowledge regarding patients' decision-making processes and therefore may contribute to the development of evidence based public policies. Methods An in-depth semi-structured guide was used to interview participants at public and private hospital EDs. Questions sought to identify the issues that were considered by the participants to decide to attend that hospital ED, previous ED experience, expectations of ED services and perceived benefits and barriers to accessing services. Interviews were audio recorded, transcribed verbatim and analysed using content and thematic approaches. Results Four core themes emerged: prior good experience with the hospital, perceived quality of care, perceived waiting times and perceived costs that may explain patients' choice. Patients' choice between public and private EDs can be explained by the interaction of these core themes. The principal issues appear to be concern for gap payments at private hospital ED and waiting times at public hospital ED. Conclusions Patients who choose to attend public EDs appear to value financial concern over waiting time; those who choose to attend private EDs appear to value waiting time ahead of financial concerns.
Resumo:
Objectives: We sought to characterise the demographics, length of admission, final diagnoses, long-term outcome and costs associated with the population who presented to an Australian emergency department (ED) with symptoms of possible acute coronary syndrome (ACS). Design, setting and participants: Prospectively collected data on ED patients presenting with suspected ACS between November 2008 and February 2011 was used, including data on presentation and at 30 days after presentation. Information on patient disposition, length of stay and costs incurred was extracted from hospital administration records. Main outcome measures: Primary outcomes were mean and median cost and length of hospital stay. Secondary outcomes were diagnosis of ACS, other cardiovascular conditions or non-cardiovascular conditions within 30 days of presentation. Results: An ACS was diagnosed in 103 (11.1%) of the 926 patients recruited. 193 patients (20.8%) were diagnosed with other cardiovascular-related conditions and 622 patients (67.2%) had non-cardiac-related chest pain. ACS events occurred in 0 and 11 (1.9%) of the low-risk and intermediate-risk groups, respectively. Ninety-two (28.0%) of the 329 high-risk patients had an ACS event. Patients with a proven ACS, high-grade atrioventricular block, pulmonary embolism and other respiratory conditions had the longest length of stay. The mean cost was highest in the ACS group ($13 509; 95% CI, $11 794–$15 223) followed by other cardiovascular conditions ($7283; 95% CI, $6152–$8415) and non-cardiovascular conditions ($3331; 95% CI, $2976–$3685). Conclusions: Most ED patients with symptoms of possible ACS do not have a cardiac cause for their presentation. The current guideline-based process of assessment is lengthy, costly and consumes significant resources. Investigation of strategies to shorten this process or reduce the need for objective cardiac testing in patients at intermediate risk according to the National Heart Foundation and Cardiac Society of Australia and New Zealand guideline is required.
Resumo:
Differences in morphology have provided a basis for detecting natural interspecific hybridisation in forest trees for decades but have come to prominence again more recently as a means for directly measuring gene flow from planted forests. Here we examined the utility of seedling morphology for hybrid discrimination in three hybrid groups relevant to the monitoring of gene flow from plantings of Corymbia (L.D. Pryor & L.A.S. Johnson ex Brooker) taxa in subtropical Australia. Thirty leaf and stem characters were assessed on 907 8-month old seedlings from four parental and six hybrid taxa grown in a common garden. Outbred F1 hybrids between spotted gums (Corymbia citriodora subspecies variegata, C. citriodora subspecies citriodora and Corymbia henryi) tended to more closely resemble their maternal Corymbia torelliana parent and the most discriminating characters were the ratio of blade length to maximum perpendicular width, the presence or absence of a lignotuber, and specific leaf weight. Assignment of individuals into genealogical classes based on a multivariate model limited to a set of the more discriminating and independent characters was highest in the hybrid group, where parental taxa were genetically most divergent. Overall power to resolve among outbred F1 hybrids from both parental taxa was low to moderate, but this may not be a limitation to its likely major application of identifying hybrids in seedlots from native spotted gum stands. Advanced generation hybrids (outbred F2 and outbred backcrosses) were more difficult to resolve reliably due to the higher variances of hybrid taxa and the tendency of backcrosses to resemble their recurrent parents. Visual assessments of seedling morphology may provide a filter allowing screening of the large numbers needed to monitor gene flow, but will need to be combined with other hybrid detection methods to ensure hybrids are detected.
Resumo:
Digital image