862 resultados para Diagnosis related groups Australia
Resumo:
Serum immunoreactive cationic trypsinogen levels were determined in 99 control subjects and 381 cystic fibrosis (CF) patients. To evaluate the status of the exocrine pancreas all CF patients had previously undergone fecal fat balance studies and/or pancreatic stimulation tests. Three hundred fourteen CF patients had fat malabsorption and/or had inadequate pancreatic enzyme secretion (pancreatic insufficiency) requiring oral pancreatic enzyme supplements with meals. Sixty-seven CF patients did not have fat malabsorption and/or had adequate enzyme secretion (pancreatic sufficiency) and were not receiving pancreatic enzyme supplements with meals. Mean serum trypsinogen in 99 control subjects was 31.4 ± 14.8 /µg/hter (± 2 SD) and levels did not vary with age or sex. In CF infants (< 2 yr) with pancreatic insufficiency, mean serum trypsinogen was significantly above the non-CF values (p < 0.001). Ninety-one percent of the CF infants had elevated levels. Serum trypsinogen values in the pancreatic insuffi ient group declined steeply up to 5 years, reaching subnormal values by age 6. An equation was developed which described these age-related changes very accurately. Only six CF patients with pancreatic insufficiency had serum trypsinogen levels above the 95% confidence limits of this equation. In contrast, there was no age related decline in serum trypsinogen among the CF group with pancreatic sufficiency. Under 7 yr, serum trypsinogen failed to distinguish the two groups. In those over 7 yr of age, however, serum trypsinogen was significantly higher than the CF group with pancreatic insufficiency (p < 0.001), and 93% had values within or above the control range. In conclusion, serum trypsinogen appears to be a useful screening test for CF in infancy. Between 2 and 7 yr of age this test is of little diagnostic value. After 7 yr of age, serum trypsinogen can reliably distinguish between CF patients with and without pancreatic insufficiency.
Resumo:
Kirramyces destructans is a serious pathogen causing a leaf, bud and shoot blight disease of Eucalyptus plantations in the subtropics and tropics of South-East Asia. During surveillance of eucalypt taxa trials in northern Queensland, symptoms resembling those of K. destructans were observed on Eucalyptus grandis and E. grandis × E. camaldulensis. Phylogenetic and morphological studies revealed that the Kirramyces sp. associated with these symptoms represents a new taxon described here as K. viscidus sp. nov., which is closely related to K. destructans. Plantation assessments revealed that while E. grandis from the Copperload provenance, collected in northern Queensland, recovered from disease, E. grandis × E. camaldulensis hybrids from South America were highly susceptible to infection by K. viscidus and are not recommended for planting in northern Queensland. Preliminary results suggest the fungus probably originates from Australia. K. viscidus is closely related to K. destructans and causes a disease with similar symptoms, suggesting that it could seriously damage Australian eucalypt plantations, especially those planted off-site.
Resumo:
An adaptive conjoint analysis was use to evaluate stakeholders' opinion of welfare indicators for ship-transported sheep and cattle, both onboard and in pre-export depots. In consultations with two nominees of each identified stakeholder group (government officials, animal welfare representatives, animal scientists, stockpersons, producers/pre-export depot operators, exporters/ship owners and veterinarians), 18 potential indicators were identified Three levels were assigned to each using industry statistics and expert opinion, representing those observed on the best and worst 5% of voyages and an intermediate value. A computer-based questionnaire was completed by 135 stakeholders (48% of those invited). All indicators were ranked by respondents in the assigned order, except fodder intake, in which case providing the amount necessary to maintain bodyweight was rated better than over or underfeeding, and time in the pre-export assembly depot, in which case 5 days was rated better than 0 or 10 days. The respective Importance Values (a relative rating given by the respondent) for each indicator were, in order of declining importance: mortality (8.6%), clinical disease incidence (8.2%), respiration rate (6.8%), space allowance (6.2%), ammonia levels (6.1%), weight change (6.0%), wet bulb temperature (6.0%), time in assembly depot (5.4%), percentage of animals in hospital pen (5.4%), fodder intake (5.2%), stress-related metabolites (5.0%), percentage of feeding trough utilised (5.0%), injuries (4.8%), percentage of animals able to access food troughs at any one time (4.8%), percentage of animals lying down (4.7%), cortisol concentration (4.5Y.), noise (3.9y.), and photoperiod (3.4%). The different stakeholder groups were relatively consistent in their ranking of the indicators, with all groups nominating the some top two and at least five of the top seven indicators. Some of the top indicators, in particular mortality, disease incidence and temperature, are already recorded in the Australian industry, but the study identified potential new welfare indicators for exported livestock, such as space allowance and ammonia concentration, which could be used to improve welfare standards if validated by scientific data. The top indicators would also be useful worldwide for countries engaging in long distance sea transport of livestock.
Resumo:
Synthetic backcrossed-derived bread wheats (SBWs) from CIMMYT were grown in the north-west of Mexico (CIANO) and sites across Australia during 3 seasons. A different set of lines was evaluated each season, as new materials became available from the CIMMYT crop enhancement program. Previously, we have evaluated both the performance of genotypes across environments and the genotype x environment interaction (G x E). The objective of this study was to interpret the G x E for yield in terms of crop attributes measured at individual sites and to identify the potential environmental drivers of this interaction. Groups of SBWs with consistent yield performance were identified, often comprising closely related lines. However, contrasting performance was also relatively common among sister lines or between a recurrent parent and its SBWs. Early flowering was a common feature among lines with broad adaptation and/or high yield in the northern Australian wheatbelt, while yields in the southern region did not show any association with the maturity type. Lines with high yields in the southern and northern regions had cooler canopies during flowering and early grain filling. Among the SBWs with Australian genetic backgrounds, lines best adapted to CIANO were tall (>100 cm), with a slightly higher ground cover. These lines also displayed a higher concentration of water-soluble carbohydrates in the stem at flowering, which was negatively correlated with stem number per unit area when evaluated in southern Australia (Horsham). Possible reasons for these patterns are discussed. Selection for yield at CIANO did not specifically identify the lines best adapted to northern Australia, although they were not the most poorly adapted either. In addition, groups of lines with specific adaptation to the south would not have been selected by choosing the highest yielding lines at CIANO. These findings suggest that selection at CIMMYT for Australian environments may be improved by either trait based selection or yield data combined with trait information. Flowering date, canopy temperature around flowering, tiller density, and water-soluble carbohydrate concentration in the stem at flowering seem likely candidates.
Resumo:
The current study explored the influence of moral values (measured by ethical ideology) on self-reported driving anger and aggressive driving responses. A convenience sample of drivers aged 17-73 years (n = 280) in Queensland, Australia, completed a self-report survey. Measures included sensation seeking, trait aggression, driving anger, endorsement of aggressive driving responses and ethical ideology (Ethical Position Questionnaire, EPQ). Scores on the two underlying dimensions of the EPQ idealism (highI/lowI) and relativism (highR/lowR) were used to categorise drivers into four ideological groups: Situationists (highI/highR); Absolutists (highI/lowR); Subjectivists (lowI/highR); and Exceptionists (lowI/lowR). Mean aggressive driving scores suggested that exceptionists were significantly more likely to endorse aggressive responses. After accounting for demographic variables, sensation seeking and driving anger, ethical ideological category added significantly, though modestly to the prediction of aggressive driving responses. Patterns in results suggest that those drivers in ideological groups characterised by greater concern to avoid affecting others negatively (i.e. highI, Situationists, Absolutists) may be less likely to endorse aggressive driving responses, even when angry. In contrast, Subjectivists (lowI, HighR), reported the lowest levels of driving anger yet were significantly more likely to endorse aggressive responses. This provides further insight into why high levels of driving anger may not always translate into more aggressive driving.
Resumo:
Three major changes in drink driving enforcement have occurred in South Australia since 1981. The effect of these changes on a number of surrogate measures of alcohol involvement in accidents were investigated. The surrogates included alcohol involvement of driver fatalities, and combinations of casualty, serious casualty, single vehicle and nighttime accidents. Data from previous studies were also cited. It was found that relationships between surrogate measures were inconsistent, and incompatible with assumptions about drink driving levels and related accidents. It was concluded that until these effects are understood the use of surrogate measures should be treated with caution.
Resumo:
Due to the improved prognosis of many forms of cancer, an increasing number of cancer survivors are willing to return to work after their treatment. It is generally believed, however, that people with cancer are either unemployed, stay at home, or retire more often than people without cancer. This study investigated the problems that cancer survivors experience on the labour market, as well as the disease-related, sociodemographic and psychosocial factors at work that are associated with the employment and work ability of cancer survivors. The impact of cancer on employment was studied combining the data of Finnish Cancer Registry and census data of the years 1985, 1990, 1995 or 1997 of Statistics Finland. There were two data sets containing 46 312 and 12 542 people with cancer. The results showed that cancer survivors were slightly less often employed than their referents. Two to three years after the diagnosis the employment rate of the cancer survivors was 9% lower than that of their referents (64% vs. 73%), whereas the employment rate was the same before the diagnosis (78%). The employment rate varied greatly according to the cancer type and education. The probability of being employed was greater in the lower than in the higher educational groups. People with cancer were less often employed than people without cancer mainly because of their higher retirement rate (34% vs. 27%). As well as employment, retirement varied by cancer type. The risk of retirement was twofold for people having cancer of the nervous system or people with leukaemia compared to their referents, whereas people with skin cancer, for example, did not have an increased risk of retirement. The aim of the questionnaire study was to investigate whether the work ability of cancer survivors differs from that of people without cancer and whether cancer had impaired their work ability. There were 591 cancer survivors and 757 referents in the data. Even though current work ability of cancer survivors did not differ between the survivors and their referents, 26% of cancer survivors reported that their physical work ability, and 19% that their mental work ability had deteriorated due to cancer. The survivors who had other diseases or had had chemotherapy, most often reported impaired work ability, whereas survivors with a strong commitment to their work organization, or a good social climate at work, reported impairment less frequently. The aim of the other questionnaire study containing 640 people with the history of cancer was to examine extent of social support that cancer survivors needed, and had received from their work community. The cancer survivors had received most support from their co-workers, and they hoped for more support especially from the occupational health care personnel (39% of women and 29% of men). More support was especially needed by men who had lymphoma, had received chemotherapy or had a low education level. The results of this study show that the majority of the survivors are able to return to work. There is, however, a group of cancer survivors who leave work life early, have impaired work ability due to their illness, and suffer from lack of support from their work place and the occupational health services. Treatment-related, as well as sociodemographic factors play an important role in survivors' work-related problems, and presumably their possibilities to continue working.
Resumo:
CONTEXT People meeting diagnostic criteria for anxiety or depressive disorders tend to score high on the personality scale of neuroticism. Studying this personality dimension can give insights into the etiology of these important psychiatric disorders. OBJECTIVES To undertake a comprehensive genome-wide linkage study of neuroticism using large study samples that have been measured multiple times and to compare the results between countries for replication and across time within countries for consistency. DESIGN Genome-wide linkage scan. SETTING Twin individuals and their family members from Australia and the Netherlands. PARTICIPANTS Nineteen thousand six hundred thirty-five sibling pairs completed self-report questionnaires for neuroticism up to 5 times over a period of up to 22 years. Five thousand sixty-nine sibling pairs were genotyped with microsatellite markers. METHODS Nonparametric linkage analyses were conducted in MERLIN-REGRESS for the mean neuroticism scores averaged across time. Additional analyses were conducted for the time-specific measures of neuroticism from each country to investigate consistency of linkage results. RESULTS Three chromosomal regions exceeded empirically derived thresholds for suggestive linkage using mean neuroticism scores: 10p 5 Kosambi cM (cM) (Dutch study sample), 14q 103 cM (Dutch study sample), and 18q 117 cM (combined Australian and Dutch study sample), but only 14q retained significance after correction for multiple testing. These regions all showed evidence for linkage in individual time-specific measures of neuroticism and 1 (18q) showed some evidence for replication between countries. Linkage intervals for these regions all overlap with regions identified in other studies of neuroticism or related traits and/or in studies of anxiety in mice. CONCLUSIONS Our results demonstrate the value of the availability of multiple measures over time and add to the optimism reported in recent reviews for replication of linkage regions for neuroticism. These regions are likely to harbor causal variants for neuroticism and its related psychiatric disorders and can inform prioritization of results from genome-wide association studies.
Resumo:
A telial rust on leaves of Acacia pennata ssp. kerrii from Cape York Peninsula is described as Sphaerophragmium quadricellulare sp. nov. No other spore stages have been observed. Brief notes on other related rusts occurring in Australia are given.
Resumo:
Wheat crops in southeast Queensland (Qld) and northern New South Wales (NSW) were infected with fusarium head blight (FHB)-like symptoms during the 201011 wheat growing season. Wheat crops in this region were surveyed at soft dough or early maturity stage to determine the distribution, severity, aetiology and toxigenicity of FHB. FHB was widespread on bread wheat and durum, and Fusarium graminearum and/or F.pseudograminearum were diagnosed from 42 of the 44 sites using species-specific PCR primers directly on spikelets or from monoconidial cultures obtained from spikelets. Stem base browning due to crown rot (CR) was also evident in some samples from both states. The overall FHB and CR severity was higher for NSW than Qld. Deoxynivalenol (DON) concentration of immature grains was more than 1 mg kg-1 in samples from 11 Qld and 14 NSW sites, but only 13 of 498 mature grain samples sourced from the affected areas had more than 1 mg kg-1 DON. DON concentration in straw also exceeded 1 mg kg-1 in eight Qld and all but one NSW sites but this was not linked to DON concentration of immature grains. The proportion of spikelets with positive diagnosis for F.graminearum and/or F.pseudograminearum and weather-related factors influenced DON levels in immature grains. The average monthly rainfall for AugustNovember during crop anthesis and maturation exceeded the long-term monthly average by 10150%. Weather played a critical role in FHB epidemics for Qld sites but this was not apparent for the NSW sites, as weather was generally favourable at all sites.
Resumo:
BACKGROUND: The recent development of very high resistance to phosphine in rusty grain beetle, Cryptolestes ferrugineus (Stephens), seriously threatens stored-grain biosecurity. The aim was to characterise this resistance, to develop a rapid bioassay for its diagnosis to support pest management and to document the distribution of resistance in Australia in 20072011. RESULTS: Bioassays of purified laboratory reference strains and field-collected samples revealed three phenotypes: susceptible, weakly resistant and strongly resistant. With resistance factors of > 1000 x , resistance to phosphine expressed by the strong resistance phenotype was higher than reported for any stored-product insect species. The new time-to-knockdown assay rapidly and accurately diagnosed each resistance phenotype within 6 h. Although less frequent in western Australia, weak resistance was detected throughout all grain production regions. Strong resistance occurred predominantly in central storages in eastern Australia. CONCLUSION: Resistance to phosphine in the rusty grain beetle is expressed through two identifiable phenotypes: weak and strong. Strong resistance requires urgent changes to current fumigation dosages. The development of a rapid assay for diagnosis of resistance enables the provision of same-day advice to expedite resistance management decisions. (c) 2012 Commonwealth of Australia. Published by John Wiley & Sons, Ltd.
Resumo:
Objectives: We sought to characterise the demographics, length of admission, final diagnoses, long-term outcome and costs associated with the population who presented to an Australian emergency department (ED) with symptoms of possible acute coronary syndrome (ACS). Design, setting and participants: Prospectively collected data on ED patients presenting with suspected ACS between November 2008 and February 2011 was used, including data on presentation and at 30 days after presentation. Information on patient disposition, length of stay and costs incurred was extracted from hospital administration records. Main outcome measures: Primary outcomes were mean and median cost and length of hospital stay. Secondary outcomes were diagnosis of ACS, other cardiovascular conditions or non-cardiovascular conditions within 30 days of presentation. Results: An ACS was diagnosed in 103 (11.1%) of the 926 patients recruited. 193 patients (20.8%) were diagnosed with other cardiovascular-related conditions and 622 patients (67.2%) had non-cardiac-related chest pain. ACS events occurred in 0 and 11 (1.9%) of the low-risk and intermediate-risk groups, respectively. Ninety-two (28.0%) of the 329 high-risk patients had an ACS event. Patients with a proven ACS, high-grade atrioventricular block, pulmonary embolism and other respiratory conditions had the longest length of stay. The mean cost was highest in the ACS group ($13 509; 95% CI, $11 794–$15 223) followed by other cardiovascular conditions ($7283; 95% CI, $6152–$8415) and non-cardiovascular conditions ($3331; 95% CI, $2976–$3685). Conclusions: Most ED patients with symptoms of possible ACS do not have a cardiac cause for their presentation. The current guideline-based process of assessment is lengthy, costly and consumes significant resources. Investigation of strategies to shorten this process or reduce the need for objective cardiac testing in patients at intermediate risk according to the National Heart Foundation and Cardiac Society of Australia and New Zealand guideline is required.
Resumo:
This case study discusses in detail for the first time the diagnosis and management of a case of leishmaniosis in a dog imported to Australia. The dog presented with epistaxis and a non-regenerative anaemia five years after being imported from Europe. Protozoa were identified within macrophages in bone marrow and splenic cytology. A Leishmania indirect fluorescent antibody test was performed and was positive while an Ehrlichia canis antibody test was negative. Polymerase chain reaction of the ITS-1 and ITS-2 regions of skin, lymph node, spleen and bone marrow were all positive for Leishmania infantum. The dog was treated with amphotericin B with a strong clinical response. The importance of thorough diagnostics in non-endemic areas, particularly Australia, is discussed. Treatment with amphotericin B is discussed. Vigilance, disease reporting and response frameworks are recommended for non-endemic areas. © 2014 Elsevier B.V.
Resumo:
There is uncertainty over the potential changes to rainfall across northern Australia under climate change. Since rainfall is a key driver of pasture growth, cattle numbers and the resulting animal productivity and beef business profitability, the ability to anticipate possible management strategies within such uncertainty is crucial. The Climate Savvy Grazing project used existing research, expert knowledge and computer modelling to explore the best-bet management strategies within best, median and worse-case future climate scenarios. All three scenarios indicated changes to the environment and resources upon which the grazing industry of northern Australia depends. Well-adapted management strategies under a changing climate are very similar to best practice within current climatic conditions. Maintaining good land condition builds resource resilience, maximises opportunities under higher rainfall years and reduces the risk of degradation during drought and failed wet seasons. Matching stocking rate to the safe long-term carrying capacity of the land is essential; reducing stock numbers in response to poor seasons and conservatively increasing stock numbers in response to better seasons generally improves profitability and maintains land in good condition. Spelling over the summer growing season will improve land condition under a changing climate as it does under current conditions. Six regions were included within the project. Of these, the Victoria River District in the Northern Territory, Gulf country of Queensland and the Kimberley region of Western Australia had projections of similar or higher than current rainfall and the potential for carrying capacity to increase. The Alice Springs, Maranoa-Balonne and Fitzroy regions had projections of generally drying conditions and the greatest risk of reduced pasture growth and carrying capacity. Encouraging producers to consider and act on the risks, opportunities and management options inherent in climate change was a key goal of the project. More than 60,000 beef producers, advisors and stakeholders are now more aware of the management strategies which build resource resilience, and that resilience helps buffer against the effects of variable and changing climatic conditions. Over 700 producers have stated they have improved confidence, skills and knowledge to attempt new practices to build resilience. During the course of the project, more than 165 beef producers reported they have implemented changes to build resource and business resilience.
Resumo:
Changes in energy-related CO2 emissions aggregate intensity, total CO2 emissions and per-capita CO2 emissions in Australia are decomposed by using a Logarithmic Mean Divisia Index (LMDI) method for the period 1978-2010. Results indicate improvements in energy efficiency played a dominant role in the measured 17% reduction in CO2 emissions aggregate intensity in Australia over the period. Structural changes in the economy, such as changes in the relative importance of the services sector vis-à-vis manufacturing, have also played a major role in achieving this outcome. Results also suggest that, without these mitigating factors, income per capita and population effects could well have produced an increase in total emissions of more than 50% higher than actually occurred over the period. Perhaps most starkly, the results indicate that, without these mitigating factors, the growth in CO2 emissions per capita could have been over 150% higher than actually observed. Notwithstanding this, the study suggests that, for Australia to meet its Copenhagen commitment, the relative average per annum effectiveness of these mitigating factors during 2010-2020 probably needs to be almost three times what it was in the 2005-2010 period-a very daunting challenge indeed for Australia's policymakers.