844 resultados para efficacy in reducing recidivism
Resumo:
Runoff, soil loss, and nutrient loss were assessed on a Red Ferrosol in tropical Australia over 3 years. The experiment was conducted using bounded, 100-m(2) field plots cropped to peanuts, maize, or grass. A bare plot, without cover or crop, was also instigated as an extreme treatment. Results showed the importance of cover in reducing runoff, soil loss, and nutrient loss from these soils. Runoff ranged from 13% of incident rainfall for the conventional cultivation to 29% under bare conditions during the highest rainfall year, and was well correlated with event rainfall and rainfall energy. Soil loss ranged from 30 t/ha. year under bare conditions to <6 t/ha. year under cropping. Nutrient losses of 35 kg N and 35 kg P/ha. year under bare conditions and 17 kg N and 11 kg P/ha. year under cropping were measured. Soil carbon analyses showed a relationship with treatment runoff, suggesting that soil properties influenced the rainfall runoff response. The cropping systems model PERFECT was calibrated using runoff, soil loss, and soil water data. Runoff and soil loss showed good agreement with observed data in the calibration, and soil water and yield had reasonable agreement. Longterm runs using historical weather data showed the episodic nature of runoff and soil loss events in this region and emphasise the need to manage land using protective measures such as conservation cropping practices. Farmers involved in related, action-learning activities wished to incorporate conservation cropping findings into their systems but also needed clear production benefits to hasten practice change.
Resumo:
The impact of excessive sediment loads entering into the Great Barrier Reef lagoon has led to increased awareness of land condition in grazing lands. Improved ground cover and land condition have been identified as two important factors in reducing sediment loads. This paper reports the economics of land regeneration using case studies for two different land types in the Fitzroy Basin. The results suggest that for sediment reduction to be achieved from land regeneration of more fertile land types (brigalow blackbutt) the most efficient method of allocating funds would be through extension and education. However for less productive country (narrow leaved ironbark woodlands) incentives will be required. The analysis also highlights the need for further scientific data to undertake similar financial assessments of land regeneration for other locations in Queensland.
Resumo:
Growing legume fallow crops has proven to be an important factor in reducing the yield decline effect in sugarcane production. Legumes can also provide a direct economic benefit to sugarcane farmers by providing a source of nitrogen. Further, in some instances, income can flow from the sale, of grain or seed. The following case study provides an insight into the changes made by Russell Young, a sugarcane farmer situated in the Rita Island area of the Burdekin district. The case study focuses on the economics of the old farming system versus a new farming system. The old farming system is based on the conventional farming practices previously used by the Young family in 2002 compared to the 2006 farming system which involves a reduction in tillage practices and use of a Soybean rotational crop for seed production. A whole-of-farm was used to assess the impact of the new farming system on farm profitability. A whole-of-farm economic analysis looks at the impact of a change in farming practice across the whole business, rather than focusing on one single component. This case study is specific to an individual grower’s situation and is not representative of all situations. When evaluating a farming system change, it is important to have a detailed plan.
Resumo:
Background: Fatigue is one of the most distressing and commonly experienced symptoms in patients with advanced cancer. Although the self-management (SM) of cancer-related symptoms has received increasing attention, no research instrument assessing fatigue SM outcomes for patients with advanced cancer is available. Objectives: to describe the development and preliminary testing of an interviewer administered instrument for assessing the frequency, and perceived levels of effectiveness and self-efficacy associated with fatigue SM behaviors in patients with advanced cancer. Methods: The development and testing of the Self-efficacy in Managing Symptoms Scale- Fatigue Subscale for Patients with Advanced Cancer (SMSFS-A) involved a number of procedures: item-generation using a comprehensive literature review and semi-structured interviews, content validity evaluation using expert panel reviews, and face validity and test-retest reliability evaluation using pilot testing. Results: Initially, 23 items (22 specific behaviors with one global item) were generated from the literature review and semi-structured interviews. After two rounds of expert panel review, the final scale was reduced to 17 items (16 behaviors with one global item). Participants in the pilot test (n=10) confirmed that the questions in this scale were clear and easy to understand. Bland-Altman analysis showed agreement of results over a one-week interval. Conclusions: The SMSFS-A items were generated using multiple sources. This tool demonstrated preliminary validity and reliability. Implications for practice: The SMSFS-A has the potential to be used for clinical and research purposes. Nurses can use this instrument for collecting data to inform the initiation of appropriate fatigue SM support for this population.
Resumo:
Phosphorus (P) retention properties of soils typical for boreal forest, i.e. podzolic soil and peat soils, vary significantly, but the range of this variation has not been sufficiently documented. To assess the usefulness of buffer zones used in forestry in removing P from the discharge by chemical sorption in soil, and to estimate the risk of P leaching after forestry operations, more data is needed on soil P retention properties. P retention properties of soils were studied at clear-cut areas, unharvested buffer zones adjoining the clear-cut and at peatland buffer zone areas. Desorption-sorption isotherms were determined for the humus layer, the mineral soil horizons E, B and C of the Podzol profile and for the surface layer peat (0-15 cm) and the subsurface layer peat (15-30 cm). The efficiency of buffer zones in retaining P was studied at six peatland buffer zone areas by adding P-containing solute in the inflow. A tracer study was conducted at one of the buffer zone areas to determine the allocation of the added P in soil and vegetation. Measured sorption or desorption rather than parameter values of fitted sorption equations described P desorption and sorption behaviour in soil. The highest P retention efficiency was in the B horizon and consequently, if contact occurred or was established between the soluble P in the water and the soil B horizon, the risk of P leaching was low. Humus layer was completely incapable of retaining P after clear-cutting. In the buffer zones, the decrease in P retention properties in the humus layer and the low amount of P sorbed by it indicated that the importance of the layer in the functioning of buffer zones is low. The peatland buffer zone areas were efficient in retaining soluble P from inflow. P sorption properties of the peat soil at the buffer zone areas varied largely but the contribution of P sorption in the peat was particularly important during high flow in spring, when the vegetation was not fully developed. Factors contributing to efficient P retention were large buffer size and low hydrological load whereas high hydrological load combined with the formation of preferential flow paths, especially during early spring or late autumn was disadvantageous. However, small buffer zone areas, too, may be efficient in reducing P load.
Resumo:
An ecological risk assessment of the East Coast Otter Trawl Fishery in the Great Barrier Reef Region was undertaken in 2010 and 2011. It assessed the risks posed by this fishery to achieving fishery-related and broader ecological objectives of both the Queensland and Australian governments, including risks to the values and integrity of the Great Barrier Reef World Heritage Area. The risks assessed included direct and indirect effects on the species caught in the fishery as well as on the structure and functioning of the ecosystem. This ecosystem-based approach included an assessment of the impacts on harvested species, by-catch, species of conservation concern, marine habitats, species assemblages and ecosystem processes. The assessment took into account current management arrangements and fishing practices at the time of the assessment. The main findings of the assessment were: Current risk levels from trawling activities are generally low. Some risks from trawling remain. Risks from trawling have reduced in the Great Barrier Reef Region. Trawl fishing effort is a key driver of ecological risk. Zoning has been important in reducing risks. Reducing identified unacceptable risks requires a range of management responses. The commercial fishing industry is supportive and being proactive. Further reductions in trawl by-catch, high compliance with rules and accurate information from ongoing risk monitoring are important. Trawl fishing is just one of the sources of risk to the Great Barrier Reef.
Resumo:
Aims To investigate, using culture-independent techniques, the presence and diversity of methanogenic archaea in the foregut of kangaroos. Methods and Results DNA was extracted from forestomach contents of 42 kangaroos (three species), three sheep and three cattle. Four qualitative and quantitative PCR assays targeting the archaeal domain (16S rRNA gene) or the functional methanogenesis gene, mcrA, were used to determine the presence and population density of archaea in kangaroos and whether they were likely to be methanogens. All ruminal samples were positive for archaea, produced PCR product of expected size, contained high numbers of archaea and high numbers of cells with mcrA genes. Kangaroos were much more diverse and contradictory. Fourteen kangaroos had detectable archaea with numbers 10- to 1000-fold fewer than sheep and cattle. Many kangaroos that did not possess archaea were positive for the mcrA gene and had detectable numbers of cells with this gene and vice versa. DNA sequence analysis of kangaroos' archaeal 16S rRNA gene clones show that many methanogens were related to Methanosphaera stadmanae. Other sequences were related to non-methanogenic archaea (Thermoplasma sp.), and a number of kangaroos had mcrA gene sequences related to methane oxidising archaea (ANME). Conclusions Discrepancies between qualitative and quantitative PCR assays for archaea and the mcrA gene suggest that the archaeal communities are very diverse and it is possible that novel species exist. Significance and Impact of the Study Archaea (in general) were below detectable limits in many kangaroos, especially Red kangaroos; when present they are in lower numbers than in ruminants, and the archaea are not necessarily methanogenic. The determination of why this is the case in the kangaroo foregut could assist in reducing emissions from other ecosystems in the future.
Resumo:
Wildfire represents a major risk to pine plantations. This risk is particularly great for young plantations (generally less than 10 m in height) where prescribed fire cannot be used to manipulate fuel biomass, and where flammable grasses are abundant in the understorey. We report results from a replicated field experiment designed to determine the effects of two rates of glyphosate (450 g L–1) application, two extents of application (inter-row only and inter-row and row) with applications being applied once or twice, on understorey fine fuel biomass, fuel structure and composition in south-east Queensland, Australia. Two herbicide applications (~9 months apart) were more effective than a once-off treatment for reducing standing biomass, grass continuity, grass height, percentage grass dry weight and the density of shrubs. In addition, the 6-L ha–1 rate of application was more effective than the 3-L ha–1 rate of application in periodically reducing grass continuity and shrub density in the inter-rows and in reducing standing biomass in the tree rows, and application in the inter-rows and rows significantly reduced shrub density relative to the inter-row-only application. Herbicide treatment in the inter-rows and rows is likely to be useful for managing fuels before prescribed fire in young pine plantations because such treatment minimised tree scorch height during prescribed burns. Further, herbicide treatments had no adverse effects on plantation trees, and in some cases tree growth was enhanced by treatments. However, the effectiveness of herbicide treatments in reducing the risk of tree damage or mortality under wildfire conditions remains untested.
Resumo:
An overwhelming majority of all the research on soil phosphorus (P) has been carried out with soil samples taken from the surface soils only, and our understanding of the forms and the reactions of P at a soil profile scale is based on few observations. In Finland, the interest in studying the P in complete soil profiles has been particularly small because of the lack of tradition in studying soil genesis, morphology, or classification. In this thesis, the P reserves and the retention of orthophosphate phosphorus (PO4-P) were examined in four cultivated mineral soil profiles in Finland (three Inceptisols and one Spodosol). The soils were classified according to the U.S. Soil Taxonomy and soil samples were taken from the genetic horizons in the profiles. The samples were analyzed for total P concentration, Chang and Jackson P fractions, P sorption properties, concentrations of water-extractable P, and for concentrations of oxalate-extractable Al and Fe. Theoretical P sorption capacities and degrees of P saturation were calculated with the data from the oxalate-extractions and the P fractionations. The studied profiles can be divided into sections with clearly differing P characteristics by their master horizons Ap, B and C. The C (or transitional BC) horizons below an approximate depth of 70 cm were dominated by, assumingly apatitic, H2SO4-soluble P. The concentration of total P in the C horizons ranged from 729 to 810 mg kg-1. In the B horizons between the depths of 30 and 70 cm, a significant part of the primary acid-soluble P has been weathered and transformed to secondary P forms. A mean weathering rate of the primary P in the soils was estimated to vary between 230 and 290 g ha-1 year-1. The degrees of P saturation in the B and C horizons were smaller than 7%, and the solubility of PO4-P was negligible. The P conditions in the Ap horizons differed drastically from those in the subsurface horizons. The high concentrations of total P (689-1870 mg kg-1) in the Ap horizons are most likely attributable to long-term cultivation with positive P balances. A significant proportion of the P in the Ap horizons occurred in the NH4F- and NaOH-extractable forms and as organic P. These three P pools, together with the concentrations of oxalate-extractable Al and Fe, seem to control the dynamics of PO4-P in the soils. The degrees of P saturation in the Ap horizons were greater (8-36%) than in the subsurface horizons. This was also reflected in the sorption experiments: Only the Ap horizons were able to maintain elevated PO4-P concentrations in the solution phase − all the subsoil horizons acted as sinks for PO4-P. Most of the available sorption capacity in the soils is located in the B horizons. The results suggest that this capacity could be utilized in reducing the losses of soluble P from excessively fertilized soils by mixing highly sorptive material from the B horizons with the P-enriched surface soil. The drastic differences in the P characteristics observed between adjoining horizons have to be taken into consideration when conducting soil sampling. Sampling of subsoils has to be made according to the genetic horizons or at small depth increments. Otherwise, contrasting materials are likely to be mixed in the same sample; and the results of such samples are not representative of any material present in the studied profile. Air-drying of soil samples was found to alter the results of the sorption experiments and the water extractions. This indicates that the studies on the most labile P forms in soil should be carried out with moist samples.
Resumo:
The present study focuses on the translational strategies of Cocksfoot mottle virus (CfMV, genus Sobemovirus), which infects monocotyledonous plants. CfMV RNA lacks the 5'cap and the 3'poly(A) tail that ensure efficient translation of cellular messenger RNAs (mRNAs). Instead, CfMV RNA is covalently linked to a viral protein VPg (viral protein, genome-linked). This indicates that the viral untranslated regions (UTRs) must functionally compensate for the lack of the cap and poly(A) tail. We examined the efficacy of translation initiation in CfMV by comparing it to well-studied viral translational enhancers. Although insertion of the CfMV 5'UTR (CfMVe) into plant expression vectors improved gene expression in barley more than the other translational enhancers examined, studies at the RNA level showed that CfMVe alone or in combination with the CfMV 3'UTR did not provide the RNAs translational advantage. Mutation analysis revealed that translation initiation from CfMVe involved scanning. Interestingly, CfMVe also promoted translation initiation from an intercistronic position of dicistronic mRNAs in vitro. Furthermore, internal initiation occurred with similar efficacy in translation lysates that had reduced concentrations of eukaryotic initiation factor (eIF) 4E, suggesting that initiation was independent of the eIF4E. In contrast, reduced translation in the eIF4G-depleted lysates indicated that translation from internally positioned CfMVe was eIF4G-dependent. After successful translation initiation, leaky scanning brings the ribosomes to the second open reading frame (ORF). The CfMV polyprotein is produced from this and the following overlapping ORF via programmed -1 ribosomal frameshift (-1 PRF). Two signals in the mRNA at the beginning of the overlap program approximately every fifth ribosome to slip one nucleotide backwards and continue translation in the new -1 frame. This leads to the production of C-terminally extended polyprotein, which encodes the viral RNA-dependent RNA polymerase (RdRp). The -1 PRF event in CfMV was very efficient, even though it was programmed by a simple stem-loop structure instead of a pseudoknot, which is usually required for high -1 PRF frequencies. Interestingly, regions surrounding the -1 PRF signals improved the -1 PRF frequencies. Viral protein P27 inhibited the -1 PRF event in vivo, putatively by binding to the -1 PRF site. This suggested that P27 could regulate the occurrence of -1 PRF. Initiation of viral replication requires that viral proteins are released from the polyprotein. This is catalyzed by viral serine protease, which is also encoded from the polyprotein. N-terminal amino acid sequencing of CfMV VPg revealed that the junction of the protease and VPg was cleaved between glutamate (E) and asparagine (N) residues. This suggested that the processing sites used in CfMV differ from the glutamate and serine (S) or threonine (T) sites utilized in other sobemoviruses. However, further analysis revealed that the E/S and E/T sites may be used to cleave out some of the CfMV proteins.
Resumo:
A recent report to the Australian Government identified concerns relating to Australia's capacity to respond to a medium to large outbreak of FMD. To assess the resources required, the AusSpread disease simulation model was used to develop a plausible outbreak scenario that included 62 infected premises in five different states at the time of detection, 28 days after the disease entered the first property in Victoria. Movements of infected animals and/or contaminated product/equipment led to smaller outbreaks in NSW, Queensland, South Australia and Tasmania. With unlimited staff resources, the outbreak was eradicated in 63 days with 54 infected premises and a 98% chance of eradication within 3 months. This unconstrained response was estimated to involve 2724 personnel. Unlimited personnel was considered unrealistic, and therefore, the course of the outbreak was modelled using three levels of staffing and the probability of achieving eradication within 3 or 6 months of introduction determined. Under the baseline staffing level, there was only a 16% probability that the outbreak would be eradicated within 3 months, and a 60% probability of eradication in 6 months. Deployment of an additional 60 personnel in the first 3 weeks of the response increased the likelihood of eradication in 3 months to 68%, and 100% in 6 months. Deployment of further personnel incrementally increased the likelihood of timely eradication and decreased the duration and size of the outbreak. Targeted use of vaccination in high-risk areas coupled with the baseline personnel resources increased the probability of eradication in 3 months to 74% and to 100% in 6 months. This required 25 vaccination teams commencing 12 days into the control program increasing to 50 vaccination teams 3 weeks later. Deploying an equal number of additional personnel to surveillance and infected premises operations was equally effective in reducing the outbreak size and duration.
Resumo:
Conyza bonariensis is a major weed infesting zero-tilled cropping systems in subtropical Australia, particularly in wheat and winter fallows. Uncontrolled C.bonariensis survives to become a problem weed in the following crops or fallows. As no herbicide has been registered for C.bonariensis in wheat, the effectiveness of 11 herbicides, currently registered for other broad-leaved weeds in wheat, was evaluated in two pot and two field experiments. As previous research showed that the age of C.bonariensis, and to a lesser extent, the soil moisture at spraying affected herbicide efficacy, these factors also were investigated. The efficacy of the majority of herbicide treatments was reduced when large rosettes (5-15cm diameter) were treated, compared with small rosettes (<5cm diameter). However, for the majority of herbicide treatments, the soil moisture did not affect the herbicide efficacy in the pot experiments. In the field, a delay in herbicide treatment of 2 weeks reduced the herbicide efficacy consistently across herbicide treatments, which was related to weed age but not to soil moisture differences. Across all the experiments, four herbicides controlled C.bonariensis in wheat consistently (83-100%): 2,4-D; aminopyralid + fluroxypyr; picloram + MCPA + metsulfuron; and picloram + high rates of 2,4-D. Thus, this problem weed can be effectively and consistently controlled in wheat, particularly when small rosettes are treated, and therefore C.bonariensis will have a less adverse impact on the following fallow or crop.
Resumo:
Pratylenchus thornei is a major pathogen of wheat crops in the northern grain region of Eastern Australia with an estimated annual yield loss of $38 million. Damaged crops show symptoms of water and nutrient stress that suggest uptake is significantly affected. In order to understand the mechanisms involved in reducing water uptake and consequently plant yield, detailed measurements of water extraction and leaf area were conducted on a range of wheat cultivars with differing levels of tolerance and resistance to P. thornei.
Resumo:
Background There has been considerable publicity regarding population ageing and hospital emergency department (ED) overcrowding. Our study aims to investigate impact of one intervention piloted in Queensland Australia, the Hospital in the Nursing Home (HiNH) program, on reducing ED and hospital attendances from residential aged care facilities (RACFs). Methods A quasi-experimental study was conducted at an intervention hospital undertaking the program and a control hospital with normal practice. Routine Queensland health information system data were extracted for analysis. Results Significant reductions in the number of ED presentations per 1000 RACF beds (rate ratio (95 % CI): 0.78 (0.67–0.92); p = 0.002), number of hospital admissions per 1000 RACF beds (0.62 (0.50–0.76); p < 0.0001), and number of hospital admissions per 100 ED presentations (0.61 (0.43–0.85); p = 0.004) were noticed in the experimental hospital after the intervention; while there were no significant differences between intervention and control hospitals before the intervention. Pre-test and post-test comparison in the intervention hospital also presented significant decreases in ED presentation rate (0.75 (0.65–0.86); p < 0.0001) and hospital admission rate per RACF bed (0.66 (0.54–0.79); p < 0.0001), and a non-significant reduction in hospital admission rate per ED presentation (0.82 (0.61–1.11); p = 0.196). Conclusions Hospital in the Nursing Home program could be effective in reducing ED presentations and hospital admissions from RACF residents. Implementation of the program across a variety of settings is preferred to fully assess the ongoing benefits for patients and any possible cost-savings.
Resumo:
Two trials were done in this project. One was a continuation of work started under a previous GRDC/SRDC-funded activity, 'Strategies to improve the integration of legumes into cane based farming systems'. This trial aimed to assess the impact of trash and tillage management options and nematicide application on nematodes and crop performance. Methods and results are contained in the following publication: Halpin NV, Stirling GR, Rehbein WE, Quinn B, Jakins A, Ginns SP. The impact of trash and tillage management options and nematicide application on crop performance and plant-parasitic nematode populations in a sugarcane/peanut farming system. Proc. Aust. Soc. Sugar Cane Technol. 37, 192-203. Nematicide application in the plant crop significantly reduced total numbers of plant parasitic nematodes (PPN) but there was no impact on yield. Application of nematicide to the ratoon crop significantly reduced sugar yield. The study confirmed other work demonstrating that implementation of strategies like reduced tillage reduced populations of total PPN, suggesting that the soil was more suppressive to PPN in those treatments. The second trial, a variety trial, demonstrated the limited value of nematicide application in sugarcane farming systems. This study has highlighted that growers shouldn’t view nematicides as a ‘cure all’ for paddocks that have historically had high PPN numbers. Nematicides have high mammalian toxicity, have the potential to contaminate ground water (Kookana et al. 1995) and are costly. The cost of nematicide used in R1 was approx. $320 - $350/ha, adding $3.50/t of cane in a 100 t/ha crop. Also, our study demonstrated that a single nematicide treatment at the application rate registered for sugarcane is not very effective in reducing populations of nematode pests. There appears to be some levels of resistance to nematodes within the current suite of varieties available to the southern canelands. For example the soil in plots that were growing Q183 had 560% more root knot nematodes / 200mL soil compared to plots that grew Q245. The authors see great value in investment into a nematode screening program that could rate varieties into groups of susceptibility to both major sugarcane nematode pests. Such a rating could then be built into a decision support ‘tree’ or tool to better enable producers to select varieties on a paddock by paddock basis.