907 resultados para Month
Resumo:
The efficacy of supported covers was investigated under field conditions using a series of prototypes deployed on an anaerobic pond treating typical piggery waste. Research focused on identifying effective cover support materials and deployment methods, quantifying odour reduction, and estimating the life expectancy of various permeable cover materials. Over a 10-month period, median odour emission rates were five to eight times lower from supported straw cover surfaces and a non-woven, spun fibre polypropylene weed control material than from the adjacent uncovered pond surface. While the straw covers visually appeared to degrade very rapidly, they continued to reduce odour emissions effectively. The polypropylene cover appeared to offer advantages from the perspectives of cost, reduced maintenance and ease of manufacture.
Resumo:
Nutrition influences reproductive efficiency and the survival of lambs and weaners but the costs of supplementary feeding or maintaining low stocking rates are not justified by the resulting income from higher lamb weaning rates and reduced weaner mortality. The current practice of segmenting the ewe flock using ultrasound scanning to determine the number of foetuses still results in groups of ewes with a wide range of condition scores and with widely differing nutritional requirements. This report describes an approach to precision management of pregnant ewes and weaners that is based on the e-sheep platform of technologies and uses computer-directed drafting for nutritional management of individual animals and walk-through weighing to monitor changing nutritional status. It is estimated that the cost of feeding a thousand-ewe flock can be reduced from $14,000 for feeding all animals to $3300 for targeted feeding of 25% of ewes requiring additional nutrition and 20% of weaners at risk of dying. The cost of the targeted feeding strategy is more than justified by the value of additional 12-month-old animals, which is $9000. The e-sheep precision nutrition system is not attractive to industry at this stage because of the cost of the e-sheep infrastructure, the perceived complexity of the technology and the requirement for further research, but it is expected to be a commercial option within three years.
Resumo:
Dairy farms in subtropical Australia use irrigated, annually sown short-term ryegrass (Lolium multiflorum) or mixtures of short-term ryegrass and white (Trifolium repens) and Persian (shaftal) (T. resupinatum) clover during the winter-spring period in all-year-round milk production systems. A series of small plot cutting experiments was conducted in 3 dairying regions (tropical upland, north Queensland, and subtropical southeast Queensland and northern New South Wales) to determine the most effective rate and frequency of application of nitrogen (N) fertiliser. The experiments were not grazed, nor was harvested material returned to the plots, after sampling. Rates up to 100 kg N/ha.month (as urea or calcium ammonium nitrate) and up to 200 kg N/ha every 2 months (as urea) were applied to pure stands of ryegrass in 1991. In 1993 and 1994, urea, at rates up to 150 kg N/ha.month and to 200 kg N/ha every 2 months, was applied to pure stands of ryegrass; urea, at rates up to 50 kg N/ha.month, was also applied to ryegrass-clover mixtures. The results indicate that applications of 50-85 kg N/ha.month can be recommended for short-term ryegrass pastures throughout the subtropics and tropical uplands of eastern Australia, irrespective of soil type. At this rate, dry matter yields will reach about 90% of their potential, forage nitrogen concentration will be increased, there is minimal risk to stock from nitrate poisoning and there will be no substantial increase in soil N. The rate of N for ryegrass-clover pastures is slightly higher than for pure ryegrass but, at these rates, the clover component will be suppressed. However, increased ryegrass yields and higher forage nitrogen concentrations will compensate for the reduced clover component. At application rates up to 100 kg N/ha.month, build-up of NO3--N and NH4+-N in soil was generally restricted to the surface layers (0-20 cm) of the soil, but there was a substantial increase throughout the soil profile at 150 kg N/ha.month. The build-up of NO3--N and NH4+-N was greater and was found at lower rates on the lighter soil compared with heavy clays. Generally, most of the soil N was in the NO3--N form and most was in the top 20 cm.
Resumo:
In recent years, dieback of durian has become a major problem in mature orchards in the northern Queensland wet tropics region. A survey of 13 durian orchards was conducted during the dry season (July-September 2001) and following wet season (February-April 2002), with roots and soil from the root zone of affected trees being sampled. Phytophthora palmivora was recovered from the roots of affected trees on 12 of the 13 farms in the dry season, and all farms in the wet season. Pythium vexans was recovered from all 13 farms in both seasons. P. palmivora and P. vexans were recovered from diseased roots of 3-month-old durian seedlings cv. Monthong artificially inoculated with these organisms.
Resumo:
Multiple sclerosis (MS) is a chronic, inflammatory disease of the central nervous system, characterized especially by myelin and axon damage. Cognitive impairment in MS is common but difficult to detect without a neuropsychological examination. Valid and reliable methods are needed in clinical practice and research to detect deficits, follow their natural evolution, and verify treatment effects. The Paced Auditory Serial Addition Test (PASAT) is a measure of sustained and divided attention, working memory, and information processing speed, and it is widely used in MS patients neuropsychological evaluation. Additionally, the PASAT is the sole cognitive measure in an assessment tool primarly designed for MS clinical trials, the Multiple Sclerosis Functional Composite (MSFC). The aims of the present study were to determine a) the frequency, characteristics, and evolution of cognitive impairment among relapsing-remitting MS patients, and b) the validity and reliability of the PASAT in measuring cognitive performance in MS patients. The subjects were 45 relapsing-remitting MS patients from Seinäjoki Central Hospital, Department of Neurology and 48 healthy controls. Both groups underwent comprehensive neuropsychological assessments, including the PASAT, twice in a one-year follow-up, and additionally a sample of 10 patients and controls were evaluated with the PASAT in serial assessments five times in one month. The frequency of cognitive dysfunction among relapsing-remitting MS patients in the present study was 42%. Impairments were characterized especially by slowed information processing speed and memory deficits. During the one-year follow-up, the cognitive performance was relatively stable among MS patients on a group level. However, the practice effects in cognitive tests were less pronounced among MS patients than healthy controls. At an individual level the spectrum of MS patients cognitive deficits was wide in regards to their characteristics, severity, and evolution. The PASAT was moderately accurate in detecting MS-associated cognitive impairment, and 69% of patients were correctly classified as cognitively impaired or unimpaired when comprehensive neuropsychological assessment was used as a "gold standard". Self-reported nervousness and poor arithmetical skills seemed to explain misclassifications. MS-related fatigue was objectively demonstrated as fading performance towards the end of the test. Despite the observed practice effect, the reliability of the PASAT was excellent, and it was sensitive to the cognitive decline taking place during the follow-up in a subgroup of patients. The PASAT can be recommended for use in the neuropsychological assessment of MS patients. The test is fairly sensitive, but less specific; consequently, the reasons for low scores have to be carefully identified before interpreting them as clinically significant.
Resumo:
Feral pigs (Sus scrofa) are believed to have a severe negative impact on the ecological values of tropical rainforests in north Queensland, Australia. Most perceptions of the environmental impacts of feral pigs focus on their disturbance of the soil or surface material (diggings). Spatial and temporal patterns of feral pig diggings were identified in this study: most diggings occurred in the early dry season and predominantly in moist soil (swamp and creek) microhabitats, with only minimal pig diggings found elsewhere through the general forest floor. The overall mean daily pig diggings were 0.09% of the rainforest floor. Most diggings occurred 3-4 months after the month of maximum rainfall. Most pig diggings were recorded in highland swamps, with over 80% of the swamp areas dug by pigs at some time during the 18-month study period. These results suggest that management of feral pig impacts should focus on protecting swamp and creek microhabitats in the rainforest, which are preferred by pigs for digging and which have a high environmental significance.
Resumo:
ADHD (attention deficit hyperactivity disorder) is developmental neurobiological disability. In adults, the prevalence of ADHD has been estimated to be about 4 %. In addition to the difficulties of attention, the problems in executive functioning are typical. The psychiatric comorbidities are common. The most extensively studied treatments are pharmacological. There is also evidence about the usefulness of the cognitive-behavioural therapy (CBT) in the treatment of adults with ADHD. There are some preliminary results about the effectiveness of cognitive training and hypnosis in children, but there is no scientific proof in adults. This dissertation is based on two intervention studies. In the first study, the usefulness of the new group CBT (n = 29) and the maintenance of the symptom reduction in the follow-up of six months were studied. In the second study, the usefulness of short hypnotherapy (n = 9), short individual CBT (n = 10) and computerized cognitive training (n = 9) were examined by comparing groups with each other and to the control group (n = 10). The participation in the group CBT and the participants' satisfaction were good. There were no changes in self-reports during waiting period of three months. After the rehabilitation, the symptoms decreased. Participants having symptom reduction during rehabilitation maintained their benefit through 6-month follow-up period. In a combined ADHD symptom score based on self-reports, seven participants in the hypnotherapy, six in the CBT, two in the cognitive training and two controls improved. Using independent evaluations, improvement was found in six of the hypnotherapy, seven of the CBT, two of the cognitive training and three of the control participants. There was no treatment-related improvement in cognitive performance. Thus, in the hypnotherapy and CBT groups, some encouraging improvement was seen. In the cognitive training group, there was improvement in the trained tasks but no generalization of the improvement. The results support the earlier results from the usefulness of CBT in the treatment of adults with ADHD. Also the hypnotherapy seems a useful rehabilitation. More research is needed to evaluate the usefulness of cognitive training. These promising results warrant further studies with more participants and with longer treatment duration. Also different measures of cognitive functioning and quality of life are needed. It is important in addition to the medication to arrange psychosocial interventions for the ADHD adults.
Resumo:
Approximately one-third of stroke patients experience depression. Stroke also has a profound effect on the lives of caregivers of stroke survivors. However, depression in this latter population has received little attention. In this study the objectives were to determine which factors are associated with and can be used to predict depression at different points in time after stroke; to compare different depression assessment methods among stroke patients; and to determine the prevalence, course and associated factors of depression among the caregivers of stroke patients. A total of 100 consecutive hospital-admitted patients no older than 70 years of age were followed for 18 months after having their first ischaemic stroke. Depression was assessed according to the Diagnostic and Statistical Manual of Mental Disorders (DSM-III-R), Beck Depression Inventory (BDI), Hamilton Rating Scale (HRSD), Visual Analogue Mood Scale (VAMS), Clinical Global Impression (CGI) and caregiver ratings. Neurological assessments and a comprehensive neuropsychological test battery were performed. Depression in caregivers was assessed by BDI. Depressive symptoms had early onsets in most cases. Mild depressive symptoms were often persistent with little change during the 18-month follow-up, although there was an increase in major depression over the same time interval. Stroke severity was associated with depression especially from 6 to 12 months post-stroke. At the acute phase, older patients were at higher risk of depression, and a higher proportion of men were depressed at 18 months post-stroke. Of the various depression assessment methods, none stood clearly apart from the others. The feasibility of each did not differ greatly, but prevalence rates differed widely according to the different criteria. When compared against DSM-III-R criteria, sensitivity and specificity were acceptable for the CGI, BDI, and HRSD. The CGI and BDI had better sensitivity than the more specific HRSD. The VAMS seemed not to be a reliable method for assessing depression among stroke patients. The caregivers often rated patients depression as more severe than did the patients themselves. Moreover, their ratings seemed to be influenced by their own depression. Of the caregivers, 30-33% were depressed. At the acute phase, caregiver depression was associated with the severity of the stroke and the older age of the patient. The best predictor of caregiver depression at later follow-up was caregiver depression at the acute phase. The results suggest that depression should be assessed during the early post-stroke period and that the follow-up of those at risk of poor emotional outcome should be extended beyond the first year post-stroke. Further, the assessment of well-being of the caregivers of stroke patients should be included as a part of a rehabilitation plan for stroke patients.
Resumo:
Background Psychotic-like experiences (PLEs) are subclinical delusional ideas and perceptual disturbances that have been associated with a range of adverse mental health outcomes. This study reports a qualitative and quantitative analysis of the acceptability, usability and short term outcomes of Get Real, a web program for PLEs in young people. Methods Participants were twelve respondents to an online survey, who reported at least one PLE in the previous 3 months, and were currently distressed. Ratings of the program were collected after participants trialled it for a month. Individual semi-structured interviews then elicited qualitative feedback, which was analyzed using Consensual Qualitative Research (CQR) methodology. PLEs and distress were reassessed at 3 months post-baseline. Results User ratings supported the program's acceptability, usability and perceived utility. Significant reductions in the number, frequency and severity of PLE-related distress were found at 3 months follow-up. The CQR analysis identified four qualitative domains: initial and current understandings of PLEs, responses to the program, and context of its use. Initial understanding involved emotional reactions, avoidance or minimization, limited coping skills and non-psychotic attributions. After using the program, participants saw PLEs as normal and common, had greater self-awareness and understanding of stress, and reported increased capacity to cope and accept experiences. Positive responses to the program focused on its normalization of PLEs, usefulness of its strategies, self-monitoring of mood, and information putting PLEs into perspective. Some respondents wanted more specific and individualized information, thought the program would be more useful for other audiences, or doubted its effectiveness. The program was mostly used in low-stress situations. Conclusions The current study provided initial support for the acceptability, utility and positive short-term outcomes of Get Real. The program now requires efficacy testing in randomized controlled trials.
Resumo:
Aim Psychotic-like experiences (PLEs) are common in young people and are associated with both distress and adverse outcomes. The Community Assessment of Psychic Experiences-Positive Scale (CAPE-P) provides a 20-item measure of lifetime PLEs. A 15-item revision of this scale was recently published (CAPE-P15). Although the CAPE-P has been used to assess PLEs in the last 12 months, there is no version of the CAPE for assessing more recent PLEs (e.g. 3 months). This study aimed to determine the reliability and validity of the current CAPE-P15 and assess its relationship with current distress. Method A cross-sectional online survey of 489 university students (17–25 years) assessed lifetime and current substance use, current distress, and lifetime and 3-month PLEs on the CAPE-P15. Results Confirmatory factor analysis indicated that the current CAPE-P15 retained the same three-factor structure as the lifetime version consisting of persecutory ideation, bizarre experiences and perceptual abnormalities. The total score of the current version was lower than the lifetime version, but the two were strongly correlated (r = .64). The current version was highly predictive of generalized distress (r = .52) and indices that combined symptom frequency with associated distress did not confer greater predictive power than frequency alone. Conclusion This study provided preliminary data that the current CAPE-P15 provides a valid and reliable measure of current PLEs. The current CAPE-P15 is likely to have substantial practical utility if it is later shown to be sensitive to change, especially in prevention and early intervention for mental disorders in young people.
Resumo:
[Excerpt] This second issue in the current four-volume series of Social Security Programs Throughout the World reports on the countries of Asia and the Pacific. The combined findings of this series, which also includes volumes on Europe, Africa, and the Americas, are published at 6-month intervals over a 2-year period. Each volume highlights features of social security programs in the particular region. This guide serves as an overview of programs in all regions. A few political jurisdictions have been excluded because they have no social security system or have issued no information regarding their social security legislation. In the absence of recent information, national programs reported in previous volumes may also be excluded. In this volume on Asia and the Pacific, the data reported are based on laws and regulations in force in July 2006 or on the last date for which information has been received.1 Information for each country on types of social security programs, types of mandatory systems for retirement income, contribution rates, and demographic and other statistics related to social security is shown in Tables 14 at the end of the guide. The country summaries show each system's major features. Separate programs in the public sector and specialized funds for such groups as agricultural workers, collective farmers, or the self-employed have not been described in any detail. Benefit arrangements of private employers or individuals are not described in any detail, even though such arrangements may be mandatory in some countries or available as alternatives to statutory programs. The country summaries also do not refer to international social security agreements that may be in force between two or more countries. Those agreements may modify coverage, contributions, and benefit provisions of national laws summarized in the country write-ups. Since the summary format requires brevity, technical terms have been developed that are concise as well as comparable and are applied to all programs. The terminology may therefore differ from national concepts or usage.
Resumo:
Polioencephalomalacia was diagnosed histologically in cattle from two herds on the Darling Downs, Queensland, during July-August 2007. In the first incident, 8 of 20 18-month-old Aberdeen Angus steers died while grazing pastures comprising 60% Sisymbrium irio (London rocket) and 40% Capsella bursapastoris (shepherd's purse). In the second incident, 2 of 150 mixed-breed adult cattle died, and another was successfully treated with thiamine, while grazing a pasture comprising almost 100% Raphanus raphanistrum (wild radish). Affected cattle were either found dead or comatose or were seen apparently blind and head-pressing in some cases. For both incidents, plant and water assays were used to calculate the total dietary sulfur content in dry matter as 0.62% and 1.01% respectively, both exceeding the recommended 0.5% for cattle eating more than 40% forage. Blood and tissue assays for lead were negative in both cases. No access to thiaminase, concentrated sodium ion or extrinsic hydrogen sulfide sources were identified in either incident. Below-median late summer and autumn rainfall followed by above-median unseasonal winter rainfall promoted weed growth at the expense of wholesome pasture species before these incidents.
Resumo:
For pasture growth in the semi-arid tropics of north-east Australia, where up to 80% of annual rainfall occurs between December and March, the timing and distribution of rainfall events is often more important than the total amount. In particular, the timing of the 'green break of the season' (GBOS) at the end of the dry season, when new pasture growth becomes available as forage and a live-weight gain is measured in cattle, affects several important management decisions that prevent overgrazing and pasture degradation. Currently, beef producers in the region use a GBOS rule based on rainfall (e. g. 40mm of rain over three days by 1 December) to define the event and make their management decisions. A survey of 16 beef producers in north-east Queensland shows three quarters of respondents use a rainfall amount that occurs in only half or less than half of all years at their location. In addition, only half the producers expect the GBOS to occur within two weeks of the median date calculated by the CSIRO plant growth days model GRIM. This result suggests that in the producer rules, either the rainfall quantity or the period of time over which the rain is expected, is unrealistic. Despite only 37% of beef producers indicating that they use a southern oscillation index (SOI) forecast in their decisions, cross validated LEPS (linear error in probability space) analyses showed both the average 3 month July-September SOI and the 2 month August-September SOI have significant forecast skill in predicting the probability of both the amount of wet season rainfall and the timing of the GBOS. The communication and implementation of a rigorous and realistic definition of the GBOS, and the likely impacts of anthropogenic climate change on the region are discussed in the context of the sustainable management of northern Australian rangelands.
Resumo:
When recapturing satellite collared wild dogs that had been trapped one month previous in padded foothold traps, we noticed varying degrees of pitting on the pads of their trapped paw. Veterinary advice, based on images taken of the injuries, suggests that the necrosis was caused by vascular compromise. Five of six dingoes we recaptured had varying degrees of necrosis restricted only to the trapped foot and ranging from single 5 mm holes to 25% sections of the toe pads missing or deformed, including loss of nails. The traps used were rubber-padded, two–coiled, Victor Soft Catch #3 traps. The springs are not standard Victor springs but were Beefer springs; these modifications slightly increase trap speed and the jaw pressure on the trapped foot. Despite this modification the spring pressure is still relatively mild in comparison to conventional long spring or four-coiled wild dog traps. The five wild dogs developing necrosis were trapped in November 2006 at 5-6 months of age. Traps were checked each morning so the dogs were unlikely to have been restrained in the trap for more than 12 hours. All dogs exhibited a small degree of paw damage at capture which presented itself as a swollen paw and compression at the capture point. In contrast, eight wild dogs, 7-8 month-old, were captured two months later in February. Upon their release, on advice from a veterinarian, we massaged the trapped foot to get blood flow back in to the foot and applied a bruise treatment (Heparinoid 8.33 mg/ml) to assist restoring blood flow. These animals were subsequently recaptured several months later and showed no signs of necrosis. While post-capture foot injuries are unlikely to be an issue in conventional control programs where the animal is immediately destroyed, caution needs to be used when releasing accidentally captured domestic dogs or research animals captured in rubber-padded traps. We have demonstrated that 7-8 month old dogs can be trapped and released without any evidence of subsequent necrosis following minimal veterinary treatment. We suspect that the rubber padding on traps may increase the tourniquet effect by wrapping around the paw and recommend the evaluation of offset laminated steel jaw traps as an alternative. Offset laminated steel jaw traps have been shown to be relatively humane producing as few foot injuries as rubber-jawed traps.
Resumo:
A total of 2115 heifers from two tropical genotypes (1007 Brahman and 1108 Tropical Composite) raised in four locations in northern Australia were ovarian-scanned every 4-6 weeks to determine the age at the first-observed corpus luteum (CL) and this was used to de. ne the age at puberty for each heifer. Other traits recorded at each time of ovarian scanning were liveweight, fat depths and body condition score. Reproductive tract size was measured close to the start of the first joining period. Results showed significant effects of location and birth month on the age at first CL and associated puberty traits. Genotypes did not differ significantly for the age or weight at first CL; however, Brahman were fatter at first CL and had a small reproductive tract size compared with that of Tropical Composite. Genetic analyses estimated the age at first CL to be moderately to highly heritable for Brahman (0.57) and Tropical Composite (0.52). The associated traits were also moderately heritable, except for reproductive tract size in Brahmans (0.03) and for Tropical Composite, the presence of an observed CL on the scanning day closest to the start of joining (0.07). Genetic correlations among puberty traits were mostly moderate to high and generally larger in magnitude for Brahman than for Tropical Composite. Genetic correlations between the age at CL and heifer- and steer-production traits showed important genotype differences. For Tropical Composite, the age at CL was negatively correlated with the heifer growth rate in their first postweaning wet season (-0.40) and carcass marbling score (-0.49), but was positively correlated with carcass P8 fat depth (0.43). For Brahman, the age at CL was moderately negatively genetically correlated with heifer measures of bodyweight, fatness, body condition score and IGF-I, in both their first postweaning wet and second dry seasons, but was positively correlated with the dry-season growth rate. For Brahman, genetic correlations between the age at CL and steer traits showed possible antagonisms with feedlot residual feed intake (-0.60) and meat colour (0.73). Selection can be used to change the heifer age at puberty in both genotypes, with few major antagonisms with steer- and heifer- production traits.