991 resultados para SOIL NEMATODE COMMUNITY
Resumo:
Nitrogen relations of natural and disturbed tropical plant communities in northern Australia (Kakadu National Park) were studied. Plant and soil N characteristics suggested that differences in N source utilisation occur at community and species level. Leaf and xylem sap N concentrations of plants in different communities were correlated with the availability of inorganic soil N (NH4+ and NO3-). In general, rates of leaf NO3- assimilation were low. Even in communities with a higher N status, including deciduous monsoon forest, disturbed wetland, and a revegetated mine waste rock dump, levels of leaf nitrate reductase, xylem and leaf NO3 levels were considerably lower than those that have been reported for eutrophic communities. Although NO3- assimilation in escarpment and eucalypt woodlands, and wetland, was generally low, within these communities there was a suite of species that exhibited a greater capacity for NO3- assimilation. These high-NO3- species were mainly annuals, resprouting herbs or deciduous trees that had leaves with high N contents. Ficus, a high-NO3- species, was associated with soil exhibiting higher rates of net mineralisation and net nitrification. Low-NO3- species were evergreen perennials with low leaf N concentrations. A third group of plants, which assimilated NO3- (albeit at lower rates than the high-NO3- species), and had high-N leaves, were leguminous species. Acacia species, common in woodlands, had the highest leaf N contents of all woody species. Acacia species appeared to have the greatest potential to utilise the entire spectrum of available N sources. This versatility in N source utilisation may be important in relation to their high tissue N status and comparatively short life cycle. Differences in N utilisation are discussed in the context of species life strategies and mycorrhizal associations.
Resumo:
Seventy-two epaulette sharks, Hemiscyllium ocellatum (Bonnaterre), were infected with the nematode parasite Proleptus australis Bayliss, 1933. The parasite population was overdispersed. Infection intensity ranged from 3 to 1002 worms per fish stomach, and there was a positive correlation between shark length and number of parasites present. The majority of worms were attached to the stomach wall, and scanning electron microscopy and histological examination showed that worms penetrated the stomach lining. Worms were observed within the lamina propria of the stomach and occasionally penetrated the muscularis mucosa. Little to no inflammatory or cellular immune reaction to the presence of the parasites was observed, except in one case where a worm was being degraded by a host tissue response. There was a large amount of connective tissue proliferation as a result of nematode attachment,, but no obvious effects on the overall health of the sharks were seen. Three sharks were also found to be infected by the cestode Callitetrarhynchus sp.
Resumo:
A multisegment percolation system (MSPS) consisting of 25 individual collection wells was constructed to study the effects of localised soil heterogeneities on the transport of solutes in the vadose zone. In particular, this paper discusses the transport of water and nutrients (NO3-, Cl-, PO43-) through structurally stable, free-draining agricultural soil from Victoria, Australia. A solution of nutrients was irrigated onto the surface of a large undisturbed soil core over a 12-h period. This was followed by a continuous irrigation of distilled water at a fate which did not cause pending for a further 18 days. During this time, the volume of leachate and the concentration of nutrients in the leachate of each well were measured. Very significant variation in drainage patterns across a small spatial scale was observed. Leaching of nitrate-nitrogen and chloride from the core occurred two days after initial application. However, less than 1% of the total applied phosphate-phosphorus leached from the soil during the 18-day experiment, indicating strong adsorption. Our experiments indicate considerable heterogeneity in water flow patterns and solute leaching on a small spatial scale. These results have significant ramifications for modelling solute transport and predicting nutrient loadings on a larger scale.
Resumo:
Coronary heart disease is a leading cause of death in Australia with the Coalfields district of New South Wales having one of the country's highest rates. Identification of the Coalfields epidemic in the 1970's led to the formation of a community awareness program in the late 1980's (the healthy heart support group) followed by a more intense community action program in 1990, the Coalfields Healthy Heartbeat (CHHB). CHHB is a coalition of community members, local government officers, health workers and University researchers. We evaluate the CHHB program, examining both the nature and sustainability of heart health activities undertaken, as well as trends in risk factor levels and rates of coronary events in the Coalfields in comparison with nearby local government areas. Process data reveal difficulties mobilising the community as a whole; activities had to be selected for interested subgroups such as families of heart disease patients, school children, retired people and women concerned with family nutrition and body maintenance. Outcome data show a significantly larger reduction in case fatality for Coalfields men (although nonfatal heart attacks did not decline) while changes in risk factors levels were comparable with surrounding areas. We explain positive responses to the CHHB by schools, heart attack survivors and women interested in body maintenance in terms of the meaning these subgroups find in health promotion discourses based on their embodied experiences. When faced with a threat to one's identity, health discourse suddenly becomes meaningful along with the regimens for health improvement. General public disinterest in heart health promotion is examined in the context of historical patterns of outsiders criticising the lifestyle of miners, an orientation toward communal lather than individual responsibility for health (i.e, community 'owned' emergency services and hospitals) and anger about risks from environmental hazards imposed by industrialists. (C) 1999 Elsevier Science Ltd. All rights reserved.
Resumo:
Application of geographic information system (GIS) and global positioning system (GPS) technology in the Hlabisa community-based tuberculosis treatment programme documents the increase in accessibility to treatment after the expansion of the service from health facilities to include community workers and volunteers.
Resumo:
OBJECTIVE: Although little studied in developing countries, multidrug-resistant tuberculosis (MDR-TB) is considered a major threat. We report the molecular epidemiology, clinical features and outcome of an emerging MDR-TB epidemic. METHODS: In 1996 all tuberculosis suspects in the rural Hlabisa district, South Africa, had sputum cultured, and drug susceptibility patterns of mycobacterial isolates were determined. Isolates with MDR-TB (resistant to both isoniazid and rifampicin) were DNA fingerprinted by restriction fragment length polymorphism (RFLP) using IS6110 and polymorphic guanine-cytosine-rich sequence-based (PGRS) probes. Patients with MDR-TB were traced to determine outcome. Data were compared with results from a survey of drug susceptibility done in 1994. RESULTS: The rate of MDR-TB among smear-positive patients increased six-fold from 0.36% (1/275) in 1994 to 2.3% (13/561) in 1996 (P = 0.04). A further eight smear-negative cases were identified in 1996 from culture, six of whom had not been diagnosed with tuberculosis. MDR disease was clinically suspected in only five of the 21 cases (24%). Prevalence of primary and acquired MDR-TB was 1.8% and 4.1%, respectively. Twelve MDR-TB cases (67%) were in five RFLP-defined clusters. Among 20 traced patients, 10 (50%) had died, five had active disease (25%) and five (25%) were apparently cured. CONCLUSIONS: The rate of MDR-TB has risen rapidly in Hlabisa, apparently due to both reactivation disease and recent transmission. Many patients were not diagnosed with tuberculosis and many were not suspected of drug-resistant disease, and outcome was poor.
Resumo:
Community awareness of the sustainable use of land, water and vegetation resources is increasing. The sustainable use of these resources is pivotal to sustainable farming systems. However, techniques for monitoring the sustainable management of these resources are poorly understood and untested. We propose a framework to benchmark and monitor resources in the grains industry. Eight steps are listed below to achieve these objectives: (i) define industry issues; (ii) identify the issues through growers, stakeholder and community consultation; (iii) identify indicators (measurable attributes, properties or characteristics) of sustainability through consultation with growers, stakeholders, experts and community members, relating to: crop productivity; resource maintenance/enhancement; biodiversity; economic viability; community viability; and institutional structure; (iv) develop and use selection criteria to select indicators that consider: responsiveness to change; ease of capture; community acceptance and involvement; interpretation; measurement error; stability, frequency and cost of measurement; spatial scale issues; and mapping capability in space and through time. The appropriateness of indicators can be evaluated using a decision making system such as a multiobjective decision support system (MO-DSS, a method to assist in decision making from multiple and conflicting objectives); (v) involve stakeholders and the community in the definition of goals and setting benchmarking and monitoring targets for sustainable farming; (vi) take preventive and corrective/remedial action; (vii) evaluate effectiveness of actions taken; and (viii) revise indicators as part of a continual improvement principle designed to achieve best management practice for sustainable farming systems. The major recommendations are to: (i) implement the framework for resources (land, water and vegetation, economic, community and institution) benchmarking and monitoring, and integrate this process with current activities so that awareness, implementation and evolution of sustainable resource management practices become normal practice in the grains industry; (ii) empower the grains industry to take the lead by using relevant sustainability indicators to benchmark and monitor resources; (iii) adopt a collaborative approach by involving various industry, community, catchment management and government agency groups to minimise implementation time. Monitoring programs such as Waterwatch, Soilcheck, Grasscheck and Topcrop should be utilised; (iv) encourage the adoption of a decision making system by growers and industry representatives as a participatory decision and evaluation process. Widespread use of sustainability indicators would assist in validating and refining these indicators and evaluating sustainable farming systems. The indicators could also assist in evaluating best management practices for the grains industry.
Resumo:
Under certain soil conditions, e.g. hardsetting clay B-horizons of South-Eastern Australia, wheat plants do not perform as well as would be expected given measurements of bulk soil attributes. In such soils, measurement indicates that a large proportion (80%) of roots are preferentially located in the soil within 1 mm of macropores. This paper addresses the question of whether there are biological and soil chemical effects concomitant with this observed spatial relationship. The properties of soil manually dissected from the 1-3 mm wide region surrounding macropores, the macropore sheath, were compared to those that are measured in a conventional manner on the bulk soil. Field specimens of two different soil materials were dissected to examine biological differentiation. To ascertain whether the macropore sheath soil differs from rhizosphere soil, wheat was grown in structured and repacked cores under laboratory conditions. The macropore sheath soil contained more microbial biomass per unit mass than both the bulk soil and the rhizosphere. The bacterial population in the macropore sheath was able to utilise a wider range of carbon substrates and to a greater extent than the bacterial population in the corresponding bulk soil. These differences between the macropore sheath and bulk soil were almost non-existent in the repacked cores. Evidence for larger numbers of propagules of the broad host range fungus Pythium in the macropore sheath soil were also obtained.
Resumo:
Plants require roots to supply water, nutrients and oxygen for growth. The spatial distribution of roots in relation to the macropore structure of the soil in which they are growing influences how effective they are at accessing these resources. A method for quantifying root-macropore associations from horizontal soil sections is illustrated using two black vertisols from the Darling Downs, Queensland, Australia. Two-dimensional digital images were obtained of the macropore structure and root distribution for an area 55 x 55 mm at a resolution of 64 mu m. The spatial distribution of roots was quantified over a range of distances using the K-function. In all specimens, roots were shown to be clustered at short distances (1-10 mm) becoming more random at longer distances. Root location in relation to macropores was estimated using the function describing the distance of each root to the nearest macropore. From this function, a summary variable, termed the macropore sheath, was defined. The macropore sheath is the distance from macropores within which 80% of roots are located. Measured root locations were compared to random simulations of root distribution to establish if there was a preferential association between roots and macropores. More roots were found in and around macropores than expected at random.
Resumo:
SETTING: Hlabisa Tuberculosis Programme, Hlabisa, South Africa. OBJECTIVE: To determine trends in and risk factors for interruption of tuberculosis treatment. METHODS: Data were extracted from the control programme database starting in 1991. Temporal trends in treatment interruption are described; independent risk factors for treatment interruption were determined with a multiple logistic regression model, and Kaplan-Meier survival curves for treatment interruption were constructed for patients treated in 1994-1995. RESULTS: Overall 629 of 3610 surviving patients (17%) failed to complete treatment; this proportion increased from 11% (n = 79) in 1991/1992 to 22% (n = 201) in 1996. Independent risk factors for treatment interruption were diagnosis between 1994-1996 compared with 1991-1393 (odds ratio [OR] 1.9, 95% confidence interval [CT] 1.6-2.4); human immunodeficiency virus (HIV) positivity compared with HIV negativity (OR 1.8, 95% CI 1.4-2.4); supervised by village clinic compared with community health worker (OR 1.9, 95% CI 1.4-2.6); and male versus female sex (OR 1.3, 95% CI 1.1-1.6). Few patients interrupted treatment during the first 2 weeks, and the treatment interruption rate thereafter was constant at 1% per 14 days. CONCLUSIONS: Frequency of treatment interruption from this programme has increased recently. The strongest risk factor was year of diagnosis, perhaps reflecting the impact of an increased caseload on programme performance. Ensuring adherence to therapy in communities with a high level of migration remains a challenge even within community-based directly observed therapy programmes.
Resumo:
At the end of Word War II, Soviet occupation forces removed countless art objects from German soil. Some of them were returned during the 1950s, but most either disappeared for good or were stored away secretly in cellars of Soviet museums. The Cold War then covered the issue with silence. After the collapse of the Soviet Union, museums in St Petersburg and Moscow started to exhibit some of the relocated art for the first time in half a century. The unusual quality of the paintings-mostly impressionist masterpieces-not only attracted the attention of the international art community, but also triggered a diplomatic row between Russia and Germany. Both governments advanced moral and legal claims to ownership. To make things even more complicated, many of the paintings once belonged to private collectors, some of whom were Jews. Their descendants also entered the dispute. The basic premise of this article is that the political and ethical dimensions of relocated art can be understood most adequately by eschewing a single authorial standpoint. Various positions, sometimes incommensurable ones, are thus explored in an attempt to outline possibilities for an ethics of representation and a dialogical solution to the international problem that relocated art has become.
Resumo:
Background: This study examined rates of and risk factors associated with suicide attempts by psychiatric patients under active care. It was especially focussed on the relative rates across three standard treatment settings: acute inpatient care, long-stay inpatient care and community-based carl. Methods: A total of 12,229 patients in 13,632 episodes of care were rated on the Health of the Nation Outcome Scales (HoNOS) Item 2. For the purposes of the current investigation, a score of 4 was deemed to indicate a suicide attempt. Results: Incidence densities per 1000 episode days were 5.4 (95% CI = 4.8-6.1) for patients under care in acute inpatient settings, 0.6 (95% CI = 0.5-0.8) for patients under care in long-stay inpatient settings, and 0.5 (95% CI = 0.5-0.6) for patients under carl in community-based arrangements. Predictors varied by treatment setting. Risk was elevated for personality disorders across all settings: 22.7 attempts per 1000 episode days (95% CI = 17.2-30.0) in acute inpatient care; 2.1 (95% CI = 1.0-4.5) in long-stay inpatient care; and 2.3 (95% CI = 1.7-3.0) in community-based care. This effect remained after adjustment for demographics. Conclusion: Rates of suicide attempts among psychiatric patients are a major issue facing contemporary mental health care systems, and risk factors vary across different treatment settings.
Resumo:
Background The introduction of community care in psychiatry is widely thought to have resulted in more offending among the seriously mentally ill. This view affects public policy towards and public perceptions of such people. We investigated the association between the introduction of community care and the pattern of offending in patients with schizophrenia in Victoria, Australia. Methods We established patterns of offending from criminal records in two groups of patients with schizophrenia over their lifetime to date and in the 10 years after their first hospital admission. One group was first admitted in 1975 before major deinstitutionalisation in Victoria, the second group in 1985 when community care was becoming the norm. Each patient was matched to a control, by age, sex, and place of residence to allow for changing patterns of offending over time in the wider community. Findings Compared with controls, significantly more of those with schizophrenia were convicted at least once for ail categories of criminal offending except sexual offences (relative risk of offending in 1975=3.5 [95% CI 2.0-5.5), p=0.001, in 1985=3.0 [1.9-4.9], p=0.001). Among men, more offences were committed in the 1985 group than the 1975 group, but this was matched by a similar increase in convictions among the community controls. Those with schizophrenia who had also received treatment for substance abuse accounted for a disproportionate amount of offending. Analysis of admission data for the patients and the total population of admissions with schizophrenia showed that although there had been an increase of 74 days per annum spent in the community for each of the study population as a whole, first admissions spent only 1 more day in the community in 1985 compared with 1975. Interpretation Increased rates in criminal conviction for those with schizophrenia over the last 20 years are consistent with change in the pattern of offending in the general community. Deinstitutionalisation does not adequately explain such change. Mental-health services should aim to reduce the raised rates of criminal offending associated with schizophrenia, but turning the clock back on community care is unlikely to contribute towards any positive outcome.
Resumo:
Contemporary strategies for rural development in Australia are based upon notions of self-help and bottom-up, community-based initiatives which are said to 'empower' the individual from the imposing structures of government intervention. While such strategies are not entirely new to Australia, they have, it seems, been inadequately theorised to date and are generally regarded, in rather functionalist terms, as indicative of attempts to cut back on public expenditure. Harnessing itself to the 'governmentality' perspective, this paper explores government and 'expert' discourses of rural community development in Queensland and suggests, instead, that these strategies are indicative of an advanced liberal form of rule which seeks to 'govern through community'. With this in mind, three basic research questions are identified as worthy of further exploration; how are the notions of self-governing individuals and communities constructed in political discourse; what political rationalities are used to justify current levels of(non) intervention and finally; what are the discourses, forms and outcomes of empowerment at the local level? The paper concludes by arguing that while the empowering effects of self-help are frequently cited as its greatest virtue, it is not so much control as the added burden of responsibility that is being devolved to local people. Given the emphasis of the governmentality perspective on strategies for 'governing at a distance', however, these conclusions can hardly be unexpected. (C) 2000 Elsevier Science Ltd, All rights reserved.
Resumo:
Background and Purpose-The goal of the present study was to identify risk factors for vascular disease in the elderly. Methods-We conducted a prospective study of control subjects from a population-based study of stroke in Perth, Western Australia, that was completed in 1989 to 1990 and used record linkage and a survey of survivors to identify deaths and nonfatal vascular events. Data validated through reference to medical records were analyzed with the use of Cox proportional hazards models. Results-Follow-up for the 931 subjects was 88% complete. By June 24, 1994, 198 (24%) of the subjects had died (96 from vascular disease), and there had been 45 nonfatal strokes or myocardial infarctions. The hazard ratio for diabetes exceeded 2.0 for all end points, whereas the consumption of meat >4 times weekly was associated with a reduction in risk of less than or equal to 30%. In most models, female sex and consumption of alcohol were associated with reduced risks, whereas previous myocardial infarction was linked to an increase in risk. Conclusions-There are only limited associations between lifestyle and major vascular illness in old age. Effective health promotion activities in early and middle life may be the key to a longer and healthier old age.