823 resultados para Community-based coral reef conservation
Resumo:
Ocean acidification (OA) is expected to reduce the net ecosystem calcification (NEC) rates and overall accretion of coral reef ecosystems. However, despite the fact that sediments are the most abundant form of calcium carbonate (CaCO3) in coral reef ecosystems and their dissolution may be more sensitive to OA than biogenic calcification, the impacts of OA induced sediment dissolution on coral reef NEC rates and CaCO3 accretion are poorly constrained. Carbon dioxide addition and light attenuation experiments were performed at Heron Island, Australia in an attempt to tease apart the influence of OA and organic metabolism (e.g. respiratory CO2 production) on CaCO3 dissolution. Overall, CaCO3 dissolution rates were an order of magnitude more sensitive to elevated CO2 and decreasing seawater aragonite saturation state (Omega Ar; 300-420% increase in dissolution per unit decrease in Omega Ar) than published reductions in biologically mediated calcification due to OA. Light attenuation experiments led to a 70% reduction in net primary production (NPP), which subsequently induced an increase in daytime (115%) and net diel (375%) CaCO3 dissolution rates. High CO2 and low light acted in synergy to drive a 575% increase in net diel dissolution rates. Importantly, disruptions to the balance of photosynthesis and respiration (P/R) had a significant effect on daytime CaCO3 dissolution, while average water column ?Ar was the main driver of nighttime dissolution rates. A simple model of platform-integrated dissolution rates was developed demonstrating that seasonal changes in photosynthetically active radiation (PAR) can have an important effect on platform integrated CaCO3 sediment dissolution rates. The considerable response of CaCO3 sediment dissolution to elevated CO2 means that much of the response of coral reef communities and ecosystems to OA could be due to increases in CaCO3 sediment and framework dissolution, and not decreases in biogenic calcification.
Resumo:
Emerging infectious diseases are a growing concern in wildlife conservation. Documenting outbreak patterns and determining the ecological drivers of transmission risk are fundamental to predicting disease spread and assessing potential impacts on population viability. However, evaluating disease in wildlife populations requires expansive surveillance networks that often do not exist in remote and developing areas. Here, we describe the results of a community-based research initiative conducted in collaboration with indigenous harvesters, the Inuit, in response to a new series of Avian Cholera outbreaks affecting Common Eiders (Somateria mollissima) and other comingling species in the Canadian Arctic. Avian Cholera is a virulent disease of birds caused by the bacterium Pasteurella multocida. Common Eiders are a valuable subsistence resource for Inuit, who hunt the birds for meat and visit breeding colonies during the summer to collect eggs and feather down for use in clothing and blankets. We compiled the observations of harvesters about the growing epidemic and with their assistance undertook field investigation of 131 colonies distributed over >1200 km of coastline in the affected region. Thirteen locations were identified where Avian Cholera outbreaks have occurred since 2004. Mortality rates ranged from 1% to 43% of the local breeding population at these locations. Using a species-habitat model (Maxent), we determined that the distribution of outbreak events has not been random within the study area and that colony size, vegetation cover, and a measure of host crowding in shared wetlands were significantly correlated to outbreak risk. In addition, outbreak locations have been spatially structured with respect to hypothesized introduction foci and clustered along the migration corridor linking Arctic breeding areas with wintering areas in Atlantic Canada. At present, Avian Cholera remains a localized threat to Common Eider populations in the Arctic; however expanded, community-based surveillance will be required to track disease spread.
Resumo:
This paper addresses current changes in the highly diverse European landscape, and the way these transitions are being treated in policy and landscape management in the fragmented, heterogeneous and dynamic context of today’s Europe. It appears that intersecting driving forces are increasing the complexity of European landscapes and causing polarising developments in agricultural land use, biodiversity conservation and cultural landscape management. On the one hand, multifunctional rural landscapes, especially in peri-urban regions, provide services and functions that serve the citizens in their demand for identity, support their sense of belonging and offer opportunities for recreation and involvement in practical landscape management. On the other hand, industrial agricultural production on increasingly large farms produces food, feed, fibre and energy to serve expanding international markets with rural live ability and accessibility as a minor issue. The intermediate areas of traditionally dominant small and family farms in Europe seem to be gradually declining in profitability. The paper discusses the potential of a governance approach that can cope with the requirement of optimising land-sharing conditions and community-based landscape development, while adapting to global market conditions.
Resumo:
Abstract: Heavily used and highly valuable, the Florida Reef is one of the world's most threatened ecosystems. Stakeholders from a densely urbanized coastal region in proximity to the reef system recognize its degradation, but their comprehension of climate change and commitment to pay for sustainable management and research funding have been opaque. With an emphasis on recreational anglers, residential stakeholders were surveyed online about their marine activities, perceptions of resources and threats, and willingness to pay (WTP) for dedicated coral reef research funding in Florida. The majority of stakeholders are wealthy, well educated, and politically independent. Supermajorities favored the two scenarios of taxation for a Florida Coral Reef Research Fund, and the scenario with matching federal funds earned higher support. In regression analyses, several factors emerged as significant contributors to stakeholders’ preferences, and the four recurring factors in extended models were prioritizing the environment over the economy, donating to environmental causes, concern about coral reefs, and concern about climate change, with the latter indicating a recent shift of opinion. Status in terms of income and education were found insignificant, and surprisingly income was negatively correlated with WTP. Perceptions through lenses of environmental and emotional attachments appear to overwhelm conventional status-based factors. Applied statewide, the first scenario's extrapolated WTP (based on a sales tax rate of 2.9%) would generate $675 million annually, and the extrapolated WTP under the second scenario, with matching federal funds (based on a sales tax rate of 3.0%) would generate $1.4 billion. Keywords: willingness to pay, coral reef research, taxation, climate change, stakeholder, perceptions, Florida Reef, recreational fishing, anglers
Resumo:
Coral reefs can exist as coral- and macroalgae-dominated habitats often separated by only a few hundred metres. While herbivorous fish are known to depress the abundance of algae and help maintain the function of coral-dominated habitats, less is known about their influence in algae-dominated habitats. Here, we quantified herbivorous fish and benthic algal communities over a 6 mo period in coral-dominated (back-reef) and algal-dominated (lagoon) habitats in a relatively undisturbed fringing coral reef (Ningaloo, Western Australia). Simulta - neously, we tested the effects of herbivorous fish on algal recruitment in both habitats using recruitment tiles and fish exclusion cages. The composition of established algal communities differed consistently between habitats, with the back-reef hosting a more diverse community than the Sargassum-dominated lagoon. However, total algal biomass and cover only differed between habitats in autumn, coinciding with maximum Sargassum biomass. The back-reef hosted high coral cover and a diverse herbivorous fish community, with herbivore biomass an order of magnitude greater than the lagoon. Despite these differences in herbivore composition, exclusion of large herbivores had a similar positive effect to foliose macroalgae recruitment on experimental tiles in both back-reef and lagoon habitats. Additionally, territorial damselfish found in the backreef increased turf algae cover and decreased crustose coralline algae cover on recruitment tiles. Collectively, our results show that disparate herbivorous fish communities in coral- and algaedominated habitats are similarly able to limit the recruitment of foliose macroalgae, but suggest that when herbivorous fish biomass and diversity are relatively low, macroalgal communities are able to escape herbivore control through increased growth.
Resumo:
Background: Population antimicrobial use may influence resistance emergence. Resistance is an ecological phenomenon due to potential transmissibility. We investigated spatial and temporal patterns of ciprofloxacin (CIP) population consumption related to E. coli resistance emergence and dissemination in a major Brazilian city. A total of 4,372 urinary tract infection E. coli cases, with 723 CIP resistant, were identified in 2002 from two outpatient centres. Cases were address geocoded in a digital map. Raw CIP consumption data was transformed into usage density in DDDs by CIP selling points influence zones determination. A stochastic model coupled with a Geographical Information System was applied for relating resistance and usage density and for detecting city areas of high/low resistance risk. Results: E. coli CIP resistant cluster emergence was detected and significantly related to usage density at a level of 5 to 9 CIP DDDs. There were clustered hot-spots and a significant global spatial variation in the residual resistance risk after allowing for usage density. Conclusions: There were clustered hot-spots and a significant global spatial variation in the residual resistance risk after allowing for usage density. The usage density of 5-9 CIP DDDs per 1,000 inhabitants within the same influence zone was the resistance triggering level. This level led to E. coli resistance clustering, proving that individual resistance emergence and dissemination was affected by antimicrobial population consumption.
Resumo:
Objectives. We sought to estimate the risk of death and recurrent myocardial infarction associated with the use of calcium antagonists after myocardial infarction in a population-based cohort study. Background. Calcium antagonists are commonly prescribed after myocardial infarction, but their long-term effects are not well established. Methods. Patients 25 to 69 years old with a suspected myocardial infarction were identified and followed up through a community-based register of myocardial infarction and cardiac death (part of the World Health Organization Monitoring Trends and Determinants in Cardiovascular Disease [MONICA] Project in Newcastle, Australia). Data were collected by review of medical records, in-hospital interview and review of death certificates. Results. From 1989 to 1993, 3,982 patients with a nonfatal suspected myocardial infarction were enrolled in the study. At hospital discharge, 1,001 patients were treated with beta-adrenergic blocking agents, 923 with calcium antagonists, 711 with both beta-blockers and calcium antagonists and 1,346 with neither drug. Compared with patients given beta-blockers, patients given calcium antagonists were more likely to suffer myocardial infarction or cardiac death (adjusted relative risk [RR] 1.4, 95% confidence interval [CI] 1.0 to 1.9), cardiac death (RR 1.6, 95% CI 1.0 to 2.7) and death from all causes (RR 1.7, 95% CI 1.1 to 2.6). Compared with patients given neither beta-blockers nor calcium antagonists, patients given calcium antagonists were not at increased risk of myocardial infarction or cardiac death (RR 1.0, 95% CI 0.8 to 1.3), cardiac death (RR 0.9, 95% CI 0.6 to 1.2) or death from all causes (RR 1.0, 95% CI 0.7 to 1.3). No excess in risk of myocardial infarction or cardiac death was observed among patients taking verapamil (RR 0.9, 95% CI 0.6 to 1.6), diltiazem (RR 1.1, 95% CI 0.8 to 1.4) or nifedipine (RR 1.3, 95% CI 0.7 to 2.2) compared,vith patients taking neither calcium antagonists nor beta-blockers. Conclusions. These results are consistent with randomized trial data showing benefit from beta blockers after myocardial infarction and no effect on the risk of recurrent myocardial infarction and death with the use of calcium antagonists. Comparisons between beta-blockers and calcium antagonists favor beta blockers because of the beneficial effects of beta-blockers and not because of adverse effects of calcium antagonists. (C) 1998 by the American College of Cardiology.
Resumo:
Application of geographic information system (GIS) and global positioning system (GPS) technology in the Hlabisa community-based tuberculosis treatment programme documents the increase in accessibility to treatment after the expansion of the service from health facilities to include community workers and volunteers.
Resumo:
OBJECTIVE: Although little studied in developing countries, multidrug-resistant tuberculosis (MDR-TB) is considered a major threat. We report the molecular epidemiology, clinical features and outcome of an emerging MDR-TB epidemic. METHODS: In 1996 all tuberculosis suspects in the rural Hlabisa district, South Africa, had sputum cultured, and drug susceptibility patterns of mycobacterial isolates were determined. Isolates with MDR-TB (resistant to both isoniazid and rifampicin) were DNA fingerprinted by restriction fragment length polymorphism (RFLP) using IS6110 and polymorphic guanine-cytosine-rich sequence-based (PGRS) probes. Patients with MDR-TB were traced to determine outcome. Data were compared with results from a survey of drug susceptibility done in 1994. RESULTS: The rate of MDR-TB among smear-positive patients increased six-fold from 0.36% (1/275) in 1994 to 2.3% (13/561) in 1996 (P = 0.04). A further eight smear-negative cases were identified in 1996 from culture, six of whom had not been diagnosed with tuberculosis. MDR disease was clinically suspected in only five of the 21 cases (24%). Prevalence of primary and acquired MDR-TB was 1.8% and 4.1%, respectively. Twelve MDR-TB cases (67%) were in five RFLP-defined clusters. Among 20 traced patients, 10 (50%) had died, five had active disease (25%) and five (25%) were apparently cured. CONCLUSIONS: The rate of MDR-TB has risen rapidly in Hlabisa, apparently due to both reactivation disease and recent transmission. Many patients were not diagnosed with tuberculosis and many were not suspected of drug-resistant disease, and outcome was poor.
Resumo:
SETTING: Hlabisa Tuberculosis Programme, Hlabisa, South Africa. OBJECTIVE: To determine trends in and risk factors for interruption of tuberculosis treatment. METHODS: Data were extracted from the control programme database starting in 1991. Temporal trends in treatment interruption are described; independent risk factors for treatment interruption were determined with a multiple logistic regression model, and Kaplan-Meier survival curves for treatment interruption were constructed for patients treated in 1994-1995. RESULTS: Overall 629 of 3610 surviving patients (17%) failed to complete treatment; this proportion increased from 11% (n = 79) in 1991/1992 to 22% (n = 201) in 1996. Independent risk factors for treatment interruption were diagnosis between 1994-1996 compared with 1991-1393 (odds ratio [OR] 1.9, 95% confidence interval [CT] 1.6-2.4); human immunodeficiency virus (HIV) positivity compared with HIV negativity (OR 1.8, 95% CI 1.4-2.4); supervised by village clinic compared with community health worker (OR 1.9, 95% CI 1.4-2.6); and male versus female sex (OR 1.3, 95% CI 1.1-1.6). Few patients interrupted treatment during the first 2 weeks, and the treatment interruption rate thereafter was constant at 1% per 14 days. CONCLUSIONS: Frequency of treatment interruption from this programme has increased recently. The strongest risk factor was year of diagnosis, perhaps reflecting the impact of an increased caseload on programme performance. Ensuring adherence to therapy in communities with a high level of migration remains a challenge even within community-based directly observed therapy programmes.
Resumo:
Recruiting coral reef fish larvae from 38 species and 19 families from New Caledonia were examined for parasites. We found 13 parasite species (Platyhelminthes: Monogenea, Cestoda and Trematoda) but no acanthocephalan, crustacean or nematode parasites. Over 23% of individual fish were infected. Didymozoid metacercariae were the most abundant parasites. We conclude that most of the parasites are pelagic species that become 'lost' once the fish larvae have recruited to the reef. Larval coral reef fish probably contribute little to the dispersal of the parasites of the adult fish so that parasite dispersal is more difficult than that of the fish themselves. (C) 2000 Australian Society for Parasitology Inc. Published by Elsevier Science Ltd. All rights reserved.
Resumo:
Degradation of coral reef ecosystems began centuries ago, but there is no global summary of the magnitude of change. We compiled records, extending back thousands of years, of the status and trends of seven major guilds of carnivores, herbivores, and architectural species from 14 regions. Large animals declined before small animals and architectural species, and Atlantic reefs declined before reefs in the Red Sea and Australia, but the trajectories of decline were markedly similar worldwide. All reefs were substantially degraded long before outbreaks of coral disease and bleaching. Regardless of these new threats, reefs will not survive without immediate protection from human exploitation over large spatial scales.