237 resultados para Memorandum
Resumo:
This technical memorandum documents the design, implementation, data preparation, and descriptive results for the 2006 Annual Economic Survey of Federal Gulf Shrimp Permit Holders. The data collection was designed by the NOAA Fisheries Southeast Fisheries Science Center Social Science Research Group to track the financial and economic status and performance by vessels holding a federal moratorium permit for harvesting shrimp in the Gulf of Mexico. A two page, self-administered mail survey collected total annual costs broken out into seven categories and auxiliary economic data. In May 2007, 580 vessels were randomly selected, stratified by state, from a preliminary population of 1,709 vessels with federal permits to shrimp in offshore waters of the Gulf of Mexico. The survey was implemented during the rest of 2007. After many reminder and verification phone calls, 509 surveys were deemed complete, for an ineligibility-adjusted response rate of 90.7%. The linking of each individual vessel’s cost data to its revenue data from a different data collection was imperfect, and hence the final number of observations used in the analyses is 484. Based on various measures and tests of validity throughout the technical memorandum, the quality of the data is high. The results are presented in a standardized table format, linking vessel characteristics and operations to simple balance sheet, cash flow, and income statements. In the text, results are discussed for the total fleet, the Gulf shrimp fleet, the active Gulf shrimp fleet, and the inactive Gulf shrimp fleet. Additional results for shrimp vessels grouped by state, by vessel characteristics, by landings volume, and by ownership structure are available in the appendices. The general conclusion of this report is that the financial and economic situation is bleak for the average vessels in most of the categories that were evaluated. With few exceptions, cash flow for the average vessel is positive while the net revenue from operations and the “profit” are negative. With negative net revenue from operations, the economic return for average shrimp vessels is less than zero. Only with the help of government payments does the average owner just about break even. In the short-term, this will discourage any new investments in the industry. The financial situation in 2006, especially if it endures over multiple years, also is economically unsustainable for the average established business. Vessels in the active and inactive Gulf shrimp fleet are, on average, 69 feet long, weigh 105 gross tons, are powered by 505 hp motor(s), and are 23 years old. Three-quarters of the vessels have steel hulls and 59% use a freezer for refrigeration. The average market value of these vessels was $175,149 in 2006, about a hundred-thousand dollars less than the average original purchase price. The outstanding loans averaged $91,955, leading to an average owner equity of $83,194. Based on the sample, 85% of the federally permitted Gulf shrimp fleet was actively shrimping in 2006. Of these 386 active Gulf shrimp vessels, just under half (46%) were owner-operated. On average, these vessels burned 52,931 gallons of fuel, landed 101,268 pounds of shrimp, and received $2.47 per pound of shrimp. Non-shrimp landings added less than 1% to cash flow, indicating that the federal Gulf shrimp fishery is very specialized. The average total cash outflow was $243,415 of which $108,775 was due to fuel expenses alone. The expenses for hired crew and captains were on average $54,866 which indicates the importance of the industry as a source of wage income. The resulting average net cash flow is $16,225 but has a large standard deviation. For the population of active Gulf shrimp vessels we can state with 95% certainty that the average net cash flow was between $9,500 and $23,000 in 2006. The median net cash flow was $11,843. Based on the income statement for active Gulf shrimp vessels, the average fixed costs accounted for just under a quarter of operating expenses (23.1%), labor costs for just over a quarter (25.3%), and the non-labor variable costs for just over half (51.6%). The fuel costs alone accounted for 42.9% of total operating expenses in 2006. It should be noted that the labor cost category in the income statement includes both the actual cash payments to hired labor and an estimate of the opportunity cost of owner-operators’ time spent as captain. The average labor contribution (as captain) of an owner-operator is estimated at about $19,800. The average net revenue from operations is negative $7,429, and is statistically different and less than zero in spite of a large standard deviation. The economic return to Gulf shrimping is negative 4%. Including non-operating activities, foremost an average government payment of $13,662, leads to an average loss before taxes of $907 for the vessel owners. The confidence interval of this value straddles zero, so we cannot reject, with 95% certainty, that the population average is zero. The average inactive Gulf shrimp vessel is generally of a smaller scale than the average active vessel. Inactive vessels are physically smaller, are valued much lower, and are less dependent on loans. Fixed costs account for nearly three quarters of the total operating expenses of $11,926, and only 6% of these vessels have hull insurance. With an average net cash flow of negative $7,537, the inactive Gulf shrimp fleet has a major liquidity problem. On average, net revenue from operations is negative $11,396, which amounts to a negative 15% economic return, and owners lose $9,381 on their vessels before taxes. To sustain such losses and especially to survive the negative cash flow, many of the owners must be subsidizing their shrimp vessels with the help of other income or wealth sources or are drawing down their equity. Active Gulf shrimp vessels in all states but Texas exhibited negative returns. The Alabama and Mississippi fleets have the highest assets (vessel values), on average, yet they generate zero cash flow and negative $32,224 net revenue from operations. Due to their high (loan) leverage ratio the negative 11% economic return is amplified into a negative 21% return on equity. In contrast, for Texas vessels, which actually have the highest leverage ratio among the states, a 1% economic return is amplified into a 13% return on equity. From a financial perspective, the average Florida and Louisiana vessels conform roughly to the overall average of the active Gulf shrimp fleet. It should be noted that these results are averages and hence hide the variation that clearly exists within all fleets and all categories. Although the financial situation for the average vessel is bleak, some vessels are profitable. (PDF contains 101 pages)
Resumo:
ADMB2R is a collection of AD Model Builder routines for saving complex data structures into a file that can be read in the R statistics environment with a single command.1 ADMB2R provides both the means to transfer data structures significantly more complex than simple tables, and an archive mechanism to store data for future reference. We developed this software because we write and run computationally intensive numerical models in Fortran, C++, and AD Model Builder. We then analyse results with R. We desired to automate data transfer to speed diagnostics during working-group meetings. We thus developed the ADMB2R interface to write an R data object (of type list) to a plain-text file. The master list can contain any number of matrices, values, dataframes, vectors or lists, all of which can be read into R with a single call to the dget function. This allows easy transfer of structured data from compiled models to R. Having the capacity to transfer model data, metadata, and results has sharply reduced the time spent on diagnostics, and at the same time, our diagnostic capabilities have improved tremendously. The simplicity of this interface and the capabilities of R have enabled us to automate graph and table creation for formal reports. Finally, the persistent storage in files makes it easier to treat model results in analyses or meta-analyses devised months—or even years—later. We offer ADMB2R to others in the hope that they will find it useful. (PDF contains 30 pages)
Resumo:
C2R is a collection of C routines for saving complex data structures into a file that can be read in the R statistics environment with a single command.1 C2R provides both the means to transfer data structures significantly more complex than simple tables, and an archive mechanism to store data for future reference. We developed this software because we write and run computationally intensive numerical models in Fortran, C++, and AD Model Builder. We then analyse results with R. We desired to automate data transfer to speed diagnostics during working-group meetings. We thus developed the C2R interface to write an R data object (of type list) to a plain-text file. The master list can contain any number of matrices, values, dataframes, vectors or lists, all of which can be read into R with a single call to the dget function. This allows easy transfer of structured data from compiled models to R. Having the capacity to transfer model data, metadata, and results has sharply reduced the time spent on diagnostics, and at the same time, our diagnostic capabilities have improved tremendously. The simplicity of this interface and the capabilities of R have enabled us to automate graph and table creation for formal reports. Finally, the persistent storage in files makes it easier to treat model results in analyses or meta-analyses devised months—or even years—later. We offer C2R to others in the hope that they will find it useful. (PDF contains 27 pages)
Resumo:
For2R is a collection of Fortran routines for saving complex data structures into a file that can be read in the R statistics environment with a single command.1 For2R provides both the means to transfer data structures significantly more complex than simple tables, and an archive mechanism to store data for future reference. We developed this software because we write and run computationally intensive numerical models in Fortran, C++, and AD Model Builder. We then analyse results with R. We desired to automate data transfer to speed diagnostics during working-group meetings. We thus developed the For2R interface to write an R data object (of type list) to a plain-text file. The master list can contain any number of matrices, values, dataframes, vectors or lists, all of which can be read into R with a single call to the dget function. This allows easy transfer of structured data from compiled models to R. Having the capacity to transfer model data, metadata, and results has sharply reduced the time spent on diagnostics, and at the same time, our diagnostic capabilities have improved tremendously. The simplicity of this interface and the capabilities of R have enabled us to automate graph and table creation for formal reports. Finally, the persistent storage in files makes it easier to treat model results in analyses or meta-analyses devised months—or even years—later. We offer For2R to others in the hope that they will find it useful. (PDF contains 31 pages)
Resumo:
Introduction: The National Oceanic and Atmospheric Administration’s Biogeography Branch has conducted surveys of reef fish in the Caribbean since 1999. Surveys were initially undertaken to identify essential fish habitat, but later were used to characterize and monitor reef fish populations and benthic communities over time. The Branch’s goals are to develop knowledge and products on the distribution and ecology of living marine resources and provide resource managers, scientists and the public with an improved ecosystem basis for making decisions. The Biogeography Branch monitors reef fishes and benthic communities in three study areas: (1) St. John, USVI, (2) Buck Island, St. Croix, USVI, and (3) La Parguera, Puerto Rico. In addition, the Branch has characterized the reef fish and benthic communities in the Flower Garden Banks National Marine Sanctuary, Gray’s Reef National Marine Sanctuary and around the island of Vieques, Puerto Rico. Reef fish data are collected using a stratified random sampling design and stringent measurement protocols. Over time, the sampling design has changed in order to meet different management objectives (i.e. identification of essential fish habitat vs. monitoring), but the designs have always remained: • Probabilistic – to allow inferences to a larger targeted population, • Objective – to satisfy management objectives, and • Stratified – to reduce sampling costs and obtain population estimates for strata. There are two aspects of the sampling design which are now under consideration and are the focus of this report: first, the application of a sample frame, identified as a set of points or grid elements from which a sample is selected; and second, the application of subsampling in a two-stage sampling design. To evaluate these considerations, the pros and cons of implementing a sampling frame and subsampling are discussed. Particular attention is paid to the impacts of each design on accuracy (bias), feasibility and sampling cost (precision). Further, this report presents an analysis of data to determine the optimal number of subsamples to collect if subsampling were used. (PDF contains 19 pages)
Resumo:
A study was conducted to assess the status of ecological condition and potential human-health risks in subtidal estuarine waters throughout the North Carolina National Estuarine Research Reserve System (NERRS) (Currituck Sound, Rachel Carson, Masonboro Island, and Zeke’s Island). Field work was conducted in September 2006 and incorporated multiple indicators of ecosystem condition including measures of water quality (dissolved oxygen, salinity, temperature, pH, nutrients and chlorophyll, suspended solids), sediment quality (granulometry, organic matter content, chemical contaminant concentrations), biological condition (diversity and abundances of benthic fauna, fish contaminant levels and pathologies), and human dimensions (fish-tissue contaminant levels relative to human-health consumption limits, various aesthetic properties). A probabilistic sampling design permitted statistical estimation of the spatial extent of degraded versus non-degraded condition across these estuaries relative to specified threshold levels of the various indicators (where possible). With some exceptions, the status of these reserves appeared to be in relatively good to fair ecological condition overall, with the majority of the area (about 54%) having various water quality, sediment quality, and biological (benthic) condition indicators rated in the healthy to intermediate range of corresponding guideline thresholds. Only three stations, representing 10.5% of the area, had one or more of these indicators rated as poor/degraded in all three categories. While such a conclusion is encouraging from a coastal management perspective, it should be viewed with some caution. For example, although co-occurrences of adverse biological and abiotic environmental conditions were limited, at least one indicator of ecological condition rated in the poor/degraded range was observed over a broader area (35.5%) represented by 11 of the 30 stations sampled. In addition, the fish-tissue contaminant data were not included in these overall spatial estimates; however, the majority of samples (77% of fish that were analyzed, from 79%, of stations where fish were caught) contained inorganic arsenic above the consumption limits for human cancer risks, though most likely derived from natural sources. Similarly, aesthetic indicators are not reflected in these spatial estimates of ecological condition, though there was evidence of noxious odors in sediments at many of the stations. Such symptoms reflect a growing realization that North Carolina estuaries are under multiple pressures from a variety of natural and human influences. These data also suggest that, while the current status of overall ecological condition appears to be good to fair, long-term monitoring is warranted to track potential changes in the future. This study establishes an important baseline of overall ecological condition within NC NERRS that can be used to evaluate any such future changes and to trigger appropriate management actions in this rapidly evolving coastal environment. (PDF contains 76 pages)
Resumo:
A study was conducted, in association with the Sapelo Island and North Carolina National Estuarine Research Reserves (NERRs), to evaluate the impacts of coastal development on sentinel habitats (e.g., tidal creek ecosystems), including potential impacts to human health and well-being. Uplands associated with southeastern tidal creeks and the salt marshes they drain are popular locations for building homes, resorts, and recreational facilities because of the high quality of life and mild climate associated with these environments. Tidal creeks form part of the estuarine ecosystem characterized by high biological productivity, great ecological value, complex environmental gradients, and numerous interconnected processes. This research combined a watershed-level study integrating ecological, public health and human dimension attributes with watershed-level land use data. The approach used for this research was based upon a comparative watershed and ecosystem approach that sampled tidal creek networks draining developed watersheds (e.g., suburban, urban, and industrial) as well as undeveloped sites. The primary objective of this work was to clearly define the relationships between coastal development with its concomitant land use changes and non-point source pollution loading and the ecological and human health and well-being status of tidal creek ecosystems. Nineteen tidal creek systems, located along the southeastern United States coast from southern North Carolina to southern Georgia, were sampled during summer (June-August), 2005 and 2006. Within each system, creeks were divided into two primary segments based upon tidal zoning: intertidal (i.e., shallow, narrow headwater sections) and subtidal (i.e., deeper and wider sections), and watersheds were delineated for each segment. In total, we report findings on 24 intertidal and 19 subtidal creeks. Indicators sampled throughout each creek included water quality (e.g., dissolved oxygen concentration, salinity, nutrients, chlorophyll-a levels), sediment quality (e.g., characteristics, contaminants levels including emerging contaminants), pathogen and viral indicators, and abundance and genetic responses of biological resources (e.g., macrobenthic and nektonic communities, shellfish tissue contaminants, oyster microarray responses). For many indicators, the intertidally-dominated or headwater portions of tidal creeks were found to respond differently than the subtidally-dominated or larger and deeper portions of tidal creeks. Study results indicate that the integrity and productivity of headwater tidal creeks were impaired by land use changes and associated non-point source pollution, suggesting these habitats are valuable early warning sentinels of ensuing ecological impacts and potential public health threats. For these headwater creeks, this research has assisted the validation of a previously developed conceptual model for the southeastern US region. This conceptual model identified adverse changes that generally occurred in the physical and chemical environment (e.g., water quality indicators such as indicator bacteria for sewage pollution or sediment chemical contamination) when impervious cover levels in the watershed reach 10-20%. Ecological characteristics responded and were generally impaired when impervious cover levels exceed 20-30%. Estimates of impervious cover levels defining where human uses are impaired are currently being determined, but it appears that shellfish bed closures and the flooding vulnerability of headwater regions become a concern when impervious cover values exceed 10-30%. This information can be used to forecast the impacts of changing land use patterns on tidal creek environmental quality as well as associated human health and well-being. In addition, this study applied tools and technologies that are adaptable, transferable, and repeatable among the high quality NERRS sites as comparable reference entities to other nearby developed coastal watersheds. The findings herein will be of value in addressing local, regional and national needs for understanding multiple stressor (anthropogenic and human impacts) effects upon estuarine ecosystems and response trends in ecosystem condition with changing coastal impacts (i.e., development, climate change). (PDF contaions 88 pages)
Resumo:
INTRODUCTION: This report summarizes the results of NOAA's sediment toxicity, chemistry, and benthic community studies in the Chesapeake Bay estuary. As part of the National Status and Trends (NS&T) Program, NOAA has conducted studies to determine the spatial extent and severity of chemical contamination and associated adverse biological effects in coastal bays and estuaries of the United States since 1991. Sediment contamination in U.S. coastal areas is a major environmental issue because of its potential toxic effects on biological resources and often, indirectly, on human health. Thus, characterizing and delineating areas of sediment contamination and toxicity and demonstrating their effect(s) on benthic living resources are viewed as important goals of coastal resource management. Benthic community studies have a history of use in regional estuarine monitoring programs and have been shown to be an effective indicator for describing the extent and magnitude of pollution impacts in estuarine ecosystems, as well as for assessing the effectiveness of management actions. Chesapeake Bay is the largest estuarine system in the United States. Including tidal tributaries, the Bay has approximately 18,694 km of shoreline (more than the entire US West Coast). The watershed is over 165,000 km2 (64,000 miles2), and includes portions of six states (Delaware, Maryland, New York, Pennsylvania, Virginia, and West Virginia) and the District of Columbia. The population of the watershed exceeds 15 million people. There are 150 rivers and streams in the Chesapeake drainage basin. Within the watershed, five major rivers - the Susquehanna, Potomac, Rappahannock, York and James - provide almost 90% of the freshwater to the Bay. The Bay receives an equal volume of water from the Atlantic Ocean. In the upper Bay and tributaries, sediments are fine-grained silts and clays. Sediments in the middle Bay are mostly made of silts and clays derived from shoreline erosion. In the lower Bay, by contrast, the sediments are sandy. These particles come from shore erosion and inputs from the Atlantic Ocean. The introduction of European-style agriculture and large scale clearing of the watershed produced massive shifts in sediment dynamics of the Bay watershed. As early as the mid 1700s, some navigable rivers were filled in by sediment and sedimentation caused several colonial seaports to become landlocked. Toxic contaminants enter the Bay via atmospheric deposition, dissolved and particulate runoff from the watershed or direct discharge. While contaminants enter the Bay from several sources, sediments accumulate many toxic contaminants and thus reveal the status of input for these constituents. In the watershed, loading estimates indicate that the major sources of contaminants are point sources, stormwater runoff, atmospheric deposition, and spills. Point sources and urban runoff in the Bay proper contribute large quantities of contaminants. Pesticide inputs to the Bay have not been quantified. Baltimore Harbor and the Elizabeth River remain among the most contaminated areas in the Unites States. In the mainstem, deep sediment core analyses indicate that sediment accumulation rates are 2-10 times higher in the northern Bay than in the middle and lower Bay, and that sedimentation rates are 2-10 times higher than before European settlement throughout the Bay (NOAA 1998). The core samples show a decline in selected PAH compounds over the past several decades, but absolute concentrations are still 1 to 2 orders of magnitude above 'pristine' conditions. Core data also indicate that concentrations of PAHs, PCBs and, organochlorine pesticides do not demonstrate consistent trends over 25 years, but remain 10 times lower than sediments in the tributaries. In contrast, tri-butyl-tin (TBT) concentrations in the deep cores have declined significantly since it=s use was severely restricted. (PDF contains 241 pages)
Resumo:
This document describes the analytical methods used to quantify core organic chemicals in tissue and sediment collected as part of NOAA’s National Status and Trends Program (NS&T) for the years 2000-2006. Organic contaminat analytical methods used during the early years of the program are described in NOAA Technical Memoranda NOS ORCA 71 and 130 (Lauenstein and Cantillo, 1993; Lauenstein and Cantillo, 1998) for the years 1984-1992 and 1993-1996, respectively. These reports are available from our website (http://www.ccma.nos.gov) The methods detailed in this document were utilized by the Mussel Watch Project and Bioeffects Project, which are both part of the NS&T program. The Mussel Watch Project has been monitoring contaminants in bivalves and sediments since 1986 and is the longest active national contaminant monitoring program operating in U.S. costal waters. Approximately 280 Mussel Watch sites are sampled on a biennial and decadal timescale for bivalve tissue and sediment respectively. Similarly, the Bioeffects Assessment Project began in 1986 to characterize estuaries and near coastal environs. Using the sediment quality triad approach that measures; (1) levels of contaminants in sediments, (2) incidence and severity of toxicity, and (3) benthic macrofaunal conmmunities, the Bioeffects Project describes the spatial extent of sediment toxicity. Contaminant assessment is a core function of both projects. These methods, while discussed here in the context of sediment and bivalve tissue, were also used with other matricies including: fish fillet, fish liver, nepheloid layer, and suspended particulate matter. The methods described herein are for the core organic contaminants monitored in the NS&T Program and include polycyclic aromatic hydrocarbons (PAHs), polychlorinated biphenyls (PCBs), butyltins, and organochlorines that have been analyzed consistently over the past 15-20 years. Organic contaminants such as dioxins, perfluoro compounds and polybrominated biphenyl ethers (PBDEs) were analyzed periodically in special studies of the NS&T Program and will be described in another document. All of the analytical techniques described in this document were used by B&B Laboratories, Inc, an affiliate of TDI-Brook International, Inc. in College Station, Texas under contract to NOAA. The NS&T Program uses a performance-based system approach to obtain the best possible data quality and comparability, and requires laboratories to demonstrate precision, accuracy, and sensitivity to ensure results-based performance goals and measures. (PDF contains 75 pages)
Resumo:
This report was developed to help establish National Ocean Service priorities and chart new directions for research and development of models for estuarine, coastal and ocean ecosystems based on user-driven requirements and supportive of sound coastal management, stewardship, and an ecosystem approach to management. (PDF contains 63 pages)
Resumo:
Over the past four decades, the state of Hawaii has developed a system of eleven Marine Life Conservation Districts (MLCDs) to conserve and replenish marine resources around the state. Initially established to provide opportunities for public interaction with the marine environment, these MLCDs vary in size, habitat quality, and management regimes, providing an excellent opportunity to test hypotheses concerning marine protected area (MPA) design and function using multiple discreet sampling units. NOAA/NOS/NCCOS/Center for Coastal Monitoring and Assessment’s Biogeography Team developed digital benthic habitat maps for all MLCD and adjacent habitats. These maps were used to evaluate the efficacy of existing MLCDs for biodiversity conservation and fisheries replenishment, using a spatially explicit stratified random sampling design. Coupling the distribution of habitats and species habitat affinities using GIS technology elucidates species habitat utilization patterns at scales that are commensurate with ecosystem processes and is useful in defining essential fish habitat and biologically relevant boundaries for MPAs. Analysis of benthic cover validated the a priori classification of habitat types and provided justification for using these habitat strata to conduct stratified random sampling and analyses of fish habitat utilization patterns. Results showed that the abundance and distribution of species and assemblages exhibited strong correlations with habitat types. Fish assemblages in the colonized and uncolonized hardbottom habitats were found to be most similar among all of the habitat types. Much of the macroalgae habitat sampled was macroalgae growing on hard substrate, and as a result showed similarities with the other hardbottom assemblages. The fish assemblages in the sand habitats were highly variable but distinct from the other habitat types. Management regime also played an important role in the abundance and distribution of fish assemblages. MLCDs had higher values for most fish assemblage characteristics (e.g. biomass, size, diversity) compared with adjacent fished areas and Fisheries Management Areas (FMAs) across all habitat types. In addition, apex predators and other targeted resources species were more abundant and larger in the MLCDs, illustrating the effectiveness of these closures in conserving fish populations. Habitat complexity, quality, size and level of protection from fishing were important determinates of MLCD effectiveness with respect to their associated fish assemblages. (PDF contains 217 pages)
Resumo:
This report describes the working of National Centers for Coastal Ocean Service (NCCOS) Wave Exposure Model (WEMo) capable of predicting the exposure of a site in estuarine and closed water to local wind generated waves. WEMo works in two different modes: the Representative Wave Energy (RWE) mode calculates the exposure using physical parameters like wave energy and wave height, while the Relative Exposure Index (REI) empirically calculates exposure as a unitless index. Detailed working of the model in both modes and their procedures are described along with a few sample runs. WEMo model output in RWE mode (wave height and wave energy) is compared against data collected from wave sensors near Harkers Island, North Carolina for validation purposes. Computed results agreed well with the wave sensors data indicating that WEMo can be an effective tool in predicting local wave energy in closed estuarine environments. (PDF contains 31 pages)
Ongoing monitoring of Tortugas Ecological Reserve: Assessing the consequences of reserve designation
Resumo:
Over the past five years, a biogeographic characterization of Tortugas Ecological Reserve(TER) has been carried out to measure the post-implementation effects of TER as a refuge for exploited species. Our results demonstrate that there is substantial microalgal biomass at depths between 10 and 30 m in the soft sediments at the coral reef interface, and that this community may play an important role in the food web supporting reef organisms. In addition, preliminary stable isotope data, in conjunction with prior results from the west Florida shelf, suggest that the shallow water benthic habitats surrounding the coral reefs of TER will prove to be an important source of the primary production ultimately fueling fish production throughout TER. The majority of the fish analyzed so far have exhibited a C isotope signature consistent with a food web which relies heavily on benthic primary production. Fish counts indicate a marked increase in the abundance of large fish (>20 cm) within the Reserve relative to the Out and Park strata, across years. Faunal collections from open and protected soft bottom habitat near the northern boundary of Tortugas North strongly suggest that relaxation of trawling pressure has increased benthic biomass and diversity in this area of TER. These data, employing an integrated Before - After Control Impact (BACI) design at multiple spatial scales, will allow us to continue to document and quantify the post-implementation effects of TER. (PDF contains 58 pages)
Resumo:
Almost all extreme events lasting less than several weeks that significantly impact ecosystems are weather related. This review examines the response of estuarine systems to intense short-term perturbations caused by major weather events such as hurricanes. Current knowledge concerning these effects is limited to relatively few studies where hurricanes and storms impacted estuaries with established environmental monitoring programs. Freshwater inputs associated with these storms were found to initially result in increased primary productivity. When hydrographic conditions are favorable, bacterial consumption of organic matter produced by the phytoplankton blooms and deposited during the initial runoff event can contribute to significant oxygen deficits during subsequent warmer periods. Salinity stress and habitat destruction associated with freshwater inputs, as well as anoxia, adversely affect benthic populations and fish. In contrast, mobile invertebrate species such as shrimp, which have a short life cycle and the ability to migrate during the runoff event, initially benefit from the increased primary productivity and decreased abundance of fish predators. Events studied so far indicate that estuaries rebound in one to three years following major short-term perturbations. However, repeated storm events without sufficient recovery time may cause a fundamental shift in ecosystem structure (Scavia et al. 2002). This is a scenario consistent with the predicted increase in hurricanes for the east coast of the United States. More work on the response of individual species to these stresses is needed so management of commercial resources can be adjusted to allow sufficient recovery time for affected populations.
Resumo:
Policy makers, natural resource managers, regulators, and the public often call on scientists to estimate the potential ecological changes caused by both natural and human-induced stresses, and to determine how those changes will impact people and the environment. To develop accurate forecasts of ecological changes we need to: 1) increase understanding of ecosystem composition, structure, and functioning, 2) expand ecosystem monitoring and apply advanced scientific information to make these complex data widely available, and 3) develop and improve forecast and interpretative tools that use a scientific basis to assess the results of management and science policy actions. (PDF contains 120 pages)