8 resultados para injury data quality

em Aquatic Commons


Relevância:

90.00% 90.00%

Publicador:

Resumo:

The mapping and geospatial analysis of benthic environments are multidisciplinary tasks that have become more accessible in recent years because of advances in technology and cost reductions in survey systems. The complex relationships that exist among physical, biological, and chemical seafloor components require advanced, integrated analysis techniques to enable scientists and others to visualize patterns and, in so doing, allow inferences to be made about benthic processes. Effective mapping, analysis, and visualization of marine habitats are particularly important because the subtidal seafloor environment is not readily viewed directly by eye. Research in benthic environments relies heavily, therefore, on remote sensing techniques to collect effective data. Because many benthic scientists are not mapping professionals, they may not adequately consider the links between data collection, data analysis, and data visualization. Projects often start with clear goals, but may be hampered by the technical details and skills required for maintaining data quality through the entire process from collection through analysis and presentation. The lack of technical understanding of the entire data handling process can represent a significant impediment to success. While many benthic mapping efforts have detailed their methodology as it relates to the overall scientific goals of a project, only a few published papers and reports focus on the analysis and visualization components (Paton et al. 1997, Weihe et al. 1999, Basu and Saxena 1999, Bruce et al. 1997). In particular, the benthic mapping literature often briefly describes data collection and analysis methods, but fails to provide sufficiently detailed explanation of particular analysis techniques or display methodologies so that others can employ them. In general, such techniques are in large part guided by the data acquisition methods, which can include both aerial and water-based remote sensing methods to map the seafloor without physical disturbance, as well as physical sampling methodologies (e.g., grab or core sampling). The terms benthic mapping and benthic habitat mapping are often used synonymously to describe seafloor mapping conducted for the purpose of benthic habitat identification. There is a subtle yet important difference, however, between general benthic mapping and benthic habitat mapping. The distinction is important because it dictates the sequential analysis and visualization techniques that are employed following data collection. In this paper general seafloor mapping for identification of regional geologic features and morphology is defined as benthic mapping. Benthic habitat mapping incorporates the regional scale geologic information but also includes higher resolution surveys and analysis of biological communities to identify the biological habitats. In addition, this paper adopts the definition of habitats established by Kostylev et al. (2001) as a “spatially defined area where the physical, chemical, and biological environment is distinctly different from the surrounding environment.” (PDF contains 31 pages)

Relevância:

80.00% 80.00%

Publicador:

Resumo:

(Document pdf contains 193 pages) Executive Summary (pdf, < 0.1 Mb) 1. Introduction (pdf, 0.2 Mb) 1.1 Data sharing, international boundaries and large marine ecosystems 2. Objectives (pdf, 0.3 Mb) 3. Background (pdf, < 0.1 Mb) 3.1 North Pacific Ecosystem Metadatabase 3.2 First federation effort: NPEM and the Korea Oceanographic Data Center 3.2 Continuing effort: Adding Japan’s Marine Information Research Center 4. Metadata Standards (pdf, < 0.1 Mb) 4.1 Directory Interchange Format 4.2 Ecological Metadata Language 4.3 Dublin Core 4.3.1. Elements of DC 4.4 Federal Geographic Data Committee 4.5 The ISO 19115 Metadata Standard 4.6 Metadata stylesheets 4.7 Crosswalks 4.8 Tools for creating metadata 5. Communication Protocols (pdf, < 0.1 Mb) 5.1 Z39.50 5.1.1. What does Z39.50 do? 5.1.2. Isite 6. Clearinghouses (pdf, < 0.1 Mb) 7. Methodology (pdf, 0.2 Mb) 7.1 FGDC metadata 7.1.1. Main sections 7.1.2. Supporting sections 7.1.3. Metadata validation 7.2 Getting a copy of Isite 7.3 NSDI Clearinghouse 8. Server Configuration and Technical Issues (pdf, 0.4 Mb) 8.1 Hardware recommendations 8.2 Operating system – Red Hat Linux Fedora 8.3 Web services – Apache HTTP Server version 2.2.3 8.4 Create and validate FGDC-compliant Metadata in XML format 8.5 Obtaining, installing and configuring Isite for UNIX/Linux 8.5.1. Download the appropriate Isite software 8.5.2. Untar the file 8.5.3. Name your database 8.5.4. The zserver.ini file 8.5.5. The sapi.ini file 8.5.6. Indexing metadata 8.5.7. Start the Clearinghouse Server process 8.5.8. Testing the zserver installation 8.6 Registering with NSDI Clearinghouse 8.7 Security issues 9. Search Tutorial and Examples (pdf, 1 Mb) 9.1 Legacy NSDI Clearinghouse search interface 9.2 New GeoNetwork search interface 10. Challenges (pdf, < 0.1 Mb) 11. Emerging Standards (pdf, < 0.1 Mb) 12. Future Activity (pdf, < 0.1 Mb) 13. Acknowledgments (pdf, < 0.1 Mb) 14. References (pdf, < 0.1 Mb) 15. Acronyms (pdf, < 0.1 Mb) 16. Appendices 16.1. KODC-NPEM meeting agendas and minutes (pdf, < 0.1 Mb) 16.1.1. Seattle meeting agenda, August 22–23, 2005 16.1.2. Seattle meeting minutes, August 22–23, 2005 16.1.3. Busan meeting agenda, October 10–11, 2005 16.1.4. Busan meeting minutes, October 10–11, 2005 16.2. MIRC-NPEM meeting agendas and minutes (pdf, < 0.1 Mb) 16.2.1. Seattle Meeting agenda, August 14-15, 2006 16.2.2. Seattle meeting minutes, August 14–15, 2006 16.2.3. Tokyo meeting agenda, October 19–20, 2006 16.2.4. Tokyo, meeting minutes, October 19–20, 2006 16.3. XML stylesheet conversion crosswalks (pdf, < 0.1 Mb) 16.3.1. FGDCI to DIF stylesheet converter 16.3.2. DIF to FGDCI stylesheet converter 16.3.3. String-modified stylesheet 16.4. FGDC Metadata Standard (pdf, 0.1 Mb) 16.4.1. Overall structure 16.4.2. Section 1: Identification information 16.4.3. Section 2: Data quality information 16.4.4. Section 3: Spatial data organization information 16.4.5. Section 4: Spatial reference information 16.4.6. Section 5: Entity and attribute information 16.4.7. Section 6: Distribution information 16.4.8. Section 7: Metadata reference information 16.4.9. Sections 8, 9 and 10: Citation information, time period information, and contact information 16.5. Images of the Isite server directory structure and the files contained in each subdirectory after Isite installation (pdf, 0.2 Mb) 16.6 Listing of NPEM’s Isite configuration files (pdf, < 0.1 Mb) 16.6.1. zserver.ini 16.6.2. sapi.ini 16.7 Java program to extract records from the NPEM metadatabase and write one XML file for each record (pdf, < 0.1 Mb) 16.8 Java program to execute the metadata extraction program (pdf, < 0.1 Mb) A1 Addendum 1: Instructions for Isite for Windows (pdf, 0.6 Mb) A2 Addendum 2: Instructions for Isite for Windows ADHOST (pdf, 0.3 Mb)

Relevância:

80.00% 80.00%

Publicador:

Resumo:

This document describes the analytical methods used to quantify core organic chemicals in tissue and sediment collected as part of NOAA’s National Status and Trends Program (NS&T) for the years 2000-2006. Organic contaminat analytical methods used during the early years of the program are described in NOAA Technical Memoranda NOS ORCA 71 and 130 (Lauenstein and Cantillo, 1993; Lauenstein and Cantillo, 1998) for the years 1984-1992 and 1993-1996, respectively. These reports are available from our website (http://www.ccma.nos.gov) The methods detailed in this document were utilized by the Mussel Watch Project and Bioeffects Project, which are both part of the NS&T program. The Mussel Watch Project has been monitoring contaminants in bivalves and sediments since 1986 and is the longest active national contaminant monitoring program operating in U.S. costal waters. Approximately 280 Mussel Watch sites are sampled on a biennial and decadal timescale for bivalve tissue and sediment respectively. Similarly, the Bioeffects Assessment Project began in 1986 to characterize estuaries and near coastal environs. Using the sediment quality triad approach that measures; (1) levels of contaminants in sediments, (2) incidence and severity of toxicity, and (3) benthic macrofaunal conmmunities, the Bioeffects Project describes the spatial extent of sediment toxicity. Contaminant assessment is a core function of both projects. These methods, while discussed here in the context of sediment and bivalve tissue, were also used with other matricies including: fish fillet, fish liver, nepheloid layer, and suspended particulate matter. The methods described herein are for the core organic contaminants monitored in the NS&T Program and include polycyclic aromatic hydrocarbons (PAHs), polychlorinated biphenyls (PCBs), butyltins, and organochlorines that have been analyzed consistently over the past 15-20 years. Organic contaminants such as dioxins, perfluoro compounds and polybrominated biphenyl ethers (PBDEs) were analyzed periodically in special studies of the NS&T Program and will be described in another document. All of the analytical techniques described in this document were used by B&B Laboratories, Inc, an affiliate of TDI-Brook International, Inc. in College Station, Texas under contract to NOAA. The NS&T Program uses a performance-based system approach to obtain the best possible data quality and comparability, and requires laboratories to demonstrate precision, accuracy, and sensitivity to ensure results-based performance goals and measures. (PDF contains 75 pages)

Relevância:

80.00% 80.00%

Publicador:

Resumo:

EXECUTIVE SUMMARY: The Coastal Change Analysis Programl (C-CAP) is developing a nationally standardized database on landcover and habitat change in the coastal regions of the United States. C-CAP is part of the Estuarine Habitat Program (EHP) of NOAA's Coastal Ocean Program (COP). C-CAP inventories coastal submersed habitats, wetland habitats, and adjacent uplands and monitors changes in these habitats on a one- to five-year cycle. This type of information and frequency of detection are required to improve scientific understanding of the linkages of coastal and submersed wetland habitats with adjacent uplands and with the distribution, abundance, and health of living marine resources. The monitoring cycle will vary according to the rate and magnitude of change in each geographic region. Satellite imagery (primarily Landsat Thematic Mapper), aerial photography, and field data are interpreted, classified, analyzed, and integrated with other digital data in a geographic information system (GIS). The resulting landcover change databases are disseminated in digital form for use by anyone wishing to conduct geographic analysis in the completed regions. C-CAP spatial information on coastal change will be input to EHP conceptual and predictive models to support coastal resource policy planning and analysis. CCAP products will include 1) spatially registered digital databases and images, 2) tabular summaries by state, county, and hydrologic unit, and 3) documentation. Aggregations to larger areas (representing habitats, wildlife refuges, or management districts) will be provided on a case-by-case basis. Ongoing C-CAP research will continue to explore techniques for remote determination of biomass, productivity, and functional status of wetlands and will evaluate new technologies (e.g. remote sensor systems, global positioning systems, image processing algorithms) as they become available. Selected hardcopy land-cover change maps will be produced at local (1:24,000) to regional scales (1:500,000) for distribution. Digital land-cover change data will be provided to users for the cost of reproduction. Much of the guidance contained in this document was developed through a series of professional workshops and interagency meetings that focused on a) coastal wetlands and uplands; b) coastal submersed habitat including aquatic beds; c) user needs; d) regional issues; e) classification schemes; f) change detection techniques; and g) data quality. Invited participants included technical and regional experts and representatives of key State and Federal organizations. Coastal habitat managers and researchers were given an opportunity for review and comment. This document summarizes C-CAP protocols and procedures that are to be used by scientists throughout the United States to develop consistent and reliable coastal change information for input to the C-CAP nationwide database. It also provides useful guidelines for contributors working on related projects. It is considered a working document subject to periodic review and revision.(PDF file contains 104 pages.)

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The use of self-contained, low-maintenance sensor systems installed on commercial vessels is becoming an important monitoring and scientific tool in many regions around the world. These systems integrate data from meteorological and water quality sensors with GPS data into a data stream that is automatically transferred from ship to shore. To begin linking some of this developing expertise, the Alliance for Coastal Technologies (ACT) and the European Coastal and Ocean Observing Technology (ECOOT) organized a workshop on this topic in Southampton, United Kingdom, October 10-12, 2006. The participants included technology users, technology developers, and shipping representatives. They collaborated to identify sensors currently employed on integrated systems, users of this data, limitations associated with these systems, and ways to overcome these limitations. The group also identified additional technologies that could be employed on future systems and examined whether standard architectures and data protocols for integrated systems should be established. Participants at the workshop defined 17 different parameters currently being measured by integrated systems. They identified that diverse user groups utilize information from these systems from resource management agencies, such as the Environmental Protection Agency (EPA), to local tourism groups and educational organizations. Among the limitations identified were instrument compatibility and interoperability, data quality control and quality assurance, and sensor calibration andlor maintenance frequency. Standardization of these integrated systems was viewed to be both advantageous and disadvantageous; while participants believed that standardization could be beneficial on many levels, they also felt that users may be hesitant to purchase a suite of instruments from a single manufacturer; and that a "plug and play" system including sensors from multiple manufactures may be difficult to achieve. A priority recommendation and conclusion for the general integrated sensor system community was to provide vessel operators with real-time access to relevant data (e.g., ambient temperature and salinity to increase efficiency of water treatment systems and meteorological data for increased vessel safety and operating efficiency) for broader system value. Simplified data displays are also required for education and public outreach/awareness. Other key recommendations were to encourage the use of integrated sensor packages within observing systems such as 100s and EuroGOOS, identify additional customers of sensor system data, and publish results of previous work in peer-reviewed journals to increase agency and scientific awareness and confidence in the technology. Priority recommendations and conclusions for ACT entailed highlighting the value of integrated sensor systems for vessels of opportunity through articles in the popular press, and marine science. [PDF contains 28 pages]

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Daily sea surface temperatures have been acquired at the Hopkins Marine Station in Pacific Grove, California since January 20, 1919.This time series is one of the longest oceanographic records along the U.S. west coast. Because of its length it is well-suited for studying climate-related and oceanic variability on interannual, decadal, and interdecadal time scales. The record, however, is not homogeneous, has numerous gaps, contains possible outliers, and the observations were not always collected at the same time each day. Because of these problems we have undertaken the task of reconstructing this long and unique series. We describe the steps that were taken and the methods that were used in this reconstruction. Although the methods employed are basic, we believe that they are consistent with the quality of the data. The reconstructed record has values at every time point, original, or estimated, and has been adjusted for time-of-day variations where this information was available. Possible outliers have also been examined and replaced where their credibility could not be established. Many of the studies that have employed the Hopkins time series have not discussed the issue of data quality and how these problems were addressed. Because of growing interest in this record, it is important that a single, well-documented version be adopted, so that the results of future analyses can be directly compared. Although additional work may be done to further improve the quality of this record, it is now available via the internet. [PDF contains 48 pages]

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Assessing the vulnerability of stocks to fishing practices in U.S. federal waters was recently highlighted by the National Marine Fisheries Service (NMFS), National Oceanic and Atmospheric Administration, as an important factor to consider when 1) identifying stocks that should be managed and protected under a fishery management plan; 2) grouping data-poor stocks into relevant management complexes; and 3) developing precautionary harvest control rules. To assist the regional fishery management councils in determining vulnerability, NMFS elected to use a modified version of a productivity and susceptibility analysis (PSA) because it can be based on qualitative data, has a history of use in other fisheries, and is recommended by several organizations as a reasonable approach for evaluating risk. A number of productivity and susceptibility attributes for a stock are used in a PSA and from these attributes, index scores and measures of uncertainty are computed and graphically displayed. To demonstrate the utility of the resulting vulnerability evaluation, we evaluated six U.S. fisheries targeting 162 stocks that exhibited varying degrees of productivity and susceptibility, and for which data quality varied. Overall, the PSA was capable of differentiating the vulnerability of stocks along the gradient of susceptibility and productivity indices, although fixed thresholds separating low-, moderate-, and highly vulnerable species were not observed. The PSA can be used as a flexible tool that can incorporate regional-specific information on fishery and management activity.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Nonindigenous species (NIS) are a major threat to marine ecosystems, with possible dramatic effects on biodiversity, biological productivity, habitat structure and fisheries. The Papahānaumokuākea Marine National Monument (PMNM) has taken active steps to mitigate the threats of NIS in Northwestern Hawaiian Islands (NWHI). Of particular concern are the 13 NIS already detected in NWHI and two invasive species found among the main Hawaiian Islands, snowflake coral (Carijoa riseii) and a red alga (Hypnea musciformis). Much of the information regarding NIS in NWHI has been collected or informed by surveys using conventional SCUBA or fishing gear. These technologies have significant drawbacks. SCUBA is generally constrained to depths shallower than 40 m and several NIS of concern have been detected well below this limit (e.g., L. kasmira – 256 m) and fishing gear is highly selective. Consequently, not all habitats or species can be properly represented. Effective management of NIS requires knowledge of their spatial distribution and abundance over their entire range. Surveys which provide this requisite information can be expensive, especially in the marine environment and even more so in deepwater. Technologies which minimize costs, increase the probability of detection and are capable of satisfying multiple objectives simultaneously are desired. This report examines survey technologies, with a focus on towed camera systems (TCSs), and modeling techniques which can increase NIS detection and sampling efficiency in deepwater habitats of NWHI; thus filling a critical data gap in present datasets. A pilot study conducted in 2008 at French Frigate Shoals and Brooks Banks was used to investigate the application of TCSs for surveying NIS in habitats deeper than 40 m. Cost and data quality were assessed. Over 100 hours of video was collected, in which 124 sightings of NIS were made among benthic habitats from 20 to 250 m. Most sightings were of a single cosmopolitan species, Lutjanus kasmira, but Cephalopholis argus, and Lutjanus fulvus, were also detected. The data expand the spatial distributions of observed NIS into deepwater habitats, identify algal plain as an important habitat and complement existing data collected using SCUBA and fishing gear. The technology’s principal drawback was its inability to identify organisms of particular concern, such as Carijoa riseii and Hypnea musciformis due to inadequate camera resolution and inability to thoroughly inspect sites. To solve this issue we recommend incorporating high-resolution cameras into TCSs, or using alternative technologies, such as technical SCUBA diving or remotely operated vehicles, in place of TCSs. We compared several different survey technologies by cost and their ability to detect NIS and these results are summarized in Table 3.