978 resultados para National Space Science Data Center
Resumo:
The purpose of the study was to explore how a public, IT services transferor, organization, comprised of autonomous entities, can effectively develop and organize its data center cost recovery mechanisms in a fair manner. The lack of a well-defined model for charges and a cost recovery scheme could cause various problems. For example one entity may be subsidizing the costs of another entity(s). Transfer pricing is in the best interest of each autonomous entity in a CCA. While transfer pricing plays a pivotal role in the price settings of services and intangible assets, TCE focuses on the arrangement at the boundary between entities. TCE is concerned with the costs, autonomy, and cooperation issues of an organization. The theory is concern with the factors that influence intra-firm transaction costs and attempting to manifest the problems involved in the determination of the charges or prices of the transactions. This study was carried out, as a single case study, in a public organization. The organization intended to transfer the IT services of its own affiliated public entities and was in the process of establishing a municipal-joint data center. Nine semi-structured interviews, including two pilot interviews, were conducted with the experts and managers of the case company and its affiliating entities. The purpose of these interviews was to explore the charging and pricing issues of the intra-firm transactions. In order to process and summarize the findings, this study employed qualitative techniques with the multiple methods of data collection. The study, by reviewing the TCE theory and a sample of transfer pricing literature, created an IT services pricing framework as a conceptual tool for illustrating the structure of transferring costs. Antecedents and consequences of the transfer price based on TCE were developed. An explanatory fair charging model was eventually developed and suggested. The findings of the study suggested that the Chargeback system was inappropriate scheme for an organization with affiliated autonomous entities. The main contribution of the study was the application of TP methodologies in the public sphere with no tax issues consideration.
Resumo:
Modern data centers host hundreds of thousands of servers to achieve economies of scale. Such a huge number of servers create challenges for the data center network (DCN) to provide proportionally large bandwidth. In addition, the deployment of virtual machines (VMs) in data centers raises the requirements for efficient resource allocation and find-grained resource sharing. Further, the large number of servers and switches in the data center consume significant amounts of energy. Even though servers become more energy efficient with various energy saving techniques, DCN still accounts for 20% to 50% of the energy consumed by the entire data center. The objective of this dissertation is to enhance DCN performance as well as its energy efficiency by conducting optimizations on both host and network sides. First, as the DCN demands huge bisection bandwidth to interconnect all the servers, we propose a parallel packet switch (PPS) architecture that directly processes variable length packets without segmentation-and-reassembly (SAR). The proposed PPS achieves large bandwidth by combining switching capacities of multiple fabrics, and it further improves the switch throughput by avoiding padding bits in SAR. Second, since certain resource demands of the VM are bursty and demonstrate stochastic nature, to satisfy both deterministic and stochastic demands in VM placement, we propose the Max-Min Multidimensional Stochastic Bin Packing (M3SBP) algorithm. M3SBP calculates an equivalent deterministic value for the stochastic demands, and maximizes the minimum resource utilization ratio of each server. Third, to provide necessary traffic isolation for VMs that share the same physical network adapter, we propose the Flow-level Bandwidth Provisioning (FBP) algorithm. By reducing the flow scheduling problem to multiple stages of packet queuing problems, FBP guarantees the provisioned bandwidth and delay performance for each flow. Finally, while DCNs are typically provisioned with full bisection bandwidth, DCN traffic demonstrates fluctuating patterns, we propose a joint host-network optimization scheme to enhance the energy efficiency of DCNs during off-peak traffic hours. The proposed scheme utilizes a unified representation method that converts the VM placement problem to a routing problem and employs depth-first and best-fit search to find efficient paths for flows.
Resumo:
Implementation of GEOSS/GMES initiative requires creation and integration of service providers, most of which provide geospatial data output from Grid system to interactive user. In this paper approaches of DOS- centers (service providers) integration used in Ukrainian segment of GEOSS/GMES will be considered and template solutions for geospatial data visualization subsystems will be suggested. Developed patterns are implemented in DOS center of Space Research Institute of National Academy of Science of Ukraine and National Space Agency of Ukraine (NASU-NSAU).
Resumo:
Summary: This cruise report is a summary of a field survey conducted within the Stellwagen Bank National Marine Sanctuary (SBNMS), located between Cape Cod and Cape Ann at the mouth of Massachusetts Bay. The survey was conducted June 14 – June 21, 2008 on NOAA Ship NANCY FOSTER Cruise NF-08-09-CCEHBR. Multiple indicators of ecological condition and human dimensions were sampled synoptically at each of 30 stations throughout SBNMS using a random probabilistic sampling design. Samples were collected for the analysis of benthic community structure and composition; concentrations of chemical contaminants (metals, pesticides, PAHs, PCBs, PBDEs) in sediments and target demersal biota; nutrient and chlorophyll levels in the water column; and other basic habitat characteristics such as depth, salinity, temperature, dissolved oxygen, turbidity, pH, sediment grain size, and organic carbon content. In addition to the fish samples that were collected for analysis of chemical contaminants relative to human-health consumption limits, other human-dimension indicators were sampled as well including presence or absence of fishing gear, vessels, surface trash, marine mammals, and noxious sediment odors. The overall purpose of the survey was to collect data to assess the status of ecosystem condition and potential stressor impacts throughout SBNMS, based on these various indicators and corresponding management thresholds, and to provide this information as a baseline for determining how such conditions may be changing with time. While sample analysis is still ongoing a few preliminary results and observations are reported here. A final report will be completed once all data have been processed. The results are anticipated to be of value in supporting goals of the SBNMS and National Marine Sanctuary Program aimed at the characterization, protection, and management of sanctuary resources (pursuant to the National Marine Sanctuary Reauthorization Act) as well as a new priority of NCCOS and NOAA to apply Ecosystem Based approaches to the Management of coastal resources (EBM) through Integrated Ecosystem Assessments (IEAs) conducted in various coastal regions of the U.S. including the Northeast Atlantic continental shelf. This was a multi-disciplinary partnership effort made possible by scientists from the following organizations: NOAA, National Ocean Service (NOS), National Centers for Coastal Ocean Science (NCCOS), Center for Coastal Environmental Health and Biomolecular Research (CCEHBR), Charleston, SC. U.S. Environmental Protection Agency (EPA), National Health and Environmental Effects Research Laboratory (NHEERL), Atlantic Ecology Division (GED), Narragansett, RI. U.S. Environmental Protection Agency (EPA), National Health and Environmental Effects Research Laboratory (NHEERL), Gulf Ecology Division (GED), Gulf Breeze, FL. U.S. Geological Survey (USGS), National Wetlands Research Center, Gulf Breeze Project Office, Gulf Breeze, FL. NOAA, Office of Marine and Aviation Operations (OMAO), NOAA ship Nancy Foster. (31pp) (PDF contains 58 pages)
Resumo:
The US National Oceanic and Atmospheric Administration (NOAA) Fisheries Continuous Plankton Recorder (CPR) Survey has sampled four routes: Boston–Nova Scotia (1961–present), New York toward Bermuda (1976–present), Narragansett Bay–Mount Hope Bay–Rhode Island Sound (1998–present) and eastward of Chesapeake Bay (1974–1980). NOAA involvement began in 1974 when it assumed responsibility for the existing Boston–Nova Scotia route from what is now the UK's Sir Alister Hardy Foundation for Ocean Science (SAHFOS). Training, equipment and computer software were provided by SAHFOS to ensure continuity for this and standard protocols for any new routes. Data for the first 14 years of this route were provided to NOAA by SAHFOS. Comparison of collection methods; sample processing; and sample identification, staging and counting techniques revealed near-consistency between NOAA and SAHFOS. One departure involved phytoplankton counting standards. This has since been addressed and the data corrected. Within- and between-survey taxonomic and life-stage names and their consistency through time were, and continue to be, an issue. For this, a cross-reference table has been generated that contains the SAHFOS taxonomic code, NOAA taxonomic code, NOAA life-stage code, National Oceanographic Data Center (NODC) taxonomic code, Integrated Taxonomic Information System (ITIS) serial number and authority and consistent use/route. This table is available for review/use by other CPR surveys. Details of the NOAA and SAHFOS comparison and analytical techniques unique to NOAA are presented.
Resumo:
Most multidimensional projection techniques rely on distance (dissimilarity) information between data instances to embed high-dimensional data into a visual space. When data are endowed with Cartesian coordinates, an extra computational effort is necessary to compute the needed distances, making multidimensional projection prohibitive in applications dealing with interactivity and massive data. The novel multidimensional projection technique proposed in this work, called Part-Linear Multidimensional Projection (PLMP), has been tailored to handle multivariate data represented in Cartesian high-dimensional spaces, requiring only distance information between pairs of representative samples. This characteristic renders PLMP faster than previous methods when processing large data sets while still being competitive in terms of precision. Moreover, knowing the range of variation for data instances in the high-dimensional space, we can make PLMP a truly streaming data projection technique, a trait absent in previous methods.
Resumo:
Context. The ESO public survey VISTA variables in the Via Lactea (VVV) started in 2010. VVV targets 562 sq. deg in the Galactic bulge and an adjacent plane region and is expected to run for about five years. Aims. We describe the progress of the survey observations in the first observing season, the observing strategy, and quality of the data obtained. Methods. The observations are carried out on the 4-m VISTA telescope in the ZYJHK(s) filters. In addition to the multi-band imaging the variability monitoring campaign in the K-s filter has started. Data reduction is carried out using the pipeline at the Cambridge Astronomical Survey Unit. The photometric and astrometric calibration is performed via the numerous 2MASS sources observed in each pointing. Results. The first data release contains the aperture photometry and astrometric catalogues for 348 individual pointings in the ZYJHK(s) filters taken in the 2010 observing season. The typical image quality is similar to 0 ''.9-1 ''.0. The stringent photometric and image quality requirements of the survey are satisfied in 100% of the JHK(s) images in the disk area and 90% of the JHK(s) images in the bulge area. The completeness in the Z and Y images is 84% in the disk, and 40% in the bulge. The first season catalogues contain 1.28 x 10(8) stellar sources in the bulge and 1.68 x 10(8) in the disk area detected in at least one of the photometric bands. The combined, multi-band catalogues contain more than 1.63 x 10(8) stellar sources. About 10% of these are double detections because of overlapping adjacent pointings. These overlapping multiple detections are used to characterise the quality of the data. The images in the JHK(s) bands extend typically similar to 4 mag deeper than 2MASS. The magnitude limit and photometric quality depend strongly on crowding in the inner Galactic regions. The astrometry for K-s = 15-18 mag has rms similar to 35-175 mas. Conclusions. The VVV Survey data products offer a unique dataset to map the stellar populations in the Galactic bulge and the adjacent plane and provide an exciting new tool for the study of the structure, content, and star-formation history of our Galaxy, as well as for investigations of the newly discovered star clusters, star-forming regions in the disk, high proper motion stars, asteroids, planetary nebulae, and other interesting objects.
Resumo:
Historical, i.e. pre-1957, upper-air data are a valuable source of information on the state of the atmosphere, in some parts of the world dating back to the early 20th century. However, to date, reanalyses have only partially made use of these data, and only of observations made after 1948. Even for the period between 1948 (the starting year of the NCEP/NCAR (National Centers for Environmental Prediction/National Center for Atmospheric Research) reanalysis) and the International Geophysical Year in 1957 (the starting year of the ERA-40 reanalysis), when the global upper-air coverage reached more or less its current status, many observations have not yet been digitised. The Comprehensive Historical Upper-Air Network (CHUAN) already compiled a large collection of pre-1957 upper-air data. In the framework of the European project ERA-CLIM (European Reanalysis of Global Climate Observations), significant amounts of additional upper-air data have been catalogued (> 1.3 million station days), imaged (> 200 000 images) and digitised (> 700 000 station days) in order to prepare a new input data set for upcoming reanalyses. The records cover large parts of the globe, focussing on, so far, less well covered regions such as the tropics, the polar regions and the oceans, and on very early upper-air data from Europe and the US. The total number of digitised/inventoried records is 61/101 for moving upper-air data, i.e. data from ships, etc., and 735/1783 for fixed upper-air stations. Here, we give a detailed description of the resulting data set including the metadata and the quality checking procedures applied. The data will be included in the next version of CHUAN. The data are available at doi:10.1594/PANGAEA.821222
Resumo:
The present data set includes 268,127 vertical in situ fluorescence profiles obtained from several available online databases and from published and unpublished individual sources. Metadata about each profiles are given in the file provided here in further details. The majority of profiles comes from the National Oceanographic Data Center (NODC) and the fluorescence profiles acquired by Bio-Argo floats available on the Oceanographic Autonomous Observations (OAO) platform (63.7% and 12.5% respectively).
Different modes of acquisition were used to collect the data presented in this study: (1) CTD profiles are acquired using a fluorometer mounted on a CTD-rosette; (2) OSD (Ocean Station Data) profiles are derived from water samples and are defined as low resolution profiles; (3) the UOR (Undulating Oceanographic Recorder) profiles are acquired by a
Resumo:
Includes bibliographies.