377 resultados para archiving


Relevância:

10.00% 10.00%

Publicador:

Resumo:

When patients enter our emergency room with suspected multiple injuries, Statscan provides a full body anterior and lateral image for initial diagnosis, and then zooms in on specific smaller areas for a more detailed evaluation. In order to examine the possible role of Statscan in the management of multiply injured patients we implemented a modified ATLS((R)) algorithm, where X-ray of C-spine, chest and pelvis have been replaced by single-total a.p./lat. body radiograph. Between 15 October 2006 and 1 February 2007 143 trauma patients (mean ISS 15+/-14 (3-75)) were included. We compared the time in resuscitation room to 650 patients (mean ISS 14+/-14 (3-75)) which were treated between 1 January 2002 and 1 January 2004 according to conventional ATLS protocol. The total-body scanning time was 3.5 min (3-6 min) compared to 25.7 (8-48 min) for conventional X-rays, The total ER time was unchanged 28.7 min (13-58 min) compared to 29.1 min (15-65 min) using conventional plain radiography. In 116/143 patients additional CT scans were necessary. In 98/116 full body trauma CT scans were performed. In 18/116 patients selective CT scans were ordered based on Statscan findings. In 43/143 additional conventional X-rays had to be performed, mainly due to inadequate a.p. views of fractured bones. All radiographs were transmitted over the hospital network (Picture Archiving and Communication System, PACS) for immediate simultaneous viewing at different places. The rapid availability of images for interpretation because of their digital nature and the reduced need for repeat exposures because of faulty radiography are also felt to be strengths.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Organizing and archiving statistical results and processing a subset of those results for publication are important and often underestimated issues in conducting statistical analyses. Because automation of these tasks is often poor, processing results produced by statistical packages is quite laborious and vulnerable to error. I will therefore present a new package called estout that facilitates and automates some of these tasks. This new command can be used to produce regression tables for use with spreadsheets, LaTeX, HTML, or word processors. For example, the results for multiple models can be organized in spreadsheets and can thus be archived in an orderly manner. Alternatively, the results can be directly saved as a publication-ready table for inclusion in, for example, a LaTeX document. estout is implemented as a wrapper for estimates table but has many additional features, such as support for mfx. However, despite its flexibility, estout is—I believe—still very straightforward and easy to use. Furthermore, estout can be customized via so-called defaults files. A tool to make available supplementary statistics called estadd is also provided.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The active plate margin of South America is characterized by a frequent occurrence of large and devastating subduction earthquakes. Here we focus on marine sedimentary records off Southern Chile that are archiving the regional paleoseismic history over the Holocene and Late Pleistocene. The investigated records - Ocean Drilling Program (ODP) Site 1232 and SONNE core 50SL - are located at ~40°S and ~38°S, within the Perú-Chile trench, and are characterized by frequent interbedded strata of turbiditic and hemipelagic origin. On the basis of the sedimentological characteristics and the association with the active margin of Southern Chile, we assume that the turbidites are mainly seismically triggered, and may be considered as paleo-megaearthquake indicators. However, the long-term changes in turbidite recurrence times appear to be strongly influenced by climate and sea level changes as well. During sea level highstands in the Holocene and Marine Isotope Stage (MIS) 5, recurrence times of turbiditic layers are substantially higher, primarily reflecting a climate-induced reduction of sediment availability and enhanced slope stability. In addition, segmented tectonic uplift changes and related drainage inversions likely influenced the postglacial decrease in turbidite frequencies. Glacial turbidite recurrence times (including MIS 2, MIS 3, cold substages of MIS 5, and MIS 6), on the other hand, are within the same order of magnitude as earthquake recurrence times derived from the historical record and other terrestrial paleoseismic archives of the region. Only during these cold stages sediment availability and slope instability were high enough to enable recording of the complete sequence of large earthquakes in Southern Chile. Our data thus suggest that earthquake recurrence times on the order of 100 to 200 years are a persistent feature at least during the last glacial period.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The oceans play a critical role in the Earth's climate, but unfortunately, the extent of this role is only partially understood. One major obstacle is the difficulty associated with making high-quality, globally distributed observations, a feat that is nearly impossible using only ships and other ocean-based platforms. The data collected by satellite-borne ocean color instruments, however, provide environmental scientists a synoptic look at the productivity and variability of the Earth's oceans and atmosphere, respectively, on high-resolution temporal and spatial scales. Three such instruments, the Sea-viewing Wide Field-of-view Sensor (SeaWiFS) onboard ORBIMAGE's OrbView-2 satellite, and two Moderate Resolution Imaging Spectroradiometers (MODIS) onboard the National Aeronautic and Space Administration's (NASA) Terra and Aqua satellites, have been in continuous operation since September 1997, February 2000, and June 2002, respectively. To facilitate the assembly of a suitably accurate data set for climate research, members of the NASA Sensor Intercomparison and Merger for Biological and Interdisciplinary Oceanic Studies (SIMBIOS) Project and SeaWiFS Project Offices devote significant attention to the calibration and validation of these and other ocean color instruments. This article briefly presents results from the SIMBIOS and SeaWiFS Project Office's (SSPO) satellite ocean color validation activities and describes the SeaWiFS Bio-optical Archive and Storage System (SeaBASS), a state-of-the-art system for archiving, cataloging, and distributing the in situ data used in these activities.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Este estudio tiene como objetivo estimar la influencia del acceso abierto en los patrones de publicación de la comunidad científica argentina en diferentes campos temáticos (Medicina; Física y Astronomía; Agricultura y Ciencias biológicas y Ciencias sociales y Humanidades), a partir del análisis del modelo de acceso de las revistas elegidas para comunicar los resultados de investigación en el período 2008-2010. La producción fue recogida de la base de datos SCOPUS y los modelos de acceso de las revistas determinados a partir de la consulta a las fuentes DOAJ, e-revist@s, SCielo, Redalyc, PubMed, Romeo-Sherpa y Dulcinea. Se analizó la accesibilidad real y potencial de la producción científica nacional por las vías dorada y verde, respectivamente, así como también por suscripción a través de la Biblioteca Electrónica de Ciencia y Tecnología del Ministerio de Ciencia, Tecnología e Innovación Productiva de la Nación Argentina. Los resultados muestran que en promedio, y para el conjunto de las temáticas estudiadas, el 70 de la producción científica argentina visible en SCOPUS se publica en revistas que adhieren de una u otra forma al movimiento de acceso abierto, en una relación del 27 para la vía dorada y del 43 para las que permiten el autoarchivo por la vía verde. Entre el 16 y el 30 (según las áreas temáticas) de los artículos publicados en revistas que permiten el autoarchivo se accede vía suscripción. El porcentaje de revistas sin acceso es del orden del 30 en Ciencias Sociales y Humanidades, y alcanza cerca del 45 en el resto de las áreas. Se concluye que Argentina presenta condiciones muy favorables para liberar un alto porcentaje de la literatura científica generada en el país bajo la modalidad del acceso abierto a través de repositorios institucionales y de mandatos para el auto-archivo, contribuyendo además a incrementar la accesibilidad y la preservación a largo plazo de la producción científica y tecnológica nacional

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Este estudio tiene como objetivo estimar la influencia del acceso abierto en los patrones de publicación de la comunidad científica argentina en diferentes campos temáticos (Medicina; Física y Astronomía; Agricultura y Ciencias biológicas y Ciencias sociales y Humanidades), a partir del análisis del modelo de acceso de las revistas elegidas para comunicar los resultados de investigación en el período 2008-2010. La producción fue recogida de la base de datos SCOPUS y los modelos de acceso de las revistas determinados a partir de la consulta a las fuentes DOAJ, e-revist@s, SCielo, Redalyc, PubMed, Romeo-Sherpa y Dulcinea. Se analizó la accesibilidad real y potencial de la producción científica nacional por las vías dorada y verde, respectivamente, así como también por suscripción a través de la Biblioteca Electrónica de Ciencia y Tecnología del Ministerio de Ciencia, Tecnología e Innovación Productiva de la Nación Argentina. Los resultados muestran que en promedio, y para el conjunto de las temáticas estudiadas, el 70 de la producción científica argentina visible en SCOPUS se publica en revistas que adhieren de una u otra forma al movimiento de acceso abierto, en una relación del 27 para la vía dorada y del 43 para las que permiten el autoarchivo por la vía verde. Entre el 16 y el 30 (según las áreas temáticas) de los artículos publicados en revistas que permiten el autoarchivo se accede vía suscripción. El porcentaje de revistas sin acceso es del orden del 30 en Ciencias Sociales y Humanidades, y alcanza cerca del 45 en el resto de las áreas. Se concluye que Argentina presenta condiciones muy favorables para liberar un alto porcentaje de la literatura científica generada en el país bajo la modalidad del acceso abierto a través de repositorios institucionales y de mandatos para el auto-archivo, contribuyendo además a incrementar la accesibilidad y la preservación a largo plazo de la producción científica y tecnológica nacional

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In 2008, the 50th anniversary of the IGY (International Geophysical Year), WDCMARE presents with this CD publication 3632 data sets in Open Access as part of the most important results from 73 cruises of the research vessel METEOR between 1964 and 1985. The archive is a coherent organized collection of published and unpublished data sets produced by scientists of all marine research disciplines who participated in Meteor expeditions, measured environmental parameters during cruises and investigated sample material post cruise in the labs of the participating institutions. In most cases, the data was gathered from the Meteor Forschungsergebnisse, published by the Deutsche Forschungsgemeinschaft (DFG). A second important data source are time series and radiosonde ascensions of more than 20 years of ships weather observations, which were provided by the Deutscher Wetterdienst, Hamburg. The final inclusion of all data into the PANGAEA information system ensures secure archiving, future updates, widespread distribution in electronic, machine-readable form with longterm access via the Internet. To produce this publication, all data sets with metadata were extracted from PANGAEA and organized in a directory structure on a CD together with a search capability.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This data set contains grain size analyses of bottom sediments collected by scientists from the V.P. Zenkovich Laboratory of Shelf and Sea Coasts (P.P. Shirshov Institute of Oceanology, Russian Academy of Sciences) during the Project ''Arctic Shelf of the Eurasia in the Late Quaternary'' in a number of expeditions to the Barents, Kara, East Siberian and Chukchi Seas on board the research vessels R/V Professor Shtokman, H/V Dmitry Laptev, H/V Malygin, and icebreaker Georgy Sedov since 1978. The analyses have been carried out according to the methods published by Petelin (1967) in the Analytical Laboratory of the P.P. Shirshov Institute of Oceanology. Archiving and electronic publication was performed through a data rescue by Evgeny Gurvich in 2003.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This data set contains the mineralogical analyses (binocular counting) of the 100-50 µm grain size fraction from bottom sediments collected by scientists of the V.P. Zenkovich Laboratory of Shelf and Sea Coasts (P.P. Shirshov Institute of Oceanology, Russian Academy of Sciences) during the Project ''Arctic Shelf of the Eurasia in the Late Quaternary'' in a number of expeditions to the Barents, Kara, East Siberian and Chukchi Seas on board research vessels R/V Professor Shtokman, H/V Dmitry Laptev, H/V Malygin, and icebreaker Georgy Sedov between 1978 and 1990. The analyses have been carried out according to the methods published by Petelin V.P. (1961) in the Analytical Laboratory of the P.P. Shirshov Institute of Oceanology. Archiving and electronic publication was performed through a data rescue by Evgeny Gurvich in 2003.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This data set contains chemical composition of bottom sediments collected by scientists from the V.P. Zenkovich Laboratory of Shelf and Sea Coasts (P.P. Shirshov Institute of Oceanology, Russian Academy of Sciences) during the Project ''Arctic Shelf of the Eurasia in the Late Quaternary'' in a number of expeditions to the Barents, Kara, East Siberian and Chukchi Seas on board research vessels R/V Professor Shtokman, H/V Dmitry Laptev, H/V Malygin, and icebreaker Georgy Sedov between 1978 and 1990. The analyses have been carried out in the Analytical Laboratory of the P.P. Shirshov Institute of Oceanology. Archiving and electronic publication was performed through a data rescue by Evgeny Gurvich in 2003.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This research is concerned with the experimental software engineering area, specifically experiment replication. Replication has traditionally been viewed as a complex task in software engineering. This is possibly due to the present immaturity of the experimental paradigm applied to software development. Researchers usually use replication packages to replicate an experiment. However, replication packages are not the solution to all the information management problems that crop up when successive replications of an experiment accumulate. This research borrows ideas from the software configuration management and software product line paradigms to support the replication process. We believe that configuration management can help to manage and administer information from one replication to another: hypotheses, designs, data analysis, etc. The software product line paradigm can help to organize and manage any changes introduced into the experiment by each replication. We expect the union of the two paradigms in replication to improve the planning, design and execution of further replications and their alignment with existing replications. Additionally, this research work will contribute a web support environment for archiving information related to different experiment replications. Additionally, it will provide flexible enough information management support for running replications with different numbers and types of changes. Finally, it will afford massive storage of data from different replications. Experimenters working collaboratively on the same experiment must all have access to the different experiments.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

BACKGROUND: Clinical Trials (CTs) are essential for bridging the gap between experimental research on new drugs and their clinical application. Just like CTs for traditional drugs and biologics have helped accelerate the translation of biomedical findings into medical practice, CTs for nanodrugs and nanodevices could advance novel nanomaterials as agents for diagnosis and therapy. Although there is publicly available information about nanomedicine-related CTs, the online archiving of this information is carried out without adhering to criteria that discriminate between studies involving nanomaterials or nanotechnology-based processes (nano), and CTs that do not involve nanotechnology (non-nano). Finding out whether nanodrugs and nanodevices were involved in a study from CT summaries alone is a challenging task. At the time of writing, CTs archived in the well-known online registry ClinicalTrials.gov are not easily told apart as to whether they are nano or non-nano CTs-even when performed by domain experts, due to the lack of both a common definition for nanotechnology and of standards for reporting nanomedical experiments and results. METHODS: We propose a supervised learning approach for classifying CT summaries from ClinicalTrials.gov according to whether they fall into the nano or the non-nano categories. Our method involves several stages: i) extraction and manual annotation of CTs as nano vs. non-nano, ii) pre-processing and automatic classification, and iii) performance evaluation using several state-of-the-art classifiers under different transformations of the original dataset. RESULTS AND CONCLUSIONS: The performance of the best automated classifier closely matches that of experts (AUC over 0.95), suggesting that it is feasible to automatically detect the presence of nanotechnology products in CT summaries with a high degree of accuracy. This can significantly speed up the process of finding whether reports on ClinicalTrials.gov might be relevant to a particular nanoparticle or nanodevice, which is essential to discover any precedents for nanotoxicity events or advantages for targeted drug therapy.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

A methodology is presented to determine both the short-term and the long-term influence of the spectral variations on the performance of Multi-Junction (MJ) solar cells and Concentrating "This is the peer reviewed version of the following article: R. Núñez, C. Domínguez, S. Askins, M. Victoria, R. Herrero, I. Antón, and G. Sala, “Determination of spectral variations by means of component cells useful for CPV rating and design,” Prog. Photovolt: Res. Appl., 2015., which has been published in final form at http://onlinelibrary.wiley.com/doi/10.1002/pip.2715/full. This article may be used for non-commercial purposes in accordance with Wiley Terms and Conditions for Self-Archiving [http://olabout.wiley.com/WileyCDA/Section/id-820227.html#terms]." Photovoltaic (CPV) modules. Component cells with the same optical behavior as MJ solar cells are used to characterize the spectrum. A set of parameters, namely Spectral Matching Ratios (SMRs), is used to characterize spectrally a particular Direct Normal Irradiance (DNI) by comparison to the reference spectrum (AM1.5D-ASTM-G173-03). Furthermore, the spectrally corrected DNI for a given MJ solar cell technology is defined providing a way to estimate the losses associated to the spectral variations. The last section analyzes how the spectrum evolves throughout a year in a given place and the set of SMRs representative for that location are calculated. This information can be used to maximize the energy harvested by the MJ solar cell throughout the year. As an example, three years of data recorded in Madrid shows that losses lower than 5% are expected due to current mismatch for state-of-the-art MJ solar cells.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

© 2015 Wiley Periodicals, Inc.