891 resultados para Data Storage Solutions


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Time series of geocenter coordinates were determined with data of two global navigation satellite systems (GNSSs), namely the U.S. GPS (Global Positioning System) and the Russian GLONASS (Global’naya Nawigatsionnaya Sputnikowaya Sistema). The data was recorded in the years 2008–2011 by a global network of 92 permanently observing GPS/GLONASS receivers. Two types of daily solutions were generated independently for each GNSS, one including the estimation of geocenter coordinates and one without these parameters. A fair agreement for GPS and GLONASS was found in the geocenter x- and y-coordinate series. Our tests, however, clearly reveal artifacts in the z-component determined with the GLONASS data. Large periodic excursions in the GLONASS geocenter z-coordinates of about 40 cm peak-to-peak are related to the maximum elevation angles of the Sun above/below the orbital planes of the satellite system and thus have a period of about 4 months (third of a year). A detailed analysis revealed that the artifacts are almost uniquely governed by the differences of the estimates of direct solar radiation pressure (SRP) in the two solution series (with and without geocenter estimation). A simple formula is derived, describing the relation between the geocenter z-coordinate and the corresponding parameter of the SRP. The effect can be explained by first-order perturbation theory of celestial mechanics. The theory also predicts a heavy impact on the GNSS-derived geocenter if once-per-revolution SRP parameters are estimated in the direction of the satellite’s solar panel axis. Specific experiments using GPS observations revealed that this is indeed the case. Although the main focus of this article is on GNSS, the theory developed is applicable to all satellite observing techniques. We applied the theory to satellite laser ranging (SLR) solutions using LAGEOS. It turns out that the correlation between geocenter and SRP parameters is not a critical issue for the SLR solutions. The reasons are threefold: The direct SRP is about a factor of 30–40 smaller for typical geodetic SLR satellites than for GNSS satellites, allowing it in most cases to not solve for SRP parameters (ruling out the correlation between these parameters and the geocenter coordinates); the orbital arc length of 7 days (which is typically used in SLR analysis) contains more than 50 revolutions of the LAGEOS satellites as compared to about two revolutions of GNSS satellites for the daily arcs used in GNSS analysis; the orbit geometry is not as critical for LAGEOS as for GNSS satellites, because the elevation angle of the Sun w.r.t. the orbital plane is usually significantly changing over 7 days.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

LARES is a new spherical geodetic satellite designed for SLR observations. It is made of solid tungsten alloy covered with 92 corner cubes. Due to a very small area-to-mass ratio, the sensitivity of LARES orbits to non-gravitational forces is greatly minimized. We processed 82 weeks (Feb12-Aug13) of LARES observations from a global SLR network and we analyzed the contribution of LARES data to the current SLR products (e.g., global scale and geocenter coordinates). The quality of the combined LARES+LAGEOS-1/2 solutions is also addressed in the paper. Introduction LARES

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The current state of health and biomedicine includes an enormity of heterogeneous data ‘silos’, collected for different purposes and represented differently, that are presently impossible to share or analyze in toto. The greatest challenge for large-scale and meaningful analyses of health-related data is to achieve a uniform data representation for data extracted from heterogeneous source representations. Based upon an analysis and categorization of heterogeneities, a process for achieving comparable data content by using a uniform terminological representation is developed. This process addresses the types of representational heterogeneities that commonly arise in healthcare data integration problems. Specifically, this process uses a reference terminology, and associated "maps" to transform heterogeneous data to a standard representation for comparability and secondary use. The capture of quality and precision of the “maps” between local terms and reference terminology concepts enhances the meaning of the aggregated data, empowering end users with better-informed queries for subsequent analyses. A data integration case study in the domain of pediatric asthma illustrates the development and use of a reference terminology for creating comparable data from heterogeneous source representations. The contribution of this research is a generalized process for the integration of data from heterogeneous source representations, and this process can be applied and extended to other problems where heterogeneous data needs to be merged.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

OBJECTIVE: To determine whether algorithms developed for the World Wide Web can be applied to the biomedical literature in order to identify articles that are important as well as relevant. DESIGN AND MEASUREMENTS A direct comparison of eight algorithms: simple PubMed queries, clinical queries (sensitive and specific versions), vector cosine comparison, citation count, journal impact factor, PageRank, and machine learning based on polynomial support vector machines. The objective was to prioritize important articles, defined as being included in a pre-existing bibliography of important literature in surgical oncology. RESULTS Citation-based algorithms were more effective than noncitation-based algorithms at identifying important articles. The most effective strategies were simple citation count and PageRank, which on average identified over six important articles in the first 100 results compared to 0.85 for the best noncitation-based algorithm (p < 0.001). The authors saw similar differences between citation-based and noncitation-based algorithms at 10, 20, 50, 200, 500, and 1,000 results (p < 0.001). Citation lag affects performance of PageRank more than simple citation count. However, in spite of citation lag, citation-based algorithms remain more effective than noncitation-based algorithms. CONCLUSION Algorithms that have proved successful on the World Wide Web can be applied to biomedical information retrieval. Citation-based algorithms can help identify important articles within large sets of relevant results. Further studies are needed to determine whether citation-based algorithms can effectively meet actual user information needs.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Information overload is a significant problem for modern medicine. Searching MEDLINE for common topics often retrieves more relevant documents than users can review. Therefore, we must identify documents that are not only relevant, but also important. Our system ranks articles using citation counts and the PageRank algorithm, incorporating data from the Science Citation Index. However, citation data is usually incomplete. Therefore, we explore the relationship between the quantity of citation information available to the system and the quality of the result ranking. Specifically, we test the ability of citation count and PageRank to identify "important articles" as defined by experts from large result sets with decreasing citation information. We found that PageRank performs better than simple citation counts, but both algorithms are surprisingly robust to information loss. We conclude that even an incomplete citation database is likely to be effective for importance ranking.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Information overload is a significant problem for modern medicine. Searching MEDLINE for common topics often retrieves more relevant documents than users can review. Therefore, we must identify documents that are not only relevant, but also important. Our system ranks articles using citation counts and the PageRank algorithm, incorporating data from the Science Citation Index. However, citation data is usually incomplete. Therefore, we explore the relationship between the quantity of citation information available to the system and the quality of the result ranking. Specifically, we test the ability of citation count and PageRank to identify "important articles" as defined by experts from large result sets with decreasing citation information. We found that PageRank performs better than simple citation counts, but both algorithms are surprisingly robust to information loss. We conclude that even an incomplete citation database is likely to be effective for importance ranking.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

People often use tools to search for information. In order to improve the quality of an information search, it is important to understand how internal information, which is stored in user’s mind, and external information, represented by the interface of tools interact with each other. How information is distributed between internal and external representations significantly affects information search performance. However, few studies have examined the relationship between types of interface and types of search task in the context of information search. For a distributed information search task, how data are distributed, represented, and formatted significantly affects the user search performance in terms of response time and accuracy. Guided by UFuRT (User, Function, Representation, Task), a human-centered process, I propose a search model, task taxonomy. The model defines its relationship with other existing information models. The taxonomy clarifies the legitimate operations for each type of search task of relation data. Based on the model and taxonomy, I have also developed prototypes of interface for the search tasks of relational data. These prototypes were used for experiments. The experiments described in this study are of a within-subject design with a sample of 24 participants recruited from the graduate schools located in the Texas Medical Center. Participants performed one-dimensional nominal search tasks over nominal, ordinal, and ratio displays, and searched one-dimensional nominal, ordinal, interval, and ratio tasks over table and graph displays. Participants also performed the same task and display combination for twodimensional searches. Distributed cognition theory has been adopted as a theoretical framework for analyzing and predicting the search performance of relational data. It has been shown that the representation dimensions and data scales, as well as the search task types, are main factors in determining search efficiency and effectiveness. In particular, the more external representations used, the better search task performance, and the results suggest the ideal search performance occurs when the question type and corresponding data scale representation match. The implications of the study lie in contributing to the effective design of search interface for relational data, especially laboratory results, which are often used in healthcare activities.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The implications of the new research presented in Volume 2, Issue 1 (Human Trafficking) of the Journal of Applied Research on Children are explored, calling attention to the need for increased awareness, greater availability of data, and proactive policy solutions to combat child trafficking.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The global ocean is a significant sink for anthropogenic carbon (Cant), absorbing roughly a third of human CO2 emitted over the industrial period. Robust estimates of the magnitude and variability of the storage and distribution of Cant in the ocean are therefore important for understanding the human impact on climate. In this synthesis we review observational and model-based estimates of the storage and transport of Cant in the ocean. We pay particular attention to the uncertainties and potential biases inherent in different inference schemes. On a global scale, three data-based estimates of the distribution and inventory of Cant are now available. While the inventories are found to agree within their uncertainty, there are considerable differences in the spatial distribution. We also present a review of the progress made in the application of inverse and data assimilation techniques which combine ocean interior estimates of Cant with numerical ocean circulation models. Such methods are especially useful for estimating the air–sea flux and interior transport of Cant, quantities that are otherwise difficult to observe directly. However, the results are found to be highly dependent on modeled circulation, with the spread due to different ocean models at least as large as that from the different observational methods used to estimate Cant. Our review also highlights the importance of repeat measurements of hydrographic and biogeochemical parameters to estimate the storage of Cant on decadal timescales in the presence of the variability in circulation that is neglected by other approaches. Data-based Cant estimates provide important constraints on forward ocean models, which exhibit both broad similarities and regional errors relative to the observational fields. A compilation of inventories of Cant gives us a "best" estimate of the global ocean inventory of anthropogenic carbon in 2010 of 155 ± 31 PgC (±20% uncertainty). This estimate includes a broad range of values, suggesting that a combination of approaches is necessary in order to achieve a robust quantification of the ocean sink of anthropogenic CO2.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

There is a growing interest in the location of Treatment, Storage, and Disposal (TSDF) sites in relation to minority communities. A number of studies have been completed, and the results of these studies have been varied. Some of the studies have shown a strong positive correlation between the location of TSDF sites and minority populations, while a few have shown no significance in that relationship. The major difference between these studies has been in the areal unit used.^ This study compared the minority populations of Texas census tracts and ZIP codes containing a TSDF using the associated county as the comparison population. The hypothesis of this study was that there was no difference between using census tracts and ZIP codes to analyze the relationship of minority populations and TSDF's. The census data used was from 1990, and the initial list of TSDF sites was supplied by the Texas Natural Resource Conservation Commission. The TSDF site locations were checked using graphical information systems (GIS) programs, in order to increase the accuracy of the identity of exposed ZIP codes and census tracts. The minority populations of the exposed areal units were compared using proportional differences, crosstables, maps, and logistic regression. The dependent variable used was the exposure status of the areal units under study, including counties, census tracts, and ZIP codes. The independent variables used included minority group proportion and grouping of the proportions, educational status, household income, and home value.^ In all cases, education was significant or near significant at the.05 level. Education rather than minority proportion was therefore the most significant predictor of the exposure status of a census tract or ZIP code. ^

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this study, we assess the climate mitigation potential from afforestation in a mountainous snow-rich region (Switzerland) with strongly varying environmental conditions. Using radiative forcing calculations, we quantify both the carbon sequestration potential and the effect of albedo change at high resolution. We calculate the albedo radiative forcing based on remotely sensed data sets of albedo, global radiation and snow cover. Carbon sequestration is estimated from changes in carbon stocks based on national inventories. We first estimate the spatial pattern of radiative forcing (RF) across Switzerland assuming homogeneous transitions from open land to forest. This highlights where forest expansion still exhibits climatic benefits when including the radiative forcing of albedo change. Second, given that forest expansion is currently the dominant land-use change process in the Swiss Alps, we calculate the radiative forcing that occurred between 1985 and 1997. Our results show that the net RF of forest expansion ranges from −24 W m−2 at low elevations of the northern Prealps to 2 W m−2 at high elevations of the Central Alps. The albedo RF increases with increasing altitude, which offsets the CO2 RF at high elevations with long snow-covered periods, high global radiation and low carbon sequestration. Albedo RF is particularly relevant during transitions from open land to open forest but not in later stages of forest development. Between 1985 and 1997, when overall forest expansion in Switzerland was approximately 4%, the albedo RF offset the CO2 RF by an average of 40%. We conclude that the albedo RF should be considered at an appropriately high resolution when estimating the climatic effect of forestation in temperate mountainous regions.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Information-centric networking (ICN) has been proposed to cope with the drawbacks of the Internet Protocol, namely scalability and security. The majority of research efforts in ICN have focused on routing and caching in wired networks, while little attention has been paid to optimizing the communication and caching efficiency in wireless networks. In this work, we study the application of Raptor codes to Named Data Networking (NDN), which is a popular ICN architecture, in order to minimize the number of transmitted messages and accelerate content retrieval times. We propose RC-NDN, which is a NDN compatible Raptor codes architecture. In contrast to other coding-based NDN solutions that employ network codes, RC-NDN considers security architectures inherent to NDN. Moreover, different from existing network coding based solutions for NDN, RC-NDN does not require significant computational resources, which renders it appropriate for low cost networks. We evaluate RC-NDN in mobile scenarios with high mobility. Evaluations show that RC-NDN outperforms the original NDN significantly. RC-NDN is particularly efficient in dense environments, where retrieval times can be reduced by 83% and the number of Data transmissions by 84.5% compared to NDN.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The liquid–vapor interface is difficult to access experimentally but is of interest from a theoretical and applied point of view and has particular importance in atmospheric aerosol chemistry. Here we examine the liquid–vapor interface for mixtures of water, sodium chloride, and formic acid, an abundant chemical in the atmosphere. We compare the results of surface tension and X-ray photoelectron spectroscopy (XPS) measurements over a wide range of formic acid concentrations. Surface tension measurements provide a macroscopic characterization of solutions ranging from 0 to 3 M sodium chloride and from 0 to over 0.5 mole fraction formic acid. Sodium chloride was found to be a weak salting out agent for formic acid with surface excess depending only slightly on salt concentration. In situ XPS provides a complementary molecular level description about the liquid–vapor interface. XPS measurements over an experimental probe depth of 51 Å gave the C 1s to O 1s ratio for both total oxygen and oxygen from water. XPS also provides detailed electronic structure information that is inaccessible by surface tension. Density functional theory calculations were performed to understand the observed shift in C 1s binding energies to lower values with increasing formic acid concentration. Part of the experimental −0.2 eV shift can be assigned to the solution composition changing from predominantly monomers of formic acid to a combination of monomers and dimers; however, the lack of an appropriate reference to calibrate the absolute BE scale at high formic acid mole fraction complicates the interpretation. Our data are consistent with surface tension measurements yielding a significantly more surface sensitive measurement than XPS due to the relatively weak propensity of formic acid for the interface. A simple model allowed us to replicate the XPS results under the assumption that the surface excess was contained in the top four angstroms of solution.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A feasibility study by Pail et al. (Can GOCE help to improve temporal gravity field estimates? In: Ouwehand L (ed) Proceedings of the 4th International GOCE User Workshop, ESA Publication SP-696, 2011b) shows that GOCE (‘Gravity field and steady-state Ocean Circulation Explorer’) satellite gravity gradiometer (SGG) data in combination with GPS derived orbit data (satellite-to-satellite tracking: SST-hl) can be used to stabilize and reduce the striping pattern of a bi-monthly GRACE (‘Gravity Recovery and Climate Experiment’) gravity field estimate. In this study several monthly (and bi-monthly) combinations of GRACE with GOCE SGG and GOCE SST-hl data on the basis of normal equations are investigated. Our aim is to assess the role of the gradients (solely) in the combination and whether already one month of GOCE observations provides sufficient data for having an impact in the combination. The estimation of clean and stable monthly GOCE SGG normal equations at high resolution ( >  d/o 150) is found to be difficult, and the SGG component, solely, does not show significant added value to monthly and bi-monthly GRACE gravity fields. Comparisons of GRACE-only and combined monthly and bi-monthly solutions show that the striping pattern can only be reduced when using both GOCE observation types (SGG, SST-hl), and mainly between d/o 45 and 60.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

AIM: Chemical decontamination increases the availability of bone grafts; however, it is unclear whether antiseptic processing changes the biological activity of bone. MATERIALS AND METHODS: Bone chips were incubated with 4 different antiseptic solutions including (1) povidone-iodine (0.5%), (2) chlorhexidine diguluconate (0.2%), (3) hydrogen peroxide (1%) and (4) sodium hypochlorite (0.25%). After 10 minutes of incubation, changes in the capacity of the bone-conditioned medium to modulate gene expression of gingival fibroblasts was investigated. RESULTS: Conditioned medium obtained from freshly prepared bone chips increased the expression of TGF-β target genes interleukin 11 (IL11), proteoglycan4 (PRG4), NADPH oxidase 4 (NOX4), and decreased the expression of adrenomedullin (ADM), and pentraxin 3 (PTX3) in gingival fibroblasts. Incubation of bone chips with 0.2% chlorhexidine, followed by vigorously washing resulted in a bone-conditioned medium with even higher expression of IL11, PRG4, and NOX4. These findings were also found with a decrease in cell viability and an activation of apoptosis signaling. Chlorhexidine alone, at low concentrations, increased IL11, PRG4 and NOX4 expression, independent of the TGF-β receptor I kinase activity. In contrast, 0.25% sodium hypochlorite almost entirely abolished the activity of bone-conditioned medium, while the other two antiseptic solutions, 1% hydrogen peroxide and 0.5% povidone-iodine, had relatively no impact, respectively. CONCLUSION: These in vitro findings demonstrate that incubation of bone chips with chlorhexidine differentially affects the activity of the respective bone-conditioned medium compared to the other antiseptic solutions. The data further suggest that the main effects are caused by chlorhexidine remaining in the bone-conditioned medium after repeated washing of the bone chips. This article is protected by copyright. All rights reserved. This article is protected by copyright. All rights reserved. KEYWORDS: Autografts; TGF-β; antiseptic solution; bone; bone conditioned medium; bone supernatant; chlorhexidine; hydrogen peroxide; povidone-iodine; sodium hypochlorite