51 resultados para downloading of data

em CentAUR: Central Archive University of Reading - UK


Relevância:

100.00% 100.00%

Publicador:

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Experimental data for the title reaction were modeled using master equation (ME)/RRKM methods based on the Multiwell suite of programs. The starting point for the exercise was the empirical fitting provided by the NASA (Sander, S. P.; Finlayson-Pitts, B. J.; Friedl, R. R.; Golden, D. M.; Huie, R. E.; Kolb, C. E.; Kurylo, M. J.; Molina, M. J.; Moortgat, G. K.; Orkin, V. L.; Ravishankara, A. R. Chemical Kinetics and Photochemical Data for Use in Atmospheric Studies, Evaluation Number 15; Jet Propulsion Laboratory: Pasadena, California, 2006)(1) and IUPAC (Atkinson, R.; Baulch, D. L.; Cox, R. A.: R. F. Hampson, J.; Kerr, J. A.; Rossi, M. J.; Troe, J. J. Phys. Chem. Ref. Data. 2000, 29, 167) 2 data evaluation panels, which represents the data in the experimental pressure ranges rather well. Despite the availability of quite reliable parameters for these calculations (molecular vibrational frequencies (Parthiban, S.; Lee, T. J. J. Chem. Phys. 2000, 113, 145)3 and a. value (Orlando, J. J.; Tyndall, G. S. J. Phys. Chem. 1996, 100,. 19398)4 of the bond dissociation energy, D-298(BrO-NO2) = 118 kJ mol(-1), corresponding to Delta H-0(circle) = 114.3 kJ mol(-1) at 0 K) and the use of RRKM/ME methods, fitting calculations to the reported data or the empirical equations was anything but straightforward. Using these molecular parameters resulted in a discrepancy between the calculations and the database of rate constants of a factor of ca. 4 at, or close to, the low-pressure limit. Agreement between calculation and experiment could be achieved in two ways, either by increasing Delta H-0(circle) to an unrealistically high value (149.3 kJ mol(-1)) or by increasing , the average energy transferred in a downward collision, to an unusually large value (> 5000 cm(-1)). The discrepancy could also be reduced by making all overall rotations fully active. The system was relatively insensitive to changing the moments of inertia in the transition state to increase the centrifugal effect. The possibility of involvement of BrOONO was tested and cannot account for the difficulties of fitting the data.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper discusses many of the issues associated with formally publishing data in academia, focusing primarily on the structures that need to be put in place for peer review and formal citation of datasets. Data publication is becoming increasingly important to the scientific community, as it will provide a mechanism for those who create data to receive academic credit for their work and will allow the conclusions arising from an analysis to be more readily verifiable, thus promoting transparency in the scientific process. Peer review of data will also provide a mechanism for ensuring the quality of datasets, and we provide suggestions on the types of activities one expects to see in the peer review of data. A simple taxonomy of data publication methodologies is presented and evaluated, and the paper concludes with a discussion of dataset granularity, transience and semantics, along with a recommended human-readable citation syntax.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Basic Network transactions specifies that datagram from source to destination is routed through numerous routers and paths depending on the available free and uncongested paths which results in the transmission route being too long, thus incurring greater delay, jitter, congestion and reduced throughput. One of the major problems of packet switched networks is the cell delay variation or jitter. This cell delay variation is due to the queuing delay depending on the applied loading conditions. The effect of delay, jitter accumulation due to the number of nodes along transmission routes and dropped packets adds further complexity to multimedia traffic because there is no guarantee that each traffic stream will be delivered according to its own jitter constraints therefore there is the need to analyze the effects of jitter. IP routers enable a single path for the transmission of all packets. On the other hand, Multi-Protocol Label Switching (MPLS) allows separation of packet forwarding and routing characteristics to enable packets to use the appropriate routes and also optimize and control the behavior of transmission paths. Thus correcting some of the shortfalls associated with IP routing. Therefore MPLS has been utilized in the analysis for effective transmission through the various networks. This paper analyzes the effect of delay, congestion, interference, jitter and packet loss in the transmission of signals from source to destination. In effect the impact of link failures, repair paths in the various physical topologies namely bus, star, mesh and hybrid topologies are all analyzed based on standard network conditions.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper analyses the appraisal of a specialized form of real estate - data centres - that has a unique blend of locational, physical and technological characteristics that differentiate it from conventional real estate assets. Market immaturity, limited trading and a lack of pricing signals enhance levels of appraisal uncertainty and disagreement relative to conventional real estate assets. Given the problems of applying standard discounted cash flow, an approach to appraisal is proposed that uses pricing signals from traded cash flows that are similar to the cash flows generated from data centres. Based upon ‘the law of one price’, it is assumed that two assets that are expected to generate identical cash flows in the future must have the same value now. It is suggested that the expected cash flow of assets should be analysed over the life cycle of the building. Corporate bond yields are used to provide a proxy for the appropriate discount rates for lease income. Since liabilities are quite diverse, a number of proxies are suggested as discount and capitalisation rates including indexed-linked, fixed interest and zero-coupon bonds.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Background and Aims Forest trees directly contribute to carbon cycling in forest soils through the turnover of their fine roots. In this study we aimed to calculate root turnover rates of common European forest tree species and to compare them with most frequently published values. Methods We compiled available European data and applied various turnover rate calculation methods to the resulting database. We used Decision Matrix and Maximum-Minimum formula as suggested in the literature. Results Mean turnover rates obtained by the combination of sequential coring and Decision Matrix were 0.86 yr−1 for Fagus sylvatica and 0.88 yr−1 for Picea abies when maximum biomass data were used for the calculation, and 1.11 yr−1 for both species when mean biomass data were used. Using mean biomass rather than maximum resulted in about 30 % higher values of root turnover. Using the Decision Matrix to calculate turnover rate doubled the rates when compared to the Maximum-Minimum formula. The Decision Matrix, however, makes use of more input information than the Maximum-Minimum formula. Conclusions We propose that calculations using the Decision Matrix with mean biomass give the most reliable estimates of root turnover rates in European forests and should preferentially be used in models and C reporting.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Data quality is a difficult notion to define precisely, and different communities have different views and understandings of the subject. This causes confusion, a lack of harmonization of data across communities and omission of vital quality information. For some existing data infrastructures, data quality standards cannot address the problem adequately and cannot fulfil all user needs or cover all concepts of data quality. In this study, we discuss some philosophical issues on data quality. We identify actual user needs on data quality, review existing standards and specifications on data quality, and propose an integrated model for data quality in the field of Earth observation (EO). We also propose a practical mechanism for applying the integrated quality information model to a large number of datasets through metadata inheritance. While our data quality management approach is in the domain of EO, we believe that the ideas and methodologies for data quality management can be applied to wider domains and disciplines to facilitate quality-enabled scientific research.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A system for continuous data assimilation is presented and discussed. To simulate the dynamical development a channel version of a balanced barotropic model is used and geopotential (height) data are assimilated into the models computations as data become available. In the first experiment the updating is performed every 24th, 12th and 6th hours with a given network. The stations are distributed at random in 4 groups in order to simulate 4 areas with different density of stations. Optimum interpolation is performed for the difference between the forecast and the valid observations. The RMS-error of the analyses is reduced in time, and the error being smaller the more frequent the updating is performed. The updating every 6th hour yields an error in the analysis less than the RMS-error of the observation. In a second experiment the updating is performed by data from a moving satellite with a side-scan capability of about 15°. If the satellite data are analysed at every time step before they are introduced into the system the error of the analysis is reduced to a value below the RMS-error of the observation already after 24 hours and yields as a whole a better result than updating from a fixed network. If the satellite data are introduced without any modification the error of the analysis is reduced much slower and it takes about 4 days to reach a comparable result to the one where the data have been analysed.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Aim  Species distribution models (SDMs) based on current species ranges underestimate the potential distribution when projected in time and/or space. A multi-temporal model calibration approach has been suggested as an alternative, and we evaluate this using 13,000 years of data. Location  Europe. Methods  We used fossil-based records of presence for Picea abies, Abies alba and Fagus sylvatica and six climatic variables for the period 13,000 to 1000 yr bp. To measure the contribution of each 1000-year time step to the total niche of each species (the niche measured by pooling all the data), we employed a principal components analysis (PCA) calibrated with data over the entire range of possible climates. Then we projected both the total niche and the partial niches from single time frames into the PCA space, and tested if the partial niches were more similar to the total niche than random. Using an ensemble forecasting approach, we calibrated SDMs for each time frame and for the pooled database. We projected each model to current climate and evaluated the results against current pollen data. We also projected all models into the future. Results  Niche similarity between the partial and the total-SDMs was almost always statistically significant and increased through time. SDMs calibrated from single time frames gave different results when projected to current climate, providing evidence of a change in the species realized niches through time. Moreover, they predicted limited climate suitability when compared with the total-SDMs. The same results were obtained when projected to future climates. Main conclusions  The realized climatic niche of species differed for current and future climates when SDMs were calibrated considering different past climates. Building the niche as an ensemble through time represents a way forward to a better understanding of a species' range and its ecology in a changing climate.