846 resultados para Semantic Publishing, Linked Data, Bibliometrics, Informetrics, Data Retrieval, Citations
Resumo:
The Along-Track Scanning Radiometers (ATSRs) provide a long time-series of measurements suitable for the retrieval of cloud properties. This work evaluates the freely-available Global Retrieval of ATSR Cloud Parameters and Evaluation (GRAPE) dataset (version 3) created from the ATSR-2 (1995�2003) and Advanced ATSR (AATSR; 2002 onwards) records. Users are recommended to consider only retrievals flagged as high-quality, where there is a good consistency between the measurements and the retrieved state (corresponding to about 60% of converged retrievals over sea, and more than 80% over land). Cloud properties are found to be generally free of any significant spurious trends relating to satellite zenith angle. Estimates of the random error on retrieved cloud properties are suggested to be generally appropriate for optically-thick clouds, and up to a factor of two too small for optically-thin cases. The correspondence between ATSR-2 and AATSR cloud properties is high, but a relative calibration difference between the sensors of order 5�10% at 660 nm and 870 nm limits the potential of the current version of the dataset for trend analysis. As ATSR-2 is thought to have the better absolute calibration, the discussion focusses on this portion of the record. Cloud-top heights from GRAPE compare well to ground-based data at four sites, particularly for shallow clouds. Clouds forming in boundary-layer inversions are typically around 1 km too high in GRAPE due to poorly-resolved inversions in the modelled temperature profiles used. Global cloud fields are compared to satellite products derived from the Moderate Resolution Imaging Spectroradiometer (MODIS), Cloud-Aerosol Lidar with Orthogonal Polarization (CALIOP) measurements, and a climatology of liquid water content derived from satellite microwave radiometers. In all cases the main reasons for differences are linked to differing sensitivity to, and treatment of, multi-layer cloud systems. The correlation coefficient between GRAPE and the two MODIS products considered is generally high (greater than 0.7 for most cloud properties), except for liquid and ice cloud effective radius, which also show biases between the datasets. For liquid clouds, part of the difference is linked to choice of wavelengths used in the retrieval. Total cloud cover is slightly lower in GRAPE (0.64) than the CALIOP dataset (0.66). GRAPE underestimates liquid cloud water path relative to microwave radiometers by up to 100 g m�2 near the Equator and overestimates by around 50 g m�2 in the storm tracks. Finally, potential future improvements to the algorithm are outlined.
Resumo:
The rapid expansion of the TMT sector in the late 1990s and more recent growing regulatory and corporate focus on business continuity and security have raised the profile of data centres. Data centres offer a unique blend of occupational, physical and technological characteristics compared to conventional real estate assets. Limited trading and heterogeneity of data centres also causes higher levels of appraisal uncertainty. In practice, the application of conventional discounted cash flow approaches requires information about a wide range of inputs that is difficult to derive from limited market signals or estimate analytically. This paper outlines an approach that uses pricing signals from similar traded cash flows is proposed. Based upon ‘the law of one price’, the method draws upon the premise that two identical future cash flows must have the same value now. Given the difficulties of estimating exit values, an alternative is that the expected cash flows of data centre are analysed over the life cycle of the building, with corporate bond yields used to provide a proxy for the appropriate discount rates for lease income. Since liabilities are quite diverse, a number of proxies are suggested as discount and capitalisation rates including indexed-linked, fixed interest and zero-coupon bonds. Although there are rarely assets that have identical cash flows and some approximation is necessary, the level of appraiser subjectivity is dramatically reduced.
Resumo:
This paper analyses the appraisal of a specialized form of real estate - data centres - that has a unique blend of locational, physical and technological characteristics that differentiate it from conventional real estate assets. Market immaturity, limited trading and a lack of pricing signals enhance levels of appraisal uncertainty and disagreement relative to conventional real estate assets. Given the problems of applying standard discounted cash flow, an approach to appraisal is proposed that uses pricing signals from traded cash flows that are similar to the cash flows generated from data centres. Based upon ‘the law of one price’, it is assumed that two assets that are expected to generate identical cash flows in the future must have the same value now. It is suggested that the expected cash flow of assets should be analysed over the life cycle of the building. Corporate bond yields are used to provide a proxy for the appropriate discount rates for lease income. Since liabilities are quite diverse, a number of proxies are suggested as discount and capitalisation rates including indexed-linked, fixed interest and zero-coupon bonds.
Resumo:
ABSTRACT Non-Gaussian/non-linear data assimilation is becoming an increasingly important area of research in the Geosciences as the resolution and non-linearity of models are increased and more and more non-linear observation operators are being used. In this study, we look at the effect of relaxing the assumption of a Gaussian prior on the impact of observations within the data assimilation system. Three different measures of observation impact are studied: the sensitivity of the posterior mean to the observations, mutual information and relative entropy. The sensitivity of the posterior mean is derived analytically when the prior is modelled by a simplified Gaussian mixture and the observation errors are Gaussian. It is found that the sensitivity is a strong function of the value of the observation and proportional to the posterior variance. Similarly, relative entropy is found to be a strong function of the value of the observation. However, the errors in estimating these two measures using a Gaussian approximation to the prior can differ significantly. This hampers conclusions about the effect of the non-Gaussian prior on observation impact. Mutual information does not depend on the value of the observation and is seen to be close to its Gaussian approximation. These findings are illustrated with the particle filter applied to the Lorenz ’63 system. This article is concluded with a discussion of the appropriateness of these measures of observation impact for different situations.
Resumo:
We describe a model-data fusion (MDF) inter-comparison project (REFLEX), which compared various algorithms for estimating carbon (C) model parameters consistent with both measured carbon fluxes and states and a simple C model. Participants were provided with the model and with both synthetic net ecosystem exchange (NEE) of CO2 and leaf area index (LAI) data, generated from the model with added noise, and observed NEE and LAI data from two eddy covariance sites. Participants endeavoured to estimate model parameters and states consistent with the model for all cases over the two years for which data were provided, and generate predictions for one additional year without observations. Nine participants contributed results using Metropolis algorithms, Kalman filters and a genetic algorithm. For the synthetic data case, parameter estimates compared well with the true values. The results of the analyses indicated that parameters linked directly to gross primary production (GPP) and ecosystem respiration, such as those related to foliage allocation and turnover, or temperature sensitivity of heterotrophic respiration, were best constrained and characterised. Poorly estimated parameters were those related to the allocation to and turnover of fine root/wood pools. Estimates of confidence intervals varied among algorithms, but several algorithms successfully located the true values of annual fluxes from synthetic experiments within relatively narrow 90% confidence intervals, achieving >80% success rate and mean NEE confidence intervals <110 gC m−2 year−1 for the synthetic case. Annual C flux estimates generated by participants generally agreed with gap-filling approaches using half-hourly data. The estimation of ecosystem respiration and GPP through MDF agreed well with outputs from partitioning studies using half-hourly data. Confidence limits on annual NEE increased by an average of 88% in the prediction year compared to the previous year, when data were available. Confidence intervals on annual NEE increased by 30% when observed data were used instead of synthetic data, reflecting and quantifying the addition of model error. Finally, our analyses indicated that incorporating additional constraints, using data on C pools (wood, soil and fine roots) would help to reduce uncertainties for model parameters poorly served by eddy covariance data.
Resumo:
We explored the impact of a degraded semantic system on lexical, morphological and syntactic complexity in language production. We analysed transcripts from connected speech samples from eight patients with semantic dementia (SD) and eight age-matched healthy speakers. The frequency distributions of nouns and verbs were compared for hand-scored data and data extracted using text-analysis software. Lexical measures showed the predicted pattern for nouns and verbs in hand-scored data, and for nouns in software-extracted data, with fewer low frequency items in the speech of the patients relative to controls. The distribution of complex morpho-syntactic forms for the SD group showed a reduced range, with fewer constructions that required multiple auxiliaries and inflections. Finally, the distribution of syntactic constructions also differed between groups, with a pattern that reflects the patients’ characteristic anomia and constraints on morpho-syntactic complexity. The data are in line with previous findings of an absence of gross syntactic errors or violations in SD speech. Alterations in the distributions of morphology and syntax, however, support constraint satisfaction models of speech production in which there is no hard boundary between lexical retrieval and grammatical encoding.
Resumo:
This Editorial presents the focus, scope and policies of the inaugural issue of Nature Conservation, a new open access, peer-reviewed journal bridging natural sciences, social sciences and hands-on applications in conservation management. The journal covers all aspects of nature conservation and aims particularly at facilitating better interaction between scientists and practitioners. The journal will impose no restrictions on manuscript size or the use of colour. We will use an XML-based editorial workflow and several cutting-edge innovations in publishing and information dissemination. These include semantic mark-up of, and enhancements to published text, data, and extensive cross-linking within the journal and to external sources. We believe the journal will make an important contribution to better linking science and practice, offers rapid, peer-reviewed and flexible publication for authors and unrestricted access to content.
Resumo:
There is large uncertainty about the magnitude of warming and how rainfall patterns will change in response to any given scenario of future changes in atmospheric composition and land use. The models used for future climate projections were developed and calibrated using climate observations from the past 40 years. The geologic record of environmental responses to climate changes provides a unique opportunity to test model performance outside this limited climate range. Evaluation of model simulations against palaeodata shows that models reproduce the direction and large-scale patterns of past changes in climate, but tend to underestimate the magnitude of regional changes. As part of the effort to reduce model-related uncertainty and produce more reliable estimates of twenty-first century climate, the Palaeoclimate Modelling Intercomparison Project is systematically applying palaeoevaluation techniques to simulations of the past run with the models used to make future projections. This evaluation will provide assessments of model performance, including whether a model is sufficiently sensitive to changes in atmospheric composition, as well as providing estimates of the strength of biosphere and other feedbacks that could amplify the model response to these changes and modify the characteristics of climate variability.
Resumo:
Brain activity can be measured with several non-invasive neuroimaging modalities, but each modality has inherent limitations with respect to resolution, contrast and interpretability. It is hoped that multimodal integration will address these limitations by using the complementary features of already available data. However, purely statistical integration can prove problematic owing to the disparate signal sources. As an alternative, we propose here an advanced neural population model implemented on an anatomically sound cortical mesh with freely adjustable connectivity, which features proper signal expression through a realistic head model for the electroencephalogram (EEG), as well as a haemodynamic model for functional magnetic resonance imaging based on blood oxygen level dependent contrast (fMRI BOLD). It hence allows simultaneous and realistic predictions of EEG and fMRI BOLD from the same underlying model of neural activity. As proof of principle, we investigate here the influence on simulated brain activity of strengthening visual connectivity. In the future we plan to fit multimodal data with this neural population model. This promises novel, model-based insights into the brain's activity in sleep, rest and task conditions.
Resumo:
Remote sensing observations often have correlated errors, but the correlations are typically ignored in data assimilation for numerical weather prediction. The assumption of zero correlations is often used with data thinning methods, resulting in a loss of information. As operational centres move towards higher-resolution forecasting, there is a requirement to retain data providing detail on appropriate scales. Thus an alternative approach to dealing with observation error correlations is needed. In this article, we consider several approaches to approximating observation error correlation matrices: diagonal approximations, eigendecomposition approximations and Markov matrices. These approximations are applied in incremental variational assimilation experiments with a 1-D shallow water model using synthetic observations. Our experiments quantify analysis accuracy in comparison with a reference or ‘truth’ trajectory, as well as with analyses using the ‘true’ observation error covariance matrix. We show that it is often better to include an approximate correlation structure in the observation error covariance matrix than to incorrectly assume error independence. Furthermore, by choosing a suitable matrix approximation, it is feasible and computationally cheap to include error correlation structure in a variational data assimilation algorithm.
Resumo:
An analysis method for diffusion tensor (DT) magnetic resonance imaging data is described, which, contrary to the standard method (multivariate fitting), does not require a specific functional model for diffusion-weighted (DW) signals. The method uses principal component analysis (PCA) under the assumption of a single fibre per pixel. PCA and the standard method were compared using simulations and human brain data. The two methods were equivalent in determining fibre orientation. PCA-derived fractional anisotropy and DT relative anisotropy had similar signal-to-noise ratio (SNR) and dependence on fibre shape. PCA-derived mean diffusivity had similar SNR to the respective DT scalar, and it depended on fibre anisotropy. Appropriate scaling of the PCA measures resulted in very good agreement between PCA and DT maps. In conclusion, the assumption of a specific functional model for DW signals is not necessary for characterization of anisotropic diffusion in a single fibre.