969 resultados para on-disk data layout
Resumo:
With the introduction of new observing systems based on asynoptic observations, the analysis problem has changed in character. In the near future we may expect that a considerable part of meteorological observations will be unevenly distributed in four dimensions, i.e. three dimensions in space and one in time. The term analysis, or objective analysis in meteorology, means the process of interpolating observed meteorological observations from unevenly distributed locations to a network of regularly spaced grid points. Necessitated by the requirement of numerical weather prediction models to solve the governing finite difference equations on such a grid lattice, the objective analysis is a three-dimensional (or mostly two-dimensional) interpolation technique. As a consequence of the structure of the conventional synoptic network with separated data-sparse and data-dense areas, four-dimensional analysis has in fact been intensively used for many years. Weather services have thus based their analysis not only on synoptic data at the time of the analysis and climatology, but also on the fields predicted from the previous observation hour and valid at the time of the analysis. The inclusion of the time dimension in objective analysis will be called four-dimensional data assimilation. From one point of view it seems possible to apply the conventional technique on the new data sources by simply reducing the time interval in the analysis-forecasting cycle. This could in fact be justified also for the conventional observations. We have a fairly good coverage of surface observations 8 times a day and several upper air stations are making radiosonde and radiowind observations 4 times a day. If we have a 3-hour step in the analysis-forecasting cycle instead of 12 hours, which is applied most often, we may without any difficulties treat all observations as synoptic. No observation would thus be more than 90 minutes off time and the observations even during strong transient motion would fall within a horizontal mesh of 500 km * 500 km.
Resumo:
Full-waveform laser scanning data acquired with a Riegl LMS-Q560 instrument were used to classify an orange orchard into orange trees, grass and ground using waveform parameters alone. Gaussian decomposition was performed on this data capture from the National Airborne Field Experiment in November 2006 using a custom peak-detection procedure and a trust-region-reflective algorithm for fitting Gauss functions. Calibration was carried out using waveforms returned from a road surface, and the backscattering coefficient c was derived for every waveform peak. The processed data were then analysed according to the number of returns detected within each waveform and classified into three classes based on pulse width and c. For single-peak waveforms the scatterplot of c versus pulse width was used to distinguish between ground, grass and orange trees. In the case of multiple returns, the relationship between first (or first plus middle) and last return c values was used to separate ground from other targets. Refinement of this classification, and further sub-classification into grass and orange trees was performed using the c versus pulse width scatterplots of last returns. In all cases the separation was carried out using a decision tree with empirical relationships between the waveform parameters. Ground points were successfully separated from orange tree points. The most difficult class to separate and verify was grass, but those points in general corresponded well with the grass areas identified in the aerial photography. The overall accuracy reached 91%, using photography and relative elevation as ground truth. The overall accuracy for two classes, orange tree and combined class of grass and ground, yielded 95%. Finally, the backscattering coefficient c of single-peak waveforms was also used to derive reflectance values of the three classes. The reflectance of the orange tree class (0.31) and ground class (0.60) are consistent with published values at the wavelength of the Riegl scanner (1550 nm). The grass class reflectance (0.46) falls in between the other two classes as might be expected, as this class has a mixture of the contributions of both vegetation and ground reflectance properties.
Resumo:
BACKGROUND: Genetic polymorphisms of transcription factor 7-like 2 (TCF7L2) have been associated with type 2 diabetes and BMI. OBJECTIVE: The objective was to investigate whether TCF7L2 HapA is associated with weight development and whether such an association is modulated by protein intake or by the glycemic index (GI). DESIGN: The investigation was based on prospective data from 5 cohort studies nested within the European Prospective Investigation into Cancer and Nutrition. Weight change was followed up for a mean (±SD) of 6.8 ± 2.5 y. TCF7L2 rs7903146 and rs10885406 were successfully genotyped in 11,069 individuals and used to derive HapA. Multiple logistic and linear regression analysis was applied to test for the main effect of HapA and its interaction with dietary protein or GI. Analyses from the cohorts were combined by random-effects meta-analysis. RESULTS: HapA was associated neither with baseline BMI (0.03 ± 0.07 BMI units per allele; P = 0.6) nor with annual weight change (8.8 ± 11.7 g/y per allele; P = 0.5). However, a previously shown positive association between intake of protein, particularly of animal origin, and subsequent weight change in this population proved to be attenuated by TCF7L2 HapA (P-interaction = 0.01). We showed that weight gain becomes independent of protein intake with an increasing number of HapA alleles. Substitution of protein with either fat or carbohydrates showed the same effects. No interaction with GI was observed. CONCLUSION: TCF7L2 HapA attenuates the positive association between animal protein intake and long-term body weight change in middle-aged Europeans but does not interact with the GI of the diet.
Resumo:
A comprehensive quality assessment of the ozone products from 18 limb-viewing satellite instruments is provided by means of a detailed intercomparison. The ozone climatologies in form of monthly zonal mean time series covering the upper troposphere to lower mesosphere are obtained from LIMS, SAGE I/II/III, UARS-MLS, HALOE, POAM II/III, SMR, OSIRIS, MIPAS, GOMOS, SCIAMACHY, ACE-FTS, ACE-MAESTRO, Aura-MLS, HIRDLS, and SMILES within 1978–2010. The intercomparisons focus on mean biases of annual zonal mean fields, interannual variability, and seasonal cycles. Additionally, the physical consistency of the data is tested through diagnostics of the quasi-biennial oscillation and Antarctic ozone hole. The comprehensive evaluations reveal that the uncertainty in our knowledge of the atmospheric ozone mean state is smallest in the tropical and midlatitude middle stratosphere with a 1σ multi-instrument spread of less than ±5%. While the overall agreement among the climatological data sets is very good for large parts of the stratosphere, individual discrepancies have been identified, including unrealistic month-to-month fluctuations, large biases in particular atmospheric regions, or inconsistencies in the seasonal cycle. Notable differences between the data sets exist in the tropical lower stratosphere (with a spread of ±30%) and at high latitudes (±15%). In particular, large relative differences are identified in the Antarctic during the time of the ozone hole, with a spread between the monthly zonal mean fields of ±50%. The evaluations provide guidance on what data sets are the most reliable for applications such as studies of ozone variability, model-measurement comparisons, detection of long-term trends, and data-merging activities.
Resumo:
Policy makers are in broad agreement that demand response should play a major role in EU electricity systems and provide much needed future system flexibility. Yet, little demand response has been forthcoming in member states to date. This paper identifies some of the technical potential for demand response, based on empirical data from one UK demand aggregator. Half-hourly electricity readings of demand during normal operation and during response events have been analysed for different industry and service sectors. We review these findings in the context of ongoing EU policy developments with particular focus on the role appropriate arrangements to enhance the available resource. We conclude that in some sectors appropriate policy and regulation could triple the available response capacity and thereby lead to stronger commercial uptake of demand response.
Resumo:
The CHARMe project enables the annotation of climate data with key pieces of supporting information that we term “commentary”. Commentary reflects the experience that has built up in the user community, and can help new or less-expert users (such as consultants, SMEs, experts in other fields) to understand and interpret complex data. In the context of global climate services, the CHARMe system will record, retain and disseminate this commentary on climate datasets, and provide a means for feeding back this experience to the data providers. Based on novel linked data techniques and standards, the project has developed a core system, data model and suite of open-source tools to enable this information to be shared, discovered and exploited by the community.
Resumo:
We present an overview of the MELODIES project, which is developing new data-intensive environmental services based on data from Earth Observation satellites, government databases, national and European agencies and more. We focus here on the capabilities and benefits of the project’s “technical platform”, which applies cloud computing and Linked Data technologies to enable the development of these services, providing flexibility and scalability.
Resumo:
The Environmental Data Abstraction Library provides a modular data management library for bringing new and diverse datatypes together for visualisation within numerous software packages, including the ncWMS viewing service, which already has very wide international uptake. The structure of EDAL is presented along with examples of its use to compare satellite, model and in situ data types within the same visualisation framework. We emphasize the value of this capability for cross calibration of datasets and evaluation of model products against observations, including preparation for data assimilation.
Resumo:
Levels of mobility in the Roman Empire have long been assumed to be relatively high, as attested by epigraphy, demography, material culture and, most recently, isotope analysis and the skeletons themselves. Building on recent data from a range of Romano-British sites (Poundbury in Dorset, York, Winchester, Gloucester, Catterick and Scorton), this article explores the significance of the presence of migrants at these sites and the impact they may have had on their host societies. The authors explore the usefulness of diaspora theory, and in particular the concept of imperial and colonial diasporas, to illustrate the complexities of identities in later Roman Britain.
Resumo:
Classical regression methods take vectors as covariates and estimate the corresponding vectors of regression parameters. When addressing regression problems on covariates of more complex form such as multi-dimensional arrays (i.e. tensors), traditional computational models can be severely compromised by ultrahigh dimensionality as well as complex structure. By exploiting the special structure of tensor covariates, the tensor regression model provides a promising solution to reduce the model’s dimensionality to a manageable level, thus leading to efficient estimation. Most of the existing tensor-based methods independently estimate each individual regression problem based on tensor decomposition which allows the simultaneous projections of an input tensor to more than one direction along each mode. As a matter of fact, multi-dimensional data are collected under the same or very similar conditions, so that data share some common latent components but can also have their own independent parameters for each regression task. Therefore, it is beneficial to analyse regression parameters among all the regressions in a linked way. In this paper, we propose a tensor regression model based on Tucker Decomposition, which identifies not only the common components of parameters across all the regression tasks, but also independent factors contributing to each particular regression task simultaneously. Under this paradigm, the number of independent parameters along each mode is constrained by a sparsity-preserving regulariser. Linked multiway parameter analysis and sparsity modeling further reduce the total number of parameters, with lower memory cost than their tensor-based counterparts. The effectiveness of the new method is demonstrated on real data sets.
Resumo:
The purpose of this paper is to report on the facilities available, organisation of, and staff attitudes to early years outdoor education from schools within the south east of England, focusing on provision for children aged three to five. One component of the successful education of the child involves providing an ‘environment for learning’, including the facilities, layout and routines. This paper presents findings concerning the type and variety of facilities available outside; the various styles of organisation of the space; staff attitudes about: their roles, their aims for the environment, children’s behaviour and learning, and perceived drawbacks to practice. This paper draws on empirical data collected from schools within the University of Reading partnership. The findings suggest that although all early years settings must adhere to the statutory framework there are a range of facilities available, and there are a number of ways this environment is organised. Further there appears to be uncertainty about the adult role outside and the aims for activities. The conclusions drawn indicate that staff do not appear to be linking their aims for outdoor education to the facilities provided or to their actions outside. This means there is not a clear link between what staff provide outside and the declared ambitions for learning. This study is important as all educators need to be certain about their aims for education to ensure best outcomes for children. The implications of these findings for early years teachers are that they need to be able to articulate their aims for outdoor education and to provide the correct facilities to achieve these aims. Finally this study was undertaken to raise debate, posit questions and to ascertain the parameters for further research about the early years outdoor environment.
Resumo:
In order to calculate unbiased microphysical and radiative quantities in the presence of a cloud, it is necessary to know not only the mean water content but also the distribution of this water content. This article describes a study of the in-cloud horizontal inhomogeneity of ice water content, based on CloudSat data. In particular, by focusing on the relations with variables that are already available in general circulation models (GCMs), a parametrization of inhomogeneity that is suitable for inclusion in GCM simulations is developed. Inhomogeneity is defined in terms of the fractional standard deviation (FSD), which is given by the standard deviation divided by the mean. The FSD of ice water content is found to increase with the horizontal scale over which it is calculated and also with the thickness of the layer. The connection to cloud fraction is more complicated; for small cloud fractions FSD increases as cloud fraction increases while FSD decreases sharply for overcast scenes. The relations to horizontal scale, layer thickness and cloud fraction are parametrized in a relatively simple equation. The performance of this parametrization is tested on an independent set of CloudSat data. The parametrization is shown to be a significant improvement on the assumption of a single-valued global FSD
Resumo:
Purpose: To investigate the relationship between research data management (RDM) and data sharing in the formulation of RDM policies and development of practices in higher education institutions (HEIs). Design/methodology/approach: Two strands of work were undertaken sequentially: firstly, content analysis of 37 RDM policies from UK HEIs; secondly, two detailed case studies of institutions with different approaches to RDM based on semi-structured interviews with staff involved in the development of RDM policy and services. The data are interpreted using insights from Actor Network Theory. Findings: RDM policy formation and service development has created a complex set of networks within and beyond institutions involving different professional groups with widely varying priorities shaping activities. Data sharing is considered an important activity in the policies and services of HEIs studied, but its prominence can in most cases be attributed to the positions adopted by large research funders. Research limitations/implications: The case studies, as research based on qualitative data, cannot be assumed to be universally applicable but do illustrate a variety of issues and challenges experienced more generally, particularly in the UK. Practical implications: The research may help to inform development of policy and practice in RDM in HEIs and funder organisations. Originality/value: This paper makes an early contribution to the RDM literature on the specific topic of the relationship between RDM policy and services, and openness – a topic which to date has received limited attention.
Resumo:
Previous versions of the Consortium for Small-scale Modelling (COSMO) numerical weather prediction model have used a constant sea-ice surface temperature, but observations show a high degree of variability on sub-daily timescales. To account for this, we have implemented a thermodynamic sea-ice module in COSMO and performed simulations at a resolution of 15 km and 5 km for the Laptev Sea area in April 2008. Temporal and spatial variability of surface and 2-m air temperature are verified by four automatic weather stations deployed along the edge of the western New Siberian polynya during the Transdrift XIII-2 expedition and by surface temperature charts derived from Moderate Resolution Imaging Spectroradiometer (MODIS) satellite data. A remarkable agreement between the new model results and these observations demonstrates that the implemented sea-ice module can be applied for short-range simulations. Prescribing the polynya areas daily, our COSMO simulations provide a high-resolution and high-quality atmospheric data set for the Laptev Sea for the period 14-30 April 2008. Based on this data set, we derive a mean total sea-ice production rate of 0.53 km3/day for all Laptev Sea polynyas under the assumption that the polynyas are ice-free and a rate of 0.30 km3/day if a 10-cm-thin ice layer is assumed. Our results indicate that ice production in Laptev Sea polynyas has been overestimated in previous studies.
Resumo:
In order to gain insights into events and issues that may cause errors and outages in parts of IP networks, intelligent methods that capture and express causal relationships online (in real-time) are needed. Whereas generalised rule induction has been explored for non-streaming data applications, its application and adaptation on streaming data is mostly undeveloped or based on periodic and ad-hoc training with batch algorithms. Some association rule mining approaches for streaming data do exist, however, they can only express binary causal relationships. This paper presents the ongoing work on Online Generalised Rule Induction (OGRI) in order to create expressive and adaptive rule sets real-time that can be applied to a broad range of applications, including network telemetry data streams.