49 resultados para DIGITAL DATA
em CentAUR: Central Archive University of Reading - UK
Resumo:
Burst timing synchronisation is maintained in a digital data decoder during multiple burst reception in a TDMA system. The data within a multiple burst are streamed into memory storage and data corresponding to a first burst in the series of bursts are selected on the basis of a current timing estimate derived from a synchronisation burst. Selections of data corresponding to other bursts in the series of bursts are modified in accordance with updated timing estimates derived from previously processed bursts.
Resumo:
To provide reliable estimates for mapping soil properties for precision agriculture requires intensive sampling and costly laboratory analyses. If the spatial structure of ancillary data, such as yield, digital information from aerial photographs, and soil electrical conductivity (EC) measurements, relates to that of soil properties they could be used to guide the sampling intensity for soil surveys. Variograins of permanent soil properties at two study sites on different parent materials were compared with each other and with those for ancillary data. The ranges of spatial dependence identified by the variograms of both sets of properties are of similar orders of magnitude for each study site, Maps of the ancillary data appear to show similar patterns of variation and these seem to relate to those of the permanent properties of the soil. Correlation analysis has confirmed these relations. Maps of kriged estimates from sub-sampled data and the original variograrns showed that the main patterns of variation were preserved when a sampling interval of less than half the average variogram range of ancillary data was used. Digital data from aerial photographs for different years and EC appear to show a more consistent relation with the soil properties than does yield. Aerial photographs, in particular those of bare soil, seem to be the most useful ancillary data and they are often cheaper to obtain than yield and EC data.
Resumo:
The collection of wind speed time series by means of digital data loggers occurs in many domains, including civil engineering, environmental sciences and wind turbine technology. Since averaging intervals are often significantly larger than typical system time scales, the information lost has to be recovered in order to reconstruct the true dynamics of the system. In the present work we present a simple algorithm capable of generating a real-time wind speed time series from data logger records containing the average, maximum, and minimum values of the wind speed in a fixed interval, as well as the standard deviation. The signal is generated from a generalized random Fourier series. The spectrum can be matched to any desired theoretical or measured frequency distribution. Extreme values are specified through a postprocessing step based on the concept of constrained simulation. Applications of the algorithm to 10-min wind speed records logged at a test site at 60 m height above the ground show that the recorded 10-min values can be reproduced by the simulated time series to a high degree of accuracy.
Resumo:
The nature and magnitude of climatic variability during the period of middle Pliocene warmth (ca 3.29–2.97 Ma) is poorly understood. We present a suite of palaeoclimate modelling experiments incorporating an advanced atmospheric general circulation model (GCM), coupled to a Q-flux ocean model for 3.29, 3.12 and 2.97 Ma BP. Astronomical solutions for the periods in question were derived from the Berger and Loutre BL2 astronomical solution. Boundary conditions, excluding sea surface temperatures (SSTs) which were predicted by the slab-ocean model, were provided from the USGS PRISM2 2°×2° digital data set. The model results indicate that little annual variation (0.5°C) in SSTs, relative to a ‘control’ experiment, occurred during the middle Pliocene in response to the altered orbital configurations. Annual surface air temperatures also displayed little variation. Seasonally, surface air temperatures displayed a trend of cooler temperatures during December, January and February, and warmer temperatures during June, July and August. This pattern is consistent with altered seasonality resulting from the prescribed orbital configurations. Precipitation changes follow the seasonal trend observed for surface air temperature. Compared to present-day, surface wind strength and wind stress over the North Atlantic, North Pacific and Southern Ocean remained greater in each of the Pliocene experiments. This suggests that wind-driven gyral circulation may have been consistently greater during the middle Pliocene. The trend of climatic variability predicted by the GCM for the middle Pliocene accords with geological data. However, it is unclear if the model correctly simulates the magnitude of the variation. This uncertainty is derived from, (a) the relative insensitivity of the GCM to perturbation in the imposed boundary conditions, (b) a lack of detailed time series data concerning changes to terrestrial ice cover and greenhouse gas concentrations for the middle Pliocene and (c) difficulties in representing the effects of ‘climatic history’ in snap-shot GCM experiments.
Resumo:
The principles of operation of an experimental prototype instrument known as J-SCAN are described along with the derivation of formulae for the rapid calculation of normalized impedances; the structure of the instrument; relevant probe design parameters; digital quantization errors; and approaches for the optimization of single frequency operation. An eddy current probe is used As the inductance element of a passive tuned-circuit which is repeatedly excited with short impulses. Each impulse excites an oscillation which is subject to decay dependent upon the values of the tuned-circuit components: resistance, inductance and capacitance. Changing conditions under the probe that affect the resistance and inductance of this circuit will thus be detected through changes in the transient response. These changes in transient response, oscillation frequency and rate of decay, are digitized, and then normalized values for probe resistance and inductance changes are calculated immediately in a micro processor. This approach coupled with a minimum analogue processing and maximum of digital processing has advantages compared with the conventional approaches to eddy current instruments. In particular there are: the absence of an out of balance condition and the flexibility and stability of digital data processing.
Resumo:
There are still major challenges in the area of automatic indexing and retrieval of digital data. The main problem arises from the ever increasing mass of digital media and the lack of efficient methods for indexing and retrieval of such data based on the semantic content rather than keywords. To enable intelligent web interactions or even web filtering, we need to be capable of interpreting the information base in an intelligent manner. Research has been ongoing for a few years in the field of ontological engineering with the aim of using ontologies to add knowledge to information. In this paper we describe the architecture of a system designed to automatically and intelligently index huge repositories of special effects video clips, based on their semantic content, using a network of scalable ontologies to enable intelligent retrieval.
Resumo:
Automatic indexing and retrieval of digital data poses major challenges. The main problem arises from the ever increasing mass of digital media and the lack of efficient methods for indexing and retrieval of such data based on the semantic content rather than keywords. To enable intelligent web interactions, or even web filtering, we need to be capable of interpreting the information base in an intelligent manner. For a number of years research has been ongoing in the field of ontological engineering with the aim of using ontologies to add such (meta) knowledge to information. In this paper, we describe the architecture of a system (Dynamic REtrieval Analysis and semantic metadata Management (DREAM)) designed to automatically and intelligently index huge repositories of special effects video clips, based on their semantic content, using a network of scalable ontologies to enable intelligent retrieval. The DREAM Demonstrator has been evaluated as deployed in the film post-production phase to support the process of storage, indexing and retrieval of large data sets of special effects video clips as an exemplar application domain. This paper provides its performance and usability results and highlights the scope for future enhancements of the DREAM architecture which has proven successful in its first and possibly most challenging proving ground, namely film production, where it is already in routine use within our test bed Partners' creative processes. (C) 2009 Published by Elsevier B.V.
Resumo:
There is a renewed interest in immersive visualization to navigate digital data-sets associated with large building and infrastructure projects. Following work with a fully immersive visualization facility at the University, this paper details the development of a complementary mobile visualization environment. It articulates progress on the requirements for this facility; the overall design of hardware and software; and the laboratory testing and planning for user pilots in construction applications. Like our fixed facility, this new light-weight mobile solution enables a group of users to navigate a 3D model at a 1:1 scale and to work collaboratively with structured asset information. However it offers greater flexibility as two users can assemble and start using it at a new location within an hour. The solution has been developed and tested in a laboratory and will be piloted in engineering design review and stakeholder engagement applications on a major construction project.
Resumo:
Flood modelling of urban areas is still at an early stage, partly because until recently topographic data of sufficiently high resolution and accuracy have been lacking in urban areas. However, Digital Surface Models (DSMs) generated from airborne scanning laser altimetry (LiDAR) having sub-metre spatial resolution have now become available, and these are able to represent the complexities of urban topography. The paper describes the development of a LiDAR post-processor for urban flood modelling based on the fusion of LiDAR and digital map data. The map data are used in conjunction with LiDAR data to identify different object types in urban areas, though pattern recognition techniques are also employed. Post-processing produces a Digital Terrain Model (DTM) for use as model bathymetry, and also a friction parameter map for use in estimating spatially-distributed friction coefficients. In vegetated areas, friction is estimated from LiDAR-derived vegetation height, and (unlike most vegetation removal software) the method copes with short vegetation less than ~1m high, which may occupy a substantial fraction of even an urban floodplain. The DTM and friction parameter map may also be used to help to generate an unstructured mesh of a vegetated urban floodplain for use by a 2D finite element model. The mesh is decomposed to reflect floodplain features having different frictional properties to their surroundings, including urban features such as buildings and roads as well as taller vegetation features such as trees and hedges. This allows a more accurate estimation of local friction. The method produces a substantial node density due to the small dimensions of many urban features.
Resumo:
We have designed and implemented a low-cost digital system using closed-circuit television cameras coupled to a digital acquisition system for the recording of in vivo behavioral data in rodents and for allowing observation and recording of more than 10 animals simultaneously at a reduced cost, as compared with commercially available solutions. This system has been validated using two experimental rodent models: one involving chemically induced seizures and one assessing appetite and feeding. We present observational results showing comparable or improved levels of accuracy and observer consistency between this new system and traditional methods in these experimental models, discuss advantages of the presented system over conventional analog systems and commercially available digital systems, and propose possible extensions to the system and applications to nonrodent studies.
Resumo:
Although accuracy of digital elevation models (DEMs) can be quantified and measured in different ways, each is influenced by three main factors: terrain character, sampling strategy and interpolation method. These parameters, and their interaction, are discussed. The generation of DEMs from digitised contours is emphasised because this is the major source of DEMs, particularly within member countries of OEEPE. Such DEMs often exhibit unwelcome artifacts, depending on the interpolation method employed. The origin and magnitude of these effects and how they can be reduced to improve the accuracy of the DEMs are also discussed.
Resumo:
We have designed and implemented a low-cost digital system using closed-circuit television cameras coupled to a digital acquisition system for the recording of in vivo behavioral data in rodents and for allowing observation and recording of more than 10 animals simultaneously at a reduced cost, as compared with commercially available solutions. This system has been validated using two experimental rodent models: one involving chemically induced seizures and one assessing appetite and feeding. We present observational results showing comparable or improved levels of accuracy and observer consistency between this new system and traditional methods in these experimental models, discuss advantages of the presented system over conventional analog systems and commercially available digital systems, and propose possible extensions to the system and applications to non-rodent studies.
Resumo:
One of the main tasks of the mathematical knowledge management community must surely be to enhance access to mathematics on digital systems. In this paper we present a spectrum of approaches to solving the various problems inherent in this task, arguing that a variety of approaches is both necessary and useful. The main ideas presented are about the differences between digitised mathematics, digitally represented mathematics and formalised mathematics. Each has its part to play in managing mathematical information in a connected world. Digitised material is that which is embodied in a computer file, accessible and displayable locally or globally. Represented material is digital material in which there is some structure (usually syntactic in nature) which maps to the mathematics contained in the digitised information. Formalised material is that in which both the syntax and semantics of the represented material, is automatically accessible. Given the range of mathematical information to which access is desired, and the limited resources available for managing that information, we must ensure that these resources are applied to digitise, form representations of or formalise, existing and new mathematical information in such a way as to extract the most benefit from the least expenditure of resources. We also analyse some of the various social and legal issues which surround the practical tasks.