63 resultados para Real data
Resumo:
Satellite-based rainfall monitoring is widely used for climatological studies because of its full global coverage but it is also of great importance for operational purposes especially in areas such as Africa where there is a lack of ground-based rainfall data. Satellite rainfall estimates have enormous potential benefits as input to hydrological and agricultural models because of their real time availability, low cost and full spatial coverage. One issue that needs to be addressed is the uncertainty on these estimates. This is particularly important in assessing the likely errors on the output from non-linear models (rainfall-runoff or crop yield) which make use of the rainfall estimates, aggregated over an area, as input. Correct assessment of the uncertainty on the rainfall is non-trivial as it must take account of • the difference in spatial support of the satellite information and independent data used for calibration • uncertainties on the independent calibration data • the non-Gaussian distribution of rainfall amount • the spatial intermittency of rainfall • the spatial correlation of the rainfall field This paper describes a method for estimating the uncertainty on satellite-based rainfall values taking account of these factors. The method involves firstly a stochastic calibration which completely describes the probability of rainfall occurrence and the pdf of rainfall amount for a given satellite value, and secondly the generation of ensemble of rainfall fields based on the stochastic calibration but with the correct spatial correlation structure within each ensemble member. This is achieved by the use of geostatistical sequential simulation. The ensemble generated in this way may be used to estimate uncertainty at larger spatial scales. A case study of daily rainfall monitoring in the Gambia, west Africa for the purpose of crop yield forecasting is presented to illustrate the method.
Resumo:
The long-term stability, high accuracy, all-weather capability, high vertical resolution, and global coverage of Global Navigation Satellite System (GNSS) radio occultation (RO) suggests it as a promising tool for global monitoring of atmospheric temperature change. With the aim to investigate and quantify how well a GNSS RO observing system is able to detect climate trends, we are currently performing an (climate) observing system simulation experiment over the 25-year period 2001 to 2025, which involves quasi-realistic modeling of the neutral atmosphere and the ionosphere. We carried out two climate simulations with the general circulation model MAECHAM5 (Middle Atmosphere European Centre/Hamburg Model Version 5) of the MPI-M Hamburg, covering the period 2001–2025: One control run with natural variability only and one run also including anthropogenic forcings due to greenhouse gases, sulfate aerosols, and tropospheric ozone. On the basis of this, we perform quasi-realistic simulations of RO observables for a small GNSS receiver constellation (six satellites), state-of-the-art data processing for atmospheric profiles retrieval, and a statistical analysis of temperature trends in both the “observed” climatology and the “true” climatology. Here we describe the setup of the experiment and results from a test bed study conducted to obtain a basic set of realistic estimates of observational errors (instrument- and retrieval processing-related errors) and sampling errors (due to spatial-temporal undersampling). The test bed results, obtained for a typical summer season and compared to the climatic 2001–2025 trends from the MAECHAM5 simulation including anthropogenic forcing, were found encouraging for performing the full 25-year experiment. They indicated that observational and sampling errors (both contributing about 0.2 K) are consistent with recent estimates of these errors from real RO data and that they should be sufficiently small for monitoring expected temperature trends in the global atmosphere over the next 10 to 20 years in most regions of the upper troposphere and lower stratosphere (UTLS). Inspection of the MAECHAM5 trends in different RO-accessible atmospheric parameters (microwave refractivity and pressure/geopotential height in addition to temperature) indicates complementary climate change sensitivity in different regions of the UTLS so that optimized climate monitoring shall combine information from all climatic key variables retrievable from GNSS RO data.
Resumo:
A recent area for investigation into the development of adaptable robot control is the use of living neuronal networks to control a mobile robot. The so-called Animat paradigm comprises a neuronal network (the ‘brain’) connected to an external embodiment (in this case a mobile robot), facilitating potentially robust, adaptable robot control and increased understanding of neural processes. Sensory input from the robot is provided to the neuronal network via stimulation on a number of electrodes embedded in a specialist Petri dish (Multi Electrode Array (MEA)); accurate control of this stimulation is vital. We present software tools allowing precise, near real-time control of electrical stimulation on MEAs, with fast switching between electrodes and the application of custom stimulus waveforms. These Linux-based tools are compatible with the widely used MEABench data acquisition system. Benefits include rapid stimulus modulation in response to neuronal activity (closed loop) and batch processing of stimulation protocols.
Resumo:
The classical computer vision methods can only weakly emulate some of the multi-level parallelisms in signal processing and information sharing that takes place in different parts of the primates’ visual system thus enabling it to accomplish many diverse functions of visual perception. One of the main functions of the primates’ vision is to detect and recognise objects in natural scenes despite all the linear and non-linear variations of the objects and their environment. The superior performance of the primates’ visual system compared to what machine vision systems have been able to achieve to date, motivates scientists and researchers to further explore this area in pursuit of more efficient vision systems inspired by natural models. In this paper building blocks for a hierarchical efficient object recognition model are proposed. Incorporating the attention-based processing would lead to a system that will process the visual data in a non-linear way focusing only on the regions of interest and hence reducing the time to achieve real-time performance. Further, it is suggested to modify the visual cortex model for recognizing objects by adding non-linearities in the ventral path consistent with earlier discoveries as reported by researchers in the neuro-physiology of vision.
Resumo:
We describe a new methodology for comparing satellite radiation budget data with a numerical weather prediction (NWP) model. This is applied to data from the Geostationary Earth Radiation Budget (GERB) instrument on Meteosat-8. The methodology brings together, in near-real time, GERB broadband shortwave and longwave fluxes with simulations based on analyses produced by the Met Office global NWP model. Results for the period May 2003 to February 2005 illustrate the progressive improvements in the data products as various initial problems were resolved. In most areas the comparisons reveal systematic errors in the model's representation of surface properties and clouds, which are discussed elsewhere. However, for clear-sky regions over the oceans the model simulations are believed to be sufficiently accurate to allow the quality of the GERB fluxes themselves to be assessed and any changes in time of the performance of the instrument to be identified. Using model and radiosonde profiles of temperature and humidity as input to a single-column version of the model's radiation code, we conduct sensitivity experiments which provide estimates of the expected model errors over the ocean of about ±5–10 W m−2 in clear-sky outgoing longwave radiation (OLR) and ±0.01 in clear-sky albedo. For the more recent data the differences between the observed and modeled OLR and albedo are well within these error estimates. The close agreement between the observed and modeled values, particularly for the most recent period, illustrates the value of the methodology. It also contributes to the validation of the GERB products and increases confidence in the quality of the data, prior to their release.
Resumo:
This study examines the efficacy of published δ18O data from the calcite of Late Miocene surface dwelling planktonic foraminifer shells, for sea surface temperature estimates for the pre-Quaternary. The data are from 33 Late Miocene (Messinian) marine sites from a modern latitudinal gradient of 64°N to 48°S. They give estimates of SSTs in the tropics/subtropics (to 30°N and S) that are mostly cooler than present. Possible causes of this temperature discrepancy are ecological factors (e.g. calcification of shells at levels below the ocean mixed layer), taphonomic effects (e.g. diagenesis or dissolution), inaccurate estimation of Late Miocene seawater oxygen isotope composition, or a real Late Miocene cool climate. The scale of apparent cooling in the tropics suggests that the SST signal of the foraminifer calcite has been reset, at least in part, by early diagenetic calcite with higher δ18O, formed in the foraminifer shells in cool sea bottom pore waters, probably coupled with the effects of calcite formed below the mixed layer during the life of the foraminifera. This hypothesis is supported by the markedly cooler SST estimates from low latitudes—in some cases more than 9 °C cooler than present—where the gradients of temperature and the δ18O composition of seawater between sea surface and sea bottom are most marked, and where ocean surface stratification is high. At higher latitudes, particularly N and S of 30°, the temperature signal is still cooler, though maximum temperature estimates overlap with modern SSTs N and S of 40°. Comparison of SST estimates for the Late Miocene from alkenone unsaturation analysis from the eastern tropical Atlantic at Ocean Drilling Program (ODP) Site 958—which suggest a warmer sea surface by 2–4 °C, with estimates from oxygen isotopes at Deep Sea Drilling Project (DSDP) Site 366 and ODP Site 959, indicating cooler than present SSTs, also suggest a significant impact on the δ18O signal. Nevertheless, much of the original SST variation is clearly preserved in the primary calcite formed in the mixed layer, and records secular and temporal oceanographic changes at the sea surface, such as movement of the Antarctic Polar Front in the Southern Ocean. Cooler SSTs in the tropics and sub-tropics are also consistent with the Late Miocene latitude reduction in the coral reef belt and with interrupted reef growth on the Queensland Plateau of eastern Australia, though it is not possible to quantify absolute SSTs with the existing oxygen isotope data. Reconstruction of an accurate global SST dataset for Neogene time-slices from the existing published DSDP/ODP isotope data, for use in general circulation models, may require a detailed re-assessment of taphonomy at many sites.
Resumo:
This study examines the efficacy of published δ18O data from the calcite of Late Miocene surface dwelling planktonic foraminifer shells, for sea surface temperature estimates for the pre-Quaternary. The data are from 33 Late Miocene (Messinian) marine sites from a modern latitudinal gradient of 64°N to 48°S. They give estimates of SSTs in the tropics/subtropics (to 30°N and S) that are mostly cooler than present. Possible causes of this temperature discrepancy are ecological factors (e.g. calcification of shells at levels below the ocean mixed layer), taphonomic effects (e.g. diagenesis or dissolution), inaccurate estimation of Late Miocene seawater oxygen isotope composition, or a real Late Miocene cool climate. The scale of apparent cooling in the tropics suggests that the SST signal of the foraminifer calcite has been reset, at least in part, by early diagenetic calcite with higher δ18O, formed in the foraminifer shells in cool sea bottom pore waters, probably coupled with the effects of calcite formed below the mixed layer during the life of the foraminifera. This hypothesis is supported by the markedly cooler SST estimates from low latitudes—in some cases more than 9 °C cooler than present—where the gradients of temperature and the δ18O composition of seawater between sea surface and sea bottom are most marked, and where ocean surface stratification is high. At higher latitudes, particularly N and S of 30°, the temperature signal is still cooler, though maximum temperature estimates overlap with modern SSTs N and S of 40°. Comparison of SST estimates for the Late Miocene from alkenone unsaturation analysis from the eastern tropical Atlantic at Ocean Drilling Program (ODP) Site 958—which suggest a warmer sea surface by 2–4 °C, with estimates from oxygen isotopes at Deep Sea Drilling Project (DSDP) Site 366 and ODP Site 959, indicating cooler than present SSTs, also suggest a significant impact on the δ18O signal. Nevertheless, much of the original SST variation is clearly preserved in the primary calcite formed in the mixed layer, and records secular and temporal oceanographic changes at the sea surface, such as movement of the Antarctic Polar Front in the Southern Ocean. Cooler SSTs in the tropics and sub-tropics are also consistent with the Late Miocene latitude reduction in the coral reef belt and with interrupted reef growth on the Queensland Plateau of eastern Australia, though it is not possible to quantify absolute SSTs with the existing oxygen isotope data. Reconstruction of an accurate global SST dataset for Neogene time-slices from the existing published DSDP/ODP isotope data, for use in general circulation models, may require a detailed re-assessment of taphonomy at many sites.
Resumo:
Recent severe flooding in the UK has highlighted the need for better information on flood risk, increasing the pressure on engineers to enhance the capabilities of computer models for flood prediction. This paper evaluates the benefits to be gained from the use of remotely sensed data to support flood modelling. The remotely sensed data available can be used either to produce high-resolution digital terrain models (DTMs) (light detection and ranging (Lidar) data), or to generate accurate inundation mapping of past flood events (airborne synthetic aperture radar (SAR) data and aerial photography). The paper reports on the modelling of real flood events that occurred at two UK sites on the rivers Severn and Ouse. At these sites a combination of remotely sensed data and recorded hydrographs was available. It is concluded first that light detection and ranging Lidar generated DTMs support the generation of considerably better models and enhance the visualisation of model results and second that flood outlines obtained from airborne SAR or aerial images help develop an appreciation of the hydraulic behaviour of important model components, and facilitate model validation. The need for further research is highlighted by a number of limitations, namely: the difficulties in obtaining an adequate representation of hydraulically important features such as embankment crests and walls; uncertainties in the validation data; and difficulties in extracting flood outlines from airborne SAR images in urban areas.
Resumo:
The influence matrix is used in ordinary least-squares applications for monitoring statistical multiple-regression analyses. Concepts related to the influence matrix provide diagnostics on the influence of individual data on the analysis - the analysis change that would occur by leaving one observation out, and the effective information content (degrees of freedom for signal) in any sub-set of the analysed data. In this paper, the corresponding concepts have been derived in the context of linear statistical data assimilation in numerical weather prediction. An approximate method to compute the diagonal elements of the influence matrix (the self-sensitivities) has been developed for a large-dimension variational data assimilation system (the four-dimensional variational system of the European Centre for Medium-Range Weather Forecasts). Results show that, in the boreal spring 2003 operational system, 15% of the global influence is due to the assimilated observations in any one analysis, and the complementary 85% is the influence of the prior (background) information, a short-range forecast containing information from earlier assimilated observations. About 25% of the observational information is currently provided by surface-based observing systems, and 75% by satellite systems. Low-influence data points usually occur in data-rich areas, while high-influence data points are in data-sparse areas or in dynamically active regions. Background-error correlations also play an important role: high correlation diminishes the observation influence and amplifies the importance of the surrounding real and pseudo observations (prior information in observation space). Incorrect specifications of background and observation-error covariance matrices can be identified, interpreted and better understood by the use of influence-matrix diagnostics for the variety of observation types and observed variables used in the data assimilation system. Copyright © 2004 Royal Meteorological Society
Resumo:
Leaf blotch, caused by Rhynchosporium secalis, was studied in a range of winter barley cultivars using a combination of traditional plant pathological techniques and newly developed multiplex and real-time polymerase chain reaction (PCR) assays. Using PCR, symptomless leaf blotch colonization was shown to occur throughout the growing season in the resistant winter barley cv. Leonie. The dynamics of colonization throughout the growing season were similar in both Leonie and Vertige, a susceptible cultivar. However, pathogen DNA levels were approximately 10-fold higher in the susceptible cultivar, which expressed symptoms throughout the growing season. Visual assessments and PCR also were used to determine levels of R. secalis colonization and infection in samples from a field experiment used to test a range of winter barley cultivars with different levels of leaf blotch resistance. The correlation between the PCR and visual assessment data was better at higher infection levels (R(2) = 0.81 for leaf samples with >0.3% disease). Although resistance ratings did not correlate well with levels of disease for all cultivars tested, low levels of infection were observed in the cultivar with the highest resistance rating and high levels of infection in the cultivar with the lowest resistance rating.
Resumo:
The relationship between speed and crashes has been well established in the literature, with the consequence that speed reduction through enforced or other means should lead to a reduction in crashes. The extent to which the public regard speeding as a problem that requires enforcement is less clear. Analysis was conducted on public perceptions of antisocial behaviors including speeding traffic. The data was collected as part of the British Crime Survey, a face-to-face interview with UK residents on issues relating to crime. The antisocial behavior section required participants to state the degree to which they perceived 16 antisocial behaviors to be a problem in their area. Results revealed that speeding traffic was perceived as the greatest problem in local communities, regardless of whether respondents were male or female, young, middle aged, or old. The rating of speeding traffic as the greatest problem in the community was replicated in a second, smaller postal survey, where respondents also provided strong support for enforcement on residential roads, and indicated that traveling immediately above the speed limit on residential roads was unacceptable. Results are discussed in relation to practical implications for speed enforcement, and the prioritization of limited police resources. (c) 2006 Elsevier Ltd. All rights reserved.
Resumo:
There is remarkable agreement in expectations today for vastly improved ocean data management a decade from now -- capabilities that will help to bring significant benefits to ocean research and to society. Advancing data management to such a degree, however, will require cultural and policy changes that are slow to effect. The technological foundations upon which data management systems are built are certain to continue advancing rapidly in parallel. These considerations argue for adopting attitudes of pragmatism and realism when planning data management strategies. In this paper we adopt those attitudes as we outline opportunities for progress in ocean data management. We begin with a synopsis of expectations for integrated ocean data management a decade from now. We discuss factors that should be considered by those evaluating candidate “standards”. We highlight challenges and opportunities in a number of technical areas, including “Web 2.0” applications, data modeling, data discovery and metadata, real-time operational data, archival of data, biological data management and satellite data management. We discuss the importance of investments in the development of software toolkits to accelerate progress. We conclude the paper by recommending a few specific, short term targets for implementation, that we believe to be both significant and achievable, and calling for action by community leadership to effect these advancements.
Resumo:
The modelling of a nonlinear stochastic dynamical processes from data involves solving the problems of data gathering, preprocessing, model architecture selection, learning or adaptation, parametric evaluation and model validation. For a given model architecture such as associative memory networks, a common problem in non-linear modelling is the problem of "the curse of dimensionality". A series of complementary data based constructive identification schemes, mainly based on but not limited to an operating point dependent fuzzy models, are introduced in this paper with the aim to overcome the curse of dimensionality. These include (i) a mixture of experts algorithm based on a forward constrained regression algorithm; (ii) an inherent parsimonious delaunay input space partition based piecewise local lineal modelling concept; (iii) a neurofuzzy model constructive approach based on forward orthogonal least squares and optimal experimental design and finally (iv) the neurofuzzy model construction algorithm based on basis functions that are Bézier Bernstein polynomial functions and the additive decomposition. Illustrative examples demonstrate their applicability, showing that the final major hurdle in data based modelling has almost been removed.
Resumo:
In a world of almost permanent and rapidly increasing electronic data availability, techniques of filtering, compressing, and interpreting this data to transform it into valuable and easily comprehensible information is of utmost importance. One key topic in this area is the capability to deduce future system behavior from a given data input. This book brings together for the first time the complete theory of data-based neurofuzzy modelling and the linguistic attributes of fuzzy logic in a single cohesive mathematical framework. After introducing the basic theory of data-based modelling, new concepts including extended additive and multiplicative submodels are developed and their extensions to state estimation and data fusion are derived. All these algorithms are illustrated with benchmark and real-life examples to demonstrate their efficiency. Chris Harris and his group have carried out pioneering work which has tied together the fields of neural networks and linguistic rule-based algortihms. This book is aimed at researchers and scientists in time series modeling, empirical data modeling, knowledge discovery, data mining, and data fusion.
Resumo:
We are developing computational tools supporting the detailed analysis of the dependence of neural electrophysiological response on dendritic morphology. We approach this problem by combining simulations of faithful models of neurons (experimental real life morphological data with known models of channel kinetics) with algorithmic extraction of morphological and physiological parameters and statistical analysis. In this paper, we present the novel method for an automatic recognition of spike trains in voltage traces, which eliminates the need for human intervention. This enables classification of waveforms with consistent criteria across all the analyzed traces and so it amounts to reduction of the noise in the data. This method allows for an automatic extraction of relevant physiological parameters necessary for further statistical analysis. In order to illustrate the usefulness of this procedure to analyze voltage traces, we characterized the influence of the somatic current injection level on several electrophysiological parameters in a set of modeled neurons. This application suggests that such an algorithmic processing of physiological data extracts parameters in a suitable form for further investigation of structure-activity relationship in single neurons.