749 resultados para Hydrological classification


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Various complex oscillatory processes are involved in the generation of the motor command. The temporal dynamics of these processes were studied for movement detection from single trial electroencephalogram (EEG). Autocorrelation analysis was performed on the EEG signals to find robust markers of movement detection. The evolution of the autocorrelation function was characterised via the relaxation time of the autocorrelation by exponential curve fitting. It was observed that the decay constant of the exponential curve increased during movement, indicating that the autocorrelation function decays slowly during motor execution. Significant differences were observed between movement and no moment tasks. Additionally, a linear discriminant analysis (LDA) classifier was used to identify movement trials with a peak accuracy of 74%.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Increasing concentrations of greenhouse gases in the atmosphere are expected to modify the global water cycle with significant consequences for terrestrial hydrology. We assess the impact of climate change on hydrological droughts in a multimodel experiment including seven global impact models (GIMs) driven by bias-corrected climate from five global climate models under four representative concentration pathways (RCPs). Drought severity is defined as the fraction of land under drought conditions. Results show a likely increase in the global severity of hydrological drought at the end of the 21st century, with systematically greater increases for RCPs describing stronger radiative forcings. Under RCP8.5, droughts exceeding 40% of analyzed land area are projected by nearly half of the simulations. This increase in drought severity has a strong signal-to-noise ratio at the global scale, and Southern Europe, the Middle East, the Southeast United States, Chile, and South West Australia are identified as possible hotspots for future water security issues. The uncertainty due to GIMs is greater than that from global climate models, particularly if including a GIM that accounts for the dynamic response of plants to CO2 and climate, as this model simulates little or no increase in drought frequency. Our study demonstrates that different representations of terrestrial water-cycle processes in GIMs are responsible for a much larger uncertainty in the response of hydrological drought to climate change than previously thought. When assessing the impact of climate change on hydrology, it is therefore critical to consider a diverse range of GIMs to better capture the uncertainty.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Future changes in runoff can have important implications for water resources and flooding. In this study, runoff projections from ISI-MIP (Inter-sectoral Impact Model Inter-comparison Project) simulations forced with HadGEM2-ES bias-corrected climate data under the Representative Concentration Pathway 8.5 have been analysed for differences between impact models. Projections of change from a baseline period (1981-2010) to the future (2070-2099) from 12 impacts models which contributed to the hydrological and biomes sectors of ISI-MIP were studied. The biome models differed from the hydrological models by the inclusion of CO2 impacts and most also included a dynamic vegetation distribution. The biome and hydrological models agreed on the sign of runoff change for most regions of the world. However, in West Africa, the hydrological models projected drying, and the biome models a moistening. The biome models tended to produce larger increases and smaller decreases in regionally averaged runoff than the hydrological models, although there is large inter-model spread. The timing of runoff change was similar, but there were differences in magnitude, particularly at peak runoff. The impact of vegetation distribution change was much smaller than the projected change over time, while elevated CO2 had an effect as large as the magnitude of change over time projected by some models in some regions. The effect of CO2 on runoff was not consistent across the models, with two models showing increases and two decreases. There was also more spread in projections from the runs with elevated CO2 than with constant CO2. The biome models which gave increased runoff from elevated CO2 were also those which differed most from the hydrological models. Spatially, regions with most difference between model types tended to be projected to have most effect from elevated CO2, and seasonal differences were also similar, so elevated CO2 can partly explain the differences between hydrological and biome model runoff change projections. Therefore, this shows that a range of impact models should be considered to give the full range of uncertainty in impacts studies.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Seven catchments of diverse size in Mediterranean Europe were investigated in order to understand the main aspects of their hydrological functioning. The methods included the analysis of daily and monthly precipitation, monthly potential evapotranspiration rates, flow duration curves, rainfall runoff relationships and catchment internal data for the smaller and more instrumented catchments. The results showed that the catchments were less dry than initially considered. Only one of them was really semi-arid throughout the year. All the remaining catchments showed wet seasons when precipitation exceeded potential evapotrans-piration, allowing aquifer recharge, wet runoff generation mechanisms and relevant baseflow contribution. Nevertheless, local infiltration excess (Hortonian) overland flow was inferred during summer storms in some catchments and urban overland flow in some others. The roles of karstic groundwater, human disturbance and low winter temperatures were identified as having an important impact on the hydrological regime in some of the catchments.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

When studying hydrological processes with a numerical model, global sensitivity analysis (GSA) is essential if one is to understand the impact of model parameters and model formulation on results. However, different definitions of sensitivity can lead to a difference in the ranking of importance of the different model factors. Here we combine a fuzzy performance function with different methods of calculating global sensitivity to perform a multi-method global sensitivity analysis (MMGSA). We use an application of a finite element subsurface flow model (ESTEL-2D) on a flood inundation event on a floodplain of the River Severn to illustrate this new methodology. We demonstrate the utility of the method for model understanding and show how the prediction of state variables, such as Darcian velocity vectors, can be affected by such a MMGSA. This paper is a first attempt to use GSA with a numerically intensive hydrological model.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Information was collated on the seed storage behaviour of 67 tree species native to the Amazon rainforest of Brazil; 38 appeared to show orthodox, 23 recalcitrant and six intermediate seed storage behaviour. A double-criteria key based on thousand-seed weight and seed moisture content at shedding to estimate likely seed storage behaviour, developed previously, showed good agreement with the above classifications. The key can aid seed storage behaviour identification considerably.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper discusses ECG classification after parametrizing the ECG waveforms in the wavelet domain. The aim of the work is to develop an accurate classification algorithm that can be used to diagnose cardiac beat abnormalities detected using a mobile platform such as smart-phones. Continuous time recurrent neural network classifiers are considered for this task. Records from the European ST-T Database are decomposed in the wavelet domain using discrete wavelet transform (DWT) filter banks and the resulting DWT coefficients are filtered and used as inputs for training the neural network classifier. Advantages of the proposed methodology are the reduced memory requirement for the signals which is of relevance to mobile applications as well as an improvement in the ability of the neural network in its generalization ability due to the more parsimonious representation of the signal to its inputs.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper discusses ECG signal classification after parametrizing the ECG waveforms in the wavelet domain. Signal decomposition using perfect reconstruction quadrature mirror filter banks can provide a very parsimonious representation of ECG signals. In the current work, the filter parameters are adjusted by a numerical optimization algorithm in order to minimize a cost function associated to the filter cut-off sharpness. The goal consists of achieving a better compromise between frequency selectivity and time resolution at each decomposition level than standard orthogonal filter banks such as those of the Daubechies and Coiflet families. Our aim is to optimally decompose the signals in the wavelet domain so that they can be subsequently used as inputs for training to a neural network classifier.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Advances in hardware and software technologies allow to capture streaming data. The area of Data Stream Mining (DSM) is concerned with the analysis of these vast amounts of data as it is generated in real-time. Data stream classification is one of the most important DSM techniques allowing to classify previously unseen data instances. Different to traditional classifiers for static data, data stream classifiers need to adapt to concept changes (concept drift) in the stream in real-time in order to reflect the most recent concept in the data as accurately as possible. A recent addition to the data stream classifier toolbox is eRules which induces and updates a set of expressive rules that can easily be interpreted by humans. However, like most rule-based data stream classifiers, eRules exhibits a poor computational performance when confronted with continuous attributes. In this work, we propose an approach to deal with continuous data effectively and accurately in rule-based classifiers by using the Gaussian distribution as heuristic for building rule terms on continuous attributes. We show on the example of eRules that incorporating our method for continuous attributes indeed speeds up the real-time rule induction process while maintaining a similar level of accuracy compared with the original eRules classifier. We termed this new version of eRules with our approach G-eRules.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Advances in hardware technologies allow to capture and process data in real-time and the resulting high throughput data streams require novel data mining approaches. The research area of Data Stream Mining (DSM) is developing data mining algorithms that allow us to analyse these continuous streams of data in real-time. The creation and real-time adaption of classification models from data streams is one of the most challenging DSM tasks. Current classifiers for streaming data address this problem by using incremental learning algorithms. However, even so these algorithms are fast, they are challenged by high velocity data streams, where data instances are incoming at a fast rate. This is problematic if the applications desire that there is no or only a very little delay between changes in the patterns of the stream and absorption of these patterns by the classifier. Problems of scalability to Big Data of traditional data mining algorithms for static (non streaming) datasets have been addressed through the development of parallel classifiers. However, there is very little work on the parallelisation of data stream classification techniques. In this paper we investigate K-Nearest Neighbours (KNN) as the basis for a real-time adaptive and parallel methodology for scalable data stream classification tasks.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The skill of a forecast can be assessed by comparing the relative proximity of both the forecast and a benchmark to the observations. Example benchmarks include climatology or a naïve forecast. Hydrological ensemble prediction systems (HEPS) are currently transforming the hydrological forecasting environment but in this new field there is little information to guide researchers and operational forecasters on how benchmarks can be best used to evaluate their probabilistic forecasts. In this study, it is identified that the forecast skill calculated can vary depending on the benchmark selected and that the selection of a benchmark for determining forecasting system skill is sensitive to a number of hydrological and system factors. A benchmark intercomparison experiment is then undertaken using the continuous ranked probability score (CRPS), a reference forecasting system and a suite of 23 different methods to derive benchmarks. The benchmarks are assessed within the operational set-up of the European Flood Awareness System (EFAS) to determine those that are ‘toughest to beat’ and so give the most robust discrimination of forecast skill, particularly for the spatial average fields that EFAS relies upon. Evaluating against an observed discharge proxy the benchmark that has most utility for EFAS and avoids the most naïve skill across different hydrological situations is found to be meteorological persistency. This benchmark uses the latest meteorological observations of precipitation and temperature to drive the hydrological model. Hydrological long term average benchmarks, which are currently used in EFAS, are very easily beaten by the forecasting system and the use of these produces much naïve skill. When decomposed into seasons, the advanced meteorological benchmarks, which make use of meteorological observations from the past 20 years at the same calendar date, have the most skill discrimination. They are also good at discriminating skill in low flows and for all catchment sizes. Simpler meteorological benchmarks are particularly useful for high flows. Recommendations for EFAS are to move to routine use of meteorological persistency, an advanced meteorological benchmark and a simple meteorological benchmark in order to provide a robust evaluation of forecast skill. This work provides the first comprehensive evidence on how benchmarks can be used in evaluation of skill in probabilistic hydrological forecasts and which benchmarks are most useful for skill discrimination and avoidance of naïve skill in a large scale HEPS. It is recommended that all HEPS use the evidence and methodology provided here to evaluate which benchmarks to employ; so forecasters can have trust in their skill evaluation and will have confidence that their forecasts are indeed better.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Three coupled knowledge transfer partnerships used pattern recognition techniques to produce an e-procurement system which, the National Audit Office reports, could save the National Health Service £500 m per annum. An extension to the system, GreenInsight, allows the environmental impact of procurements to be assessed and savings made. Both systems require suitable products to be discovered and equivalent products recognised, for which classification is a key component. This paper describes the innovative work done for product classification, feature selection and reducing the impact of mislabelled data.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We extend extreme learning machine (ELM) classifiers to complex Reproducing Kernel Hilbert Spaces (RKHS) where the input/output variables as well as the optimization variables are complex-valued. A new family of classifiers, called complex-valued ELM (CELM) suitable for complex-valued multiple-input–multiple-output processing is introduced. In the proposed method, the associated Lagrangian is computed using induced RKHS kernels, adopting a Wirtinger calculus approach formulated as a constrained optimization problem similarly to the conventional ELM classifier formulation. When training the CELM, the Karush–Khun–Tuker (KKT) theorem is used to solve the dual optimization problem that consists of satisfying simultaneously smallest training error as well as smallest norm of output weights criteria. The proposed formulation also addresses aspects of quaternary classification within a Clifford algebra context. For 2D complex-valued inputs, user-defined complex-coupled hyper-planes divide the classifier input space into four partitions. For 3D complex-valued inputs, the formulation generates three pairs of complex-coupled hyper-planes through orthogonal projections. The six hyper-planes then divide the 3D space into eight partitions. It is shown that the CELM problem formulation is equivalent to solving six real-valued ELM tasks, which are induced by projecting the chosen complex kernel across the different user-defined coordinate planes. A classification example of powdered samples on the basis of their terahertz spectral signatures is used to demonstrate the advantages of the CELM classifiers compared to their SVM counterparts. The proposed classifiers retain the advantages of their ELM counterparts, in that they can perform multiclass classification with lower computational complexity than SVM classifiers. Furthermore, because of their ability to perform classification tasks fast, the proposed formulations are of interest to real-time applications.