36 resultados para vision-based performance analysis
Resumo:
Over the last forty years, applying dendrogeomorphology to palaeoflood analysis has improved estimates of the frequency and magnitude of past floods worldwide. This paper reviews the main results obtained by applying dendrogeomorphology to flood research in several case studies in Central Spain. These dendrogeomorphological studies focused on the following topics: (1) anatomical analysis to understand the physiological response of trees to flood damage and improve sampling efficiency; (2) compiling robust flood chronologies in ungauged mountain streams, (3) determining flow depth and estimating flood discharge using two-dimensional hydraulic modelling, and comparing them with other palaeostage indicators; (4) calibrating hydraulic model parameters (i.e. Manning roughness); and (5) implementing stochastic-based, cost–benefit analysis to select optimal mitigation measures. The progress made in these areas is presented with suggestions for further research to improve the applicability of dendrogeochronology to palaeoflood studies. Further developments will include new methods for better identification of the causes of specific types of flood damage to trees (e.g. tilted trees) or stable isotope analysis of tree rings to identify the climatic conditions associated with periods of increasing flood magnitude or frequency.
Resumo:
Dendrogeomorphology uses information sources recorded in the roots, trunks and branches of trees and bushes located in the fluvial system to complement (or sometimes even replace) systematic and palaeohydrological records of past floods. The application of dendrogeomorphic data sources and methods to palaeoflood analysis over nearly 40 years has allowed improvements to be made in frequency and magnitude estimations of past floods. Nevertheless, research carried out so far has shown that the dendrogeomorphic indicators traditionally used (mainly scar evidence), and their use to infer frequency and magnitude, have been restricted to a small, limited set of applications. New possibilities with enormous potential remain unexplored. New insights in future research of palaeoflood frequency and magnitude using dendrogeomorphic data sources should: (1) test the application of isotopic indicators (16O/18O ratio) to discover the meteorological origin of past floods; (2) use different dendrogeomorphic indicators to estimate peak flows with 2D (and 3D) hydraulic models and study how they relate to other palaeostage indicators; (3) investigate improved calibration of 2D hydraulic model parameters (roughness); and (4) apply statistics-based cost–benefit analysis to select optimal mitigation measures. This paper presents an overview of these innovative methodologies, with a focus on their capabilities and limitations in the reconstruction of recent floods and palaeofloods.
Resumo:
Besides its primary role in producing food and fiber, agriculture also has relevant effects on several other functions, such as management of renewable natural resources. Climate change (CC) may lead to new trade-offs between agricultural functions or aggravate existing ones, but suitable agricultural management may maintain or even improve the ability of agroecosystems to supply these functions. Hence, it is necessary to identify relevant drivers (e.g., cropping practices, local conditions) and their interactions, and how they affect agricultural functions in a changing climate. The goal of this study was to use a modeling framework to analyze the sensitivity of indicators of three important agricultural functions, namely crop yield (food and fiber production function), soil erosion (soil conservation function), and nutrient leaching (clean water provision function), to a wide range of agricultural practices for current and future climate conditions. In a two-step approach, cropping practices that explain high proportions of variance of the different indicators were first identified by an analysis of variance-based sensitivity analysis. Then, most suitable combinations of practices to achieve best performance with respect to each indicator were extracted, and trade-offs were analyzed. The procedure was applied to a region in western Switzerland, considering two different soil types to test the importance of local environmental constraints. Results show that the sensitivity of crop yield and soil erosion due to management is high, while nutrient leaching mostly depends on soil type. We found that the influence of most agricultural practices does not change significantly with CC; only irrigation becomes more relevant as a consequence of decreasing summer rainfall. Trade-offs were identified when focusing on best performances of each indicator separately, and these were amplified under CC. For adaptation to CC in the selected study region, conservation soil management and the use of cropped grasslands appear to be the most suitable options to avoid trade-offs.
Resumo:
Computer vision-based food recognition could be used to estimate a meal's carbohydrate content for diabetic patients. This study proposes a methodology for automatic food recognition, based on the Bag of Features (BoF) model. An extensive technical investigation was conducted for the identification and optimization of the best performing components involved in the BoF architecture, as well as the estimation of the corresponding parameters. For the design and evaluation of the prototype system, a visual dataset with nearly 5,000 food images was created and organized into 11 classes. The optimized system computes dense local features, using the scale-invariant feature transform on the HSV color space, builds a visual dictionary of 10,000 visual words by using the hierarchical k-means clustering and finally classifies the food images with a linear support vector machine classifier. The system achieved classification accuracy of the order of 78%, thus proving the feasibility of the proposed approach in a very challenging image dataset.
Resumo:
In patients diagnosed with pharmaco-resistant epilepsy, cerebral areas responsible for seizure generation can be defined by performing implantation of intracranial electrodes. The identification of the epileptogenic zone (EZ) is based on visual inspection of the intracranial electroencephalogram (IEEG) performed by highly qualified neurophysiologists. New computer-based quantitative EEG analyses have been developed in collaboration with the signal analysis community to expedite EZ detection. The aim of the present report is to compare different signal analysis approaches developed in four different European laboratories working in close collaboration with four European Epilepsy Centers. Computer-based signal analysis methods were retrospectively applied to IEEG recordings performed in four patients undergoing pre-surgical exploration of pharmaco-resistant epilepsy. The four methods elaborated by the different teams to identify the EZ are based either on frequency analysis, on nonlinear signal analysis, on connectivity measures or on statistical parametric mapping of epileptogenicity indices. All methods converge on the identification of EZ in patients that present with fast activity at seizure onset. When traditional visual inspection was not successful in detecting EZ on IEEG, the different signal analysis methods produced highly discordant results. Quantitative analysis of IEEG recordings complement clinical evaluation by contributing to the study of epileptogenic networks during seizures. We demonstrate that the degree of sensitivity of different computer-based methods to detect the EZ in respect to visual EEG inspection depends on the specific seizure pattern.
Resumo:
Cloud Computing has evolved to become an enabler for delivering access to large scale distributed applications running on managed network-connected computing systems. This makes possible hosting Distributed Enterprise Information Systems (dEISs) in cloud environments, while enforcing strict performance and quality of service requirements, defined using Service Level Agreements (SLAs). {SLAs} define the performance boundaries of distributed applications, and are enforced by a cloud management system (CMS) dynamically allocating the available computing resources to the cloud services. We present two novel VM-scaling algorithms focused on dEIS systems, which optimally detect most appropriate scaling conditions using performance-models of distributed applications derived from constant-workload benchmarks, together with SLA-specified performance constraints. We simulate the VM-scaling algorithms in a cloud simulator and compare against trace-based performance models of dEISs. We compare a total of three SLA-based VM-scaling algorithms (one using prediction mechanisms) based on a real-world application scenario involving a large variable number of users. Our results show that it is beneficial to use autoregressive predictive SLA-driven scaling algorithms in cloud management systems for guaranteeing performance invariants of distributed cloud applications, as opposed to using only reactive SLA-based VM-scaling algorithms.