39 resultados para FLASH

em BORIS: Bern Open Repository and Information System - Berna - Suiça


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Flash pulmonary edema (FPE) is a general clinical term used to describe a particularly dramatic form of acute decompensated heart failure. Well-established risk factors for heart failure such as hypertension, coronary ischemia, valvular heart disease, and diastolic dysfunction are associated with acute decompensated heart failure as well as with FPE. However, endothelial dysfunction possibly secondary to an excessive activity of renin-angiotensin-aldosterone system, impaired nitric oxide synthesis, increased endothelin levels, and/or excessive circulating catecholamines may cause excessive pulmonary capillary permeability and facilitate FPE formation. Renal artery stenosis particularly when bilateral has been identified has a common cause of FPE. Lack of diurnal variation in blood pressure and a widened pulse pressure have been identified as risk factors for FPE. This review is an attempt to delineate clinical and pathophysiological mechanisms responsible for FPE and to distinguish pathophysiologic, clinical, and therapeutic aspects of FPE from those of acute decompensated heart failure.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The reconstruction of past flash floods in ungauged basins leads to a high level of uncertainty, which increases if other processes are involved such as the transport of large wood material. An important flash flood occurred in 1997 in Venero Claro (Central Spain), causing significant economic losses. The wood material clogged bridge sections, raising the water level upstream. The aim of this study was to reconstruct this event, analysing the influence of woody debris transport on the flood hazard pattern. Because the reach in question was affected by backwater effects due to bridge clogging, using only high water mark or palaeostage indicators may overestimate discharges, and so other methods are required to estimate peak flows. Therefore, the peak discharge was estimated (123 ± 18 m3 s–1) using indirect methods, but one-dimensional hydraulic simulation was also used to validate these indirect estimates through an iterative process (127 ± 33 m3 s–1) and reconstruct the bridge obstruction to obtain the blockage ratio during the 1997 event (~48%) and the bridge clogging curves. Rainfall–Runoff modelling with stochastic simulation of different rainfall field configurations also helped to confirm that a peak discharge greater than 150 m3 s–1 is very unlikely to occur and that the estimated discharge range is consistent with the estimated rainfall amount (233 ± 27 mm). It was observed that the backwater effect due to the obstruction (water level ~7 m) made the 1997 flood (~35-year return period) equivalent to the 50-year flood. This allowed the equivalent return period to be defined as the recurrence interval of an event of specified magnitude, which, where large woody debris is present, is equivalent in water depth and extent of flooded area to a more extreme event of greater magnitude. These results highlight the need to include obstruction phenomena in flood hazard analysis.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

One of the main problems of flood hazard assessment in ungauged or poorly gauged basins is the lack of runoff data. In an attempt to overcome this problem we have combined archival records, dendrogeomorphic time series and instrumental data (daily rainfall and discharge) from four ungauged and poorly gauged mountain basins in Central Spain with the aim of reconstructing and compiling information on 41 flash flood events since the end of the 19th century. Estimation of historical discharge and the incorporation of uncertainty for the at-site and regional flood frequency analysis were performed with an empirical rainfall–runoff assessment as well as stochastic and Bayesian Markov Chain Monte Carlo (MCMC) approaches. Results for each of the ungauged basins include flood frequency, severity, seasonality and triggers (synoptic meteorological situations). The reconstructed data series clearly demonstrates how uncertainty can be reduced by including historical information, but also points to the considerable influence of different approaches on quantile estimation. This uncertainty should be taken into account when these data are used for flood risk management.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper addresses an investigation with machine learning (ML) classification techniques to assist in the problem of flash flood now casting. We have been attempting to build a Wireless Sensor Network (WSN) to collect measurements from a river located in an urban area. The machine learning classification methods were investigated with the aim of allowing flash flood now casting, which in turn allows the WSN to give alerts to the local population. We have evaluated several types of ML taking account of the different now casting stages (i.e. Number of future time steps to forecast). We have also evaluated different data representation to be used as input of the ML techniques. The results show that different data representation can lead to results significantly better for different stages of now casting.

Relevância:

20.00% 20.00%

Publicador:

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Data gathering, either for event recognition or for monitoring applications is the primary intention for sensor network deployments. In many cases, data is acquired periodically and autonomously, and simply logged onto secondary storage (e.g. flash memory) either for delayed offline analysis or for on demand burst transfer. Moreover, operational data such as connectivity information, node and network state is typically kept as well. Naturally, measurement and/or connectivity logging comes at a cost. Space for doing so is limited. Finding a good representative model for the data and providing clever coding of information, thus data compression, may be a means to use the available space to its best. In this paper, we explore the design space for data compression for wireless sensor and mesh networks by profiling common, publicly available algorithms. Several goals such as a low overhead in terms of utilized memory and compression time as well as a decent compression ratio have to be well balanced in order to find a simple, yet effective compression scheme.