940 resultados para flood frequency evaluation


Relevância:

80.00% 80.00%

Publicador:

Resumo:

The necessity/convenience for improving accuracy in determining the flood frequency is widely accepted further than among hydrologists, and is increasingly deepened in relationship with the statement of different thresholds related to the respective management systems. And both Scientific and Management Communities fully accept the necessity of living with determined levels of flood risk. Most of the approaches for “Advancing Methods” improving concentrate on the statistical ways, even since Climate in fact is not a Stationary process. The question is here reflected since the SMARTeST research and final highlights, policy and recommendations. The paper looks at a better agreement between Hydrology and the whole Climate as the result of the Global Thermal Machine and takes mainly into account a historical approach, trying to show the necessity of a wider collection and analysis of climate data for statistical approaches.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The necessity/convenience for improving accuracy in determining the flood frequency is widely accepted further than among hydrologists, and is increasingly deepened in relationship with the statement of different thresholds related to the respective management systems. And both Scientific and Management Communities fully accept the necessity of living with determined levels of flood risk. Most of the approaches for “Advancing Methods” improving concentrate on the statistical ways, even since Climate in fact is not a Stationary process. The question is here reflected since the SMARTeST research and final highlights, policy and recommendations. The paper looks at a better agreement between Hydrology and the whole Climate as the result of the Global Thermal Machine and takes mainly into account a historical approach, trying to show the necessity of a wider collection and analysis of climate data for statistical approaches.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

National Highway Traffic Safety Administration, Washington, D.C.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

This paper proposes a theoretical explanation of the variations of the sediment delivery ratio (SDR) versus catchment area relationships and the complex patterns in the behavior of sediment transfer processes at catchment scale. Taking into account the effects of erosion source types, deposition, and hydrological controls, we propose a simple conceptual model that consists of two linear stores arranged in series: a hillslope store that addresses transport to the nearest streams and a channel store that addresses sediment routing in the channel network. The model identifies four dimensionless scaling factors, which enable us to analyze a variety of effects on SDR estimation, including (1) interacting processes of erosion sources and deposition, (2) different temporal averaging windows, and (3) catchment runoff response. We show that the interactions between storm duration and hillslope/channel travel times are the major controls of peak-value-based sediment delivery and its spatial variations. The interplay between depositional timescales and the travel/residence times determines the spatial variations of total-volume-based SDR. In practical terms this parsimonious, minimal complexity model could provide a sound physical basis for diagnosing catchment to catchment variability of sediment transport if the proposed scaling factors can be quantified using climatic and catchment properties.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

This paper presents a scientific and technical description of the modelling framework and the main results of modelling the long-term average sediment delivery at hillslope to medium-scale catchments over the entire Murray Darling Basin (MDB). A theoretical development that relates long-term averaged sediment delivery to the statistics of rainfall and catchment parameters is presented. The derived flood frequency approach was adapted to investigate the problem of regionalization of the sediment delivery ratio (SDR) across the Basin. SDR, a measure of catchment response to the upland erosion rate, was modeled by two lumped linear stores arranged in series: hillslope transport to the nearest streams and flow routing in the channel network. The theory shows that the ratio of catchment sediment residence time (SRT) to average effective rainfall duration is the most important control in the sediment delivery processes. In this study, catchment SRTs were estimated using travel time for overland flow multiplied by an enlargement factor which is a function of particle size. Rainfall intensity and effective duration statistics were regionalized by using long-term measurements from 195 pluviograph sites within and around the Basin. Finally, the model was implemented across the MDB by using spatially distributed soil, vegetation, topographical and land use properties under Geographic Information System (GIs) environment. The results predict strong variations in SDR from close to 0 in floodplains to 70% in the eastern uplands of the Basin. (c) 2005 Elsevier Ltd. All rights reserved.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Index-flood related regional frequency analysis (RFA) procedures are in use by hydrologists to estimate design quantiles of hydrological extreme events at data sparse/ungauged locations in river basins. There is a dearth of attempts to establish which among those procedures is better for RFA in the L-moment framework. This paper evaluates the performance of the conventional index flood (CIF), the logarithmic index flood (LIF), and two variants of the population index flood (PIF) procedures in estimating flood quantiles for ungauged locations by Monte Carlo simulation experiments and a case study on watersheds in Indiana in the U.S. To evaluate the PIF procedure, L-moment formulations are developed for implementing the procedure in situations where the regional frequency distribution (RFD) is the generalized logistic (GLO), generalized Pareto (GPA), generalized normal (GNO) or Pearson type III (PE3), as those formulations are unavailable. Results indicate that one of the variants of the PIF procedure, which utilizes the regional information on the first two L-moments is more effective than the CIF and LIF procedures. The improvement in quantile estimation using the variant of PIF procedure as compared with the CIF procedure is significant when the RFD is a generalized extreme value, GLO, GNO, or PE3, and marginal when it is GPA. (C) 2015 American Society of Civil Engineers.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Dynamic Voltage and Frequency Scaling (DVFS) offers a huge potential for designing trade-offs involving energy, power, temperature and performance of computing systems. In this paper, we evaluate three different DVFS schemes - our enhancement of a Petri net performance model based DVFS method for sequential programs to stream programs, a simple profile based Linear Scaling method, and an existing hardware based DVFS method for multithreaded applications - using multithreaded stream applications, in a full system Chip Multiprocessor (CMP) simulator. From our evaluation, we find that the software based methods achieve significant Energy/Throughput2(ET−2) improvements. The hardware based scheme degrades performance heavily and suffers ET−2 loss. Our results indicate that the simple profile based scheme achieves the benefits of the complex Petri net based scheme for stream programs, and present a strong case for the need for independent voltage/frequency control for different cores of CMPs, which is lacking in most of the state-of-the-art CMPs. This is in contrast to the conclusions of a recent evaluation of per-core DVFS schemes for multithreaded applications for CMPs.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

A low-order harmonic pulsating torque is a major concern in high-power drives, high-speed drives, and motor drives operating in an overmodulation region. This paper attempts to minimize the low-order harmonic torques in induction motor drives, operated at a low pulse number (i.e., a low ratio of switching frequency to fundamental frequency), through a frequency domain (FD) approach as well as a synchronous reference frame (SRF) based approach. This paper first investigates FD-based approximate elimination of harmonic torque as suggested by classical works. This is then extended into a procedure for minimization of low-order pulsating torque components in the FD, which is independent of machine parameters and mechanical load. Furthermore, an SRF-based optimal pulse width modulation (PWM) method is proposed to minimize the low-order harmonic torques, considering the motor parameters and load torque. The two optimal methods are evaluated and compared with sine-triangle (ST) PWM and selective harmonic elimination (SHE) PWM through simulations and experimental studies on a 3.7-kW induction motor drive. The SRF-based optimal PWM results in marginally better performance than the FD-based one. However, the selection of optimal switching angle for any modulation index (M) takes much longer in case of SRF than in case of the FD-based approach. The FD-based optimal solutions can be used as good starting solutions and/or to reasonably restrict the search space for optimal solutions in the SRF-based approach. Both of the FD-based and SRF-based optimal PWM methods reduce the low-order pulsating torque significantly, compared to ST PWM and SHE PWM, as shown by the simulation and experimental results.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

ENGLISH: A two-stage sampling design is used to estimate the variances of the numbers of yellowfin in different age groups caught in the eastern Pacific Ocean. For purse seiners, the primary sampling unit (n) is a brine well containing fish from a month-area stratum; the number of fish lengths (m) measured from each well are the secondary units. The fish cannot be selected at random from the wells because of practical limitations. The effects of different sampling methods and other factors on the reliability and precision of statistics derived from the length-frequency data were therefore examined. Modifications are recommended where necessary. Lengths of fish measured during the unloading of six test wells revealed two forms of inherent size stratification: 1) short-term disruptions of existing pattern of sizes, and 2) transition zones between long-term trends in sizes. To some degree, all wells exhibited cyclic changes in mean size and variance during unloading. In half of the wells, it was observed that size selection by the unloaders induced a change in mean size. As a result of stratification, the sequence of sizes removed from all wells was non-random, regardless of whether a well contained fish from a single set or from more than one set. The number of modal sizes in a well was not related to the number of sets. In an additional well composed of fish from several sets, an experiment on vertical mixing indicated that a representative sample of the contents may be restricted to the bottom half of the well. The contents of the test wells were used to generate 25 simulated wells and to compare the results of three sampling methods applied to them. The methods were: (1) random sampling (also used as a standard), (2) protracted sampling, in which the selection process was extended over a large portion of a well, and (3) measuring fish consecutively during removal from the well. Repeated sampling by each method and different combinations indicated that, because the principal source of size variation occurred among primary units, increasing n was the most effective way to reduce the variance estimates of both the age-group sizes and the total number of fish in the landings. Protracted sampling largely circumvented the effects of size stratification, and its performance was essentially comparable to that of random sampling. Sampling by this method is recommended. Consecutive-fish sampling produced more biased estimates with greater variances. Analysis of the 1988 length-frequency samples indicated that, for age groups that appear most frequently in the catch, a minimum sampling frequency of one primary unit in six for each month-area stratum would reduce the coefficients of variation (CV) of their size estimates to approximately 10 percent or less. Additional stratification of samples by set type, rather than month-area alone, further reduced the CV's of scarce age groups, such as the recruits, and potentially improved their accuracy. The CV's of recruitment estimates for completely-fished cohorts during the 198184 period were in the vicinity of 3 to 8 percent. Recruitment estimates and their variances were also relatively insensitive to changes in the individual quarterly catches and variances, respectively, of which they were composed. SPANISH: Se usa un diseño de muestreo de dos etapas para estimar las varianzas de los números de aletas amari11as en distintos grupos de edad capturados en el Océano Pacifico oriental. Para barcos cerqueros, la unidad primaria de muestreo (n) es una bodega de salmuera que contenía peces de un estrato de mes-área; el numero de ta11as de peces (m) medidas de cada bodega es la unidad secundaria. Limitaciones de carácter practico impiden la selección aleatoria de peces de las bodegas. Por 10 tanto, fueron examinados los efectos de distintos métodos de muestreo y otros factores sobre la confiabilidad y precisión de las estadísticas derivadas de los datos de frecuencia de ta11a. Se recomiendan modificaciones donde sean necesarias. Las ta11as de peces medidas durante la descarga de seis bodegas de prueba revelaron dos formas de estratificación inherente por ta11a: 1) perturbaciones a corto plazo en la pauta de ta11as existente, y 2) zonas de transición entre las tendencias a largo plazo en las ta11as. En cierto grado, todas las bodegas mostraron cambios cíclicos en ta11a media y varianza durante la descarga. En la mitad de las bodegas, se observo que selección por ta11a por los descargadores indujo un cambio en la ta11a media. Como resultado de la estratificación, la secuencia de ta11as sacadas de todas las bodegas no fue aleatoria, sin considerar si una bodega contenía peces de un solo lance 0 de mas de uno. El numero de ta11as modales en una bodega no estaba relacionado al numero de lances. En una bodega adicional compuesta de peces de varios lances, un experimento de mezcla vertical indico que una muestra representativa del contenido podría estar limitada a la mitad inferior de la bodega. Se uso el contenido de las bodegas de prueba para generar 25 bodegas simuladas y comparar los resultados de tres métodos de muestreo aplicados a estas. Los métodos fueron: (1) muestreo aleatorio (usado también como norma), (2) muestreo extendido, en el cual el proceso de selección fue extendido sobre una porción grande de una bodega, y (3) medición consecutiva de peces durante la descarga de la bodega. EI muestreo repetido con cada método y distintas combinaciones de n y m indico que, puesto que la fuente principal de variación de ta11a ocurría entre las unidades primarias, aumentar n fue la manera mas eficaz de reducir las estimaciones de la varianza de las ta11as de los grupos de edad y el numero total de peces en los desembarcos. El muestreo extendido evito mayormente los efectos de la estratificación por ta11a, y su desempeño fue esencialmente comparable a aquel del muestreo aleatorio. Se recomienda muestrear con este método. El muestreo de peces consecutivos produjo estimaciones mas sesgadas con mayores varianzas. Un análisis de las muestras de frecuencia de ta11a de 1988 indico que, para los grupos de edad que aparecen con mayor frecuencia en la captura, una frecuencia de muestreo minima de una unidad primaria de cada seis para cada estrato de mes-área reduciría los coeficientes de variación (CV) de las estimaciones de ta11a correspondientes a aproximadamente 10% 0 menos. Una estratificación adicional de las muestras por tipo de lance, y no solamente mes-área, redujo aun mas los CV de los grupos de edad escasos, tales como los reclutas, y mejoró potencialmente su precisión. Los CV de las estimaciones del reclutamiento para las cohortes completamente pescadas durante 1981-1984 fueron alrededor de 3-8%. Las estimaciones del reclutamiento y sus varianzas fueron también relativamente insensibles a cambios en las capturas de trimestres individuales y las varianzas, respectivamente, de las cuales fueron derivadas. (PDF contains 70 pages)

Relevância:

40.00% 40.00%

Publicador:

Resumo:

We develop general model-free adjustment procedures for the calculation of unbiased volatility loss functions based on practically feasible realized volatility benchmarks. The procedures, which exploit recent nonparametric asymptotic distributional results, are both easy-to-implement and highly accurate in empirically realistic situations. We also illustrate that properly accounting for the measurement errors in the volatility forecast evaluations reported in the existing literature can result in markedly higher estimates for the true degree of return volatility predictability.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

A replicate evaluation of increased micronucleus (MN) frequencies in peripheral lymphocytes of workers occupationally exposed to formaldehyde (FA) was undertaken to verify the observed effect and to determine scoring variability. May–Grünwald–Giemsa-stained slides were obtained from a previously performed cytokinesis-block micronucleus test (CBMNT) with 56 workers in anatomy and pathology laboratories and 85 controls. The first evaluation by one scorer (scorer 1) had led to a highly significant difference between workers and controls (3.96 vs 0.81 MN per 1000 cells). The slides were coded before re-evaluation and the code was broken after the complete re-evaluation of the study. A total of 1000 binucleated cells (BNC) were analysed per subject and the frequency of MN (in ‰) was determined. Slides were distributed equally and randomly between two scorers, so that the scorers had no knowledge of the exposure status. Scorer 2 (32 exposed, 36 controls) measured increased MN frequencies in exposed workers (9.88 vs 6.81). Statistical analysis with the two-sample Wilcoxon test indicated that this difference was not significant (p = 0.17). Scorer 3 (20 exposed, 46 controls) obtained a similar result, but slightly higher values for the comparison of exposed and controls (19.0 vs 12.89; p = 0.089). Combining the results of the two scorers (13.38 vs 10.22), a significant difference between exposed and controls (p = 0.028) was obtained when the stratified Wilcoxon test with the scorers as strata was applied. Interestingly, the re-evaluation of the slides led to clearly higher MN frequencies for exposed and controls compared with the first evaluation. Bland–Altman plots indicated that the agreement between the measurements of the different scorers was very poor, as shown by mean differences of 5.9 between scorer 1 and scorer 2 and 13.0 between scorer 1 and scorer 3. Calculation of the intra-class correlation coefficient (ICC) revealed that all scorer comparisons in this study were far from acceptable for the reliability of this assay. Possible implications for the use of the CBMNT in human biomonitoring studies are discussed.