933 resultados para rainfall-runoff empirical statistical model
Resumo:
In the study of the spatial characteristics of the visual channels, the power spectrum model of visual masking is one of the most widely used. When the task is to detect a signal masked by visual noise, this classical model assumes that the signal and the noise are previously processed by a bank of linear channels and that the power of the signal at threshold is proportional to the power of the noise passing through the visual channel that mediates detection. The model also assumes that this visual channel will have the highest ratio of signal power to noise power at its output. According to this, there are masking conditions where the highest signal-to-noise ratio (SNR) occurs in a channel centered in a spatial frequency different from the spatial frequency of the signal (off-frequency looking). Under these conditions the channel mediating detection could vary with the type of noise used in the masking experiment and this could affect the estimation of the shape and the bandwidth of the visual channels. It is generally believed that notched noise, white noise and double bandpass noise prevent off-frequency looking, and high-pass, low-pass and bandpass noises can promote it independently of the channel's shape. In this study, by means of a procedure that finds the channel that maximizes the SNR at its output, we performed numerical simulations using the power spectrum model to study the characteristics of masking caused by six types of one-dimensional noise (white, high-pass, low-pass, bandpass, notched, and double bandpass) for two types of channel's shape (symmetric and asymmetric). Our simulations confirm that (1) high-pass, low-pass, and bandpass noises do not prevent the off-frequency looking, (2) white noise satisfactorily prevents the off-frequency looking independently of the shape and bandwidth of the visual channel, and interestingly we proved for the first time that (3) notched and double bandpass noises prevent off-frequency looking only when the noise cutoffs around the spatial frequency of the signal match the shape of the visual channel (symmetric or asymmetric) involved in the detection. In order to test the explanatory power of the model with empirical data, we performed six visual masking experiments. We show that this model, with only two free parameters, fits the empirical masking data with high precision. Finally, we provide equations of the power spectrum model for six masking noises used in the simulations and in the experiments.
Resumo:
This study analyzes the manifestation of the dimensions of Entrepreneurial Orientation (EO) and Project Management Systems (PMS). We used a qualitative approach to conduct exploratory research through a study in literature and a pilot case in a software company. Data was collected from semi structured interviews, documents, and records on file, then triangulated and treated with content analysis. The model proposed for the relationship between the types of PMS (ad hoc, Classic PM, innovation, entrepreneurship/intrapreneurship) and the dimensions of EO (innovativeness, risk-taking, proactiveness, competitive aggressiveness, and autonomy), was partially corroborated by empirical studies. New studies are suggested to validate the applicability and setup of the model.
Resumo:
This research was conducted to investigate the management of knowledge flows in a Mauritian multinational organisation. A case study research method was used to gather data which was analysed using the SECI model. Results show that all the four quadrants of this model were applied by the conglomerate in transferring knowledge to its newly acquired manufacturing operations in Madagascar. This paper discusses some of the knowledge management strategies employed.
Resumo:
Extreme weather events related to deep convection are high-impact critical phenomena whose reliable numerical simulation is still challenging. High-resolution (convection-permitting) modeling setups allow to switch off physical parameterizations accountable for substantial errors in convection representation. A new convection-permitting reanalysis over Italy (SPHERA) has been produced at ARPAE to enhance the representation and understanding of extreme weather situations. SPHERA is obtained through a dynamical downscaling of the global reanalysis ERA5 using the non-hydrostatic model COSMO at 2.2 km grid spacing over 1995-2020. This thesis aims to verify the expectations placed on SPHERA by analyzing two weather phenomena that are particularly challenging to simulate: heavy rainfall and hail. A quantitative statistical analysis over Italy during 2003-2017 for daily and hourly precipitation is presented to compare the performance of SPHERA with its driver ERA5 considering the national network of rain gauges as reference. Furthermore, two extreme precipitation events are deeply investigated. SPHERA shows a quantitative added skill over ERA5 for moderate to severe and rapid accumulations in terms of adherence to the observations, higher detailing of the spatial fields, and more precise temporal matching. These results prompted the use of SPHERA for the investigation of hailstorms, for which the combination of multiple information is crucial to reduce the substantial uncertainties permeating their understanding. A proxy for hail is developed by combining hail-favoring environmental numerical predictors with observations of ESWD hail reports and satellite overshooting top detections. The procedure is applied to the extended summer season (April-October) of 2016-2018 over the whole SPHERA spatial domain. The results indicate maximum hail likelihood over pre-Alpine regions and the northern Adriatic sea around 15 UTC in June-July, in agreement with recent European hail climatologies. The method demonstrates enhanced performance in case of severe hail occurrences and the ability to separate between ambient signatures depending on hail severity.
Resumo:
Este trabalho avalia o desempenho de previsões sazonais do modelo climático regional RegCM3, aninhado ao modelo global CPTEC/COLA. As previsões com o RegCM3 utilizaram 60 km de resolução horizontal num domínio que inclui grande parte da América do Sul. As previsões do RegCM3 e CPTEC/COLA foram avaliadas utilizando as análises de chuva e temperatura do ar do Climate Prediction Center (CPC) e National Centers for Enviromental Prediction (NCEP), respectivamente. Entre maio de 2005 e julho de 2007, 27 previsões sazonais de chuva e temperatura do ar (exceto a temperatura do CPTEC/COLA, que possui 26 previsões) foram avaliadas em três regiões do Brasil: Nordeste (NDE), Sudeste (SDE) e Sul (SUL). As previsões do RegCM3 também foram comparadas com as climatologias das análises. De acordo com os índices estatísticos (bias, coeficiente de correlação, raiz quadrada do erro médio quadrático e coeficiente de eficiência), nas três regiões (NDE, SDE e SUL) a chuva sazonal prevista pelo RegCM3 é mais próxima da observada do que a prevista pelo CPTEC/COLA. Além disto, o RegCM3 também é melhor previsor da chuva sazonal do que da média das observações nas três regiões. Para temperatura, as previsões do RegCM3 são superiores às do CPTEC/COLA nas áreas NDE e SUL, enquanto o CPTEC/COLA é superior no SDE. Finalmente, as previsões de chuva e temperatura do RegCM3 são mais próximas das observações do que a climatologia observada. Estes resultados indicam o potencial de utilização do RegCM3 para previsão sazonal, que futuramente deverá ser explorado através de previsão por conjunto.
Resumo:
We describe an estimation technique for biomass burning emissions in South America based on a combination of remote-sensing fire products and field observations, the Brazilian Biomass Burning Emission Model (3BEM). For each fire pixel detected by remote sensing, the mass of the emitted tracer is calculated based on field observations of fire properties related to the type of vegetation burning. The burnt area is estimated from the instantaneous fire size retrieved by remote sensing, when available, or from statistical properties of the burn scars. The sources are then spatially and temporally distributed and assimilated daily by the Coupled Aerosol and Tracer Transport model to the Brazilian developments on the Regional Atmospheric Modeling System (CATT-BRAMS) in order to perform the prognosis of related tracer concentrations. Three other biomass burning inventories, including GFEDv2 and EDGAR, are simultaneously used to compare the emission strength in terms of the resultant tracer distribution. We also assess the effect of using the daily time resolution of fire emissions by including runs with monthly-averaged emissions. We evaluate the performance of the model using the different emission estimation techniques by comparing the model results with direct measurements of carbon monoxide both near-surface and airborne, as well as remote sensing derived products. The model results obtained using the 3BEM methodology of estimation introduced in this paper show relatively good agreement with the direct measurements and MOPITT data product, suggesting the reliability of the model at local to regional scales.
Resumo:
We introduce a simple mean-field lattice model to describe the behavior of nematic elastomers. This model combines the Maier-Saupe-Zwanzig approach to liquid crystals and an extension to lattice systems of the Warner-Terentjev theory of elasticity, with the addition of quenched random fields. We use standard techniques of statistical mechanics to obtain analytic solutions for the full range of parameters. Among other results, we show the existence of a stress-strain coexistence curve below a freezing temperature, analogous to the P-V diagram of a simple fluid, with the disorder strength playing the role of temperature. Below a critical value of disorder, the tie lines in this diagram resemble the experimental stress-strain plateau and may be interpreted as signatures of the characteristic polydomain-monodomain transition. Also, in the monodomain case, we show that random fields may soften the first-order transition between nematic and isotropic phases, provided the samples are formed in the nematic state.
Resumo:
We numerically study the dynamics of a discrete spring-block model introduced by Olami, Feder, and Christensen (OFC) to mimic earthquakes and investigate to what extent this simple model is able to reproduce the observed spatiotemporal clustering of seismicity. Following a recently proposed method to characterize such clustering by networks of recurrent events [J. Davidsen, P. Grassberger, and M. Paczuski, Geophys. Res. Lett. 33, L11304 (2006)], we find that for synthetic catalogs generated by the OFC model these networks have many nontrivial statistical properties. This includes characteristic degree distributions, very similar to what has been observed for real seismicity. There are, however, also significant differences between the OFC model and earthquake catalogs, indicating that this simple model is insufficient to account for certain aspects of the spatiotemporal clustering of seismicity.
Resumo:
Efficient automatic protein classification is of central importance in genomic annotation. As an independent way to check the reliability of the classification, we propose a statistical approach to test if two sets of protein domain sequences coming from two families of the Pfam database are significantly different. We model protein sequences as realizations of Variable Length Markov Chains (VLMC) and we use the context trees as a signature of each protein family. Our approach is based on a Kolmogorov-Smirnov-type goodness-of-fit test proposed by Balding et at. [Limit theorems for sequences of random trees (2008), DOI: 10.1007/s11749-008-0092-z]. The test statistic is a supremum over the space of trees of a function of the two samples; its computation grows, in principle, exponentially fast with the maximal number of nodes of the potential trees. We show how to transform this problem into a max-flow over a related graph which can be solved using a Ford-Fulkerson algorithm in polynomial time on that number. We apply the test to 10 randomly chosen protein domain families from the seed of Pfam-A database (high quality, manually curated families). The test shows that the distributions of context trees coming from different families are significantly different. We emphasize that this is a novel mathematical approach to validate the automatic clustering of sequences in any context. We also study the performance of the test via simulations on Galton-Watson related processes.
Resumo:
The Random Parameter model was proposed to explain the structure of the covariance matrix in problems where most, but not all, of the eigenvalues of the covariance matrix can be explained by Random Matrix Theory. In this article, we explore the scaling properties of the model, as observed in the multifractal structure of the simulated time series. We use the Wavelet Transform Modulus Maxima technique to obtain the multifractal spectrum dependence with the parameters of the model. The model shows a scaling structure compatible with the stylized facts for a reasonable choice of the parameter values. (C) 2009 Elsevier B.V. All rights reserved.
Resumo:
In this work we study an agent based model to investigate the role of asymmetric information degrees for market evolution. This model is quite simple and may be treated analytically since the consumers evaluate the quality of a certain good taking into account only the quality of the last good purchased plus her perceptive capacity beta. As a consequence, the system evolves according to a stationary Markov chain. The value of a good offered by the firms increases along with quality according to an exponent alpha, which is a measure of the technology. It incorporates all the technological capacity of the production systems such as education, scientific development and techniques that change the productivity rates. The technological level plays an important role to explain how the asymmetry of information may affect the market evolution in this model. We observe that, for high technological levels, the market can detect adverse selection. The model allows us to compute the maximum asymmetric information degree before the market collapses. Below this critical point the market evolves during a limited period of time and then dies out completely. When beta is closer to 1 (symmetric information), the market becomes more profitable for high quality goods, although high and low quality markets coexist. The maximum asymmetric information level is a consequence of an ergodicity breakdown in the process of quality evaluation. (C) 2011 Elsevier B.V. All rights reserved.
Resumo:
Literature presents a huge number of different simulations of gas-solid flows in risers applying two-fluid modeling. In spite of that, the related quantitative accuracy issue remains mostly untouched. This state of affairs seems to be mainly a consequence of modeling shortcomings, notably regarding the lack of realistic closures. In this article predictions from a two-fluid model are compared to other published two-fluid model predictions applying the same Closures, and to experimental data. A particular matter of concern is whether the predictions are generated or not inside the statistical steady state regime that characterizes the riser flows. The present simulation was performed inside the statistical steady state regime. Time-averaged results are presented for different time-averaging intervals of 5, 10, 15 and 20 s inside the statistical steady state regime. The independence of the averaged results regarding the time-averaging interval is addressed and the results averaged over the intervals of 10 and 20 s are compared to both experiment and other two-fluid predictions. It is concluded that the two-fluid model used is still very crude, and cannot provide quantitative accurate results, at least for the particular case that was considered. (C) 2009 Elsevier Inc. All rights reserved.
Resumo:
The purpose of this research is to analyze the contribution of human resources management throughout the evolutionary stages of environmental management in Brazilian companies. A theoretical framework concerning environmental management and its evolution and the `greening` of the functional and competitive dimensions of human resource management were developed. A methodological triangulation was developed in two complimentary phases. In the first phase, data were collected from 94 Brazilian companies with ISO 14001 certification. The data collected were analyzed and processed using statistical techniques. The conclusions of the first phase supported the second phase of this empirical research. The second phase consisted of a study of multiple cases in four Brazilian companies. The results show evidence of the first known empirical study of contributions of human resource dimensions throughout the stages of environmental management in Brazilian manufacturing companies.
Resumo:
An updated flow pattern map was developed for CO2 on the basis of the previous Cheng-Ribatski-Wojtan-Thome CO2 flow pattern map [1,2] to extend the flow pattern map to a wider range of conditions. A new annular flow to dryout transition (A-D) and a new dryout to mist flow transition (D-M) were proposed here. In addition, a bubbly flow region which generally occurs at high mass velocities and low vapor qualities was added to the updated flow pattern map. The updated flow pattern map is applicable to a much wider range of conditions: tube diameters from 0.6 to 10 mm, mass velocities from 50 to 1500 kg/m(2) s, heat fluxes from 1.8 to 46 kW/m(2) and saturation temperatures from -28 to +25 degrees C (reduced pressures from 0.21 to 0.87). The updated flow pattern map was compared to independent experimental data of flow patterns for CO2 in the literature and it predicts the flow patterns well. Then, a database of CO2 two-phase flow pressure drop results from the literature was set up and the database was compared to the leading empirical pressure drop models: the correlations by Chisholm [3], Friedel [4], Gronnerud [5] and Muller-Steinhagen and Heck [6], a modified Chisholm correlation by Yoon et al. [7] and the flow pattern based model of Moreno Quiben and Thome [8-10]. None of these models was able to predict the CO2 pressure drop data well. Therefore, a new flow pattern based phenomenological model of two-phase flow frictional pressure drop for CO2 was developed by modifying the model of Moreno Quiben and Thome using the updated flow pattern map in this study and it predicts the CO2 pressure drop database quite well overall. (C) 2007 Elsevier Ltd. All rights reserved.
Resumo:
Cementitious stabilization of aggregates and soils is an effective technique to increase the stiffness of base and subbase layers. Furthermore, cementitious bases can improve the fatigue behavior of asphalt surface layers and subgrade rutting over the short and long term. However, it can lead to additional distresses such as shrinkage and fatigue in the stabilized layers. Extensive research has tested these materials experimentally and characterized them; however, very little of this research attempts to correlate the mechanical properties of the stabilized layers with their performance. The Mechanistic Empirical Pavement Design Guide (MEPDG) provides a promising theoretical framework for the modeling of pavements containing cementitiously stabilized materials (CSMs). However, significant improvements are needed to bring the modeling of semirigid pavements in MEPDG to the same level as that of flexible and rigid pavements. Furthermore, the MEPDG does not model CSMs in a manner similar to those for hot-mix asphalt or portland cement concrete materials. As a result, performance gains from stabilized layers are difficult to assess using the MEPDG. The current characterization of CSMs was evaluated and issues with CSM modeling and characterization in the MEPDG were discussed. Addressing these issues will help designers quantify the benefits of stabilization for pavement service life.