882 resultados para Predicted Distribution Data


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Vehicle activated signs (VAS) display a warning message when drivers exceed a particular threshold. VAS are often installed on local roads to display a warning message depending on the speed of the approaching vehicles. VAS are usually powered by electricity; however, battery and solar powered VAS are also commonplace. This thesis investigated devel-opment of an automatic trigger speed of vehicle activated signs in order to influence driver behaviour, the effect of which has been measured in terms of reduced mean speed and low standard deviation. A comprehen-sive understanding of the effectiveness of the trigger speed of the VAS on driver behaviour was established by systematically collecting data. Specif-ically, data on time of day, speed, length and direction of the vehicle have been collected for the purpose, using Doppler radar installed at the road. A data driven calibration method for the radar used in the experiment has also been developed and evaluated. Results indicate that trigger speed of the VAS had variable effect on driv-ers’ speed at different sites and at different times of the day. It is evident that the optimal trigger speed should be set near the 85th percentile speed, to be able to lower the standard deviation. In the case of battery and solar powered VAS, trigger speeds between the 50th and 85th per-centile offered the best compromise between safety and power consump-tion. Results also indicate that different classes of vehicles report differ-ences in mean speed and standard deviation; on a highway, the mean speed of cars differs slightly from the mean speed of trucks, whereas a significant difference was observed between the classes of vehicles on lo-cal roads. A differential trigger speed was therefore investigated for the sake of completion. A data driven approach using Random forest was found to be appropriate in predicting trigger speeds respective to types of vehicles and traffic conditions. The fact that the predicted trigger speed was found to be consistently around the 85th percentile speed justifies the choice of the automatic model.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Instrumentation and automation plays a vital role to managing the water industry. These systems generate vast amounts of data that must be effectively managed in order to enable intelligent decision making. Time series data management software, commonly known as data historians are used for collecting and managing real-time (time series) information. More advanced software solutions provide a data infrastructure or utility wide Operations Data Management System (ODMS) that stores, manages, calculates, displays, shares, and integrates data from multiple disparate automation and business systems that are used daily in water utilities. These ODMS solutions are proven and have the ability to manage data from smart water meters to the collaboration of data across third party corporations. This paper focuses on practical, utility successes in the water industry where utility managers are leveraging instantaneous access to data from proven, commercial off-the-shelf ODMS solutions to enable better real-time decision making. Successes include saving $650,000 / year in water loss control, safeguarding water quality, saving millions of dollars in energy management and asset management. Immediate opportunities exist to integrate the research being done in academia with these ODMS solutions in the field and to leverage these successes to utilities around the world.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this research the 3DVAR data assimilation scheme is implemented in the numerical model DIVAST in order to optimize the performance of the numerical model by selecting an appropriate turbulence scheme and tuning its parameters. Two turbulence closure schemes: the Prandtl mixing length model and the two-equation k-ε model were incorporated into DIVAST and examined with respect to their universality of application, complexity of solutions, computational efficiency and numerical stability. A square harbour with one symmetrical entrance subject to tide-induced flows was selected to investigate the structure of turbulent flows. The experimental part of the research was conducted in a tidal basin. A significant advantage of such laboratory experiment is a fully controlled environment where domain setup and forcing are user-defined. The research shows that the Prandtl mixing length model and the two-equation k-ε model, with default parameterization predefined according to literature recommendations, overestimate eddy viscosity which in turn results in a significant underestimation of velocity magnitudes in the harbour. The data assimilation of the model-predicted velocity and laboratory observations significantly improves model predictions for both turbulence models by adjusting modelled flows in the harbour to match de-errored observations. 3DVAR allows also to identify and quantify shortcomings of the numerical model. Such comprehensive analysis gives an optimal solution based on which numerical model parameters can be estimated. The process of turbulence model optimization by reparameterization and tuning towards optimal state led to new constants that may be potentially applied to complex turbulent flows, such as rapidly developing flows or recirculating flows.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Drinking water distribution networks risk exposure to malicious or accidental contamination. Several levels of responses are conceivable. One of them consists to install a sensor network to monitor the system on real time. Once a contamination has been detected, this is also important to take appropriate counter-measures. In the SMaRT-OnlineWDN project, this relies on modeling to predict both hydraulics and water quality. An online model use makes identification of the contaminant source and simulation of the contaminated area possible. The objective of this paper is to present SMaRT-OnlineWDN experience and research results for hydraulic state estimation with sampling frequency of few minutes. A least squares problem with bound constraints is formulated to adjust demand class coefficient to best fit the observed values at a given time. The criterion is a Huber function to limit the influence of outliers. A Tikhonov regularization is introduced for consideration of prior information on the parameter vector. Then the Levenberg-Marquardt algorithm is applied that use derivative information for limiting the number of iterations. Confidence intervals for the state prediction are also given. The results are presented and discussed on real networks in France and Germany.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Hydrological loss is a vital component in many hydrological models, which are usedin forecasting floods and evaluating water resources for both surface and subsurface flows. Due to the complex and random nature of the rainfall runoff process, hydrological losses are not yet fully understood. Consequently, practitioners often use representative values of the losses for design applications such as rainfall-runoff modelling which has led to inaccurate quantification of water quantities in the resulting applications. The existing hydrological loss models must be revisited and modellers should be encouraged to utilise other available data sets. This study is based on three unregulated catchments situated in Mt. Lofty Ranges of South Australia (SA). The paper focuses on conceptual models for: initial loss (IL), continuing loss (CL) and proportional loss (PL) with rainfall characteristics (total rainfall (TR) and storm duration (D)), and antecedent wetness (AW) conditions. The paper introduces two methods that can be implemented to estimate IL as a function of TR, D and AW. The IL distribution patterns and parameters for the study catchments are determined using multivariate analysis and descriptive statistics. The possibility of generalising the methods and the limitations of this are also discussed. This study will yield improvements to existing loss models and will encourage practitioners to utilise multiple data sets to estimate losses, instead of using hypothetical or representative values to generalise real situations.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Este trabalho examinou as características de carteiras compostas por ações e otimizadas segundo o critério de média-variância e formadas através de estimativas robustas de risco e retorno. A motivação para isto é a distribuição típica de ativos financeiros (que apresenta outliers e mais curtose que a distribuição normal). Para comparação entre as carteiras, foram consideradas suas propriedades: estabilidade, variabilidade e os índices de Sharpe obtidos pelas mesmas. O resultado geral mostra que estas carteiras obtidas através de estimativas robustas de risco e retorno apresentam melhoras em sua estabilidade e variabilidade, no entanto, esta melhora é insuficiente para diferenciar os índices de Sharpe alcançados pelas mesmas das carteiras obtidas através de método de máxima verossimilhança para estimativas de risco e retorno.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Access has been one of the main difficulties companies have faced in emerging markets (PRAHALAD, 2005). The capillarity of the market, the existence of small, not professionalized and sometimes informal retailers, the lack of infrastructure and high transportation costs are some of the distribution challenges companies face in poorer regions. The literature concerning the Base of the Pyramid (BoP) is still recent and only after the seminal article by Prahalad and Hart (2002), it evolved into many different management perspectives. However, there is a lack of researches concerning distribution strategies to the BoP. Therefore, the main objective of this research is to identify, in the perception of executives working in the market, the conditions associated to a satisfactory distribution for the BoP market in Brazil and to build a substantive theory that helps to shed light to the understanding of the distribution phenomenon adopted by consumer goods companies to reach the BoP market in Brazil. In order to accomplish the objectives of this thesis, a grounded theory methodology (Glaser; Strauss, 1967; Corbin; Strauss, 2008) was used. This approach helped to identify the channel strategies used by local and global companies in the market. Many techniques for data collection were applied. The most important one was in-depth interviews with 26 executives from 24 different consumer goods companies in Brazil. Among the companies there were small, medium and large enterprises; which were also grouped as manufacturers, distributors and retailers. Furthermore, secondary data were examined to identify business strategies to reach BoP and map global distribution initiatives. A database from a consumer panel was also used to analyze what and where BoP consumers purchase non-durable goods. It was verified that small and traditional retailing is a very strong format in BoP markets and in the Northern/Northeastern regions. Cash & Carry is a format that is growing a lot. On the other hand, hypermarkets are not very used by low income population. The results suggest that three major categories are associated to a satisfactory distribution: (a) willingness, which means the effort, knowledge and enthusiasm a firm has to operate at BoP markets; (b) well-done execution, which is related to designing correctly the marketing channel and operating efficiently in an environment full of obstacles, such as lack of infrastructure, capillarity, lack of safety, regional differences and informality, and (c) relationship, which was perceived to be friendlier and essential at BoP markets, since it is very difficult for manufacturers to reach the entire market alone. It is more likely to have a satisfactory distribution when manufacturers establish strong relationships in the marketing channel. Besides, small retailers have a perception of isolation and expect a higher level of relationship. These major categories explain also the competitive advantage that local companies have in relation to MNCs and large companies. Despite of the limitations of an exploratory study, it is expected that this thesis will contribute to the BoP knowledge as well as to the identification of the peculiarities of distribution in BoP markets.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

To assess the quality of school education, much of educational research is concerned with comparisons of test scores means or medians. In this paper, we shift this focus and explore test scores data by addressing some often neglected questions. In the case of Brazil, the mean of test scores in Math for students of the fourth grade has declined approximately 0,2 standard deviation in the late 1990s. But what about changes in the distribution of scores? It is unclear whether the decline was caused by deterioration in student performance in upper and/or lower tails of the distribution. To answer this question, we propose the use of the relative distribution method developed by Handcock and Morris (1999). The advantage of this methodology is that it compares two distributions of test scores data through a single distribution and synthesizes all the differences between them. Moreover, it is possible to decompose the total difference between two distributions in a level effect (changes in median) and shape effect (changes in shape of the distribution). We find that the decline of average-test scores is mainly caused by a worsening in the position of all students throughout the distribution of scores and is not only specific to any quantile of distribution.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Competitive Strategy literature predicts three different mechanisms of performance generation, thus distinguishing between firms that have competitive advantage, firms that have competitive disadvantage or firms that have neither. Nonetheless, previous works in the field have fitted a single normal distribution to model firm performance. Here, we develop a new approach that distinguishes among performance generating mechanisms and allows the identification of firms with competitive advantage or disadvantage. Theorizing on the positive feedback loops by which firms with competitive advantage have facilitated access to acquire new resources, we proposed a distribution we believe data on firm performance should follow. We illustrate our model by assessing its fit to data on firm performance, addressing its theoretical implications and comparing it to previous works.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

According to Diamond (1977), one of the reasons for the existence of social security systems is that they function as an income redistribution mechanism. There is an extensive literature that tests whether social security systems produce the desired results in developed countries (mainly for the U.S.A.). Nevertheless, there is not an obvious consensus about this social security property and there is little evidence for developing countries. In this article, we test this property for the Brazilian Social Security System. In addition, we also look at another question which has not been answered yet in the previous literature. Is the trend of social security systems increasingly progressive or regressive? We conclude that the changes in Brazilian Social Security legislation reduced inequality between 1987 and 1996, but only for the elderly. For the other age groups, there is a stable trend. Results for the period between 1996 and 2006 reveal that the Brazilian system is neutral for all cohorts. Therefore, we found out that social security systems are not an effective mechanism for income redistribution, as predicted by previous studies.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Este trabalho explora um importante conceito desenvolvido por Breeden & Litzenberger para extrair informações contidas nas opções de juros no mercado brasileiro (Opção Sobre IDI), no âmbito da Bolsa de Valores, Mercadorias e Futuros de São Paulo (BM&FBOVESPA) dias antes e após a decisão do COPOM sobre a taxa Selic. O método consiste em determinar a distribuição de probabilidade através dos preços das opções sobre IDI, após o cálculo da superfície de volatilidade implícita, utilizando duas técnicas difundidas no mercado: Interpolação Cúbica (Spline Cubic) e Modelo de Black (1976). Serão analisados os quatro primeiros momentos da distribuição: valor esperado, variância, assimetria e curtose, assim como suas respectivas variações.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper examines the evolution of wage inequality in Brazil in the 1980s and 1990s . It tries to investigate the role played by changing economic returns to education and to experience over this period together with the evolution of within-group inequality. It applies a quantile regression approach on grouped data to the Brazilian case. Results using repeated cross-sections of a Brazilian annual household survey indicate that : i) Male wage dispersion remained basically constant overall in the 1980's and 1990' s but has increased substantially within education and age groups. ii) Returns to experience increased significantly over this period, with the rise concentrated on the iliterate/primary school group iii) Returns to college education have risen over time, whereas return to intermediate and high-school education have fallen iv) The apparent rise in within-group inequality seems to be the result of a fall in real wages, since the difference in wage leveIs has dec1ined substantially over the period, especially within the high-educated sample. v) Returns to experience rise with education. vi) Returns to education rise over the life-cycle. vii) Wage inequality increases over the life-cycle. The next step i~ this research will try to conciliate all these stylised facts.