945 resultados para Probability Distribution Function


Relevância:

80.00% 80.00%

Publicador:

Resumo:

Distributed generation unlike centralized electrical generation aims to generate electrical energy on small scale as near as possible to load centers, interchanging electric power with the network. This work presents a probabilistic methodology conceived to assist the electric system planning engineers in the selection of the distributed generation location, taking into account the hourly load changes or the daily load cycle. The hourly load centers, for each of the different hourly load scenarios, are calculated deterministically. These location points, properly weighted according to their load magnitude, are used to calculate the best fit probability distribution. This distribution is used to determine the maximum likelihood perimeter of the area where each source distributed generation point should preferably be located by the planning engineers. This takes into account, for example, the availability and the cost of the land lots, which are factors of special relevance in urban areas, as well as several obstacles important for the final selection of the candidates of the distributed generation points. The proposed methodology has been applied to a real case, assuming three different bivariate probability distributions: the Gaussian distribution, a bivariate version of Freund’s exponential distribution and the Weibull probability distribution. The methodology algorithm has been programmed in MATLAB. Results are presented and discussed for the application of the methodology to a realistic case and demonstrate the ability of the proposed methodology for efficiently handling the determination of the best location of the distributed generation and their corresponding distribution networks.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

A new effective isotropic potential is proposed for the dipolar hard-sphere fluid, on the basis of recent results by others for its angle-averaged radial distribution function. The new effective potential is shown to exhibit oscillations even for moderately high densities and moderately strong dipole moments, which are absent from earlier effective isotropic potentials. The validity and significance of this result are briefly discussed.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Modern real-time systems, with a more flexible and adaptive nature, demand approaches for timeliness evaluation based on probabilistic measures of meeting deadlines. In this context, simulation can emerge as an adequate solution to understand and analyze the timing behaviour of actual systems. However, care must be taken with the obtained outputs under the penalty of obtaining results with lack of credibility. Particularly important is to consider that we are more interested in values from the tail of a probability distribution (near worst-case probabilities), instead of deriving confidence on mean values. We approach this subject by considering the random nature of simulation output data. We will start by discussing well known approaches for estimating distributions out of simulation output, and the confidence which can be applied to its mean values. This is the basis for a discussion on the applicability of such approaches to derive confidence on the tail of distributions, where the worst-case is expected to be.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Wind resource evaluation in two sites located in Portugal was performed using the mesoscale modelling system Weather Research and Forecasting (WRF) and the wind resource analysis tool commonly used within the wind power industry, the Wind Atlas Analysis and Application Program (WAsP) microscale model. Wind measurement campaigns were conducted in the selected sites, allowing for a comparison between in situ measurements and simulated wind, in terms of flow characteristics and energy yields estimates. Three different methodologies were tested, aiming to provide an overview of the benefits and limitations of these methodologies for wind resource estimation. In the first methodology the mesoscale model acts like “virtual” wind measuring stations, where wind data was computed by WRF for both sites and inserted directly as input in WAsP. In the second approach, the same procedure was followed but here the terrain influences induced by the mesoscale model low resolution terrain data were removed from the simulated wind data. In the third methodology, the simulated wind data is extracted at the top of the planetary boundary layer height for both sites, aiming to assess if the use of geostrophic winds (which, by definition, are not influenced by the local terrain) can bring any improvement in the models performance. The obtained results for the abovementioned methodologies were compared with those resulting from in situ measurements, in terms of mean wind speed, Weibull probability density function parameters and production estimates, considering the installation of one wind turbine in each site. Results showed that the second tested approach is the one that produces values closest to the measured ones, and fairly acceptable deviations were found using this coupling technique in terms of estimated annual production. However, mesoscale output should not be used directly in wind farm sitting projects, mainly due to the mesoscale model terrain data poor resolution. Instead, the use of mesoscale output in microscale models should be seen as a valid alternative to in situ data mainly for preliminary wind resource assessments, although the application of mesoscale and microscale coupling in areas with complex topography should be done with extreme caution.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

We consider a Bertrand duopoly model with unknown costs. The firms' aim is to choose the price of its product according to the well-known concept of Bayesian Nash equilibrium. The chooses are made simultaneously by both firms. In this paper, we suppose that each firm has two different technologies, and uses one of them according to a certain probability distribution. The use of either one or the other technology affects the unitary production cost. We show that this game has exactly one Bayesian Nash equilibrium. We analyse the advantages, for firms and for consumers, of using the technology with highest production cost versus the one with cheapest production cost. We prove that the expected profit of each firm increases with the variance of its production costs. We also show that the expected price of each good increases with both expected production costs, being the effect of the expected production costs of the rival dominated by the effect of the own expected production costs.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

This paper studies the effect of ship speed and water depth on the propagation of ship generated waves. The ship is represented by a moving pressure distribution function at the free surface that is able to reproduce most of the phenomena involved in wave propagation. Results are obtained for a ship sailing along a coastal stretch made of a sloping bottom and a constant depth region. The results show that in the sloping bottom the crests of waves are bent along the slope and in the constant depth the standard Kelvin wave patterns can be found for the subcritical regime. In the critical regime the wave system is characterized by significant diverging waves and for a supercritical regime, the transverse waves disappear. © 2015 Taylor & Francis Group, London.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Energy resource scheduling is becoming increasingly important, such as the use of more distributed generators and electric vehicles connected to the distribution network. This paper proposes a methodology to be used by Virtual Power Players (VPPs), regarding the energy resource scheduling in smart grids and considering day-ahead, hour-ahead and realtime time horizons. This method considers that energy resources are managed by a VPP which establishes contracts with their owners. The full AC power flow calculation included in the model takes into account network constraints. In this paper, distribution function errors are used to simulate variations between time horizons, and to measure the performance of the proposed methodology. A 33-bus distribution network with large number of distributed resources is used.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The use of appropriate acceptance criteria in the risk assessment process for occupational accidents is an important issue but often overlooked in the literature, particularly when new risk assessment methods are proposed and discussed. In most cases, there is no information on how or by whom they were defined, or even how companies can adapt them to their own circumstances. Bearing this in mind, this study analysed the problem of the definition of risk acceptance criteria for occupational settings, defining the quantitative acceptance criteria for the specific case study of the Portuguese furniture industrial sector. The key steps to be considered in formulating acceptance criteria were analysed in the literature review. By applying the identified steps, the acceptance criteria for the furniture industrial sector were then defined. The Cumulative Distribution Function (CDF) for the injury statistics of the industrial sector was identified as the maximum tolerable risk level. The acceptable threshold was defined by adjusting the CDF to the Occupational, Safety & Health (OSH) practitioners’ risk acceptance judgement. Adjustments of acceptance criteria to the companies’ safety cultures were exemplified by adjusting the Burr distribution parameters. An example of a risk matrix was also used to demonstrate the integration of the defined acceptance criteria into a risk metric. This work has provided substantial contributions to the issue of acceptance criteria for occupational accidents, which may be useful in overcoming the practical difficulties faced by authorities, companies and experts.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

In this paper, we consider a Stackelberg duopoly competition with differentiated goods and with unknown costs. The firms' aim is to choose the output levels of their products according to the well-known concept of perfect Bayesian equilibrium. There is a firm ( F1 ) that chooses first the quantity 1 q of its good; the other firm ( F2 ) observes 1 q and then chooses the quantity 2 q of its good. We suppose that each firm has two different technologies, and uses one of them following a probability distribution. The use of either one or the other technology affects the unitary production cost. We show that there is exactly one perfect Bayesian equilibrium for this game. We analyse the advantages, for firms and for consumers, of using the technology with the highest production cost versus the one with the cheapest cost.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The conclusions of the Bertrand model of competition are substantially altered by the presence of either differentiated goods or asymmetric information about rival’s production costs. In this paper, we consider a Bertrand competition, with differentiated goods. Furthermore, we suppose that each firm has two different technologies, and uses one of them according to a certain probability distribution. The use of either one or the other technology affects the unitary production cost. We show that this game has exactly one Bayesian Nash equilibrium. We do ex-ante and ex-post analyses of firms’ profits and market prices. We prove that the expected profit of each firm increases with the variance of its production costs. We also show that the expected price of each good increases with both expected production costs, being the effect of the expected production costs of the rival dominated by the effect of the own expected production costs.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

In this paper, we consider a Stackelberg duopoly competition with differentiated goods, linear and symmetric demand and with unknown costs. In our model, the two firms play a non-cooperative game with two stages: in a first stage, firm F 1 chooses the quantity, q 1, that is going to produce; in the second stage, firm F 2 observes the quantity q 1 produced by firm F 1 and chooses its own quantity q 2. Firms choose their output levels in order to maximise their profits. We suppose that each firm has two different technologies, and uses one of them following a certain probability distribution. The use of either one or the other technology affects the unitary production cost. We show that there is exactly one perfect Bayesian equilibrium for this game. We analyse the variations of the expected profits with the parameters of the model, namely with the parameters of the probability distributions, and with the parameters of the demand and differentiation.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

We consider two firms, located in different countries, selling the same homogeneous good in both countries. In each country there is a non negative tariff on imports of the good produced in the other country. We suppose that each firm has two different technologies, and uses one of them according to a certain probability distribution. The use of either one or the other technology affects the unitary production cost. We analyse the effect of the production costs uncertainty on the profits of the firms and also on the welfare of the governments.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

We consider a dynamic setting-price duopoly model in which a dominant (leader) firm moves first and a subordinate (follower) firm moves second. We suppose that each firm has two different technologies, and uses one of them according to a certain probability distribution. The use of either one or the other technology affects the unitary production cost. We analyse the effect of the production costs uncertainty on the profits of the firms, for different values of the intercept demand parameters.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Dissertação apresentada para obtenção do Grau de Doutor em Engenharia do Ambiente, pela Universidade Nova de Lisboa, Faculdade de Ciências e Tecnologia

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The use of appropriate acceptance criteria in the risk assessment process for occupational accidents is an important issue but often overlooked in the literature, particularly when new risk assessment methods are proposed and discussed. In most cases, there is no information on how or by whom they were defined, or even how companies can adapt them to their own circumstances. Bearing this in mind, this study analysed the problem of the definition of risk acceptance criteria for occupational settings, defining the quantitative acceptance criteria for the specific case study of the Portuguese furniture industrial sector. The key steps to be considered in formulating acceptance criteria were analysed in the literature review. By applying the identified steps, the acceptance criteria for the furniture industrial sector were then defined. The Cumulative Distribution Function (CDF) for the injury statistics of the industrial sector was identified as the maximum tolerable risk level. The acceptable threshold was defined by adjusting the CDF to the Occupational, Safety & Health (OSH) practitioners’ risk acceptance judgement. Adjustments of acceptance criteria to the companies’ safety cultures were exemplified by adjusting the Burr distribution parameters. An example of a risk matrix was also used to demonstrate the integration of the defined acceptance criteria into a risk metric. This work has provided substantial contributions to the issue of acceptance criteria for occupational accidents, which may be useful in overcoming the practical difficulties faced by authorities, companies and experts.