408 resultados para Probability distribution


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Re-supplying loads on outage through cross-connect from adjacent feeders in a distribution system may cause voltage drop and hence require load shedding. However, the surplus PV generated in some of the LV feeders can prevent load shedding, and improve reliability. In order to measure these effects, this paper proposes the application of Direct Load Flow method[1] in reliability evaluation of distribution systems with PV units. As part of this study, seasonal impacts on load consumption together with surplus PV output power injection to higher voltage networks are also considered. New indices are proposed to measure yearly expected energy export, from LV to MV and from MV to higher voltage network.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This chapter presents the stability analysis based on bifurcation theory of the distribution static compensator (DSTATCOM) operating both in current control mode as in voltage control mode. The bifurcation analysis allows delimiting the operating zones of nonlinear power systems and hence the computation of these boundaries is of interest for practical design and planning purposes. Suitable mathematical representations of the DSTATCOM are proposed to carry out the bifurcation analyses efficiently. The stability regions in the Thevenin equivalent plane are computed for different power factors at the Point of Common Coupling (PCC). In addition, the stability regions in the control gain space are computed, and the DC capacitor and AC capacitor impact on the stability are analyzed in detail. It is shown through bifurcation analysis that the loss of stability in the DSTATCOM is in general due to the emergence of oscillatory dynamics. The observations are verified through detailed simulation studies.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This work examined the suitability of the PAGAT gel dosimeter for use in dose distribution measurements around high-density implants. An assessment of the gels reactivity with various metals was performed and no corrosive effects were observed. An artefact reduction technique was also investigated in order to minimise scattering of the laser light in the optical CT scans. The potential for attenuation and backscatter measurements using this gel dosimeter were examined for a temporary tissue expander's internal magnetic port.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The estimation of the critical gap has been an issue since the 1970s, when gap acceptance was introduced to evaluate the capacity of unsignalized intersections. The critical gap is the shortest gap that a driver is assumed to accept. A driver’s critical gap cannot be measured directly and a number of techniques have been developed to estimate the mean critical gaps of a sample of drivers. This paper reviews the ability of the Maximum Likelihood technique and the Probability Equilibrium Method to predict the mean and standard deviation of the critical gap with a simulation of 100 drivers, repeated 100 times for each flow condition. The Maximum Likelihood method gave consistent and unbiased estimates of the mean critical gap. Whereas the probability equilibrium method had a significant bias that was dependent on the flow in the priority stream. Both methods were reasonably consistent, although the Maximum Likelihood Method was slightly better. If drivers are inconsistent, then again the Maximum Likelihood method is superior. A criticism levelled at the Maximum Likelihood method is that a distribution of the critical gap has to be assumed. It was shown that this does not significantly affect its ability to predict the mean and standard deviation of the critical gaps. Finally, the Maximum Likelihood method can predict reasonable estimates with observations for 25 to 30 drivers. A spreadsheet procedure for using the Maximum Likelihood method is provided in this paper. The PEM can be improved if the maximum rejected gap is used.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this thesis various schemes using custom power devices for power quality improvement in low voltage distribution network are studied. Customer operated distributed generators makes a typical network non-radial and affect the power quality. A scheme considering different algorithm of DSTATCOM is proposed for power circulation and islanded operation of the system. To compensate reactive power overflow and facilitate unity power factor, a UPQC is introduced. Stochastic analysis is carried out for different scenarios to get a comprehensive idea about a real life distribution network. Combined operation of static compensator and voltage regulator is tested for the optimum quality and stability of the system.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Long-term measurements of particle number size distribution (PNSD) produce a very large number of observations and their analysis requires an efficient approach in order to produce results in the least possible time and with maximum accuracy. Clustering techniques are a family of sophisticated methods which have been recently employed to analyse PNSD data, however, very little information is available comparing the performance of different clustering techniques on PNSD data. This study aims to apply several clustering techniques (i.e. K-means, PAM, CLARA and SOM) to PNSD data, in order to identify and apply the optimum technique to PNSD data measured at 25 sites across Brisbane, Australia. A new method, based on the Generalised Additive Model (GAM) with a basis of penalised B-splines, was proposed to parameterise the PNSD data and the temporal weight of each cluster was also estimated using the GAM. In addition, each cluster was associated with its possible source based on the results of this parameterisation, together with the characteristics of each cluster. The performances of four clustering techniques were compared using the Dunn index and Silhouette width validation values and the K-means technique was found to have the highest performance, with five clusters being the optimum. Therefore, five clusters were found within the data using the K-means technique. The diurnal occurrence of each cluster was used together with other air quality parameters, temporal trends and the physical properties of each cluster, in order to attribute each cluster to its source and origin. The five clusters were attributed to three major sources and origins, including regional background particles, photochemically induced nucleated particles and vehicle generated particles. Overall, clustering was found to be an effective technique for attributing each particle size spectra to its source and the GAM was suitable to parameterise the PNSD data. These two techniques can help researchers immensely in analysing PNSD data for characterisation and source apportionment purposes.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

An expanding education market targeted through ‘bridging material’ enabling cineliteracies has the potential to offer Australian producers with increased distribution opportunities, educators with targeted teaching aids and students with enhanced learning outcomes. For Australian documentary producers, the key to unlocking the potential of the education sector is engaging with its curriculum-based requirements at the earliest stages of pre-production. Two key mechanisms can lead to effective educational engagement; the established area of study guides produced in association with the Australian Teachers of Media (ATOM) and the emerging area of philanthropic funding coordinated by the Documentary Australia Foundation (DAF). DAF has acted as a key financial and cultural philanthropic bridge between individuals, foundations, corporations and the Australian documentary sector for over 14 years. DAF does not make or commission films but through management and receipt of grants and donations provides ‘expertise, information, guidance and resources to help each sector work together to achieve their goals’. The DAF application process also requires film-makers to detail their ‘Education and Outreach Strategy’ for each film with 582 films registered and 39 completed as of June 2014. These education strategies that can range from detailed to cursory efforts offer valuable insights into the Australian documentary sector's historical and current expectations of education as a receptive and dynamic audience for quality factual content. A recurring film-maker education strategy found in the DAF data is an engagement with ATOM to create a study guide for their film. This study guide then acts as a ‘bridging material’ between content and education audience. The frequency of this effort suggests these study guides enable greater educator engagement with content and increased interest and distribution of the film to educators. The paper Education paths for documentary distribution: DAF, ATOM and the study guides that bind them will address issues arising out of the changing needs of the education sector and the impact targeting ‘cineliteracy’ outcomes may have for Australian documentary distribution.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This study examines hospital care system performance in Iran. We first briefly review hospital care delivery system in Iran. Then, the hospital care system in Iran has been investigated from financial, utilization, and quality perspectives. In particular, we examined the extent to which health care system in Iran protects people from the financial consequence of health care expenses and whether inpatient care distributed according to need. We also empirically analyzed the quality of hospital care in Iran using patient satisfaction information collected in a national health service survey. The Iranian health care system consists of unequal access to hospital care; mismatch between the distribution of services and inpatients' need; and high probability of financial catastrophe due to out-of-pocket payments for inpatient services. Our analysis indicates that the quality of hospital care among Iranian provinces favors patients residing in provinces with high numbers of hospital beds per capita such as Esfahan and Yazd. Patients living in provinces with low levels of accessibility to hospital care (e.g. Gilan, Kermanshah, Hamadan, Chahar Mahall and Bakhtiari, Khuzestan, and Sistan and Baluchestan) receive lower-quality services. These findings suggest that policymakers in Iran should work on several fronts including utilization, financing, and service quality to improve hospital care.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Overvoltage and overloading due to high utilization of PVs are the main power quality concerns for future distribution power systems. This paper proposes a distributed control coordination strategy to manage multiple PVs within a network to overcome these issues. PVs reactive power is used to deal with over-voltages and PVs active power curtailment are regulated to avoid overloading. The proposed control structure is used to share the required contribution fairly among PVs, in proportion to their ratings. This approach is examined on a practical distribution network with multiple PVs.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Aim: To quantify the consequences of major threats to biodiversity, such as climate and land-use change, it is important to use explicit measures of species persistence, such as extinction risk. The extinction risk of metapopulations can be approximated through simple models, providing a regional snapshot of the extinction probability of a species. We evaluated the extinction risk of three species under different climate change scenarios in three different regions of the Mexican cloud forest, a highly fragmented habitat that is particularly vulnerable to climate change. Location: Cloud forests in Mexico. Methods: Using Maxent, we estimated the potential distribution of cloud forest for three different time horizons (2030, 2050 and 2080) and their overlap with protected areas. Then, we calculated the extinction risk of three contrasting vertebrate species for two scenarios: (1) climate change only (all suitable areas of cloud forest through time) and (2) climate and land-use change (only suitable areas within a currently protected area), using an explicit patch-occupancy approximation model and calculating the joint probability of all populations becoming extinct when the number of remaining patches was less than five. Results: Our results show that the extent of environmentally suitable areas for cloud forest in Mexico will sharply decline in the next 70 years. We discovered that if all habitat outside protected areas is transformed, then only species with small area requirements are likely to persist. With habitat loss through climate change only, high dispersal rates are sufficient for persistence, but this requires protection of all remaining cloud forest areas. Main conclusions: Even if high dispersal rates mitigate the extinction risk of species due to climate change, the synergistic impacts of changing climate and land use further threaten the persistence of species with higher area requirements. Our approach for assessing the impacts of threats on biodiversity is particularly useful when there is little time or data for detailed population viability analyses. © 2013 John Wiley & Sons Ltd.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

So far, low probability differentials for the key schedule of block ciphers have been used as a straightforward proof of security against related-key differential analysis. To achieve resistance, it is believed that for cipher with k-bit key it suffices the upper bound on the probability to be 2− k . Surprisingly, we show that this reasonable assumption is incorrect, and the probability should be (much) lower than 2− k . Our counter example is a related-key differential analysis of the well established block cipher CLEFIA-128. We show that although the key schedule of CLEFIA-128 prevents differentials with a probability higher than 2− 128, the linear part of the key schedule that produces the round keys, and the Feistel structure of the cipher, allow to exploit particularly chosen differentials with a probability as low as 2− 128. CLEFIA-128 has 214 such differentials, which translate to 214 pairs of weak keys. The probability of each differential is too low, but the weak keys have a special structure which allows with a divide-and-conquer approach to gain an advantage of 27 over generic analysis. We exploit the advantage and give a membership test for the weak-key class and provide analysis of the hashing modes. The proposed analysis has been tested with computer experiments on small-scale variants of CLEFIA-128. Our results do not threaten the practical use of CLEFIA.