980 resultados para reliability theory


Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this paper a combined subtransmission and distribution reliability analysis of SEQEB’s outer suburban network is presented. The reliability analysis was carried out with a commercial software package which evaluates both energy and customer indices. Various reinforcement options were investigated to ascertain the impact they have on the reliability of supply seen by the customers. The customer and energy indices produced by the combined subtransmission and distribution reliability studies contributed to optimise capital expenditure to the most effective areas of the network.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Capacity probability models of generating units are commonly used in many power system reliability studies, at hierarchical level one (HLI). Analytical modelling of a generating system with many units or generating units with many derated states in a system, can result in an extensive number of states in the capacity model. Limitations on available memory and computational time of present computer facilities can pose difficulties for assessment of such systems in many studies. A cluster procedure using the nearest centroid sorting method was used for IEEE-RTS load model. The application proved to be very effective in producing a highly similar model with substantially fewer states. This paper presents an extended application of the clustering method to include capacity probability representation. A series of sensitivity studies are illustrated using IEEE-RTS generating system and load models. The loss of load and energy expectations (LOLE, LOEE), are used as indicators to evaluate the application

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper present an efficient method using system state sampling technique in Monte Carlo simulation for reliability evaluation of multi-area power systems, at Hierarchical Level One (HLI). System state sampling is one of the common methods used in Monte Carlo simulation. The cpu time and memory requirement can be a problem, using this method. Combination of analytical and Monte Carlo method known as Hybrid method, as presented in this paper, can enhance the efficiency of the solution. Incorporation of load model in this study can be utilised either by sampling or enumeration. Both cases are examined in this paper, by application of the methods on Roy Billinton Test System(RBTS).

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The reliable operation of the electrical system at Callide Power Station is of extreme importance to the normal everyday running of the Station. This study applied the principles of reliability to do an analysis on the electrical system at Callide Power Station. It was found that the level of expected outage cost increased exponentially with a declining level of maintenance. Concluding that even in a harsh economic electricity market where CS Energy tries and push their plants to the limit, maintenance must not be neglected. A number of system configurations were found to increase the reliability of the system and reduce the expected outage costs. A number of other advantages were identified as a result of using reliability principles to do this study on the Callide electrical system configuration.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper presents a reliability assessment of a substation, part of the Queensland transmission network in Australia. As part of a maintenance considerations, this study utilises the substation reliability assessment package STAREL to quantitatively compare the reliability improvement achieved by two circuit breaker reinforcement alternatives for Swanbank circuit breaker replacement or refurbishment. Substation reliability is interpreted on the basis of outage frequency and outage duration indices for each individual transmission line terminated in Swanbank 'B' substation. By considering the reliability indices in this paper with the cost associated conducted by POWERLINK Queensland, a Swanbank 'B' reinforcement alternative can be selected that optimises both transmission line security and the costs incurred in achieving it.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Reliability is an integral component of modern power system design, planning and management. This paper uses the Markov approach to substation reliability evaluation using dedicated reliability software. This technique was applied to yield reliability indices for an existing and important substation in the POWERLINK QUEENSLAND 275 kV transmission network. Reliability indices were also determined for several reinforcement alternatives for this substation with the aim of improving substation reliability. The economic feasibility of achieving higher levels of reliability was also taken into account.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

My quantitative study asks how Chinese Australians’ “Chineseness” and their various resources influence their Chinese language proficiency, using online survey and snowball sampling. ‘Operationalization’ is a challenging process which ensures that the survey design talks back to the informing theory and forwards to the analysis model. It requires the attention to two core methodological concerns, namely ‘validity’ and ‘reliability’. Construction of a high-quality questionnaire is critical to the achievement of valid and reliable operationalization. A series of strategies were chosen to ensure the quality of the questions, and thus the eventual data. These strategies enable the use of structural equation modelling to examine how well the data fits the theoretical framework, which was constructed in light of Bourdieu’s theory of habitus, capital and field.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

What are the information practices of teen content creators? In the United States over two thirds of teens have participated in creating and sharing content in online communities that are developed for the purpose of allowing users to be producers of content. This study investigates how teens participating in digital participatory communities find and use information as well as how they experience the information. From this investigation emerged a model of their information practices while creating and sharing content such as film-making, visual art work, story telling, music, programming, and web site design in digital participatory communities. The research uses grounded theory methodology in a social constructionist framework to investigate the research problem: what are the information practices of teen content creators? Data was gathered through semi-structured interviews and observation of teen’s digital communities. Analysis occurred concurrently with data collection, and the principle of constant comparison was applied in analysis. As findings were constructed from the data, additional data was collected until a substantive theory was constructed and no new information emerged from data collection. The theory that was constructed from the data describes five information practices of teen content creators. The five information practices are learning community, negotiating aesthetic, negotiating control, negotiating capacity, and representing knowledge. In describing the five information practices there are three necessary descriptive components, the community of practice, the experiences of information and the information actions. The experiences of information include information as participation, inspiration, collaboration, process, and artifact. Information actions include activities that occur in the categories of gathering, thinking and creating. The experiences of information and information actions intersect in the information practices, which are situated within the specific community of practice, such as a digital participatory community. Finally, the information practices interact and build upon one another and this is represented in a graphic model and explanation.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Grounded theory, first developed by Glaser and Strauss in the 1960s, was introduced into nursing education as a distinct research methodology in the 1970s. The theory is grounded in a critique of the dominant contemporary approach to social inquiry, which imposed "enduring" theoretical propositions onto study data. Rather than starting from a set theoretical framework, grounded theory relies on researchers distinguishing meaningful constructs from generated data and then identifying an appropriate theory. Grounded theory is thus particularly useful in investigating complex issues and behaviours not previously addressed and concepts and relationships in particular populations or places that are still undeveloped or weakly connected. Grounded theory data analysis processes include open, axial and selective coding levels. The purpose of this article was to explore the grounded theory research process and provide an initial understanding of this methodology.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The ability to estimate the asset reliability and the probability of failure is critical to reducing maintenance costs, operation downtime, and safety hazards. Predicting the survival time and the probability of failure in future time is an indispensable requirement in prognostics and asset health management. In traditional reliability models, the lifetime of an asset is estimated using failure event data, alone; however, statistically sufficient failure event data are often difficult to attain in real-life situations due to poor data management, effective preventive maintenance, and the small population of identical assets in use. Condition indicators and operating environment indicators are two types of covariate data that are normally obtained in addition to failure event and suspended data. These data contain significant information about the state and health of an asset. Condition indicators reflect the level of degradation of assets while operating environment indicators accelerate or decelerate the lifetime of assets. When these data are available, an alternative approach to the traditional reliability analysis is the modelling of condition indicators and operating environment indicators and their failure-generating mechanisms using a covariate-based hazard model. The literature review indicates that a number of covariate-based hazard models have been developed. All of these existing covariate-based hazard models were developed based on the principle theory of the Proportional Hazard Model (PHM). However, most of these models have not attracted much attention in the field of machinery prognostics. Moreover, due to the prominence of PHM, attempts at developing alternative models, to some extent, have been stifled, although a number of alternative models to PHM have been suggested. The existing covariate-based hazard models neglect to fully utilise three types of asset health information (including failure event data (i.e. observed and/or suspended), condition data, and operating environment data) into a model to have more effective hazard and reliability predictions. In addition, current research shows that condition indicators and operating environment indicators have different characteristics and they are non-homogeneous covariate data. Condition indicators act as response variables (or dependent variables) whereas operating environment indicators act as explanatory variables (or independent variables). However, these non-homogenous covariate data were modelled in the same way for hazard prediction in the existing covariate-based hazard models. The related and yet more imperative question is how both of these indicators should be effectively modelled and integrated into the covariate-based hazard model. This work presents a new approach for addressing the aforementioned challenges. The new covariate-based hazard model, which termed as Explicit Hazard Model (EHM), explicitly and effectively incorporates all three available asset health information into the modelling of hazard and reliability predictions and also drives the relationship between actual asset health and condition measurements as well as operating environment measurements. The theoretical development of the model and its parameter estimation method are demonstrated in this work. EHM assumes that the baseline hazard is a function of the both time and condition indicators. Condition indicators provide information about the health condition of an asset; therefore they update and reform the baseline hazard of EHM according to the health state of asset at given time t. Some examples of condition indicators are the vibration of rotating machinery, the level of metal particles in engine oil analysis, and wear in a component, to name but a few. Operating environment indicators in this model are failure accelerators and/or decelerators that are included in the covariate function of EHM and may increase or decrease the value of the hazard from the baseline hazard. These indicators caused by the environment in which an asset operates, and that have not been explicitly identified by the condition indicators (e.g. Loads, environmental stresses, and other dynamically changing environment factors). While the effects of operating environment indicators could be nought in EHM; condition indicators could emerge because these indicators are observed and measured as long as an asset is operational and survived. EHM has several advantages over the existing covariate-based hazard models. One is this model utilises three different sources of asset health data (i.e. population characteristics, condition indicators, and operating environment indicators) to effectively predict hazard and reliability. Another is that EHM explicitly investigates the relationship between condition and operating environment indicators associated with the hazard of an asset. Furthermore, the proportionality assumption, which most of the covariate-based hazard models suffer from it, does not exist in EHM. According to the sample size of failure/suspension times, EHM is extended into two forms: semi-parametric and non-parametric. The semi-parametric EHM assumes a specified lifetime distribution (i.e. Weibull distribution) in the form of the baseline hazard. However, for more industry applications, due to sparse failure event data of assets, the analysis of such data often involves complex distributional shapes about which little is known. Therefore, to avoid the restrictive assumption of the semi-parametric EHM about assuming a specified lifetime distribution for failure event histories, the non-parametric EHM, which is a distribution free model, has been developed. The development of EHM into two forms is another merit of the model. A case study was conducted using laboratory experiment data to validate the practicality of the both semi-parametric and non-parametric EHMs. The performance of the newly-developed models is appraised using the comparison amongst the estimated results of these models and the other existing covariate-based hazard models. The comparison results demonstrated that both the semi-parametric and non-parametric EHMs outperform the existing covariate-based hazard models. Future research directions regarding to the new parameter estimation method in the case of time-dependent effects of covariates and missing data, application of EHM in both repairable and non-repairable systems using field data, and a decision support model in which linked to the estimated reliability results, are also identified.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Power system restoration after a large area outage involves many factors, and the procedure is usually very complicated. A decision-making support system could then be developed so as to find the optimal black-start strategy. In order to evaluate candidate black-start strategies, some indices, usually both qualitative and quantitative, are employed. However, it may not be possible to directly synthesize these indices, and different extents of interactions may exist among these indices. In the existing black-start decision-making methods, qualitative and quantitative indices cannot be well synthesized, and the interactions among different indices are not taken into account. The vague set, an extended version of the well-developed fuzzy set, could be employed to deal with decision-making problems with interacting attributes. Given this background, the vague set is first employed in this work to represent the indices for facilitating the comparisons among them. Then, a concept of the vague-valued fuzzy measure is presented, and on that basis a mathematical model for black-start decision-making developed. Compared with the existing methods, the proposed method could deal with the interactions among indices and more reasonably represent the fuzzy information. Finally, an actual power system is served for demonstrating the basic features of the developed model and method.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this paper, a new comprehensive planning methodology is proposed for implementing distribution network reinforcement. The load growth, voltage profile, distribution line loss, and reliability are considered in this procedure. A time-segmentation technique is employed to reduce the computational load. Options considered range from supporting the load growth using the traditional approach of upgrading the conventional equipment in the distribution network, through to the use of dispatchable distributed generators (DDG). The objective function is composed of the construction cost, loss cost and reliability cost. As constraints, the bus voltages and the feeder currents should be maintained within the standard level. The DDG output power should not be less than a ratio of its rated power because of efficiency. A hybrid optimization method, called modified discrete particle swarm optimization, is employed to solve this nonlinear and discrete optimization problem. A comparison is performed between the optimized solution based on planning of capacitors along with tap-changing transformer and line upgrading and when DDGs are included in the optimization.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Climate change and land use pressures are making environmental monitoring increasingly important. As environmental health is degrading at an alarming rate, ecologists have tried to tackle the problem by monitoring the composition and condition of environment. However, traditional monitoring methods using experts are manual and expensive; to address this issue government organisations designed a simpler and faster surrogate-based assessment technique for consultants, landholders and ordinary citizens. However, it remains complex, subjective and error prone. This makes collected data difficult to interpret and compare. In this paper we describe a work-in-progress mobile application designed to address these shortcomings through the use of augmented reality and multimedia smartphone technology.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Lankes and Silverstein (2006) introduced the “participatory library” and suggested that the nature and form of the library should be explored. In the last several years, some attempts have been made in order to develop contemporary library models that are often known as Library 2.0. However, little research has been based on empirical data and such models have had a strong focus on technical aspects but less focus on participation. The research presented in this paper fills this gap. A grounded theory approach was adopted for this study. Six librarians were involved in in-depth individual interviews. As a preliminary result, five main factors of the participatory library emerged including technological, human, educational, social-economic, and environmental. Five factors influencing the participation in libraries were also identified: finance, technology, education, awareness, and policy. The study’s findings provide a fresh perspective on contemporary library and create a basis for further studies on this area.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Whole-body computer control interfaces present new opportunities to engage children with games for learning. Stomp is a suite of educational games that use such a technology, allowing young children to use their whole body to interact with a digital environment projected on the floor. To maximise the effectiveness of this technology, tenets of self-determination theory (SDT) are applied to the design of Stomp experiences. By meeting user needs for competence, autonomy, and relatedness our aim is to increase children's engagement with the Stomp learning platform. Analysis of Stomp's design suggests that these tenets are met. Observations from a case study of Stomp being used by young children show that they were highly engaged and motivated by Stomp. This analysis demonstrates that continued application of SDT to Stomp will further enhance user engagement. It also is suggested that SDT, when applied more widely to other whole-body multi-user interfaces, could instil similar positive effects.