301 resultados para Bayesian operation


Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper proposes the use of Bayesian approaches with the cross likelihood ratio (CLR) as a criterion for speaker clustering within a speaker diarization system, using eigenvoice modeling techniques. The CLR has previously been shown to be an effective decision criterion for speaker clustering using Gaussian mixture models. Recently, eigenvoice modeling has become an increasingly popular technique, due to its ability to adequately represent a speaker based on sparse training data, as well as to provide an improved capture of differences in speaker characteristics. The integration of eigenvoice modeling into the CLR framework to capitalize on the advantage of both techniques has also been shown to be beneficial for the speaker clustering task. Building on that success, this paper proposes the use of Bayesian methods to compute the conditional probabilities in computing the CLR, thus effectively combining the eigenvoice-CLR framework with the advantages of a Bayesian approach to the diarization problem. Results obtained on the 2002 Rich Transcription (RT-02) Evaluation dataset show an improved clustering performance, resulting in a 33.5% relative improvement in the overall Diarization Error Rate (DER) compared to the baseline system.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The use of Bayesian methodologies for solving optimal experimental design problems has increased. Many of these methods have been found to be computationally intensive for design problems that require a large number of design points. A simulation-based approach that can be used to solve optimal design problems in which one is interested in finding a large number of (near) optimal design points for a small number of design variables is presented. The approach involves the use of lower dimensional parameterisations that consist of a few design variables, which generate multiple design points. Using this approach, one simply has to search over a few design variables, rather than searching over a large number of optimal design points, thus providing substantial computational savings. The methodologies are demonstrated on four applications, including the selection of sampling times for pharmacokinetic and heat transfer studies, and involve nonlinear models. Several Bayesian design criteria are also compared and contrasted, as well as several different lower dimensional parameterisation schemes for generating the many design points.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The reliable operation of the electrical system at Callide Power Station is of extreme importance to the normal everyday running of the Station. This study applied the principles of reliability to do an analysis on the electrical system at Callide Power Station. It was found that the level of expected outage cost increased exponentially with a declining level of maintenance. Concluding that even in a harsh economic electricity market where CS Energy tries and push their plants to the limit, maintenance must not be neglected. A number of system configurations were found to increase the reliability of the system and reduce the expected outage costs. A number of other advantages were identified as a result of using reliability principles to do this study on the Callide electrical system configuration.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Advances in algorithms for approximate sampling from a multivariable target function have led to solutions to challenging statistical inference problems that would otherwise not be considered by the applied scientist. Such sampling algorithms are particularly relevant to Bayesian statistics, since the target function is the posterior distribution of the unobservables given the observables. In this thesis we develop, adapt and apply Bayesian algorithms, whilst addressing substantive applied problems in biology and medicine as well as other applications. For an increasing number of high-impact research problems, the primary models of interest are often sufficiently complex that the likelihood function is computationally intractable. Rather than discard these models in favour of inferior alternatives, a class of Bayesian "likelihoodfree" techniques (often termed approximate Bayesian computation (ABC)) has emerged in the last few years, which avoids direct likelihood computation through repeated sampling of data from the model and comparing observed and simulated summary statistics. In Part I of this thesis we utilise sequential Monte Carlo (SMC) methodology to develop new algorithms for ABC that are more efficient in terms of the number of model simulations required and are almost black-box since very little algorithmic tuning is required. In addition, we address the issue of deriving appropriate summary statistics to use within ABC via a goodness-of-fit statistic and indirect inference. Another important problem in statistics is the design of experiments. That is, how one should select the values of the controllable variables in order to achieve some design goal. The presences of parameter and/or model uncertainty are computational obstacles when designing experiments but can lead to inefficient designs if not accounted for correctly. The Bayesian framework accommodates such uncertainties in a coherent way. If the amount of uncertainty is substantial, it can be of interest to perform adaptive designs in order to accrue information to make better decisions about future design points. This is of particular interest if the data can be collected sequentially. In a sense, the current posterior distribution becomes the new prior distribution for the next design decision. Part II of this thesis creates new algorithms for Bayesian sequential design to accommodate parameter and model uncertainty using SMC. The algorithms are substantially faster than previous approaches allowing the simulation properties of various design utilities to be investigated in a more timely manner. Furthermore the approach offers convenient estimation of Bayesian utilities and other quantities that are particularly relevant in the presence of model uncertainty. Finally, Part III of this thesis tackles a substantive medical problem. A neurological disorder known as motor neuron disease (MND) progressively causes motor neurons to no longer have the ability to innervate the muscle fibres, causing the muscles to eventually waste away. When this occurs the motor unit effectively ‘dies’. There is no cure for MND, and fatality often results from a lack of muscle strength to breathe. The prognosis for many forms of MND (particularly amyotrophic lateral sclerosis (ALS)) is particularly poor, with patients usually only surviving a small number of years after the initial onset of disease. Measuring the progress of diseases of the motor units, such as ALS, is a challenge for clinical neurologists. Motor unit number estimation (MUNE) is an attempt to directly assess underlying motor unit loss rather than indirect techniques such as muscle strength assessment, which generally is unable to detect progressions due to the body’s natural attempts at compensation. Part III of this thesis builds upon a previous Bayesian technique, which develops a sophisticated statistical model that takes into account physiological information about motor unit activation and various sources of uncertainties. More specifically, we develop a more reliable MUNE method by applying marginalisation over latent variables in order to improve the performance of a previously developed reversible jump Markov chain Monte Carlo sampler. We make other subtle changes to the model and algorithm to improve the robustness of the approach.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Reliable pollutant build-up prediction plays a critical role in the accuracy of urban stormwater quality modelling outcomes. However, water quality data collection is resource demanding compared to streamflow data monitoring, where a greater quantity of data is generally available. Consequently, available water quality data sets span only relatively short time scales unlike water quantity data. Therefore, the ability to take due consideration of the variability associated with pollutant processes and natural phenomena is constrained. This in turn gives rise to uncertainty in the modelling outcomes as research has shown that pollutant loadings on catchment surfaces and rainfall within an area can vary considerably over space and time scales. Therefore, the assessment of model uncertainty is an essential element of informed decision making in urban stormwater management. This paper presents the application of a range of regression approaches such as ordinary least squares regression, weighted least squares Regression and Bayesian Weighted Least Squares Regression for the estimation of uncertainty associated with pollutant build-up prediction using limited data sets. The study outcomes confirmed that the use of ordinary least squares regression with fixed model inputs and limited observational data may not provide realistic estimates. The stochastic nature of the dependent and independent variables need to be taken into consideration in pollutant build-up prediction. It was found that the use of the Bayesian approach along with the Monte Carlo simulation technique provides a powerful tool, which attempts to make the best use of the available knowledge in the prediction and thereby presents a practical solution to counteract the limitations which are otherwise imposed on water quality modelling.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper focuses on the super/sub-synchronous operation of the doubly fed induction generator (DFIG) system. The impact of a damping controller on the different modes of operation for the DFIG based wind generation system is investigated. The co-ordinated tuning of the damping controller to enhance the damping of the oscillatory modes using bacteria foraging (BF) technique is presented. The results from eigenvalue analysis are presented to elucidate the effectiveness of the tuned damping controller in the DFIG system. The robustness issue of the damping controller is also investigated

Relevância:

20.00% 20.00%

Publicador:

Resumo:

An energy storage system (ESS) can provide ancillary services such as frequency regulation and reserves, as well as smooth the fluctuations of wind power outputs, and hence improve the security and economics of the power system concerned. The combined operation of a wind farm and an ESS has become a widely accepted operating mode. Hence, it appears necessary to consider this operating mode in transmission system expansion planning, and this is an issue to be systematically addressed in this work. Firstly, the relationship between the cost of the NaS based ESS and its discharging cycle life is analyzed. A strategy for the combined operation of a wind farm and an ESS is next presented, so as to have a good compromise between the operating cost of the ESS and the smoothing effect of the fluctuation of wind power outputs. Then, a transmission system expansion planning model is developed with the sum of the transmission investment costs, the investment and operating costs of ESSs and the punishment cost of lost wind energy as the objective function to be minimized. An improved particle swarm optimization algorithm is employed to solve the developed planning model. Finally, the essential features of the developed model and adopted algorithm are demonstrated by 18-bus and 46-bus test systems.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Manufacturing companies have strived to enhance managerial and technical capabilities to improve business performance. Building these capabilities requires effective share of knowledge - the strategic resource. Specifically, knowledge sharing (KS) between different manufacturing departments can improve manufacturing processes since leveraging organisational knowledge plays an enssential role in achieving competitive advantage. This paper presents an empirical investigation into the impact of KS on the effectiveness of supply chain management (SCM) and the product development process (PDP) in achieving desired business performance (BP). A questionnaire survey was administered from electronic manufacturing companies operating in Taiwan. 168 valid responses were received and used to statistically examine the relationships between the concepts (SCM, PDP, KS, BP). The study findings reveal that within the Taiwanese electronic manufacturing companies KS is an essential enabler for facilitating the effectiveness of SCM and PDP in achieving desired BP.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Effective Wayfinding is the successful interplay of human and environmental factors resulting in a person successfully moving from their current position to a desired location in a timely manner. To date this process has not been modelled to reflect this interplay. This paper proposes a complex modelling system approach of wayfinding by using Bayesian Networks to model this process, and applies the model to airports. The model suggests that human factors have a greater impact on effective wayfinding in airports than environmental factors. The greatest influences on human factors are found to be the level of spatial anxiety experienced by travellers and their cognitive and spatial skills. The model also predicted that the navigation pathway that a traveller must traverse has a larger impact on the effectiveness of an airport’s environment in promoting effective wayfinding than the terminal design.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A hydrogen gas sensor based on Pt/nanostructured ZnO Schottky diode has been developed. Our proposed theoretical model allows for the explanation of superior dynamic performance of the reverse biased diode when compared to the forward bias operation. The sensor was evaluated with low concentration H2 gas exposures over a temperature range of 280°C to 430°C. Upon exposure to H2 gas, the effective change in free carrier concentration at the Pt/structured ZnO interface is amplified by an enhancement factor, effectively lowering the reverse barrier, producing a large voltage shift. The lowering of the reverse barrier permits a faster response in reverse bias operation, than in forward bias operation.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Bus Rapid Transit (BRT) station is the interface between passenger and service. The station is crucial to line operation as it is typically the only location where buses can pass each other. Congestion may occur here when buses maneuvering into and out of the platform lane interfere with bus flow, or when a queue of buses forms upstream of the platform lane blocking the passing lane. However, some systems include operation where express buses pass the critical station, resulting in a proportion of non stopping buses. It is important to understand the operation of the critical busway station under this type of operation, as it affects busway line capacity. This study uses micro simulation to treat the BRT station operation and to analyze the relationship between station Limit state bus capacity (B_ls), Total Bus Capacity (B_ttl). First, the simulation model is developed for Limit state scenario and then a mathematical model is defined, calibrated for a specified range of controlled scenarios of mean and coefficient of variation of dwell time. Thereafter, the proposed B_ls model is extended to consider non stopping buses and B_ttlmodel is defined. The proposed models provides better understanding to the BRT line capacity and is useful for transit authorities for designing better BRT operation.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Approximate Bayesian computation has become an essential tool for the analysis of complex stochastic models when the likelihood function is numerically unavailable. However, the well-established statistical method of empirical likelihood provides another route to such settings that bypasses simulations from the model and the choices of the approximate Bayesian computation parameters (summary statistics, distance, tolerance), while being convergent in the number of observations. Furthermore, bypassing model simulations may lead to significant time savings in complex models, for instance those found in population genetics. The Bayesian computation with empirical likelihood algorithm we develop in this paper also provides an evaluation of its own performance through an associated effective sample size. The method is illustrated using several examples, including estimation of standard distributions, time series, and population genetics models.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this paper we present a new simulation methodology in order to obtain exact or approximate Bayesian inference for models for low-valued count time series data that have computationally demanding likelihood functions. The algorithm fits within the framework of particle Markov chain Monte Carlo (PMCMC) methods. The particle filter requires only model simulations and, in this regard, our approach has connections with approximate Bayesian computation (ABC). However, an advantage of using the PMCMC approach in this setting is that simulated data can be matched with data observed one-at-a-time, rather than attempting to match on the full dataset simultaneously or on a low-dimensional non-sufficient summary statistic, which is common practice in ABC. For low-valued count time series data we find that it is often computationally feasible to match simulated data with observed data exactly. Our particle filter maintains $N$ particles by repeating the simulation until $N+1$ exact matches are obtained. Our algorithm creates an unbiased estimate of the likelihood, resulting in exact posterior inferences when included in an MCMC algorithm. In cases where exact matching is computationally prohibitive, a tolerance is introduced as per ABC. A novel aspect of our approach is that we introduce auxiliary variables into our particle filter so that partially observed and/or non-Markovian models can be accommodated. We demonstrate that Bayesian model choice problems can be easily handled in this framework.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Database security techniques are available widely. Among those techniques, the encryption method is a well-certified and established technology for protecting sensitive data. However, once encrypted, the data can no longer be easily queried. The performance of the database depends on how to encrypt the sensitive data, and an approach for searching and retrieval efficiencies that are implemented. In this paper we analyze the database queries and the data properties and propose a suitable mechanism to query the encrypted database. We proposed and analyzed the new database encryption algorithm using the Bloom Filter with the bucket index method. Finally, we demonstrated the superiority of the proposed algorithm through several experiments that should be useful for database encryption related research and application activities.