599 resultados para Effectiveness Estimation


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Parliamentary questions are an integral part of most Westminster parliamentary systems, serving as a major form of legislative oversight and constituency service (Glassman 2008). There are two types of parliamentary questions, ‘questions without notice’ and ‘questions on notice’. Questions without notice are asked and answered orally during ‘Question Time’. Questions on notice are asked in writing and the relevant minister provides the answer in writing. Parliamentary questions provide a mechanism to seek the accountability of the executive on the floor of the House and barely ‘any aspect of the executive department’s powers and activities can be shielded from questions’ (Crick 1964: 237). In terms of media coverage, this practice is the most widely reported legislative device. Therefore, to a casual observer, the working of parliament is synonymous with Question Time.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The common approach to estimate bus dwell time at a BRT station platform is to apply the traditional dwell time methodology derived for suburban bus stops. Current dwell time models are sensitive towards bus type, fare collection policy along with the number of boarding and alighting passengers. However, they fall short in accounting for the effects of passenger/s walking on a relatively longer BRT station platform. Analysis presented in this paper shows that the average walking time of a passenger at BRT platform is 10 times more than that of bus stop. The requirement of walking to the bus entry door at the BRT station platform may lead to the bus experiencing a higher dwell time. This paper presents a theory for a BRT network which explains the loss of station capacity during peak period operation. It also highlights shortcomings of present available bus dwell time models suggested for the analysis of BRT operation.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A basic element in advertising strategy is the choice of an appeal. In business-to-business (B2B) marketing communication, a long-standing approach relies on literal and factual, benefit-laden messages. Given the highly complex, costly and involved processes of business purchases, such approaches are certainly understandable. This project challenges the traditional B2B approach and asks if an alternative approach—using symbolic messages that operate at a more intrinsic or emotional level—is effective in the B2B arena. As an alternative to literal (factual) messages, there is an emerging body of literature that asserts stronger, more enduring results can be achieved through symbolic messages (imagery or text) in an advertisement. The present study contributes to this stream of research. From a theoretical standpoint, the study explores differences in literal-symbolic message content in B2B advertisements. There has been much discussion—mainly in the consumer literature—on the ability of symbolic messages to motivate a prospect to process advertising information by necessitating more elaborate processing and comprehension. Business buyers are regarded as less receptive to indirect or implicit appeals because their purchase decisions are based on direct evidence of product superiority. It is argued here, that these same buyers may be equally influenced by advertising that stimulates internally-directed motivation, feelings and cognitions about the brand. Thus far, studies on the effect of literalism and symbolism are fragmented, and few focus on the B2B market. While there have been many studies about the effects of symbolism no adequate scale exists to measure the continuum of literalism-symbolism. Therefore, a first task for this study was to develop such a scale. Following scale development, content analysis of 748 B2B print advertisements was undertaken to investigate whether differences in literalism-symbolism led to higher advertising performance. Variations of time and industry were also measured. From a practical perspective, the results challenge the prevailing B2B practice of relying on literal messages. While definitive support was not established for the use of symbolic message content, literal messages also failed to predict advertising performance. If the ‘fact, benefit laden’ assumption within B2B advertising cannot be supported, then other approaches used in the business-to-consumer (B2C) sector, such as symbolic messages may be also appropriate in business markets. Further research will need to test the potential effects of such messages, thereby building a revised foundation that can help drive advances in B2B advertising. Finally, the study offers a contribution to the growing body of knowledge on symbolism in advertising. While the specific focus of the study relates to B2B advertising, the Literalism-Symbolism scale developed here provides a reliable measure to evaluate literal and symbolic message content in all print advertisements. The value of this scale to advance our understanding about message strategy may be significant in future consumer and business advertising research.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The ability to accurately predict the remaining useful life of machine components is critical for machine continuous operation and can also improve productivity and enhance system’s safety. In condition-based maintenance (CBM), maintenance is performed based on information collected through condition monitoring and assessment of the machine health. Effective diagnostics and prognostics are important aspects of CBM for maintenance engineers to schedule a repair and to acquire replacement components before the components actually fail. Although a variety of prognostic methodologies have been reported recently, their application in industry is still relatively new and mostly focused on the prediction of specific component degradations. Furthermore, they required significant and sufficient number of fault indicators to accurately prognose the component faults. Hence, sufficient usage of health indicators in prognostics for the effective interpretation of machine degradation process is still required. Major challenges for accurate longterm prediction of remaining useful life (RUL) still remain to be addressed. Therefore, continuous development and improvement of a machine health management system and accurate long-term prediction of machine remnant life is required in real industry application. This thesis presents an integrated diagnostics and prognostics framework based on health state probability estimation for accurate and long-term prediction of machine remnant life. In the proposed model, prior empirical (historical) knowledge is embedded in the integrated diagnostics and prognostics system for classification of impending faults in machine system and accurate probability estimation of discrete degradation stages (health states). The methodology assumes that machine degradation consists of a series of degraded states (health states) which effectively represent the dynamic and stochastic process of machine failure. The estimation of discrete health state probability for the prediction of machine remnant life is performed using the ability of classification algorithms. To employ the appropriate classifier for health state probability estimation in the proposed model, comparative intelligent diagnostic tests were conducted using five different classifiers applied to the progressive fault data of three different faults in a high pressure liquefied natural gas (HP-LNG) pump. As a result of this comparison study, SVMs were employed in heath state probability estimation for the prediction of machine failure in this research. The proposed prognostic methodology has been successfully tested and validated using a number of case studies from simulation tests to real industry applications. The results from two actual failure case studies using simulations and experiments indicate that accurate estimation of health states is achievable and the proposed method provides accurate long-term prediction of machine remnant life. In addition, the results of experimental tests show that the proposed model has the capability of providing early warning of abnormal machine operating conditions by identifying the transitional states of machine fault conditions. Finally, the proposed prognostic model is validated through two industrial case studies. The optimal number of health states which can minimise the model training error without significant decrease of prediction accuracy was also examined through several health states of bearing failure. The results were very encouraging and show that the proposed prognostic model based on health state probability estimation has the potential to be used as a generic and scalable asset health estimation tool in industrial machinery.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper presents a method for measuring the in-bucket payload volume on a dragline excavator for the purpose of estimating the material's bulk density in real-time. Knowledge of the payload's bulk density can provide feedback to mine planning and scheduling to improve blasting and therefore provide a more uniform bulk density across the excavation site. This allows a single optimal bucket size to be used for maximum overburden removal per dig and in turn reduce costs and emissions in dragline operation and maintenance. The proposed solution uses a range bearing laser to locate and scan full buckets between the lift and dump stages of the dragline cycle. The bucket is segmented from the scene using cluster analysis, and the pose of the bucket is calculated using the Iterative Closest Point (ICP) algorithm. Payload points are identified using a known model and subsequently converted into a height grid for volume estimation. Results from both scaled and full scale implementations show that this method can achieve an accuracy of above 95%.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Over recent years a significant amount of research has been undertaken to develop prognostic models that can be used to predict the remaining useful life of engineering assets. Implementations by industry have only had limited success. By design, models are subject to specific assumptions and approximations, some of which are mathematical, while others relate to practical implementation issues such as the amount of data required to validate and verify a proposed model. Therefore, appropriate model selection for successful practical implementation requires not only a mathematical understanding of each model type, but also an appreciation of how a particular business intends to utilise a model and its outputs. This paper discusses business issues that need to be considered when selecting an appropriate modelling approach for trial. It also presents classification tables and process flow diagrams to assist industry and research personnel select appropriate prognostic models for predicting the remaining useful life of engineering assets within their specific business environment. The paper then explores the strengths and weaknesses of the main prognostics model classes to establish what makes them better suited to certain applications than to others and summarises how each have been applied to engineering prognostics. Consequently, this paper should provide a starting point for young researchers first considering options for remaining useful life prediction. The models described in this paper are Knowledge-based (expert and fuzzy), Life expectancy (stochastic and statistical), Artificial Neural Networks, and Physical models.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Asset health inspections can produce two types of indicators: (1) direct indicators (e.g. the thickness of a brake pad, and the crack depth on a gear) which directly relate to a failure mechanism; and (2) indirect indicators (e.g. the indicators extracted from vibration signals and oil analysis data) which can only partially reveal a failure mechanism. While direct indicators enable more precise references to asset health condition, they are often more difficult to obtain than indirect indicators. The state space model provides an efficient approach to estimating direct indicators by using indirect indicators. However, existing state space models to estimate direct indicators largely depend on assumptions such as, discrete time, discrete state, linearity, and Gaussianity. The discrete time assumption requires fixed inspection intervals. The discrete state assumption entails discretising continuous degradation indicators, which often introduces additional errors. The linear and Gaussian assumptions are not consistent with nonlinear and irreversible degradation processes in most engineering assets. This paper proposes a state space model without these assumptions. Monte Carlo-based algorithms are developed to estimate the model parameters and the remaining useful life. These algorithms are evaluated for performance using numerical simulations through MATLAB. The result shows that both the parameters and the remaining useful life are estimated accurately. Finally, the new state space model is used to process vibration and crack depth data from an accelerated test of a gearbox. During this application, the new state space model shows a better fitness result than the state space model with linear and Gaussian assumption.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The effectiveness of ‘the lockout policy’ integrated within a broader police enforcement strategy to reduce alcohol-related harm, in and around late-night licensed premises, in major drinking precincts was examined. First response operational police (n= 280) recorded all alcohol and non alcohol-related incidents they attended in and around late-night liquor trading premises. A before and after study design was used, with police completing modified activity logs prior to and following the introduction of the lockout policy in two policing regions: Gold Coast (n = 12,801 incidents); Brisbane City/Fortitude Valley (n = 9,117 incidents). Qualitative information from key stakeholders (e.g., Police, Security Staff & Politicians n = 20) was also obtained. The number of alcohol-related offences requiring police attention was significantly reduced in some policing areas and for some types of offences (e.g., sex offences, street disturbances, traffic incidents. However, there was no variation for a number of other offence categories (e.g., assault). Interviews with licensees revealed that although all were initially opposed to the lockout policy, most perceived benefits from its introduction. This study was the first of its kind to comprehensively examine the impact of a lockout policy and provides supportive evidence for the effectiveness of the lockout policy as integrating positively with police enforcement to enhance public safety in some areas in and around late-night liquor trading premises.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Many traffic situations require drivers to cross or merge into a stream having higher priority. Gap acceptance theory enables us to model such processes to analyse traffic operation. This discussion demonstrated that numerical search fine tuned by statistical analysis can be used to determine the most likely critical gap for a sample of drivers, based on their largest rejected gap and accepted gap. This method shares some common features with the Maximum Likelihood Estimation technique (Troutbeck 1992) but lends itself well to contemporary analysis tools such as spreadsheet and is particularly analytically transparent. This method is considered not to bias estimation of critical gap due to very small rejected gaps or very large rejected gaps. However, it requires a sufficiently large sample that there is reasonable representation of largest rejected gap/accepted gap pairs within a fairly narrow highest likelihood search band.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Markov chain Monte Carlo (MCMC) estimation provides a solution to the complex integration problems that are faced in the Bayesian analysis of statistical problems. The implementation of MCMC algorithms is, however, code intensive and time consuming. We have developed a Python package, which is called PyMCMC, that aids in the construction of MCMC samplers and helps to substantially reduce the likelihood of coding error, as well as aid in the minimisation of repetitive code. PyMCMC contains classes for Gibbs, Metropolis Hastings, independent Metropolis Hastings, random walk Metropolis Hastings, orientational bias Monte Carlo and slice samplers as well as specific modules for common models such as a module for Bayesian regression analysis. PyMCMC is straightforward to optimise, taking advantage of the Python libraries Numpy and Scipy, as well as being readily extensible with C or Fortran.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Organisations are increasingly investing in complex technological innovations, such as enterprise information systems, with the aim of improving the operation of the business, and in this way gaining competitive advantage. However, the implementation of technological innovations tends to have an excessive focus on either technology innovation effectiveness, or the resulting operational effectiveness. Focusing on either one of them is detrimental to long-term performance. Cross-functional teams have been used by many organisations as a way of involving expertise from different functional areas in the implementation of technologies. The role of boundary spanning actors is discussed as they bring a common language to the cross-functional teams. Multiple regression analysis has been used to identify the structural relationships and provide an explanation for the influence of cross-functional teams, technology innovation effectiveness and operational effectiveness in the continuous improvement of operational performance. The findings indicate that cross functional teams have an indirect influence on continuous improvement of operational performance through the alignment between technology innovation effectiveness and operational effectiveness.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We study model selection strategies based on penalized empirical loss minimization. We point out a tight relationship between error estimation and data-based complexity penalization: any good error estimate may be converted into a data-based penalty function and the performance of the estimate is governed by the quality of the error estimate. We consider several penalty functions, involving error estimates on independent test data, empirical VC dimension, empirical VC entropy, and margin-based quantities. We also consider the maximal difference between the error on the first half of the training data and the second half, and the expected maximal discrepancy, a closely related capacity estimate that can be calculated by Monte Carlo integration. Maximal discrepancy penalty functions are appealing for pattern classification problems, since their computation is equivalent to empirical risk minimization over the training data with some labels flipped.