953 resultados para function estimation
Resumo:
Effective machine fault prognostic technologies can lead to elimination of unscheduled downtime and increase machine useful life and consequently lead to reduction of maintenance costs as well as prevention of human casualties in real engineering asset management. This paper presents a technique for accurate assessment of the remnant life of machines based on health state probability estimation technique and historical failure knowledge embedded in the closed loop diagnostic and prognostic system. To estimate a discrete machine degradation state which can represent the complex nature of machine degradation effectively, the proposed prognostic model employed a classification algorithm which can use a number of damage sensitive features compared to conventional time series analysis techniques for accurate long-term prediction. To validate the feasibility of the proposed model, the five different level data of typical four faults from High Pressure Liquefied Natural Gas (HP-LNG) pumps were used for the comparison of intelligent diagnostic test using five different classification algorithms. In addition, two sets of impeller-rub data were analysed and employed to predict the remnant life of pump based on estimation of health state probability using the Support Vector Machine (SVM) classifier. The results obtained were very encouraging and showed that the proposed prognostics system has the potential to be used as an estimation tool for machine remnant life prediction in real life industrial applications.
Resumo:
In Chapters 1 through 9 of the book (with the exception of a brief discussion on observers and integral action in Section 5.5 of Chapter 5) we considered constrained optimal control problems for systems without uncertainty, that is, with no unmodelled dynamics or disturbances, and where the full state was available for measurement. More realistically, however, it is necessary to consider control problems for systems with uncertainty. This chapter addresses some of the issues that arise in this situation. As in Chapter 9, we adopt a stochastic description of uncertainty, which associates probability distributions to the uncertain elements, that is, disturbances and initial conditions. (See Section 12.6 for references to alternative approaches to model uncertainty.) When incomplete state information exists, a popular observer-based control strategy in the presence of stochastic disturbances is to use the certainty equivalence [CE] principle, introduced in Section 5.5 of Chapter 5 for deterministic systems. In the stochastic framework, CE consists of estimating the state and then using these estimates as if they were the true state in the control law that results if the problem were formulated as a deterministic problem (that is, without uncertainty). This strategy is motivated by the unconstrained problem with a quadratic objective function, for which CE is indeed the optimal solution (˚Astr¨om 1970, Bertsekas 1976). One of the aims of this chapter is to explore the issues that arise from the use of CE in RHC in the presence of constraints. We then turn to the obvious question about the optimality of the CE principle. We show that CE is, indeed, not optimal in general. We also analyse the possibility of obtaining truly optimal solutions for single input linear systems with input constraints and uncertainty related to output feedback and stochastic disturbances.We first find the optimal solution for the case of horizon N = 1, and then we indicate the complications that arise in the case of horizon N = 2. Our conclusion is that, for the case of linear constrained systems, the extra effort involved in the optimal feedback policy is probably not justified in practice. Indeed, we show by example that CE can give near optimal performance. We thus advocate this approach in real applications.
Resumo:
We investigate the utility to computational Bayesian analyses of a particular family of recursive marginal likelihood estimators characterized by the (equivalent) algorithms known as "biased sampling" or "reverse logistic regression" in the statistics literature and "the density of states" in physics. Through a pair of numerical examples (including mixture modeling of the well-known galaxy dataset) we highlight the remarkable diversity of sampling schemes amenable to such recursive normalization, as well as the notable efficiency of the resulting pseudo-mixture distributions for gauging prior-sensitivity in the Bayesian model selection context. Our key theoretical contributions are to introduce a novel heuristic ("thermodynamic integration via importance sampling") for qualifying the role of the bridging sequence in this procedure, and to reveal various connections between these recursive estimators and the nested sampling technique.
Resumo:
The exchange of physical forces in both cell-cell and cell-matrix interactions play a significant role in a variety of physiological and pathological processes, such as cell migration, cancer metastasis, inflammation and wound healing. Therefore, great interest exists in accurately quantifying the forces that cells exert on their substrate during migration. Traction Force Microscopy (TFM) is the most widely used method for measuring cell traction forces. Several mathematical techniques have been developed to estimate forces from TFM experiments. However, certain simplifications are commonly assumed, such as linear elasticity of the materials and/or free geometries, which in some cases may lead to inaccurate results. Here, cellular forces are numerically estimated by solving a minimization problem that combines multiple non-linear FEM solutions. Our simulations, free from constraints on the geometrical and the mechanical conditions, show that forces are predicted with higher accuracy than when using the standard approaches.
Resumo:
A key derivation function (KDF) is a function that transforms secret non-uniformly random source material together with some public strings into one or more cryptographic keys. These cryptographic keys are used with a cryptographic algorithm for protecting electronic data during both transmission over insecure channels and storage. In this thesis, we propose a new method for constructing a generic stream cipher based key derivation function. We show that our proposed key derivation function based on stream ciphers is secure if the under-lying stream cipher is secure. We simulate instances of this stream cipher based key derivation function using three eStream nalist: Trivium, Sosemanuk and Rabbit. The simulation results show these stream cipher based key derivation functions offer efficiency advantages over the more commonly used key derivation functions based on block ciphers and hash functions.
Resumo:
This paper presents a method for the estimation of thrust model parameters of uninhabited airborne systems using specific flight tests. Particular tests are proposed to simplify the estimation. The proposed estimation method is based on three steps. The first step uses a regression model in which the thrust is assumed constant. This allows us to obtain biased initial estimates of the aerodynamic coeficients of the surge model. In the second step, a robust nonlinear state estimator is implemented using the initial parameter estimates, and the model is augmented by considering the thrust as random walk. In the third step, the estimate of the thrust obtained by the observer is used to fit a polynomial model in terms of the propeller advanced ratio. We consider a numerical example based on Monte-Carlo simulations to quantify the sampling properties of the proposed estimator given realistic flight conditions.
Resumo:
This study explored how the social context influences the stress-buffering effects of social support on employee adjustment. It was anticipated that the positive relationship between support from colleagues and employee adjustment would be more marked for those strongly identifying with their work team. Furthermore, as part of a three-way interactive effect, it was predicted that high identification would increase the efficacy of coworker support as a buffer of two role stressors (role overload and role ambiguity). One hundred and 55 employees recruited from first-year psychology courses enrolled at two Australian universities were surveyed. Hierarchical multiple regression analyses revealed that the negative main effect of role ambiguity on job satisfaction was significant for those employees with low levels of team identification, whereas high team identifiers were buffered from the deleterious effect of role ambiguity on job satisfaction. There also was a significant interaction between coworker support and team identification. The positive effect of coworker support on job satisfaction was significant for high team identifiers, whereas coworker support was not a source of satisfaction for those employees with low levels of team identification. A three-way interaction emerged among the focal variables in the prediction of psychological well-being, suggesting that the combined benefits of coworker support and team identification under conditions of high demand may be limited and are more likely to be observed when demands are low.
Resumo:
Inflammation of the spinal cord after traumatic spinal cord injury leads to destruction of healthy tissue. This “secondary degeneration” is more damaging than the initial physical damage and is the major contributor to permanent loss of functions. In our previous study we showed that combined delivery of two growth factors, vascular endothelial growth factor (VEGF) and platelet-derived growth factor (PDGF), significantly reduced secondary degeneration after hemi-section injury of the spinal cord in the rat. Growth factor treatment reduced the size of the lesion cavity at 30d compared to control animals and further reduced the cavity at 90d in treated animals while in control animals the lesion cavity continued to increase in size. Growth factor treatment also reduced astrogliosis and reduced macroglia/macrophage activation around the injury site. Treatment with individual growth factors alone had similar effects to control treatments. The present study investigated whether growth factor treatment would improve locomotor behaviour after spinal contusion injury, a more relevant preclinical model of spinal cord injury. The growth factors were delivered for the first 7d to the injury site via osmotic minipump. Locomotor behaviour was monitored at 1-28d after injury using the BBB score and at 30d using automated gait analysis. Treated animals had BBB scores of 18; Control animals scored 10. Treated animals had significantly reduced lesion cavities and reduced macroglia/macrophage activation around the injury site. We conclude that growth factor treatment preserved spinal cord tissues after contusion injury, thereby allowing functional recovery. This treatment has the potential to significantly reduce the severity of human spinal cord injuries.
Resumo:
This paper considers two problems that frequently arise in dynamic discrete choice problems but have not received much attention with regard to simulation methods. The first problem is how to simulate unbiased simulators of probabilities conditional on past history. The second is simulating a discrete transition probability model when the underlying dependent variable is really continuous. Both methods work well relative to reasonable alternatives in the application discussed. However, in both cases, for this application, simpler methods also provide reasonably good results.
Resumo:
In this paper, I present a number of leading examples in the empirical literature that use simulation-based estimation methods. For each example, I describe the model, why simulation is needed, and how to simulate the relevant object. There is a section on simulation methods and another on simulations-based estimation methods. The paper concludes by considering the significance of each of the examples discussed a commenting on potential future areas of interest.
Semiparametric estimates of the supply and demand effects of disability on labor force participation
Resumo:
This paper modifies and uses the semiparametric methods of Ichimura and Lee (1991) on standard cross-section data to decompose the effect of disability on labor force participation into a demand and a supply effect. It shows that straightforward use of Ichimura and Lee leads to meaningless results while imposing monotonicity on the unknown function leads to substantial results. The paper finds that supply effects dominate the demand effects of disability.
Resumo:
This paper develops a semiparametric estimation approach for mixed count regression models based on series expansion for the unknown density of the unobserved heterogeneity. We use the generalized Laguerre series expansion around a gamma baseline density to model unobserved heterogeneity in a Poisson mixture model. We establish the consistency of the estimator and present a computational strategy to implement the proposed estimation techniques in the standard count model as well as in truncated, censored, and zero-inflated count regression models. Monte Carlo evidence shows that the finite sample behavior of the estimator is quite good. The paper applies the method to a model of individual shopping behavior. © 1999 Elsevier Science S.A. All rights reserved.
Resumo:
This paper addresses the problem of determining optimal designs for biological process models with intractable likelihoods, with the goal of parameter inference. The Bayesian approach is to choose a design that maximises the mean of a utility, and the utility is a function of the posterior distribution. Therefore, its estimation requires likelihood evaluations. However, many problems in experimental design involve models with intractable likelihoods, that is, likelihoods that are neither analytic nor can be computed in a reasonable amount of time. We propose a novel solution using indirect inference (II), a well established method in the literature, and the Markov chain Monte Carlo (MCMC) algorithm of Müller et al. (2004). Indirect inference employs an auxiliary model with a tractable likelihood in conjunction with the generative model, the assumed true model of interest, which has an intractable likelihood. Our approach is to estimate a map between the parameters of the generative and auxiliary models, using simulations from the generative model. An II posterior distribution is formed to expedite utility estimation. We also present a modification to the utility that allows the Müller algorithm to sample from a substantially sharpened utility surface, with little computational effort. Unlike competing methods, the II approach can handle complex design problems for models with intractable likelihoods on a continuous design space, with possible extension to many observations. The methodology is demonstrated using two stochastic models; a simple tractable death process used to validate the approach, and a motivating stochastic model for the population evolution of macroparasites.
Resumo:
This article lays down the foundations of the renormalization group (RG) approach for differential equations characterized by multiple scales. The renormalization of constants through an elimination process and the subsequent derivation of the amplitude equation [Chen, Phys. Rev. E 54, 376 (1996)] are given a rigorous but not abstract mathematical form whose justification is based on the implicit function theorem. Developing the theoretical framework that underlies the RG approach leads to a systematization of the renormalization process and to the derivation of explicit closed-form expressions for the amplitude equations that can be carried out with symbolic computation for both linear and nonlinear scalar differential equations and first order systems but independently of their particular forms. Certain nonlinear singular perturbation problems are considered that illustrate the formalism and recover well-known results from the literature as special cases. © 2008 American Institute of Physics.
Resumo:
In a paper published in FSE 2007, a way of obtaining near-collisions and in theory also collisions for the FORK-256 hash function was presented [8]. The paper contained examples of near-collisions for the compression function, but in practice the attack could not be extended to the full function due to large memory requirements and computation time. In this paper we improve the attack and show that it is possible to find near-collisions in practice for any given value of IV. In particular, this means that the full hash function with the prespecified IV is vulnerable in practice, not just in theory. We exhibit an example near-collision for the complete hash function.