996 resultados para APPLIED PROBABILITY
Resumo:
We examine the representation of judgements of stochastic independence in probabilistic logics. We focus on a relational logic where (i) judgements of stochastic independence are encoded by directed acyclic graphs, and (ii) probabilistic assessments are flexible in the sense that they are not required to specify a single probability measure. We discuss issues of knowledge representation and inference that arise from our particular combination of graphs, stochastic independence, logical formulas and probabilistic assessments. (C) 2007 Elsevier B.V. All rights reserved.
Resumo:
Systems of distributed artificial intelligence can be powerful tools in a wide variety of practical applications. Its most surprising characteristic, the emergent behavior, is also the most answerable for the difficulty in. projecting these systems. This work proposes a tool capable to beget individual strategies for the elements of a multi-agent system and thereof providing to the group means on obtaining wanted results, working in a coordinated and cooperative manner as well. As an application example, a problem was taken as a basis where a predators` group must catch a prey in a three-dimensional continuous ambient. A synthesis of system strategies was implemented of which internal mechanism involves the integration between simulators by Particle Swarm Optimization algorithm (PSO), a Swarm Intelligence technique. The system had been tested in several simulation settings and it was capable to synthesize automatically successful hunting strategies, substantiating that the developed tool can provide, as long as it works with well-elaborated patterns, satisfactory solutions for problems of complex nature, of difficult resolution starting from analytical approaches. (c) 2007 Elsevier Ltd. All rights reserved.
Resumo:
Simulated annealing (SA) is an optimization technique that can process cost functions with degrees of nonlinearities, discontinuities and stochasticity. It can process arbitrary boundary conditions and constraints imposed on these cost functions. The SA technique is applied to the problem of robot path planning. Three situations are considered here: the path is represented as a polyline; as a Bezier curve; and as a spline interpolated curve. In the proposed SA algorithm, the sensitivity of each continuous parameter is evaluated at each iteration increasing the number of accepted solutions. The sensitivity of each parameter is associated to its probability distribution in the definition of the next candidate. (C) 2010 Elsevier Ltd. All rights reserved.
Resumo:
The purpose of this article is to present a quantitative analysis of the human failure contribution in the collision and/or grounding of oil tankers, considering the recommendation of the ""Guidelines for Formal Safety Assessment"" of the International Maritime Organization. Initially, the employed methodology is presented, emphasizing the use of the technique for human error prediction to reach the desired objective. Later, this methodology is applied to a ship operating on the Brazilian coast and, thereafter, the procedure to isolate the human actions with the greatest potential to reduce the risk of an accident is described. Finally, the management and organizational factors presented in the ""International Safety Management Code"" are associated with these selected actions. Therefore, an operator will be able to decide where to work in order to obtain an effective reduction in the probability of accidents. Even though this study does not present a new methodology, it can be considered as a reference in the human reliability analysis for the maritime industry, which, in spite of having some guides for risk analysis, has few studies related to human reliability effectively applied to the sector.
Resumo:
This work investigates the influence of the addition of cerium (IV) ions on the anticorrosion properties of organic-inorganic hybrid coatings applied to passivated tin coated steel. In order to evaluate the specific effect of cerium (IV) addition on nanostructural features of the organic and inorganic phases of the hybrid coating, the hydrolytic polycondensation of silicon alkoxide and the radical polymerization of the methyl methacrylate (MMA) function were induced separately. The corrosion resistance of the coatings was evaluated by means of linear polarization, Tafel type curves and electrochemical impedance measurements. The impedance results obtained for the hybrid coatings were discussed based on an electrical equivalent circuit used to fit the experimental data. The electrochemical results clearly showed the improvement of the protective properties of the organic-inorganic hybrid coating mainly when the cerium (IV) was added to the organic phase solution precursor, which seemed to be due to the formation of a more uniform and densely reticulated siloxane-PMMA film. (C) 2010 Elsevier Ltd. All rights reserved.
Resumo:
The present work reports the thermal annealing process, the number of layer and electrochemical process effect in the optical response quality of Bragg and microcavity devices that were applied as organic solvent sensors. These devices have been obtained by using porous silicon (PS) technology. The optical characterization of the Bragg reflector, before annealing, showed a broad photonic band-gap structure with blue shifted and narrowed after annealing process. The electrochemical process used to obtain the PS-based device imposes the limit in the number of layers because of the chemical dissolution effect. The interface roughness minimizations in the devices have been achieved by using the double electrochemical cell setup. The microcavity devices showed to have a good sensibility for organic solvent detection. The thermal annealed device showed better sensibility feature and this result was attributed to passivation of the surface devices. (c) 2007 Elsevier Ltd. All rights reserved.
Resumo:
The atomic force microscope (AFM) introduced the surface investigation with true atomic resolution. In the frequency modulation technique (FM-AFM) both the amplitude and the frequency of oscillation of the micro-cantilever must be kept constant even in the presence of tip-surface interaction forces. For that reason, the proper design of the Phase-Locked Loop (PLL) used in FM-AFM is vital to system performance. Here, the mathematical model of the FM-AFM control system is derived considering high order PLL In addition a method to design stable third-order Phase-Locked Loops is presented. (C) 2010 Elsevier B.V. All rights reserved.
Resumo:
This paper considers two aspects of the nonlinear H(infinity) control problem: the use of weighting functions for performance and robustness improvement, as in the linear case, and the development of a successive Galerkin approximation method for the solution of the Hamilton-Jacobi-Isaacs equation that arises in the output-feedback case. Design of nonlinear H(infinity) controllers obtained by the well-established Taylor approximation and by the proposed Galerkin approximation method applied to a magnetic levitation system are presented for comparison purposes.
Resumo:
Eight different models to represent the effect of friction in control valves are presented: four models based on physical principles and four empirical ones. The physical models, both static and dynamic, have the same structure. The models are implemented in Simulink/Matlab (R) and compared, using different friction coefficients and input signals. Three of the models were able to reproduce the stick-slip phenomenon and passed all the tests, which were applied following ISA standards. (C) 2008 Elsevier Ltd. All rights reserved.
Resumo:
Accurate price forecasting for agricultural commodities can have significant decision-making implications for suppliers, especially those of biofuels, where the agriculture and energy sectors intersect. Environmental pressures and high oil prices affect demand for biofuels and have reignited the discussion about effects on food prices. Suppliers in the sugar-alcohol sector need to decide the ideal proportion of ethanol and sugar to optimise their financial strategy. Prices can be affected by exogenous factors, such as exchange rates and interest rates, as well as non-observable variables like the convenience yield, which is related to supply shortages. The literature generally uses two approaches: artificial neural networks (ANNs), which are recognised as being in the forefront of exogenous-variable analysis, and stochastic models such as the Kalman filter, which is able to account for non-observable variables. This article proposes a hybrid model for forecasting the prices of agricultural commodities that is built upon both approaches and is applied to forecast the price of sugar. The Kalman filter considers the structure of the stochastic process that describes the evolution of prices. Neural networks allow variables that can impact asset prices in an indirect, nonlinear way, what cannot be incorporated easily into traditional econometric models.
Resumo:
In this paper, we compare three residuals to assess departures from the error assumptions as well as to detect outlying observations in log-Burr XII regression models with censored observations. These residuals can also be used for the log-logistic regression model, which is a special case of the log-Burr XII regression model. For different parameter settings, sample sizes and censoring percentages, various simulation studies are performed and the empirical distribution of each residual is displayed and compared with the standard normal distribution. These studies suggest that the residual analysis usually performed in normal linear regression models can be straightforwardly extended to the modified martingale-type residual in log-Burr XII regression models with censored data.
Resumo:
The aim of this study was to investigate the behavior of the association between atrazine and glyphosate in the soil through mineralization and degradation tests. Soil treatments consisted of the combination of a field dose of glyphosate (2.88 kg ha-1) with 0, 1/2, 1 and 2 times a field dose of atrazine (3.00 kg ha-1) and a field dose of atrazine with 0, 1/2, 1 and 2 times a field dose of glyphosate. The herbicide mineralization rates were measured after 0, 3, 7, 14, 21, 28, 35, 42, 49, 56 and 63 days of soil application, and degradation rates after 0, 7, 28 and 63 days. Although glyphosate mineralization rate was higher in the presence of 1 (one) dose of atrazine when compared with glyphosate alone, no significant differences were found when half or twice the atrazine dose was applied, meaning that differences in glyphosate mineralization rates cannot be attributed to the presence of atrazine. On the other hand, the influence of glyphosate on atrazine mineralization was evident, since increasing doses of glyphosate increased the atrazine mineralization rate and the lowest dose of glyphosate accelerated atrazine degradation.
Resumo:
Survival models involving frailties are commonly applied in studies where correlated event time data arise due to natural or artificial clustering. In this paper we present an application of such models in the animal breeding field. Specifically, a mixed survival model with a multivariate correlated frailty term is proposed for the analysis of data from over 3611 Brazilian Nellore cattle. The primary aim is to evaluate parental genetic effects on the trait length in days that their progeny need to gain a commercially specified standard weight gain. This trait is not measured directly but can be estimated from growth data. Results point to the importance of genetic effects and suggest that these models constitute a valuable data analysis tool for beef cattle breeding.
Resumo:
Mixed models have become important in analyzing the results of experiments, particularly those that require more complicated models (e.g., those that involve longitudinal data). This article describes a method for deriving the terms in a mixed model. Our approach extends an earlier method by Brien and Bailey to explicitly identify terms for which autocorrelation and smooth trend arising from longitudinal observations need to be incorporated in the model. At the same time we retain the principle that the model used should include, at least, all the terms that are justified by the randomization. This is done by dividing the factors into sets, called tiers, based on the randomization and determining the crossing and nesting relationships between factors. The method is applied to formulate mixed models for a wide range of examples. We also describe the mixed model analysis of data from a three-phase experiment to investigate the effect of time of refinement on Eucalyptus pulp from four different sources. Cubic smoothing splines are used to describe differences in the trend over time and unstructured covariance matrices between times are found to be necessary.
Resumo:
Joint generalized linear models and double generalized linear models (DGLMs) were designed to model outcomes for which the variability can be explained using factors and/or covariates. When such factors operate, the usual normal regression models, which inherently exhibit constant variance, will under-represent variation in the data and hence may lead to erroneous inferences. For count and proportion data, such noise factors can generate a so-called overdispersion effect, and the use of binomial and Poisson models underestimates the variability and, consequently, incorrectly indicate significant effects. In this manuscript, we propose a DGLM from a Bayesian perspective, focusing on the case of proportion data, where the overdispersion can be modeled using a random effect that depends on some noise factors. The posterior joint density function was sampled using Monte Carlo Markov Chain algorithms, allowing inferences over the model parameters. An application to a data set on apple tissue culture is presented, for which it is shown that the Bayesian approach is quite feasible, even when limited prior information is available, thereby generating valuable insight for the researcher about its experimental results.