99 resultados para forecasting models
Resumo:
A thermodynamic approach to predict bulk glass-forming compositions in binary metallic systems was recently proposed. In this approach. the parameter gamma* = Delta H-amor/(Delta H-inter - Delta H-amor) indicates the glass-forming ability (GFA) from the standpoint of the driving force to form different competing phases, and Delta H-amor and Delta H-inter are the enthalpies for-lass and intermetallic formation, respectively. Good glass-forming compositions should have a large negative enthalpy for glass formation and a very small difference for intermetallic formation, thus making the glassy phase easily reachable even under low cooling rates. The gamma* parameter showed a good correlation with GFA experimental data in the Ni-Nb binary system. In this work, a simple extension of the gamma* parameter is applied in the ternary Al-Ni-Y system. The calculated gamma* isocontours in the ternary diagram are compared with experimental results of glass formation in that system. Despite sonic misfitting, the best glass formers are found quite close to the highest gamma* values, leading to the conclusion that this thermodynamic approach can lie extended to ternary systems, serving as a useful tool for the development of new glass-forming compositions. Finally the thermodynamic approach is compared with the topological instability criteria used to predict the thermal behavior of glassy Al alloys. (C) 2007 Elsevier B. V. All rights reserved.
Resumo:
Chloride attack in marine environments or in structures where deicing salts are used will not always show profiles with concentrations that decrease from the external surface to the interior of the concrete. Some profiles show an increase in chloride concentrations from when a peak is formed. This type of profile must be analyzed in a different way from the traditional model of Fick`s second law to generate more precise service life models. A model for forecasting the penetration of chloride ions as a function of time for profiles having formed a peak. To confirm the efficiency of this model, it is necessary to observe the behavior of a chloride profile with peak in a specific structure over a period of time. To achieve this, two chloride profiles with different ages (22 and 27 years) were extracted from the same structure. The profile obtained from the 22-year sample was used to estimate the chloride profile at 27 years using three models: a) the traditional model using Fick`s second law and extrapolating the value of C(S)-external surface chloride concentration; b) the traditional model using Fick`s second law and shifting the x-axis to the peak depth; c) the previously proposed model. The results from these models were compared with the actual profile measured in the 27-year sample and the results were analyzed. The model was presented with good precision for this study of case, requiring to be tested with other structures in use.
Resumo:
In this paper, a comparative analysis of the long-term electric power forecasting methodologies used in some South American countries, is presented. The purpose of this study is to compare and observe if such methodologies have some similarities, and also examine the behavior of the results when they are applied to the Brazilian electric market. The abovementioned power forecasts were performed regarding the main four consumption classes (residential, industrial, commercial and rural) which are responsible for approximately 90% of the national consumption. The tool used in this analysis was the SAS (c) program. The outcome of this study allowed identifying various methodological similarities, mainly those related to the econometric variables used by these methods. This fact strongly conditioned the comparative results obtained.
Resumo:
This work deals with the determination of crack openings in 2D reinforced concrete structures using the Finite Element Method with a smeared rotating crack model or an embedded crack model In the smeared crack model, the strong discontinuity associated with the crack is spread throughout the finite element As is well known, the continuity of the displacement field assumed for these models is incompatible with the actual discontinuity However, this type of model has been used extensively due to the relative computational simplicity it provides by treating cracks in a continuum framework, as well as the reportedly good predictions of reinforced concrete members` structural behavior On the other hand, by enriching the displacement field within each finite element crossed by the crack path, the embedded crack model is able to describe the effects of actual discontinuities (cracks) This paper presents a comparative study of the abilities of these 2D models in predicting the mechanical behavior of reinforced concrete structures Structural responses are compared with experimental results from the literature, including crack patterns, crack openings and rebar stresses predicted by both models
Resumo:
This paper presents both the theoretical and the experimental approaches of the development of a mathematical model to be used in multi-variable control system designs of an active suspension for a sport utility vehicle (SUV), in this case a light pickup truck. A complete seven-degree-of-freedom model is successfully quickly identified, with very satisfactory results in simulations and in real experiments conducted with the pickup truth. The novelty of the proposed methodology is the use of commercial software in the early stages of the identification to speed up the process and to minimize the need for a large number of costly experiments. The paper also presents major contributions to the identification of uncertainties in vehicle suspension models and in the development of identification methods using the sequential quadratic programming, where an innovation regarding the calculation of the objective function is proposed and implemented. Results from simulations of and practical experiments with the real SUV are presented, analysed, and compared, showing the potential of the method.
Resumo:
The ideal conditions for the operation of tandem cold mills are connected to a set of references generated by models and used by dynamic regulators. Aiming at the optimization of the friction and yield stress coefficients an adaptation algorithm is proposed in this paper. Experimental results obtained from an industrial cold rolling mill are presented. (C) 2008 Elsevier B.V. All rights reserved.
Resumo:
This letter addresses the optimization and complexity reduction of switch-reconfigured antennas. A new optimization technique based on graph models is investigated. This technique is used to minimize the redundancy in a reconfigurable antenna structure and reduce its complexity. A graph modeling rule for switch-reconfigured antennas is proposed, and examples are presented.
Resumo:
Distribution of timing signals is an essential factor for the development of digital systems for telecommunication networks, integrated circuits and manufacturing automation. Originally, this distribution was implemented by using the master-slave architecture with a precise master clock generator sending signals to phase-locked loops (PLL) working as slave oscillators. Nowadays, wireless networks with dynamical connectivity and the increase in size and operation frequency of the integrated circuits suggest that the distribution of clock signals could be more efficient if mutually connected architectures were used. Here, mutually connected PLL networks are studied and conditions for synchronous states existence are analytically derived, depending on individual node parameters and network connectivity, considering that the nodes are nonlinear oscillators with nonlinear coupling conditions. An expression for the network synchronisation frequency is obtained. The lock-in range and the transmission error bounds are analysed providing hints to the design of this kind of clock distribution system.
Resumo:
Eight different models to represent the effect of friction in control valves are presented: four models based on physical principles and four empirical ones. The physical models, both static and dynamic, have the same structure. The models are implemented in Simulink/Matlab (R) and compared, using different friction coefficients and input signals. Three of the models were able to reproduce the stick-slip phenomenon and passed all the tests, which were applied following ISA standards. (C) 2008 Elsevier Ltd. All rights reserved.
Resumo:
This paper presents two strategies for the upgrade of set-up generation systems for tandem cold mills. Even though these mills have been modernized mainly due to quality requests, their upgrades may be made intending to replace pre-calculated reference tables. In this case, Bryant and Osborn mill model without adaptive technique is proposed. As a more demanding modernization, Bland and Ford model including adaptation is recommended, although it requires a more complex computational hardware. Advantages and disadvantages of these two systems are compared and discussed and experimental results obtained from an industrial cold mill are shown.
Resumo:
Computer viruses are an important risk to computational systems endangering either corporations of all sizes or personal computers used for domestic applications. Here, classical epidemiological models for disease propagation are adapted to computer networks and, by using simple systems identification techniques a model called SAIC (Susceptible, Antidotal, Infectious, Contaminated) is developed. Real data about computer viruses are used to validate the model. (c) 2008 Elsevier Ltd. All rights reserved.
Resumo:
In this paper, we compare three residuals to assess departures from the error assumptions as well as to detect outlying observations in log-Burr XII regression models with censored observations. These residuals can also be used for the log-logistic regression model, which is a special case of the log-Burr XII regression model. For different parameter settings, sample sizes and censoring percentages, various simulation studies are performed and the empirical distribution of each residual is displayed and compared with the standard normal distribution. These studies suggest that the residual analysis usually performed in normal linear regression models can be straightforwardly extended to the modified martingale-type residual in log-Burr XII regression models with censored data.
Resumo:
Mixed models have become important in analyzing the results of experiments, particularly those that require more complicated models (e.g., those that involve longitudinal data). This article describes a method for deriving the terms in a mixed model. Our approach extends an earlier method by Brien and Bailey to explicitly identify terms for which autocorrelation and smooth trend arising from longitudinal observations need to be incorporated in the model. At the same time we retain the principle that the model used should include, at least, all the terms that are justified by the randomization. This is done by dividing the factors into sets, called tiers, based on the randomization and determining the crossing and nesting relationships between factors. The method is applied to formulate mixed models for a wide range of examples. We also describe the mixed model analysis of data from a three-phase experiment to investigate the effect of time of refinement on Eucalyptus pulp from four different sources. Cubic smoothing splines are used to describe differences in the trend over time and unstructured covariance matrices between times are found to be necessary.
Resumo:
In this paper, we present various diagnostic methods for polyhazard models. Polyhazard models are a flexible family for fitting lifetime data. Their main advantage over the single hazard models, such as the Weibull and the log-logistic models, is to include a large amount of nonmonotone hazard shapes, as bathtub and multimodal curves. Some influence methods, such as the local influence and total local influence of an individual are derived, analyzed and discussed. A discussion of the computation of the likelihood displacement as well as the normal curvature in the local influence method are presented. Finally, an example with real data is given for illustration.
Resumo:
This paper proposes a regression model considering the modified Weibull distribution. This distribution can be used to model bathtub-shaped failure rate functions. Assuming censored data, we consider maximum likelihood and Jackknife estimators for the parameters of the model. We derive the appropriate matrices for assessing local influence on the parameter estimates under different perturbation schemes and we also present some ways to perform global influence. Besides, for different parameter settings, sample sizes and censoring percentages, various simulations are performed and the empirical distribution of the modified deviance residual is displayed and compared with the standard normal distribution. These studies suggest that the residual analysis usually performed in normal linear regression models can be straightforwardly extended for a martingale-type residual in log-modified Weibull regression models with censored data. Finally, we analyze a real data set under log-modified Weibull regression models. A diagnostic analysis and a model checking based on the modified deviance residual are performed to select appropriate models. (c) 2008 Elsevier B.V. All rights reserved.