936 resultados para MODEL ANALYSIS
Resumo:
The objective of this paper is to develop and validate a mechanistic model for the degradation of phenol by the Fenton process. Experiments were performed in semi-batch operation, in which phenol, catechol and hydroquinone concentrations were measured. Using the methodology described in Pontes and Pinto [R.F.F. Pontes, J.M. Pinto, Analysis of integrated kinetic and flow models for anaerobic digesters, Chemical Engineering journal 122 (1-2) (2006) 65-80], a stoichiometric model was first developed, with 53 reactions and 26 compounds, followed by the corresponding kinetic model. Sensitivity analysis was performed to determine the most influential kinetic parameters of the model that were estimated with the obtained experimental results. The adjusted model was used to analyze the impact of the initial concentration and flow rate of reactants on the efficiency of the Fenton process to degrade phenol. Moreover, the model was applied to evaluate the treatment cost of wastewater contaminated with phenol in order to meet environmental standards. (C) 2009 Elsevier B.V. All rights reserved.
Resumo:
Cooling towers are widely used in many industrial and utility plants as a cooling medium, whose thermal performance is of vital importance. Despite the wide interest in cooling tower design, rating and its importance in energy conservation, there are few investigations concerning the integrated analysis of cooling systems. This work presents an approach for the systemic performance analysis of a cooling water system. The approach combines experimental design with mathematical modeling. An experimental investigation was carried out to characterize the mass transfer in the packing of the cooling tower as a function of the liquid and gas flow rates, whose results were within the range of the measurement accuracy. Then, an integrated model was developed that relies on the mass and heat transfer of the cooling tower, as well as on the hydraulic and thermal interactions with a heat exchanger network. The integrated model for the cooling water system was simulated and the temperature results agree with the experimental data of the real operation of the pilot plant. A case study illustrates the interaction in the system and the need for a systemic analysis of cooling water system. The proposed mathematical and experimental analysis should be useful for performance analysis of real-world cooling water systems. (C) 2009 Elsevier Ltd. All rights reserved.
Resumo:
We derive an easy-to-compute approximate bound for the range of step-sizes for which the constant-modulus algorithm (CMA) will remain stable if initialized close to a minimum of the CM cost function. Our model highlights the influence, of the signal constellation used in the transmission system: for smaller variation in the modulus of the transmitted symbols, the algorithm will be more robust, and the steady-state misadjustment will be smaller. The theoretical results are validated through several simulations, for long and short filters and channels.
Resumo:
The TCP/IP architecture was consolidated as a standard to the distributed systems. However, there are several researches and discussions about alternatives to the evolution of this architecture and, in this study area, this work presents the Title Model to contribute with the application needs support by the cross layer ontology use and the horizontal addressing, in a next generation Internet. For a practical viewpoint, is showed the network cost reduction for the distributed programming example, in networks with layer 2 connectivity. To prove the title model enhancement, it is presented the network analysis performed for the message passing interface, sending a vector of integers and returning its sum. By this analysis, it is confirmed that the current proposal allows, in this environment, a reduction of 15,23% over the total network traffic, in bytes.
Resumo:
This paper presents an analysis of the performance of a baseband multiple-input single-output (MISO) time reversal ultra-wideband system (TR-UWB) incorporating a symbol spaced decision feedback equalizer (DFE). A semi-analytical performance analysis based on a Gaussian approach is considered, which matched well with simulation results, even for the DFE case. The channel model adopted is based on the IEEE 802.15.3a model, considering correlated shadowing across antenna elements. In order to provide a more realistic analysis, channel estimation errors are considered for the design of the TR filter. A guideline for the choice of equalizer length is provided. The results show that the system`s performance improves with an increase in the number of transmit antennas and when a symbol spaced equalizer is used with a relatively small number of taps compared to the number of resolvable paths in the channel impulse response. Moreover, it is possible to conclude that due to the time reversal scheme, the error propagation in the DFE does not play a role in the system`s performance.
Resumo:
Lightning-induced overvoltages have a considerable impact on the power quality of overhead distribution and telecommunications systems, and various models have been developed for the computation of the electromagnetic transients caused by indirect strokes. The most adequate has been shown to be the one proposed by Agrawal et al.; the Rusck model can be visualized as a particular case, as both models are equivalent when the lightning channel is perpendicular to the ground plane. In this paper, an extension of the Rusck model that enables the calculation of lightning-induced transients considering flashes to nearby elevated structures and realistic line configurations is tested against data obtained from both natural lightning and scale model experiments. The latter, performed under controlled conditions, can be used also to verify the validity of other coupling models and relevant codes. The so-called Extended Rusck Model, which is shown to be sufficiently accurate, is applied to the analysis of lightning-induced voltages on lines with a shield wire and/or surge arresters. The investigation conducted indicates that the ratio between the peak values of the voltages induced by typical first and subsequent strokes can be either greater or smaller than the unity, depending on the line configuration.
Resumo:
A rigorous derivation of non-linear equations governing the dynamics of an axially loaded beam is given with a clear focus to develop robust low-dimensional models. Two important loading scenarios were considered, where a structure is subjected to a uniformly distributed axial and a thrust force. These loads are to mimic the main forces acting on an offshore riser, for which an analytical methodology has been developed and applied. In particular, non-linear normal modes (NNMs) and non-linear multi-modes (NMMs) have been constructed by using the method of multiple scales. This is to effectively analyse the transversal vibration responses by monitoring the modal responses and mode interactions. The developed analytical models have been crosschecked against the results from FEM simulation. The FEM model having 26 elements and 77 degrees-of-freedom gave similar results as the low-dimensional (one degree-of-freedom) non-linear oscillator, which was developed by constructing a so-called invariant manifold. The comparisons of the dynamical responses were made in terms of time histories, phase portraits and mode shapes. (C) 2008 Elsevier Ltd. All rights reserved.
Resumo:
Accurate price forecasting for agricultural commodities can have significant decision-making implications for suppliers, especially those of biofuels, where the agriculture and energy sectors intersect. Environmental pressures and high oil prices affect demand for biofuels and have reignited the discussion about effects on food prices. Suppliers in the sugar-alcohol sector need to decide the ideal proportion of ethanol and sugar to optimise their financial strategy. Prices can be affected by exogenous factors, such as exchange rates and interest rates, as well as non-observable variables like the convenience yield, which is related to supply shortages. The literature generally uses two approaches: artificial neural networks (ANNs), which are recognised as being in the forefront of exogenous-variable analysis, and stochastic models such as the Kalman filter, which is able to account for non-observable variables. This article proposes a hybrid model for forecasting the prices of agricultural commodities that is built upon both approaches and is applied to forecast the price of sugar. The Kalman filter considers the structure of the stochastic process that describes the evolution of prices. Neural networks allow variables that can impact asset prices in an indirect, nonlinear way, what cannot be incorporated easily into traditional econometric models.
Resumo:
Background: The presence of the periodontal ligament (PDL) makes it possible to absorb and distribute loads produced during masticatory function and other tooth contacts into the alveolar process via the alveolar bone proper. However, several factors affect the integrity of periodontal structures causing the destruction of the connective matrix and cells, the loss of fibrous attachment, and the resorption of alveolar bone. Methods: The purpose of this study was to evaluate the stress distribution by finite element analysis in a PDL in three-dimensional models of the upper central incisor under three different load conditions: 100 N occlusal loading at 45 degrees (model 1: masticatory load); 500 N at the incisal edge at 45 degrees (model 2: parafunctional habit); and 800 N at the buccal surface at 90 degrees (model 3: trauma case). The models were built from computed tomography scans. Results: The stress distribution was quite different among the models. The most significant values (harmful) of tensile and compressive stresses were observed in models 2 and 3, with similarly distinct patterns of stress distributions along the PDL. Tensile stresses were observed along the internal and external aspects of the PDL, mostly at the cervical and middle thirds. Conclusions: The stress generation in these models may affect the integrity of periodontal structures. A better understanding of the biomechanical behavior of the PDL under physiologic and traumatic loading conditions might enhance the understanding of the biologic reaction of the PDL in health and disease. J Periodontol 2009;80:1859-1867.
Resumo:
In this paper, we compare three residuals to assess departures from the error assumptions as well as to detect outlying observations in log-Burr XII regression models with censored observations. These residuals can also be used for the log-logistic regression model, which is a special case of the log-Burr XII regression model. For different parameter settings, sample sizes and censoring percentages, various simulation studies are performed and the empirical distribution of each residual is displayed and compared with the standard normal distribution. These studies suggest that the residual analysis usually performed in normal linear regression models can be straightforwardly extended to the modified martingale-type residual in log-Burr XII regression models with censored data.
Resumo:
A bathtub-shaped failure rate function is very useful in survival analysis and reliability studies. The well-known lifetime distributions do not have this property. For the first time, we propose a location-scale regression model based on the logarithm of an extended Weibull distribution which has the ability to deal with bathtub-shaped failure rate functions. We use the method of maximum likelihood to estimate the model parameters and some inferential procedures are presented. We reanalyze a real data set under the new model and the log-modified Weibull regression model. We perform a model check based on martingale-type residuals and generated envelopes and the statistics AIC and BIC to select appropriate models. (C) 2009 Elsevier B.V. All rights reserved.
Resumo:
This paper proposes a regression model considering the modified Weibull distribution. This distribution can be used to model bathtub-shaped failure rate functions. Assuming censored data, we consider maximum likelihood and Jackknife estimators for the parameters of the model. We derive the appropriate matrices for assessing local influence on the parameter estimates under different perturbation schemes and we also present some ways to perform global influence. Besides, for different parameter settings, sample sizes and censoring percentages, various simulations are performed and the empirical distribution of the modified deviance residual is displayed and compared with the standard normal distribution. These studies suggest that the residual analysis usually performed in normal linear regression models can be straightforwardly extended for a martingale-type residual in log-modified Weibull regression models with censored data. Finally, we analyze a real data set under log-modified Weibull regression models. A diagnostic analysis and a model checking based on the modified deviance residual are performed to select appropriate models. (c) 2008 Elsevier B.V. All rights reserved.
Resumo:
In this study, regression models are evaluated for grouped survival data when the effect of censoring time is considered in the model and the regression structure is modeled through four link functions. The methodology for grouped survival data is based on life tables, and the times are grouped in k intervals so that ties are eliminated. Thus, the data modeling is performed by considering the discrete models of lifetime regression. The model parameters are estimated by using the maximum likelihood and jackknife methods. To detect influential observations in the proposed models, diagnostic measures based on case deletion, which are denominated global influence, and influence measures based on small perturbations in the data or in the model, referred to as local influence, are used. In addition to those measures, the local influence and the total influential estimate are also employed. Various simulation studies are performed and compared to the performance of the four link functions of the regression models for grouped survival data for different parameter settings, sample sizes and numbers of intervals. Finally, a data set is analyzed by using the proposed regression models. (C) 2010 Elsevier B.V. All rights reserved.
Resumo:
Joint generalized linear models and double generalized linear models (DGLMs) were designed to model outcomes for which the variability can be explained using factors and/or covariates. When such factors operate, the usual normal regression models, which inherently exhibit constant variance, will under-represent variation in the data and hence may lead to erroneous inferences. For count and proportion data, such noise factors can generate a so-called overdispersion effect, and the use of binomial and Poisson models underestimates the variability and, consequently, incorrectly indicate significant effects. In this manuscript, we propose a DGLM from a Bayesian perspective, focusing on the case of proportion data, where the overdispersion can be modeled using a random effect that depends on some noise factors. The posterior joint density function was sampled using Monte Carlo Markov Chain algorithms, allowing inferences over the model parameters. An application to a data set on apple tissue culture is presented, for which it is shown that the Bayesian approach is quite feasible, even when limited prior information is available, thereby generating valuable insight for the researcher about its experimental results.
Resumo:
Using data from a logging experiment in the eastern Brazilian Amazon region, we develop a matrix growth and yield model that captures the dynamic effects of harvest system choice on forest structure and composition. Multinomial logistic regression is used to estimate the growth transition parameters for a 10-year time step, while a Poisson regression model is used to estimate recruitment parameters. The model is designed to be easily integrated with an economic model of decisionmaking to perform tropical forest policy analysis. The model is used to compare the long-run structure and composition of a stand arising from the choice of implementing either conventional logging techniques or more carefully planned and executed reduced-impact logging (RIL) techniques, contrasted against a baseline projection of an unlogged forest. Results from log and leave scenarios show that a stand logged according to Brazilian management requirements will require well over 120 years to recover its initial commercial volume, regardless of logging technique employed. Implementing RIL, however, accelerates this recovery. Scenarios imposing a 40-year cutting cycle raise the possibility of sustainable harvest volumes, although at significantly lower levels than is implied by current regulations. Meeting current Brazilian forest policy goals may require an increase in the planned total area of permanent production forest or the widespread adoption of silvicultural practices that increase stand recovery and volume accumulation rates after RIL harvests. Published by Elsevier B.V.