323 resultados para Embankment Model Tests
Resumo:
This paper presents the design and implementation of an embedded soft sensor, i. e., a generic and autonomous hardware module, which can be applied to many complex plants, wherein a certain variable cannot be directly measured. It is implemented based on a fuzzy identification algorithm called ""Limited Rules"", employed to model continuous nonlinear processes. The fuzzy model has a Takagi-Sugeno-Kang structure and the premise parameters are defined based on the Fuzzy C-Means (FCM) clustering algorithm. The firmware contains the soft sensor and it runs online, estimating the target variable from other available variables. Tests have been performed using a simulated pH neutralization plant. The results of the embedded soft sensor have been considered satisfactory. A complete embedded inferential control system is also presented, including a soft sensor and a PID controller. (c) 2007, ISA. Published by Elsevier Ltd. All rights reserved.
Resumo:
Accurate price forecasting for agricultural commodities can have significant decision-making implications for suppliers, especially those of biofuels, where the agriculture and energy sectors intersect. Environmental pressures and high oil prices affect demand for biofuels and have reignited the discussion about effects on food prices. Suppliers in the sugar-alcohol sector need to decide the ideal proportion of ethanol and sugar to optimise their financial strategy. Prices can be affected by exogenous factors, such as exchange rates and interest rates, as well as non-observable variables like the convenience yield, which is related to supply shortages. The literature generally uses two approaches: artificial neural networks (ANNs), which are recognised as being in the forefront of exogenous-variable analysis, and stochastic models such as the Kalman filter, which is able to account for non-observable variables. This article proposes a hybrid model for forecasting the prices of agricultural commodities that is built upon both approaches and is applied to forecast the price of sugar. The Kalman filter considers the structure of the stochastic process that describes the evolution of prices. Neural networks allow variables that can impact asset prices in an indirect, nonlinear way, what cannot be incorporated easily into traditional econometric models.
Resumo:
Recently, the development of industrial processes brought on the outbreak of technologically complex systems. This development generated the necessity of research relative to the mathematical techniques that have the capacity to deal with project complexities and validation. Fuzzy models have been receiving particular attention in the area of nonlinear systems identification and analysis due to it is capacity to approximate nonlinear behavior and deal with uncertainty. A fuzzy rule-based model suitable for the approximation of many systems and functions is the Takagi-Sugeno (TS) fuzzy model. IS fuzzy models are nonlinear systems described by a set of if then rules which gives local linear representations of an underlying system. Such models can approximate a wide class of nonlinear systems. In this paper a performance analysis of a system based on IS fuzzy inference system for the calibration of electronic compass devices is considered. The contribution of the evaluated IS fuzzy inference system is to reduce the error obtained in data acquisition from a digital electronic compass. For the reliable operation of the TS fuzzy inference system, adequate error measurements must be taken. The error noise must be filtered before the application of the IS fuzzy inference system. The proposed method demonstrated an effectiveness of 57% at reducing the total error based on considered tests. (C) 2011 Elsevier Ltd. All rights reserved.
Resumo:
The thermodynamic assessment of an Al(2)O(3)-MnO pseudo-binary system has been carried out with the use of an ionic model. The use of the electro-neutrality principles in addition to the constitutive relations, between site fractions of the species on each sub-lattice, the thermodynamics descriptions of each solid phase has been determined to make possible the solubility description. Based on the thermodynamics descriptions of each phase in addition to thermo-chemical data obtained from the literature, the Gibbs energy functions were optimized for each phase of the Al(2)O(3)-MnO system with the support of PARROT(R) module from ThemoCalc(R) package. A thermodynamic database was obtained, in agreement with the thermo-chemical data extracted from the literature, to describe the Al(2)O(3)-MnO system including the solubility description of solid phases. (C) 2009 Elsevier Ltd. All rights reserved.
Resumo:
This work discusses a 4D lung reconstruction method from unsynchronized MR sequential images. The lung, differently from the heart, does not have its own muscles, turning impossible to see its real movements. The visualization of the lung in motion is an actual topic of research in medicine. CT (Computerized Tomography) can obtain spatio-temporal images of the heart by synchronizing with electrocardiographic waves. The FOV of the heart is small when compared to the lung`s FOV. The lung`s movement is not periodic and is susceptible to variations in the degree of respiration. Compared to CT, MR (Magnetic Resonance) imaging involves longer acquisition times and it is not possible to obtain instantaneous 3D images of the lung. For each slice, only one temporal sequence of 2D images can be obtained. However, methods using MR are preferable because they do not involve radiation. In this paper, based on unsynchronized MR images of the lung an animated B-Repsolid model of the lung is created. The 3D animation represents the lung`s motion associated to one selected sequence of MR images. The proposed method can be divided in two parts. First, the lung`s silhouettes moving in time are extracted by detecting the presence of a respiratory pattern on 2D spatio-temporal MR images. This approach enables us to determine the lung`s silhouette for every frame, even on frames with obscure edges. The sequence of extracted lung`s silhouettes are unsynchronized sagittal and coronal silhouettes. Using our algorithm it is possible to reconstruct a 3D lung starting from a silhouette of any type (coronal or sagittal) selected from any instant in time. A wire-frame model of the lung is created by composing coronal and sagittal planar silhouettes representing cross-sections. The silhouette composition is severely underconstrained. Many wire-frame models can be created from the observed sequences of silhouettes in time. Finally, a B-Rep solid model is created using a meshing algorithm. Using the B-Rep solid model the volume in time for the right and left lungs were calculated. It was possible to recognize several characteristics of the 3D real right and left lungs in the shaded model. (C) 2007 Elsevier Ltd. All rights reserved.
Resumo:
Since the computer viruses pose a serious problem to individual and corporative computer systems, a lot of effort has been dedicated to study how to avoid their deleterious actions, trying to create anti-virus programs acting as vaccines in personal computers or in strategic network nodes. Another way to combat viruses propagation is to establish preventive policies based on the whole operation of a system that can be modeled with population models, similar to those that are used in epidemiological studies. Here, a modified version of the SIR (Susceptible-Infected-Removed) model is presented and how its parameters are related to network characteristics is explained. Then, disease-free and endemic equilibrium points are calculated, stability and bifurcation conditions are derived and some numerical simulations are shown. The relations among the model parameters in the several bifurcation conditions allow a network design minimizing viruses risks. (C) 2009 Elsevier Inc. All rights reserved.
Resumo:
This work deals with a procedure for model re-identification of a process in closed loop with ail already existing commercial MPC. The controller considered here has a two-layer structure where the upper layer performs a target calculation based on a simplified steady-state optimization of the process. Here, it is proposed a methodology where a test signal is introduced in a tuning parameter of the target calculation layer. When the outputs are controlled by zones instead of at fixed set points, the approach allows the continuous operation of the process without an excessive disruption of the operating objectives as process constraints and product specifications remain satisfied during the identification test. The application of the method is illustrated through the simulation of two processes of the oil refining industry. (c) 2008 Elsevier Ltd. All rights reserved.
Resumo:
Owing to its toxicity, aluminum (Al), which is one of the most abundant metals, inhibits the productivity of many cultures and affects the microbial metabolism. The aim of this work was to investigate the capacity of sugar cane vinasse to mitigate the adverse effects of Al on cell growth, viability, and budding, as the likely result of possible chelating action. For this purpose, Fleischmann`s yeast (Saccharomyces cerevisiae) was used in growth tests performed in 125-mL Erlenmeyer flasks containing 30 mL of YED medium (5.0 g/L yeast extract plus 20 g/L glucose) supplemented with the selected amounts of either vinasse or Al in the form of AlCl(3) center dot A H(2)O. Without vinasse, the addition of increasing levels of Al up to 54 mg/L reduced the specific growth rate by 18%, whereas no significant reduction was observed in its presence. The toxic effect of Al on S. cerevisiae growth and the mitigating effect of sugar cane vinasse were quantified by the exponential model of Ciftci et al. (Biotechnol Bioeng 25:2007-2023, 1983). The cell viability decreased from 97.7% at the start to 84.0% at the end of runs without vinasse and to 92.3% with vinasse. On the other hand, the cell budding increased from 7.62% at the start to 8.84% at the end of runs without vinasse and to 17.8% with vinasse. These results demonstrate the ability of this raw material to stimulate cell growth and mitigate the toxic effect of Al.
Resumo:
A bathtub-shaped failure rate function is very useful in survival analysis and reliability studies. The well-known lifetime distributions do not have this property. For the first time, we propose a location-scale regression model based on the logarithm of an extended Weibull distribution which has the ability to deal with bathtub-shaped failure rate functions. We use the method of maximum likelihood to estimate the model parameters and some inferential procedures are presented. We reanalyze a real data set under the new model and the log-modified Weibull regression model. We perform a model check based on martingale-type residuals and generated envelopes and the statistics AIC and BIC to select appropriate models. (C) 2009 Elsevier B.V. All rights reserved.
Resumo:
In a sample of censored survival times, the presence of an immune proportion of individuals who are not subject to death, failure or relapse, may be indicated by a relatively high number of individuals with large censored survival times. In this paper the generalized log-gamma model is modified for the possibility that long-term survivors may be present in the data. The model attempts to separately estimate the effects of covariates on the surviving fraction, that is, the proportion of the population for which the event never occurs. The logistic function is used for the regression model of the surviving fraction. Inference for the model parameters is considered via maximum likelihood. Some influence methods, such as the local influence and total local influence of an individual are derived, analyzed and discussed. Finally, a data set from the medical area is analyzed under the log-gamma generalized mixture model. A residual analysis is performed in order to select an appropriate model.
Resumo:
Interval-censored survival data, in which the event of interest is not observed exactly but is only known to occur within some time interval, occur very frequently. In some situations, event times might be censored into different, possibly overlapping intervals of variable widths; however, in other situations, information is available for all units at the same observed visit time. In the latter cases, interval-censored data are termed grouped survival data. Here we present alternative approaches for analyzing interval-censored data. We illustrate these techniques using a survival data set involving mango tree lifetimes. This study is an example of grouped survival data.
Resumo:
Joint generalized linear models and double generalized linear models (DGLMs) were designed to model outcomes for which the variability can be explained using factors and/or covariates. When such factors operate, the usual normal regression models, which inherently exhibit constant variance, will under-represent variation in the data and hence may lead to erroneous inferences. For count and proportion data, such noise factors can generate a so-called overdispersion effect, and the use of binomial and Poisson models underestimates the variability and, consequently, incorrectly indicate significant effects. In this manuscript, we propose a DGLM from a Bayesian perspective, focusing on the case of proportion data, where the overdispersion can be modeled using a random effect that depends on some noise factors. The posterior joint density function was sampled using Monte Carlo Markov Chain algorithms, allowing inferences over the model parameters. An application to a data set on apple tissue culture is presented, for which it is shown that the Bayesian approach is quite feasible, even when limited prior information is available, thereby generating valuable insight for the researcher about its experimental results.
Resumo:
The application of airborne laser scanning (ALS) technologies in forest inventories has shown great potential to improve the efficiency of forest planning activities. Precise estimates, fast assessment and relatively low complexity can explain the good results in terms of efficiency. The evolution of GPS and inertial measurement technologies, as well as the observed lower assessment costs when these technologies are applied to large scale studies, can explain the increasing dissemination of ALS technologies. The observed good quality of results can be expressed by estimates of volumes and basal area with estimated error below the level of 8.4%, depending on the size of sampled area, the quantity of laser pulses per square meter and the number of control plots. This paper analyzes the potential of an ALS assessment to produce certain forest inventory statistics in plantations of cloned Eucalyptus spp with precision equal of superior to conventional methods. The statistics of interest in this case were: volume, basal area, mean height and dominant trees mean height. The ALS flight for data assessment covered two strips of approximately 2 by 20 Km, in which clouds of points were sampled in circular plots with a radius of 13 m. Plots were sampled in different parts of the strips to cover different stand ages. The clouds of points generated by the ALS assessment: overall height mean, standard error, five percentiles (height under which we can find 10%, 30%, 50%,70% and 90% of the ALS points above ground level in the cloud), and density of points above ground level in each percentile were calculated. The ALS statistics were used in regression models to estimate mean diameter, mean height, mean height of dominant trees, basal area and volume. Conventional forest inventory sample plots provided real data. For volume, an exploratory assessment involving different combinations of ALS statistics allowed for the definition of the most promising relationships and fitting tests based on well known forest biometric models. The models based on ALS statistics that produced the best results involved: the 30% percentile to estimate mean diameter (R(2)=0,88 and MQE%=0,0004); the 10% and 90% percentiles to estimate mean height (R(2)=0,94 and MQE%=0,0003); the 90% percentile to estimate dominant height (R(2)=0,96 and MQE%=0,0003); the 10% percentile and mean height of ALS points to estimate basal area (R(2)=0,92 and MQE%=0,0016); and, to estimate volume, age and the 30% and 90% percentiles (R(2)=0,95 MQE%=0,002). Among the tested forest biometric models, the best fits were provided by the modified Schumacher using age and the 90% percentile, modified Clutter using age, mean height of ALS points and the 70% percentile, and modified Buckman using age, mean height of ALS points and the 10% percentile.
Resumo:
Using data from a logging experiment in the eastern Brazilian Amazon region, we develop a matrix growth and yield model that captures the dynamic effects of harvest system choice on forest structure and composition. Multinomial logistic regression is used to estimate the growth transition parameters for a 10-year time step, while a Poisson regression model is used to estimate recruitment parameters. The model is designed to be easily integrated with an economic model of decisionmaking to perform tropical forest policy analysis. The model is used to compare the long-run structure and composition of a stand arising from the choice of implementing either conventional logging techniques or more carefully planned and executed reduced-impact logging (RIL) techniques, contrasted against a baseline projection of an unlogged forest. Results from log and leave scenarios show that a stand logged according to Brazilian management requirements will require well over 120 years to recover its initial commercial volume, regardless of logging technique employed. Implementing RIL, however, accelerates this recovery. Scenarios imposing a 40-year cutting cycle raise the possibility of sustainable harvest volumes, although at significantly lower levels than is implied by current regulations. Meeting current Brazilian forest policy goals may require an increase in the planned total area of permanent production forest or the widespread adoption of silvicultural practices that increase stand recovery and volume accumulation rates after RIL harvests. Published by Elsevier B.V.
Resumo:
Using a dynamic systems model specifically developed for Piracicaba, Capivari and Jundia River Water Basins (BH-PCJ) as a tool to help to analyze water resources management alternatives for policy makers and decision takers, five simulations for 50 years timeframe were performed. The model estimates water supply and demand, as well as wastewater generation from the consumers at BH-PCJ. A run was performed using mean precipitation value constant, and keeping the actual water supply and demand rates, the business as usual scenario. Under these considerations, it is expected an increment of about similar to 76% on water demand, that similar to 39% of available water volume will come from wastewater reuse, and that waste load increases to similar to 91%. Falkenmark Index will change from 1,403 m(3) person(-1) year(-1) in 2004, to 734 m(3) P(-1) year(-1) by 2054, and the Sustainability Index from 0.44 to 0.20. Another four simulations were performed by affecting the annual precipitation by 90 and 110%; considering an ecological flow equal to 30% of the mean daily flow; and keeping the same rates for all other factors except for ecological flow and household water consumption. All of them showed a tendency to a water crisis in the near future at BH-PCJ.