40 resultados para Interaction modeling. Model-based development. Interaction evaluation.
Resumo:
This contribution describes the development of a continuous emulsion copolymerization processs for vinyl acetate and n-butyl acrylate in a tubular reactor. Special features of this reactor include the use of oscillatory (pulsed) flow and internals (sieve plates) to prevent polymer fouling and promote good radial mixing, along with a controlled amount of axial mixing. The copolymer system studied (vinyl acetate and butyl acrylate) is strongly prone to composition drift due to very different reactivity ratios. An axially dispersed plug flow model, based on classical free radical copolymerization kinetics, was developed for this process and used successfully to optimize the lateral feeding profile to reduce compositional drift. An energy balance was included in the model equations to predict the effect of temperature variations on the process. The model predictions were validated with experimental data for monomer conversion, copolymer composition, average particle size, and temperature measured along the reactor length.
Resumo:
Oxidation processes can be used to treat industrial wastewater containing non-biodegradable organic compounds. However, the presence of dissolved salts may inhibit or retard the treatment process. In this study, wastewater desalination by electrodialysis (ED) associated with an advanced oxidation process (photo-Fenton) was applied to an aqueous NaCl solution containing phenol. The influence of process variables on the demineralization factor was investigated for ED in pilot scale and a correlation was obtained between the phenol, salt and water fluxes with the driving force. The oxidation process was investigated in a laboratory batch reactor and a model based on artificial neural networks was developed by fitting the experimental data describing the reaction rate as a function of the input variables. With the experimental parameters of both processes, a dynamic model was developed for ED and a continuous model, using a plug flow reactor approach, for the oxidation process. Finally, the hybrid model simulation could validate different scenarios of the integrated system and can be used for process optimization.
Resumo:
Among several process variability sources, valve friction and inadequate controller tuning are supposed to be two of the most prevalent. Friction quantification methods can be applied to the development of model-based compensators or to diagnose valves that need repair, whereas accurate process models can be used in controller retuning. This paper extends existing methods that jointly estimate the friction and process parameters, so that a nonlinear structure is adopted to represent the process model. The developed estimation algorithm is tested with three different data sources: a simulated first order plus dead time process, a hybrid setup (composed of a real valve and a simulated pH neutralization process) and from three industrial datasets corresponding to real control loops. The results demonstrate that the friction is accurately quantified, as well as ""good"" process models are estimated in several situations. Furthermore, when a nonlinear process model is considered, the proposed extension presents significant advantages: (i) greater accuracy for friction quantification and (ii) reasonable estimates of the nonlinear steady-state characteristics of the process. (C) 2010 Elsevier Ltd. All rights reserved.
Resumo:
Colletotrichum gossypii var. cephalosporioides, the fungus that causes ramulosis disease of cotton, is widespread in Brazil and can cause severe yield loss. Because weather conditions greatly affect disease development, the objective of this work was to develop weather-based models to assess disease favorability. Latent period, incidence, and severity of ramulosis symptoms were evaluated in controlled environment experiments using factorial combinations of temperature (15, 20, 25, 30, and 35 degrees C) and leaf wetness duration (0, 4, 8, 16, 32, and 64 h after inoculation). Severity was modeled as an exponential function of leaf wetness duration and temperature. At the optimum temperature of disease development, 27 degrees C, average latent period was 10 days. Maximum ramulosis severity occurred from 20 to 30 degrees C, with sharp decreases at lower and higher temperatures. Ramulosis severity increased as wetness periods were increased from 4 to 32 h. In field experiments at Piracicaba, Sao Paulo State, Brazil, cotton plots were inoculated (10(5) conidia ml(-1)) and ramulosis severity was evaluated weekly. The model obtained from the controlled environment study was used to generate a disease favorability index for comparison with disease progress rate in the field. Hourly measurements of solar radiation, temperature, relative humidity, leaf wetness duration, rainfall, and wind speed were also evaluated as possible explanatory variables. Both the disease favorability model and a model based on rainfall explained ramulosis growth rate well, with R(2) of 0.89 and 0.91, respectively. They are proposed as models of ramulosis development rate on cotton in Brazil, and weather-disease relationships revealed by this work can form the basis of a warning system for ramulosis development.
Resumo:
Overcommitment of development capacity or development resource deficiencies are important problems in new product development (NPD). Existing approaches to development resource planning have largely neglected the issue of resource magnitude required for NPD. This research aims to fill the void by developing a simple higher-level aggregate model based on an intuitive idea: The number of new product families that a firm can effectively undertake is bound by the complexity of its products or systems and the total amount of resources allocated to NPD. This study examines three manufacturing companies to verify the proposed model. The empirical results confirm the study`s initial hypothesis: The more complex the product family, the smaller the number of product families that are launched per unit of revenue. Several suggestions and implications for managing NPD resources are discussed, such as how this study`s model can establish an upper limit for the capacity to develop and launch new product families.
Resumo:
This paper uses a fully operational inter-regional computable general equilibrium (CGE) model implemented for the Brazilian economy, based on previous work by Haddad and Hewings, in order to assess the likely economic effects of road transportation policy changes in Brazil. Among the features embedded in this framework, modelling of external scale economies and transportation costs provides an innovative way of dealing explicitly with theoretical issues related to integrated regional systems. The model is calibrated for 109 regions. The explicit modelling of transportation costs built into the inter-regional CGE model, based on origin-destination flows, which takes into account the spatial structure of the Brazilian economy, creates the capability of integrating the inter-regional CGE model with a geo-coded transportation network model enhancing the potential of the framework in understanding the role of infrastructure on regional development. The transportation model used is the so-called Highway Development and Management, developed by the World Bank, implemented using the software TransCAD. Further extensions of the current model specification for integrating other features of transport planning in a continental industrialising country like Brazil are discussed, with the goal of building a bridge between conventional transport planning practices and the innovative use of CGE models. In order to illustrate the analytical power of the integrated system, the authors present a set of simulations, which evaluate the ex ante economic impacts of physical/qualitative changes in the Brazilian road network (for example, a highway improvement), in accordance with recent policy developments in Brazil. Rather than providing a critical evaluation of this debate, they intend to emphasise the likely structural impacts of such policies. They expect that the results will reinforce the need to better specifying spatial interactions in inter-regional CGE models.
Resumo:
FS CMa type stars are a recently described group of objects with the B[e] phenomenon which exhibits strong emission-line spectra and strong IR excesses. In this paper, we report the first attempt for a detailed modeling of IRAS 00470+6429, for which we have the best set of observations. Our modeling is based on two key assumptions: the star has a main-sequence luminosity for its spectral type (B2) and the circumstellar (CS) envelope is bimodal, composed of a slowly outflowing disklike wind and a fast polar wind. Both outflows are assumed to be purely radial. We adopt a novel approach to describe the dust formation site in the wind that employs timescale arguments for grain condensation and a self-consistent solution for the dust destruction surface. With the above assumptions we were able to satisfactorily reproduce many observational properties of IRAS 00470+6429, including the Hi line profiles and the overall shape of the spectral energy distribution. Our adopted recipe for dust formation proved successful in reproducing the correct amount of dust formed in the CS envelope. Possible shortcomings of our model, as well as suggestions for future improvements, are discussed.
Resumo:
The evolution of commodity computing lead to the possibility of efficient usage of interconnected machines to solve computationally-intensive tasks, which were previously solvable only by using expensive supercomputers. This, however, required new methods for process scheduling and distribution, considering the network latency, communication cost, heterogeneous environments and distributed computing constraints. An efficient distribution of processes over such environments requires an adequate scheduling strategy, as the cost of inefficient process allocation is unacceptably high. Therefore, a knowledge and prediction of application behavior is essential to perform effective scheduling. In this paper, we overview the evolution of scheduling approaches, focusing on distributed environments. We also evaluate the current approaches for process behavior extraction and prediction, aiming at selecting an adequate technique for online prediction of application execution. Based on this evaluation, we propose a novel model for application behavior prediction, considering chaotic properties of such behavior and the automatic detection of critical execution points. The proposed model is applied and evaluated for process scheduling in cluster and grid computing environments. The obtained results demonstrate that prediction of the process behavior is essential for efficient scheduling in large-scale and heterogeneous distributed environments, outperforming conventional scheduling policies by a factor of 10, and even more in some cases. Furthermore, the proposed approach proves to be efficient for online predictions due to its low computational cost and good precision. (C) 2009 Elsevier B.V. All rights reserved.
Resumo:
In interval-censored survival data, the event of interest is not observed exactly but is only known to occur within some time interval. Such data appear very frequently. In this paper, we are concerned only with parametric forms, and so a location-scale regression model based on the exponentiated Weibull distribution is proposed for modeling interval-censored data. We show that the proposed log-exponentiated Weibull regression model for interval-censored data represents a parametric family of models that include other regression models that are broadly used in lifetime data analysis. Assuming the use of interval-censored data, we employ a frequentist analysis, a jackknife estimator, a parametric bootstrap and a Bayesian analysis for the parameters of the proposed model. We derive the appropriate matrices for assessing local influences on the parameter estimates under different perturbation schemes and present some ways to assess global influences. Furthermore, for different parameter settings, sample sizes and censoring percentages, various simulations are performed; in addition, the empirical distribution of some modified residuals are displayed and compared with the standard normal distribution. These studies suggest that the residual analysis usually performed in normal linear regression models can be straightforwardly extended to a modified deviance residual in log-exponentiated Weibull regression models for interval-censored data. (C) 2009 Elsevier B.V. All rights reserved.
Resumo:
The adsorption behavior of several amphiphilic polyelectrolytes of poly(maleic anhydride-alt-styrene) functionalized with naphthyl and phenyl groups, onto amino-terminated silicon wafer has been studied by means of null- ellipsometry, atomic force microscopy (AFM) and contact angle measurements. The maximum of adsorption, Gamma(plateau), varies with the ionic strength, the polyelectrolyte structure and the chain length. Values of Gamma(plateau) obtained at low and high ionic strengths indicate that the adsorption follows the ""screening-reduced adsorption"" regime. Large aggregates were detected in solution by means of dynamic light scattering and fluorescence measurements. However. AFM indicated the formation of smooth layers and the absence of aggregates. A model based on a two-step adsorption behavior was proposed. In the first one, isolated chains in equilibrium with the aggregates in solution adsorbed onto amino-terminated surface. The adsorption is driven by electrostatic interaction between protonated surface and carboxylate groups. This first layer exposes naphtyl or phenyl groups to the solution. The second layer adsorption is now driven by hydrophobic interaction between surface and chains and exposes carboxylate groups to the medium, which repel the forthcoming chain by electrostatic repulsion. Upon drying some hydrophobic naphtyl or phenyl groups might be oriented to the air, as revealed by contact angle measurements. Such amphiphilic polyelectrolyte layers worked well for the building-up of multilayers with chitosan. (C) 2010 Elsevier Ltd. All rights reserved.
Resumo:
A dynamic atmosphere generator with a naphthalene emission source has been constructed and used for the development and evaluation of a bioluminescence sensor based on the bacteria Pseudomonas fluorescens HK44 immobilized in 2% agar gel (101 cell mL(-1)) placed in sampling tubes. A steady naphthalene emission rate (around 7.3 nmol min(-1) at 27 degrees C and 7.4 mLmin(-1) of purified air) was obtained by covering the diffusion unit containing solid naphthalene with a PTFE filter membrane. The time elapsed from gelation of the agar matrix to analyte exposure (""maturation time"") was found relevant for the bioluminescence assays, being most favorable between 1.5 and 3 h. The maximum light emission, observed after 80 min, is dependent on the analyte concentration and the exposure time (evaluated between 5 and 20 min), but not on the flow rate of naphthalene in the sampling tube, over the range of 1.8-7.4 nmol min(-1). A good linear response was obtained between 50 and 260 nmol L-1 with a limit of detection estimated in 20 nmol L-1 far below the recommended threshold limit value for naphthalene in air. (c) 2008 Elsevier B.V. All rights reserved.
Resumo:
OBJECTIVES: The complexity and heterogeneity of human bone, as well as ethical issues, most always hinder the performance of clinical trials. Thus, in vitro studies become an important source of information for the understanding of biomechanical events on implant-supported prostheses, although study results cannot be considered reliable unless validation studies are conducted. The purpose of this work was to validate an artificial experimental model based on its modulus of elasticity, to simulate the performance of human bone in vivo in biomechanical studies of implant-supported prostheses. MATERIAL AND METHODS: In this study, fast-curing polyurethane (F16 polyurethane, Axson) was used to build 40 specimens that were divided into five groups. The following reagent ratios (part A/part B) were used: Group A (0.5/1.0), Group B (0.8/1.0), Group C (1.0/1.0), Group D (1.2/1.0), and Group E (1.5/1.0). A universal testing machine (Kratos model K - 2000 MP) was used to measure modulus of elasticity values by compression. RESULTS: Mean modulus of elasticity values were: Group A - 389.72 MPa, Group B - 529.19 MPa, Group C - 571.11 MPa, Group D - 470.35 MPa, Group E - 437.36 MPa. CONCLUSION: The best mechanical characteristics and modulus of elasticity value comparable to that of human trabecular bone were obtained when A/B ratio was 1:1.
Resumo:
In this work, the development and evaluation of a hyphenated flow injection-capillary electrophoresis system with on-line pre-concentration is described. Preliminary tests were performed to investigate the influence of flow rates over the analytical signals. Results revealed losses in terms of sensitivity of the FIA-CE system when compared to the conventional CE system. To overcome signal decrease and to make the system more efficient, a lower flow rate was set and an anionic resin column was added to the flow manifold in order to pre-concentrate the analyte. The pre-concentration FIA-CE system presented a sensitivity improvement of about 660% and there was only a small increase of 8% in total peak dispersion. These results have confirmed the great potential of the proposed system for many analytical tasks especially for low concentration samples.
Resumo:
A compact frequency standard based on an expanding cold (133)CS cloud is under development in our laboratory. In a first experiment, Cs cold atoms were prepared by a magneto-optical trap in a vapor cell, and a microwave antenna was used to transmit the radiation for the clock transition. The signal obtained from fluorescence of the expanding cold atoms cloud is used to lock a microwave chain. In this way the overall system stability is evaluated. A theoretical model based on a two-level system interacting with the two microwave pulses enables interpretation for the observed features, especially the poor Ramsey fringes contrast. (C) 2008 Optical Society of America.
Resumo:
The aim of this article is to present the main contributions of human resource management to develop sustainable organizations. The relationship between human resources and organizational sustainability, which is based on economical, social and environmental performance, involves some important aspects concerning management such as innovation, cultural diversity and the environment. The integration of items from the triple bottom line approach leads to developing a model based on a strategic and central posture of human resource management. Based on this model, propositions and recommendations for future research on this theme are presented.