147 resultados para Process Models

em Biblioteca Digital da Produção Intelectual da Universidade de São Paulo (BDPI/USP)


Relevância:

70.00% 70.00%

Publicador:

Resumo:

Among several process variability sources, valve friction and inadequate controller tuning are supposed to be two of the most prevalent. Friction quantification methods can be applied to the development of model-based compensators or to diagnose valves that need repair, whereas accurate process models can be used in controller retuning. This paper extends existing methods that jointly estimate the friction and process parameters, so that a nonlinear structure is adopted to represent the process model. The developed estimation algorithm is tested with three different data sources: a simulated first order plus dead time process, a hybrid setup (composed of a real valve and a simulated pH neutralization process) and from three industrial datasets corresponding to real control loops. The results demonstrate that the friction is accurately quantified, as well as ""good"" process models are estimated in several situations. Furthermore, when a nonlinear process model is considered, the proposed extension presents significant advantages: (i) greater accuracy for friction quantification and (ii) reasonable estimates of the nonlinear steady-state characteristics of the process. (C) 2010 Elsevier Ltd. All rights reserved.

Relevância:

70.00% 70.00%

Publicador:

Resumo:

This paper presents two strategies for the upgrade of set-up generation systems for tandem cold mills. Even though these mills have been modernized mainly due to quality requests, their upgrades may be made intending to replace pre-calculated reference tables. In this case, Bryant and Osborn mill model without adaptive technique is proposed. As a more demanding modernization, Bland and Ford model including adaptation is recommended, although it requires a more complex computational hardware. Advantages and disadvantages of these two systems are compared and discussed and experimental results obtained from an industrial cold mill are shown.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Atmospheric aerosol particles serving as cloud condensation nuclei (CCN) are key elements of the hydrological cycle and climate. We have measured and characterized CCN at water vapor supersaturations in the range of S=0.10-0.82% in pristine tropical rainforest air during the AMAZE-08 campaign in central Amazonia. The effective hygroscopicity parameters describing the influence of chemical composition on the CCN activity of aerosol particles varied in the range of kappa approximate to 0.1-0.4 (0.16+/-0.06 arithmetic mean and standard deviation). The overall median value of kappa approximate to 0.15 was by a factor of two lower than the values typically observed for continental aerosols in other regions of the world. Aitken mode particles were less hygroscopic than accumulation mode particles (kappa approximate to 0.1 at D approximate to 50 nm; kappa approximate to 0.2 at D approximate to 200 nm), which is in agreement with earlier hygroscopicity tandem differential mobility analyzer (H-TDMA) studies. The CCN measurement results are consistent with aerosol mass spectrometry (AMS) data, showing that the organic mass fraction (f(org)) was on average as high as similar to 90% in the Aitken mode (D <= 100 nm) and decreased with increasing particle diameter in the accumulation mode (similar to 80% at D approximate to 200 nm). The kappa values exhibited a negative linear correlation with f(org) (R(2)=0.81), and extrapolation yielded the following effective hygroscopicity parameters for organic and inorganic particle components: kappa(org)approximate to 0.1 which can be regarded as the effective hygroscopicity of biogenic secondary organic aerosol (SOA) and kappa(inorg)approximate to 0.6 which is characteristic for ammonium sulfate and related salts. Both the size dependence and the temporal variability of effective particle hygroscopicity could be parameterized as a function of AMS-based organic and inorganic mass fractions (kappa(p)=kappa(org) x f(org)+kappa(inorg) x f(inorg)). The CCN number concentrations predicted with kappa(p) were in fair agreement with the measurement results (similar to 20% average deviation). The median CCN number concentrations at S=0.1-0.82% ranged from N(CCN,0.10)approximate to 35 cm(-3) to N(CCN,0.82)approximate to 160 cm(-3), the median concentration of aerosol particles larger than 30 nm was N(CN,30)approximate to 200 cm(-3), and the corresponding integral CCN efficiencies were in the range of N(CCN,0.10/NCN,30)approximate to 0.1 to N(CCN,0.82/NCN,30)approximate to 0.8. Although the number concentrations and hygroscopicity parameters were much lower in pristine rainforest air, the integral CCN efficiencies observed were similar to those in highly polluted megacity air. Moreover, model calculations of N(CCN,S) assuming an approximate global average value of kappa approximate to 0.3 for continental aerosols led to systematic overpredictions, but the average deviations exceeded similar to 50% only at low water vapor supersaturation (0.1%) and low particle number concentrations (<= 100 cm(-3)). Model calculations assuming aconstant aerosol size distribution led to higher average deviations at all investigated levels of supersaturation: similar to 60% for the campaign average distribution and similar to 1600% for a generic remote continental size distribution. These findings confirm earlier studies suggesting that aerosol particle number and size are the major predictors for the variability of the CCN concentration in continental boundary layer air, followed by particle composition and hygroscopicity as relatively minor modulators. Depending on the required and applicable level of detail, the information and parameterizations presented in this paper should enable efficient description of the CCN properties of pristine tropical rainforest aerosols of Amazonia in detailed process models as well as in large-scale atmospheric and climate models.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

The exact physical conditions generating the abundances of r-elements in environments such as supernovae explosions are still under debate. We evaluated the characteristics expected for the neutrino wind in the proposed model of type-II supernova driven by conversion of nuclear matter to strange matter. Neutrinos will change the final abundance of elements after freeze out of r-process nucleosynthesis, specially those close to mass peaks.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Com o objetivo de comparar a satisfação das mulheres com a experiência do parto em três modelos assistenciais, foi realizada pesquisa descritiva, com abordagem quantitativa, em dois hospitais públicos de São Paulo, um promovendo o modelo "Típico" e o outro com um centro de parto intra-hospitalar (modelo "CPNIH") e um peri-hospitalar (modelo "CPNPH"). A amostra foi constituída por 90 puérperas, 30 de cada modelo. A comparação entre os resultados referentes à satisfação das mulheres com o atendimento prestado pelos profissionais de saúde, com a qualidade da assistência e os motivos de satisfação e insatisfação, com a indicação ou recomendação dos serviços recebidos, com a sensação de segurança no processo e com as sugestões de melhorias, mostrou que o modelo CPHPH foi o melhor avaliado, vindo em seguida o CPNIH e por último o Típico. Conclui-se que o modelo peri-hospitalar de assistência ao parto deveria receber maior apoio do SUS, por se constituir em serviço em que as mulheres se mostram satisfeitas com a atenção recebida

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The mass function of cluster-size halos and their redshift distribution are computed for 12 distinct accelerating cosmological scenarios and confronted to the predictions of the conventional flat Lambda CDM model. The comparison with Lambda CDM is performed by a two-step process. First, we determine the free parameters of all models through a joint analysis involving the latest cosmological data, using supernovae type Ia, the cosmic microwave background shift parameter, and baryon acoustic oscillations. Apart from a braneworld inspired cosmology, it is found that the derived Hubble relation of the remaining models reproduces the Lambda CDM results approximately with the same degree of statistical confidence. Second, in order to attempt to distinguish the different dark energy models from the expectations of Lambda CDM, we analyze the predicted cluster-size halo redshift distribution on the basis of two future cluster surveys: (i) an X-ray survey based on the eROSITA satellite, and (ii) a Sunayev-Zeldovich survey based on the South Pole Telescope. As a result, we find that the predictions of 8 out of 12 dark energy models can be clearly distinguished from the Lambda CDM cosmology, while the predictions of 4 models are statistically equivalent to those of the Lambda CDM model, as far as the expected cluster mass function and redshift distribution are concerned. The present analysis suggests that such a technique appears to be very competitive to independent tests probing the late time evolution of the Universe and the associated dark energy effects.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

With each directed acyclic graph (this includes some D-dimensional lattices) one can associate some Abelian algebras that we call directed Abelian algebras (DAAs). On each site of the graph one attaches a generator of the algebra. These algebras depend on several parameters and are semisimple. Using any DAA, one can define a family of Hamiltonians which give the continuous time evolution of a stochastic process. The calculation of the spectra and ground-state wave functions (stationary state probability distributions) is an easy algebraic exercise. If one considers D-dimensional lattices and chooses Hamiltonians linear in the generators, in finite-size scaling the Hamiltonian spectrum is gapless with a critical dynamic exponent z=D. One possible application of the DAA is to sandpile models. In the paper we present this application, considering one- and two-dimensional lattices. In the one-dimensional case, when the DAA conserves the number of particles, the avalanches belong to the random walker universality class (critical exponent sigma(tau)=3/2). We study the local density of particles inside large avalanches, showing a depletion of particles at the source of the avalanche and an enrichment at its end. In two dimensions we did extensive Monte-Carlo simulations and found sigma(tau)=1.780 +/- 0.005.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The machining of hardened steels has always been a great challenge in metal cutting, particularly for drilling operations. Generally, drilling is the machining process that is most difficult to cool due to the tool`s geometry. The aim of this work is to determine the heat flux and the coefficient of convection in drilling using the inverse heat conduction method. Temperature was assessed during the drilling of hardened AISI H13 steel using the embedded thermocouple technique. Dry machining and two cooling/lubrication systems were used, and thermocouples were fixed at distances very close to the hole`s wall. Tests were replicated for each condition, and were carried out with new and worn drills. An analytical heat conduction model was used to calculate the temperature at tool-workpiece interface and to define the heat flux and the coefficient of convection. In all tests using new and worn out drills, the lowest temperatures and decrease of heat flux were observed using the flooded system, followed by the MQL, considering the dry condition as reference. The decrease of temperature was directly proportional to the amount of lubricant applied and was significant in the MQL system when compared to dry cutting. (C) 2011 Elsevier Ltd. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Product lifecycle management (PLM) innovates as it defines both the product as a central element to aggregate enterprise information and the lifecycle as a new time dimension for information integration and analysis. Because of its potential benefits to shorten innovation lead-times and to reduce costs, PLM has attracted a lot of attention at industry and at research. However, the current PLM implementation stage at most organisations still does not apply the lifecycle management concepts thoroughly. In order to close the existing realisation gap, this article presents a process oriented framework to support effective PLM implementation. The framework central point consists of a set of lifecycle oriented business process reference models which links the necessary fundamental concepts, enterprise knowledge and software solutions to effectively deploy PLM. (c) 2007 Elsevier B.V. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The health sector requires continuous investments to ensure the improvement of products and services from a technological standpoint, the use of new materials, equipment and tools, and the application of process management methods. Methods associated with the process management approach, such as the development of reference models of business processes, can provide significant innovations in the health sector and respond to the current market trend for modern management in this sector (Gunderman et al. (2008) [4]). This article proposes a process model for diagnostic medical X-ray imaging, from which it derives a primary reference model and describes how this information leads to gains in quality and improvements. (C) 2010 Elsevier Ireland Ltd. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Ecological niche modelling combines species occurrence points with environmental raster layers in order to obtain models for describing the probabilistic distribution of species. The process to generate an ecological niche model is complex. It requires dealing with a large amount of data, use of different software packages for data conversion, for model generation and for different types of processing and analyses, among other functionalities. A software platform that integrates all requirements under a single and seamless interface would be very helpful for users. Furthermore, since biodiversity modelling is constantly evolving, new requirements are constantly being added in terms of functions, algorithms and data formats. This evolution must be accompanied by any software intended to be used in this area. In this scenario, a Service-Oriented Architecture (SOA) is an appropriate choice for designing such systems. According to SOA best practices and methodologies, the design of a reference business process must be performed prior to the architecture definition. The purpose is to understand the complexities of the process (business process in this context refers to the ecological niche modelling problem) and to design an architecture able to offer a comprehensive solution, called a reference architecture, that can be further detailed when implementing specific systems. This paper presents a reference business process for ecological niche modelling, as part of a major work focused on the definition of a reference architecture based on SOA concepts that will be used to evolve the openModeller software package for species modelling. The basic steps that are performed while developing a model are described, highlighting important aspects, based on the knowledge of modelling experts. In order to illustrate the steps defined for the process, an experiment was developed, modelling the distribution of Ouratea spectabilis (Mart.) Engl. (Ochnaceae) using openModeller. As a consequence of the knowledge gained with this work, many desirable improvements on the modelling software packages have been identified and are presented. Also, a discussion on the potential for large-scale experimentation in ecological niche modelling is provided, highlighting opportunities for research. The results obtained are very important for those involved in the development of modelling tools and systems, for requirement analysis and to provide insight on new features and trends for this category of systems. They can also be very helpful for beginners in modelling research, who can use the process and the experiment example as a guide to this complex activity. (c) 2008 Elsevier B.V. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper presents both the theoretical and the experimental approaches of the development of a mathematical model to be used in multi-variable control system designs of an active suspension for a sport utility vehicle (SUV), in this case a light pickup truck. A complete seven-degree-of-freedom model is successfully quickly identified, with very satisfactory results in simulations and in real experiments conducted with the pickup truth. The novelty of the proposed methodology is the use of commercial software in the early stages of the identification to speed up the process and to minimize the need for a large number of costly experiments. The paper also presents major contributions to the identification of uncertainties in vehicle suspension models and in the development of identification methods using the sequential quadratic programming, where an innovation regarding the calculation of the objective function is proposed and implemented. Results from simulations of and practical experiments with the real SUV are presented, analysed, and compared, showing the potential of the method.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The objective of this paper is to develop and validate a mechanistic model for the degradation of phenol by the Fenton process. Experiments were performed in semi-batch operation, in which phenol, catechol and hydroquinone concentrations were measured. Using the methodology described in Pontes and Pinto [R.F.F. Pontes, J.M. Pinto, Analysis of integrated kinetic and flow models for anaerobic digesters, Chemical Engineering journal 122 (1-2) (2006) 65-80], a stoichiometric model was first developed, with 53 reactions and 26 compounds, followed by the corresponding kinetic model. Sensitivity analysis was performed to determine the most influential kinetic parameters of the model that were estimated with the obtained experimental results. The adjusted model was used to analyze the impact of the initial concentration and flow rate of reactants on the efficiency of the Fenton process to degrade phenol. Moreover, the model was applied to evaluate the treatment cost of wastewater contaminated with phenol in order to meet environmental standards. (C) 2009 Elsevier B.V. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The ideal conditions for the operation of tandem cold mills are connected to a set of references generated by models and used by dynamic regulators. Aiming at the optimization of the friction and yield stress coefficients an adaptation algorithm is proposed in this paper. Experimental results obtained from an industrial cold rolling mill are presented. (C) 2008 Elsevier B.V. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Sepsis remains a major cause of morbidity and mortality mainly because of sepsis-induced multiple organ dysfunction. In contrast to preclinical studies, most clinical trials of promising new treatment strategies for sepsis have failed to demonstrate efficacy. Although many reasons could account for this discrepancy, the misinterpretation of preclinical data obtained from experimental studies and especially the use of animal models that do not adequately mimic human sepsis may have been contributing factors. In this review, the potentials and limitations of various animal models of sepsis are discussed to clarify to which extent these findings are relevant to human sepsis. Such models include intravascular infusion of endotoxin or live bacteria, bacterial peritonitis, cecal ligation and perforation, soft tissue infection, pneumonia or meningitis models using different animal species including rats, mice, rabbits, dogs, pigs, sheep, and nonhuman primates. Despite several limitations, animal models remain essential in the development of all new therapies for sepsis and septic shock because they provide fundamental information about the pharmacokinetics, toxicity, and mechanism of drug action that cannot be replaced by other methods. New therapeutic agents should be studied in infection models, even after the initiation of the septic process. Furthermore, debility conditions need to be reproduced to avoid the exclusive use of healthy animals, which often do not represent the human septic patient.