926 resultados para Process Models


Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this paper we examine the time T to reach a critical number K0 of infections during an outbreak in an epidemic model with infective and susceptible immigrants. The underlying process X, which was first introduced by Ridler-Rowe (1967), is related to recurrent diseases and it appears to be analytically intractable. We present an approximating model inspired from the use of extreme values, and we derive formulae for the Laplace-Stieltjes transform of T and its moments, which are evaluated by using an iterative procedure. Numerical examples are presented to illustrate the effects of the contact and removal rates on the expected values of T and the threshold K0, when the initial time instant corresponds to an invasion time. We also study the exact reproduction number Rexact,0 and the population transmission number Rp, which are random versions of the basic reproduction number R0.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Soil erosion is a naturally occurring process that involves the detachment, transport, and deposition of soil particles. Disturbances such as thinning and wildfire can reduce cover greatly and increase erosion rates. Forest managers may use erosion prediction tools, such as the Universal Soil Loss Equation (USLE) and Water Erosion Prediction Project (WEPP) to estimate erosion rates and develop techniques to manage erosion. However, it is important to understand the differences and the applications of each model. Erosion rates were generated by each model and the model most applicable to the study site, Los Alamos, New Mexico was determined. It was also used to find the amount of cover needed to stabilize soil. The USLE is a simpler model and less complicated than a computer model like WEPP, and thus easier to manipulate to estimate cover values. Predicted cover values were compared to field cover values. Cover is necessary to establish effective erosion control guidelines.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Phase equilibrium data regression is an unavoidable task necessary to obtain the appropriate values for any model to be used in separation equipment design for chemical process simulation and optimization. The accuracy of this process depends on different factors such as the experimental data quality, the selected model and the calculation algorithm. The present paper summarizes the results and conclusions achieved in our research on the capabilities and limitations of the existing GE models and about strategies that can be included in the correlation algorithms to improve the convergence and avoid inconsistencies. The NRTL model has been selected as a representative local composition model. New capabilities of this model, but also several relevant limitations, have been identified and some examples of the application of a modified NRTL equation have been discussed. Furthermore, a regression algorithm has been developed that allows for the advisable simultaneous regression of all the condensed phase equilibrium regions that are present in ternary systems at constant T and P. It includes specific strategies designed to avoid some of the pitfalls frequently found in commercial regression tools for phase equilibrium calculations. Most of the proposed strategies are based on the geometrical interpretation of the lowest common tangent plane equilibrium criterion, which allows an unambiguous comprehension of the behavior of the mixtures. The paper aims to show all the work as a whole in order to reveal the necessary efforts that must be devoted to overcome the difficulties that still exist in the phase equilibrium data regression problem.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The exponential growth of the subjective information in the framework of the Web 2.0 has led to the need to create Natural Language Processing tools able to analyse and process such data for multiple practical applications. They require training on specifically annotated corpora, whose level of detail must be fine enough to capture the phenomena involved. This paper presents EmotiBlog – a fine-grained annotation scheme for subjectivity. We show the manner in which it is built and demonstrate the benefits it brings to the systems using it for training, through the experiments we carried out on opinion mining and emotion detection. We employ corpora of different textual genres –a set of annotated reported speech extracted from news articles, the set of news titles annotated with polarity and emotion from the SemEval 2007 (Task 14) and ISEAR, a corpus of real-life self-expressed emotion. We also show how the model built from the EmotiBlog annotations can be enhanced with external resources. The results demonstrate that EmotiBlog, through its structure and annotation paradigm, offers high quality training data for systems dealing both with opinion mining, as well as emotion detection.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

After the 2010 Haiti earthquake, that hits the city of Port-au-Prince, capital city of Haiti, a multidisciplinary working group of specialists (seismologist, geologists, engineers and architects) from different Spanish Universities and also from Haiti, joined effort under the SISMO-HAITI project (financed by the Universidad Politecnica de Madrid), with an objective: Evaluation of seismic hazard and risk in Haiti and its application to the seismic design, urban planning, emergency and resource management. In this paper, as a first step for a structural damage estimation of future earthquakes in the country, a calibration of damage functions has been carried out by means of a two-stage procedure. After compiling a database with observed damage in the city after the earthquake, the exposure model (building stock) has been classified and through an iteratively two-step calibration process, a specific set of damage functions for the country has been proposed. Additionally, Next Generation Attenuation Models (NGA) and Vs30 models have been analysed to choose the most appropriate for the seismic risk estimation in the city. Finally in a next paper, these functions will be used to estimate a seismic risk scenario for a future earthquake.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The optimization of chemical processes where the flowsheet topology is not kept fixed is a challenging discrete-continuous optimization problem. Usually, this task has been performed through equation based models. This approach presents several problems, as tedious and complicated component properties estimation or the handling of huge problems (with thousands of equations and variables). We propose a GDP approach as an alternative to the MINLP models coupled with a flowsheet program. The novelty of this approach relies on using a commercial modular process simulator where the superstructure is drawn directly on the graphical use interface of the simulator. This methodology takes advantage of modular process simulators (specially tailored numerical methods, reliability, and robustness) and the flexibility of the GDP formulation for the modeling and solution. The optimization tool proposed is successfully applied to the synthesis of a methanol plant where different alternatives are available for the streams, equipment and process conditions.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We present a derivative-free optimization algorithm coupled with a chemical process simulator for the optimal design of individual and complex distillation processes using a rigorous tray-by-tray model. The proposed approach serves as an alternative tool to the various models based on nonlinear programming (NLP) or mixed-integer nonlinear programming (MINLP) . This is accomplished by combining the advantages of using a commercial process simulator (Aspen Hysys), including especially suited numerical methods developed for the convergence of distillation columns, with the benefits of the particle swarm optimization (PSO) metaheuristic algorithm, which does not require gradient information and has the ability to escape from local optima. Our method inherits the superstructure developed in Yeomans, H.; Grossmann, I. E.Optimal design of complex distillation columns using rigorous tray-by-tray disjunctive programming models. Ind. Eng. Chem. Res.2000, 39 (11), 4326–4335, in which the nonexisting trays are considered as simple bypasses of liquid and vapor flows. The implemented tool provides the optimal configuration of distillation column systems, which includes continuous and discrete variables, through the minimization of the total annual cost (TAC). The robustness and flexibility of the method is proven through the successful design and synthesis of three distillation systems of increasing complexity.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this article, a new methodology is presented to obtain representation models for a priori relation z = u(x1, x2, . . . ,xn) (1), with a known an experimental dataset zi; x1i ; x2i ; x3i ; . . . ; xni i=1;2;...;p· In this methodology, a potential energy is initially defined over each possible model for the relationship (1), what allows the application of the Lagrangian mechanics to the derived system. The solution of the Euler–Lagrange in this system allows obtaining the optimal solution according to the minimal action principle. The defined Lagrangian, corresponds to a continuous medium, where a n-dimensional finite elements model has been applied, so it is possible to get a solution for the problem solving a compatible and determined linear symmetric equation system. The computational implementation of the methodology has resulted in an improvement in the process of get representation models obtained and published previously by the authors.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In an open system, each disequilibrium causes a force. Each force causes a flow process, these being represented by a flow variable formally written as an equation called flow equation, and if each flow tends to equilibrate the system, these equations mathematically represent the tendency to that equilibrium. In this paper, the authors, based on the concepts of forces and conjugated fluxes and dissipation function developed by Onsager and Prigogine, they expose the following hypothesis: Is replaced in Prigogine’s Theorem the flow by its equation or by a flow orbital considering conjugate force as a gradient. This allows to obtain a dissipation function for each flow equation and a function of orbital dissipation.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper describes a study and analysis of surface normal-base descriptors for 3D object recognition. Specifically, we evaluate the behaviour of descriptors in the recognition process using virtual models of objects created from CAD software. Later, we test them in real scenes using synthetic objects created with a 3D printer from the virtual models. In both cases, the same virtual models are used on the matching process to find similarity. The difference between both experiments is in the type of views used in the tests. Our analysis evaluates three subjects: the effectiveness of 3D descriptors depending on the viewpoint of camera, the geometry complexity of the model and the runtime used to do the recognition process and the success rate to recognize a view of object among the models saved in the database.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Superstructure approaches are the solution to the difficult problem which involves the rigorous economic design of a distillation column. These methods require complex initialization procedures and they are hard to solve. For this reason, these methods have not been extensively used. In this work, we present a methodology for the rigorous optimization of chemical processes implemented on a commercial simulator using surrogate models based on a kriging interpolation. Several examples were studied, but in this paper, we perform the optimization of a superstructure for a non-sharp separation to show the efficiency and effectiveness of the method. Noteworthy that it is possible to get surrogate models accurate enough with up to seven degrees of freedom.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The process of liquid silicon infiltration is investigated for channels with radii from 0.25 to 0.75 [mm] drilled in compact carbon preforms. The advantage of this setup is that the study of the phenomenon results to be simplified. For comparison purposes, attempts are made in order to work out a framework for evaluating the accuracy of simulations. The approach relies on dimensionless numbers involving the properties of the surface reaction. It turns out that complex hydrodynamic behavior derived from second Newton law can be made consistent with Lattice-Boltzmann simulations. The experiments give clear evidence that the growth of silicon carbide proceeds in two different stages and basic mechanisms are highlighted. Lattice-Boltzmann simulations prove to be an effective tool for the description of the growing phase. Namely, essential experimental constraints can be implemented. As a result, the existing models are useful to gain more insight on the process of reactive infiltration into porous media in the first stage of penetration, i.e. up to pore closure because of surface growth. A way allowing to implement the resistance from chemical reaction in Darcy law is also proposed.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this work, we propose a new methodology for the large scale optimization and process integration of complex chemical processes that have been simulated using modular chemical process simulators. Units with significant numerical noise or large CPU times are substituted by surrogate models based on Kriging interpolation. Using a degree of freedom analysis, some of those units can be aggregated into a single unit to reduce the complexity of the resulting model. As a result, we solve a hybrid simulation-optimization model formed by units in the original flowsheet, Kriging models, and explicit equations. We present a case study of the optimization of a sour water stripping plant in which we simultaneously consider economics, heat integration and environmental impact using the ReCiPe indicator, which incorporates the recent advances made in Life Cycle Assessment (LCA). The optimization strategy guarantees the convergence to a local optimum inside the tolerance of the numerical noise.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Statistical machine translation (SMT) is an approach to Machine Translation (MT) that uses statistical models whose parameter estimation is based on the analysis of existing human translations (contained in bilingual corpora). From a translation student’s standpoint, this dissertation aims to explain how a phrase-based SMT system works, to determine the role of the statistical models it uses in the translation process and to assess the quality of the translations provided that system is trained with in-domain goodquality corpora. To that end, a phrase-based SMT system based on Moses has been trained and subsequently used for the English to Spanish translation of two texts related in topic to the training data. Finally, the quality of this output texts produced by the system has been assessed through a quantitative evaluation carried out with three different automatic evaluation measures and a qualitative evaluation based on the Multidimensional Quality Metrics (MQM).

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Tese de doutoramento, Psicologia (Psicologia Clínica), Universidade de Lisboa, Faculdade de Psicologia, 2016