896 resultados para Simulation Systems Analysis


Relevância:

90.00% 90.00%

Publicador:

Resumo:

Dissertation presented to obtain a Master degree in Biotechnology

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Dynamic model, tubular reactor, polyethylene, LDPE, discretization, simulation, sensitivity analysis, nonlinear analysis

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Because of the increase in workplace automation and the diversification of industrial processes, workplaces have become more and more complex. The classical approaches used to address workplace hazard concerns, such as checklists or sequence models, are, therefore, of limited use in such complex systems. Moreover, because of the multifaceted nature of workplaces, the use of single-oriented methods, such as AEA (man oriented), FMEA (system oriented), or HAZOP (process oriented), is not satisfactory. The use of a dynamic modeling approach in order to allow multiple-oriented analyses may constitute an alternative to overcome this limitation. The qualitative modeling aspects of the MORM (man-machine occupational risk modeling) model are discussed in this article. The model, realized on an object-oriented Petri net tool (CO-OPN), has been developed to simulate and analyze industrial processes in an OH&S perspective. The industrial process is modeled as a set of interconnected subnets (state spaces), which describe its constitutive machines. Process-related factors are introduced, in an explicit way, through machine interconnections and flow properties. While man-machine interactions are modeled as triggering events for the state spaces of the machines, the CREAM cognitive behavior model is used in order to establish the relevant triggering events. In the CO-OPN formalism, the model is expressed as a set of interconnected CO-OPN objects defined over data types expressing the measure attached to the flow of entities transiting through the machines. Constraints on the measures assigned to these entities are used to determine the state changes in each machine. Interconnecting machines implies the composition of such flow and consequently the interconnection of the measure constraints. This is reflected by the construction of constraint enrichment hierarchies, which can be used for simulation and analysis optimization in a clear mathematical framework. The use of Petri nets to perform multiple-oriented analysis opens perspectives in the field of industrial risk management. It may significantly reduce the duration of the assessment process. But, most of all, it opens perspectives in the field of risk comparisons and integrated risk management. Moreover, because of the generic nature of the model and tool used, the same concepts and patterns may be used to model a wide range of systems and application fields.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Granular flow phenomena are frequently encountered in the design of process and industrial plants in the traditional fields of the chemical, nuclear and oil industries as well as in other activities such as food and materials handling. Multi-phase flow is one important branch of the granular flow. Granular materials have unusual kinds of behavior compared to normal materials, either solids or fluids. Although some of the characteristics are still not well-known yet, one thing is confirmed: the particle-particle interaction plays a key role in the dynamics of granular materials, especially for dense granular materials. At the beginning of this thesis, detailed illustration of developing two models for describing the interaction based on the results of finite-element simulation, dimension analysis and numerical simulation is presented. The first model is used to describing the normal collision of viscoelastic particles. Based on some existent models, more parameters are added to this model, which make the model predict the experimental results more accurately. The second model is used for oblique collision, which include the effects from tangential velocity, angular velocity and surface friction based on Coulomb's law. The theoretical predictions of this model are in agreement with those by finite-element simulation. I n the latter chapters of this thesis, the models are used to predict industrial granular flow and the agreement between the simulations and experiments also shows the validation of the new model. The first case presents the simulation of granular flow passing over a circular obstacle. The simulations successfully predict the existence of a parabolic steady layer and show how the characteristics of the particles, such as coefficients of restitution and surface friction affect the separation results. The second case is a spinning container filled with granular material. Employing the previous models, the simulation could also reproduce experimentally observed phenomena, such as a depression in the center of a high frequency rotation. The third application is about gas-solid mixed flow in a vertically vibrated device. Gas phase motion is added to coherence with the particle motion. The governing equations of the gas phase are solved by using the Large eddy simulation (LES) and particle motion is predicted by using the Lagrangian method. The simulation predicted some pattern formation reported by experiment.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

A sign of presence in virtual environments is that people respond to situations and events as if they were real, where response may be considered at many different levels, ranging from unconscious physiological responses through to overt behavior,emotions, and thoughts. In this paper we consider two responses that gave different indications of the onset of presence in a gradually forming environment. Two aspects of the response of people to an immersive virtual environment were recorded: their eye scanpath, and their skin conductance response (SCR). The scenario was formed over a period of 2 min, by introducing an increasing number of its polygons in random order in a head-tracked head-mounted display. For one group of experimental participants (n 8) the environment formed into one in which they found themselves standing on top of a 3 m high column. For a second group of participants (n 6) the environment was otherwise the same except that the column was only 1 cm high, so that they would be standing at normal ground level. For a third group of participants (n 14) the polygons never formed into a meaningful environment. The participants who stood on top of the tall column exhibited a significant decrease in entropy of the eye scanpath and an increase in the number of SCR by 99 s into the scenario, at a time when only 65% of the polygons had been displayed. The ground level participants exhibited a similar decrease in scanpath entropy, but not the increase in SCR. The random scenario grouping did not exhibit this decrease in eye scanpath entropy. A drop in scanpath entropy indicates that the environment had cohered into a meaningful perception. An increase in the rate of SCR indicates the perception of an aversive stimulus. These results suggest that on these two dimensions (scanpath entropy and rate of SCR) participants were responding realistically to the scenario shown in the virtual environment. In addition, the response occurred well before the entire scenario had been displayed, suggesting that once a set of minimal cues exists within a scenario,it is enough to form a meaningful perception. Moreover, at the level of the sympathetic nervous system, the participants who were standing on top of the column exhibited arousal as if their experience might be real. This is an important practical aspect of the concept of presence.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Kivihiokkeen valmistus on energiaintensiivistä. Käytetystä energiasta muuttuu yli 90 prosenttia lämmöksi. Hiomolla käytetystä lämmöksi muuttuneesta tehosta voidaan paperikoneelle siirtää noin puolet. Mekaanisen massan valmistuksen ja paperikoneen vesikierrot erotetaan toisistaan häiriöaineiden kulkeutumisen estämiseksi. Vesikiertojen erottamisella katkaistaan myös lämmön siirtyminen hiomolta paperikoneelle massojen mukana. Käyttämällä lämmönsiirtimiä hiomon vesien jäähdytyksessä, voidaan hiomon hiomakoneiden suihkuvesivesilämpötilaa alentaa. Lämmönsiirto vaikuttaa paperikoneella annostelumassojen laimennusten kautta perälaatikkolämpötilaa kohottavasti. Työn tehtäväksi määritettiin kesäkuukausina esiintyvä hiomakoneiden suihkuveden raakavesijäähdytyksen tarpeen poistaminen ensisijaisesti niin, että ylimäärälämpö hyödynnetään tehtaalla. Työn muiksi tavoitteiksi muodostui annostelumassojen lämpötilan hallinta, etenkin muutokset, joilla voidaan nostaa hylkymassan annostelulämpötilaa. Työn kokeellinen osa tehtiin UPM Kymmene Oyj Kajaanin tehtailla syksyn 2004 aikana. Työssä tutkittiin WinGEMS simulointiohjelmalla tehtyjen mallien avulla lämmön siirtymistä hiomon ja paperikone 2:n välillä, sekä lämmönsiirtoa pois tasealueelta. Simulointimalli nykytilanteesta rakennettiin yksityiskohtaisesti nykyisen tuotantoprosessin kaltaiseksi ja siitä muokattiin eri vaihtoehtoja, joilla ratkaistiin tutkimukselle asetetut tehtävät. Kytkentämuutoksilla pystyttiin siirtämään hiomolta yli 85 % hiomakoneiden suihkuveden ylimäärälämmöstä ilman uusia laitehankintoja. Asentamalla lopuksi lämmönsiirrin hiomon puhdassuodoslinjaan, hiomakoneiden suihkuveden jäähdytystarve poistettiin kokonaan. Samalla alennettiin valkaisuun menevän massan lämpötilaa, jolloin peroksidivalkaisun kemikaalikulutus väheni yli 10 %. Lämmönsiirrinverkostosta tehtiin kesätilanteen pinch-analyysi, jolla selvitettiin prosessin lämmitys ja jäähdytystarpeet. Analyysin perusteella selvisi, että kytkennöissä ei rikota pinch sääntöjä ja, että prosessissa esiintyy kynnysongelma, jossa prosessi tarvitsee ainoastaan jäähdytystä.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The goal of this study is to develop managerial recommendations for international vendors and system integrators, which offer Software as a Service for enterprise information systems on the Russian market. Those recommendations can be used to develop marketing, sales, new product and service level agreement strategies. For those reasons factors affecting SaaS adoption were determined and their influence on intention to adoption was examined.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

This paper presents a study on the dynamics of the rattling problem in gearboxes under non-ideal excitation. The subject has being analyzed by a number of authors such as Karagiannis and Pfeiffer (1991), for the ideal excitation case. An interesting model of the same problem by Moon (1992) has been recently used by Souza and Caldas (1999) to detect chaotic behavior. We consider two spur gears with different diameters and gaps between the teeth. Suppose the motion of one gear to be given while the motion of the other is governed by its dynamics. In the ideal case, the driving wheel is supposed to undergo a sinusoidal motion with given constant amplitude and frequency. In this paper, we consider the motion to be a function of the system response and a limited energy source is adopted. Thus an extra degree of freedom is introduced in the problem. The equations of motion are obtained via a Lagrangian approach with some assumed characteristic torque curves. Next, extensive numerical integration is used to detect some interesting geometrical aspects of regular and irregular motions of the system response.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

This thesis presents a one-dimensional, semi-empirical dynamic model for the simulation and analysis of a calcium looping process for post-combustion CO2 capture. Reduction of greenhouse emissions from fossil fuel power production requires rapid actions including the development of efficient carbon capture and sequestration technologies. The development of new carbon capture technologies can be expedited by using modelling tools. Techno-economical evaluation of new capture processes can be done quickly and cost-effectively with computational models before building expensive pilot plants. Post-combustion calcium looping is a developing carbon capture process which utilizes fluidized bed technology with lime as a sorbent. The main objective of this work was to analyse the technological feasibility of the calcium looping process at different scales with a computational model. A one-dimensional dynamic model was applied to the calcium looping process, simulating the behaviour of the interconnected circulating fluidized bed reactors. The model incorporates fundamental mass and energy balance solvers to semi-empirical models describing solid behaviour in a circulating fluidized bed and chemical reactions occurring in the calcium loop. In addition, fluidized bed combustion, heat transfer and core-wall layer effects were modelled. The calcium looping model framework was successfully applied to a 30 kWth laboratory scale and a pilot scale unit 1.7 MWth and used to design a conceptual 250 MWth industrial scale unit. Valuable information was gathered from the behaviour of a small scale laboratory device. In addition, the interconnected behaviour of pilot plant reactors and the effect of solid fluidization on the thermal and carbon dioxide balances of the system were analysed. The scale-up study provided practical information on the thermal design of an industrial sized unit, selection of particle size and operability in different load scenarios.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The main objective of this master’s thesis is to examine if Weibull analysis is suitable method for warranty forecasting in the Case Company. The Case Company has used Reliasoft’s Weibull++ software, which is basing on the Weibull method, but the Company has noticed that the analysis has not given right results. This study was conducted making Weibull simulations in different profit centers of the Case Company and then comparing actual cost and forecasted cost. Simula-tions were made using different time frames and two methods for determining future deliveries. The first sub objective is to examine, which parameters of simulations will give the best result to each profit center. The second sub objective of this study is to create a simple control model for following forecasted costs and actual realized costs. The third sub objective is to document all Qlikview-parameters of profit centers. This study is a constructive research, and solutions for company’s problems are figured out in this master’s thesis. In the theory parts were introduced quality issues, for example; what is quality, quality costing and cost of poor quality. Quality is one of the major aspects in the Case Company, so understand-ing the link between quality and warranty forecasting is important. Warranty management was also introduced and other different tools for warranty forecasting. The Weibull method and its mathematical properties and reliability engineering were introduced. The main results of this master’s thesis are that the Weibull analysis forecasted too high costs, when calculating provision. Although, some forecasted values of profit centers were lower than actual values, the method works better for planning purposes. One of the reasons is that quality improving or alternatively quality decreasing is not showing in the results of the analysis in the short run. The other reason for too high values is that the products of the Case Company are com-plex and analyses were made in the profit center-level. The Weibull method was developed for standard products, but products of the Case Company consists of many complex components. According to the theory, this method was developed for homogeneous-data. So the most im-portant notification is that the analysis should be made in the product level, not the profit center level, when the data is more homogeneous.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

In the field of molecular biology, scientists adopted for decades a reductionist perspective in their inquiries, being predominantly concerned with the intricate mechanistic details of subcellular regulatory systems. However, integrative thinking was still applied at a smaller scale in molecular biology to understand the underlying processes of cellular behaviour for at least half a century. It was not until the genomic revolution at the end of the previous century that we required model building to account for systemic properties of cellular activity. Our system-level understanding of cellular function is to this day hindered by drastic limitations in our capability of predicting cellular behaviour to reflect system dynamics and system structures. To this end, systems biology aims for a system-level understanding of functional intraand inter-cellular activity. Modern biology brings about a high volume of data, whose comprehension we cannot even aim for in the absence of computational support. Computational modelling, hence, bridges modern biology to computer science, enabling a number of assets, which prove to be invaluable in the analysis of complex biological systems, such as: a rigorous characterization of the system structure, simulation techniques, perturbations analysis, etc. Computational biomodels augmented in size considerably in the past years, major contributions being made towards the simulation and analysis of large-scale models, starting with signalling pathways and culminating with whole-cell models, tissue-level models, organ models and full-scale patient models. The simulation and analysis of models of such complexity very often requires, in fact, the integration of various sub-models, entwined at different levels of resolution and whose organization spans over several levels of hierarchy. This thesis revolves around the concept of quantitative model refinement in relation to the process of model building in computational systems biology. The thesis proposes a sound computational framework for the stepwise augmentation of a biomodel. One starts with an abstract, high-level representation of a biological phenomenon, which is materialised into an initial model that is validated against a set of existing data. Consequently, the model is refined to include more details regarding its species and/or reactions. The framework is employed in the development of two models, one for the heat shock response in eukaryotes and the second for the ErbB signalling pathway. The thesis spans over several formalisms used in computational systems biology, inherently quantitative: reaction-network models, rule-based models and Petri net models, as well as a recent formalism intrinsically qualitative: reaction systems. The choice of modelling formalism is, however, determined by the nature of the question the modeler aims to answer. Quantitative model refinement turns out to be not only essential in the model development cycle, but also beneficial for the compilation of large-scale models, whose development requires the integration of several sub-models across various levels of resolution and underlying formal representations.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Ce document traite premièrement des diverses tentatives de modélisation et de simulation de la nage anguilliforme puis élabore une nouvelle technique, basée sur la méthode de la frontière immergée généralisée et la théorie des poutres de Reissner-Simo. Cette dernière, comme les équations des fluides polaires, est dérivée de la mécanique des milieux continus puis les équations obtenues sont discrétisées afin de les amener à une résolution numérique. Pour la première fois, la théorie des schémas de Runge-Kutta additifs est combinée à celle des schémas de Runge-Kutta-Munthe-Kaas pour engendrer une méthode d’ordre de convergence formel arbitraire. De plus, les opérations d’interpolation et d’étalement sont traitées d’un nouveau point de vue qui suggère l’usage des splines interpolatoires nodales en lieu et place des fonctions d’étalement traditionnelles. Enfin, de nombreuses vérifications numériques sont faites avant de considérer les simulations de la nage.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

En un mundo hiperconectado, dinámico y cargado de incertidumbre como el actual, los métodos y modelos analíticos convencionales están mostrando sus limitaciones. Las organizaciones requieren, por tanto, herramientas útiles que empleen tecnología de información y modelos de simulación computacional como mecanismos para la toma de decisiones y la resolución de problemas. Una de las más recientes, potentes y prometedoras es el modelamiento y la simulación basados en agentes (MSBA). Muchas organizaciones, incluidas empresas consultoras, emplean esta técnica para comprender fenómenos, hacer evaluación de estrategias y resolver problemas de diversa índole. Pese a ello, no existe (hasta donde conocemos) un estado situacional acerca del MSBA y su aplicación a la investigación organizacional. Cabe anotar, además, que por su novedad no es un tema suficientemente difundido y trabajado en Latinoamérica. En consecuencia, este proyecto pretende elaborar un estado situacional sobre el MSBA y su impacto sobre la investigación organizacional.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Virtual Reality (VR) is widely used in visualizing medical datasets. This interest has emerged due to the usefulness of its techniques and features. Such features include immersion, collaboration, and interactivity. In a medical visualization context, immersion is important, because it allows users to interact directly and closelywith detailed structures in medical datasets. Collaboration on the other hand is beneficial, because it gives medical practitioners the chance to share their expertise and offer feedback and advice in a more effective and intuitive approach. Interactivity is crucial in medical visualization and simulation systems, because responsiveand instantaneous actions are key attributes in applications, such as surgical simulations. In this paper we present a case study that investigates the use of VR in a collaborative networked CAVE environment from a medical volumetric visualization perspective. The study will present a networked CAVE application, which has been built to visualize and interact with volumetric datasets. We will summarize the advantages of such an application and the potential benefits of our system. We also will describe the aspects related to this application area and the relevant issues of such implementations.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

An enterprise is viewed as a complex system which can be engineered to accomplish organisational objectives. Systems analysis and modelling will enable to the planning and development of the enterprise and IT systems. Many IT systems design methods focus on functional and non-functional requirements of the IT systems. Most methods are normally capable of one but leave out other aspects. Analysing and modelling of both business and IT systems may often have to call on techniques from various suites of methods which may be placed on different philosophic and methodological underpinnings. Coherence and consistency between the analyses are hard to ensure. This paper introduces the Problem Articulation Method (PAM) which facilitates the design of an enterprise system infrastructure on which an IT system is built. Outcomes of this analysis represent requirements which can be further used for planning and designing a technical system. As a case study, a finance system, Agresso, for e-procurement has been used in this paper to illustrate the applicability of PAM in modelling complex systems.