948 resultados para Process simulation


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Global energy consumption has been increasing yearly and a big portion of it is used in rotating electrical machineries. It is clear that in these machines energy should be used efficiently. In this dissertation the aim is to improve the design process of high-speed electrical machines especially from the mechanical engineering perspective in order to achieve more reliable and efficient machines. The design process of high-speed machines is challenging due to high demands and several interactions between different engineering disciplines such as mechanical, electrical and energy engineering. A multidisciplinary design flow chart for a specific type of high-speed machine in which computer simulation is utilized is proposed. In addition to utilizing simulation parallel with the design process, two simulation studies are presented. The first is used to find the limits of two ball bearing models. The second is used to study the improvement of machine load capacity in a compressor application to exceed the limits of current machinery. The proposed flow chart and simulation studies show clearly that improvements in the high-speed machinery design process can be achieved. Engineers designing in high-speed machines can utilize the flow chart and simulation results as a guideline during the design phase to achieve more reliable and efficient machines that use energy efficiently in required different operation conditions.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This dissertation describes an approach for developing a real-time simulation for working mobile vehicles based on multibody modeling. The use of multibody modeling allows comprehensive description of the constrained motion of the mechanical systems involved and permits real-time solving of the equations of motion. By carefully selecting the multibody formulation method to be used, it is possible to increase the accuracy of the multibody model while at the same time solving equations of motion in real-time. In this study, a multibody procedure based on semi-recursive and augmented Lagrangian methods for real-time dynamic simulation application is studied in detail. In the semirecursive approach, a velocity transformation matrix is introduced to describe the dependent coordinates into relative (joint) coordinates, which reduces the size of the generalized coordinates. The augmented Lagrangian method is based on usage of global coordinates and, in that method, constraints are accounted using an iterative process. A multibody system can be modelled as either rigid or flexible bodies. When using flexible bodies, the system can be described using a floating frame of reference formulation. In this method, the deformation mode needed can be obtained from the finite element model. As the finite element model typically involves large number of degrees of freedom, reduced number of deformation modes can be obtained by employing model order reduction method such as Guyan reduction, Craig-Bampton method and Krylov subspace as shown in this study The constrained motion of the working mobile vehicles is actuated by the force from the hydraulic actuator. In this study, the hydraulic system is modeled using lumped fluid theory, in which the hydraulic circuit is divided into volumes. In this approach, the pressure wave propagation in the hoses and pipes is neglected. The contact modeling is divided into two stages: contact detection and contact response. Contact detection determines when and where the contact occurs, and contact response provides the force acting at the collision point. The friction between tire and ground is modelled using the LuGre friction model, which describes the frictional force between two surfaces. Typically, the equations of motion are solved in the full matrices format, where the sparsity of the matrices is not considered. Increasing the number of bodies and constraint equations leads to the system matrices becoming large and sparse in structure. To increase the computational efficiency, a technique for solution of sparse matrices is proposed in this dissertation and its implementation demonstrated. To assess the computing efficiency, augmented Lagrangian and semi-recursive methods are implemented employing a sparse matrix technique. From the numerical example, the results show that the proposed approach is applicable and produced appropriate results within the real-time period.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This study has two main objectives. First, the phlebotomy process at the St. Catharines Site of the Niagara Health System is investigated, which starts when an order for a blood test is placed, and ends when the specimen arrives at the lab. The performance measurement is the flow time of the process, which reflects concerns and interests of both the hospital and the patients. Three popular operational methodologies are applied to reduce the flow time and improve the process: DMAIC from Six Sigma, lean principles and simulation modeling. Potential suggestions are provided for the St. Catharines Site, which could result in an average of seven minutes reduction in the flow time. The second objective addresses the fact that these three methodologies have not been combined before in a process improvement effort. A structured framework combining them is developed to benefit future study of phlebotomy and other hospital processes.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

L’innovation pédagogique pour elle-même s’avère parfois discutable, mais elle se justifie quand les enseignants se heurtent aux difficultés d’apprentissage de leurs étudiants. En particulier, certaines notions de physique sont réputées difficiles à appréhender par les étudiants, comme c’est le cas pour l’effet photoélectrique qui n’est pas souvent compris par les étudiants au niveau collégial. Cette recherche tente de déterminer si, dans le cadre d’un cours de physique, la simulation de l’effet photoélectrique et l’utilisation des dispositifs mobiles et en situation de collaboration favorisent une évolution des conceptions des étudiants au sujet de la lumière. Nous avons ainsi procédé à l’élaboration d’un scénario d’apprentissage collaboratif intégrant une simulation de l’effet photoélectrique sur un ordinateur de poche. La conception du scénario a d’abord été influencée par notre vision socioconstructiviste de l’apprentissage. Nous avons effectué deux études préliminaires afin de compléter notre scénario d’apprentissage et valider la plateforme MobileSim et l’interface du simulateur, que nous avons utilisées dans notre expérimentation : la première avec des ordinateurs de bureau et la seconde avec des ordinateurs de poche. Nous avons fait suivre à deux groupes d’étudiants deux cours différents, l’un portant sur une approche traditionnelle d’enseignement, l’autre basé sur le scénario d’apprentissage collaboratif élaboré. Nous leur avons fait passer un test évaluant l’évolution conceptuelle sur la nature de la lumière et sur le phénomène de l’effet photoélectrique et concepts connexes, à deux reprises : la première avant que les étudiants ne s’investissent dans le cours et la seconde après la réalisation des expérimentations. Nos résultats aux prétest et post-test sont complétés par des entrevues individuelles semi-dirigées avec tous les étudiants, par des enregistrements vidéo et par des traces récupérées des fichiers logs ou sur papier. Les étudiants du groupe expérimental ont obtenu de très bons résultats au post-test par rapport à ceux du groupe contrôle. Nous avons enregistré un gain moyen d’apprentissage qualifié de niveau modéré selon Hake (1998). Les résultats des entrevues ont permis de repérer quelques difficultés conceptuelles d’apprentissage chez les étudiants. L’analyse des données recueillies des enregistrements des séquences vidéo, des questionnaires et des traces récupérées nous a permis de mieux comprendre le processus d’apprentissage collaboratif et nous a dévoilé que le nombre et la durée des interactions entre les étudiants sont fortement corrélés avec le gain d’apprentissage. Ce projet de recherche est d’abord une réussite sur le plan de la conception d’un scénario d’apprentissage relatif à un phénomène aussi complexe que l’effet photoélectrique, tout en respectant de nombreux critères (collaboration, simulation, dispositifs mobiles) qui nous paraissaient extrêmement utopiques de réunir dans une situation d’apprentissage en classe. Ce scénario pourra être adapté pour l’apprentissage d’autres notions de la physique et pourra être considéré pour la conception des environnements collaboratifs d’apprentissage mobile innovants, centrés sur les besoins des apprenants et intégrant les technologies au bon moment et pour la bonne activité.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Corteo is a program that implements Monte Carlo (MC) method to simulate ion beam analysis (IBA) spectra of several techniques by following the ions trajectory until a sufficiently large fraction of them reach the detector to generate a spectrum. Hence, it fully accounts for effects such as multiple scattering (MS). Here, a version of Corteo is presented where the target can be a 2D or 3D image. This image can be derived from micrographs where the different compounds are identified, therefore bringing extra information into the solution of an IBA spectrum, and potentially significantly constraining the solution. The image intrinsically includes many details such as the actual surface or interfacial roughness, or actual nanostructures shape and distribution. This can for example lead to the unambiguous identification of structures stoichiometry in a layer, or at least to better constraints on their composition. Because MC computes in details the trajectory of the ions, it simulates accurately many of its aspects such as ions coming back into the target after leaving it (re-entry), as well as going through a variety of nanostructures shapes and orientations. We show how, for example, as the ions angle of incidence becomes shallower than the inclination distribution of a rough surface, this process tends to make the effective roughness smaller in a comparable 1D simulation (i.e. narrower thickness distribution in a comparable slab simulation). Also, in ordered nanostructures, target re-entry can lead to replications of a peak in a spectrum. In addition, bitmap description of the target can be used to simulate depth profiles such as those resulting from ion implantation, diffusion, and intermixing. Other improvements to Corteo include the possibility to interpolate the cross-section in angle-energy tables, and the generation of energy-depth maps.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

An Ising-like model, with interactions ranging up to next-nearest-neighbor pairs, is used to simulate the process of interface alloying. Interactions are chosen to stabilize an intermediate "antiferromagnetic" ordered structure. The dynamics proceeds exclusively by atom-vacancy exchanges. In order to characterize the process, the time evolution of the width of the intermediate ordered region and the diffusion length is studied. Both lengths are found to follow a power-law evolution with exponents depending on the characteristic features of the model.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Agent based simulation is a widely developing area in artificial intelligence.The simulation studies are extensively used in different areas of disaster management. This work deals with the study of an agent based evacuation simulation which is being done to handle the various evacuation behaviors.Various emergent behaviors of agents are addressed here. Dynamic grouping behaviors of agents are studied. Collision detection and obstacle avoidances are also incorporated in this approach.Evacuation is studied with single exits and multiple exits and efficiency is measured in terms of evacuation rate, collision rate etc.Net logo is the tool used which helps in the efficient modeling of scenarios in evacuation

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The centralised control rooms of large industrial plants have separated people from the processes they should control. Perception is restricted mainly to the visual sense. Only telephone or radio links provide narrow-band voice communication with maintenance personnel down in the plant. Multimedia equipment can perceptionally bring back the operator into the plant while bodily keeping him the comfortable and safe control room. This involves video and audio transmission from process components as well as sights and sounds artificially generated from measurements. Groupware systems support inter-action between operators, engineers, and managers in different plants. With support from the German government, the state of Hessen, and industrial companies the Laboratory for Systems Engineering and Human-Machine Systems at the University of Kassel establishes an Experimental Multimedia Process Control Room. Core of this set-up are two high-performance graphics workstations linked to one of several process or vehicle simulators. Multimedia periphery includes video and teleconferencing equipment and a vibration and sound generation system.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Im Rahmen dieser Arbeit werden Modellbildungsverfahren zur echtzeitfähigen Simulation wichtiger Schadstoffkomponenten im Abgasstrom von Verbrennungsmotoren vorgestellt. Es wird ein ganzheitlicher Entwicklungsablauf dargestellt, dessen einzelne Schritte, beginnend bei der Ver-suchsplanung über die Erstellung einer geeigneten Modellstruktur bis hin zur Modellvalidierung, detailliert beschrieben werden. Diese Methoden werden zur Nachbildung der dynamischen Emissi-onsverläufe relevanter Schadstoffe des Ottomotors angewendet. Die abgeleiteten Emissionsmodelle dienen zusammen mit einer Gesamtmotorsimulation zur Optimierung von Betriebstrategien in Hybridfahrzeugen. Im ersten Abschnitt der Arbeit wird eine systematische Vorgehensweise zur Planung und Erstellung von komplexen, dynamischen und echtzeitfähigen Modellstrukturen aufgezeigt. Es beginnt mit einer physikalisch motivierten Strukturierung, die eine geeignete Unterteilung eines Prozessmodells in einzelne überschaubare Elemente vorsieht. Diese Teilmodelle werden dann, jeweils ausgehend von einem möglichst einfachen nominalen Modellkern, schrittweise erweitert und ermöglichen zum Abschluss eine robuste Nachbildung auch komplexen, dynamischen Verhaltens bei hinreichender Genauigkeit. Da einige Teilmodelle als neuronale Netze realisiert werden, wurde eigens ein Verfah-ren zur sogenannten diskreten evidenten Interpolation (DEI) entwickelt, das beim Training einge-setzt, und bei minimaler Messdatenanzahl ein plausibles, also evidentes Verhalten experimenteller Modelle sicherstellen kann. Zum Abgleich der einzelnen Teilmodelle wurden statistische Versuchs-pläne erstellt, die sowohl mit klassischen DoE-Methoden als auch mittels einer iterativen Versuchs-planung (iDoE ) generiert wurden. Im zweiten Teil der Arbeit werden, nach Ermittlung der wichtigsten Einflussparameter, die Model-strukturen zur Nachbildung dynamischer Emissionsverläufe ausgewählter Abgaskomponenten vor-gestellt, wie unverbrannte Kohlenwasserstoffe (HC), Stickstoffmonoxid (NO) sowie Kohlenmono-xid (CO). Die vorgestellten Simulationsmodelle bilden die Schadstoffkonzentrationen eines Ver-brennungsmotors im Kaltstart sowie in der anschließenden Warmlaufphase in Echtzeit nach. Im Vergleich zur obligatorischen Nachbildung des stationären Verhaltens wird hier auch das dynami-sche Verhalten des Verbrennungsmotors in transienten Betriebsphasen ausreichend korrekt darge-stellt. Eine konsequente Anwendung der im ersten Teil der Arbeit vorgestellten Methodik erlaubt, trotz einer Vielzahl von Prozesseinflussgrößen, auch hier eine hohe Simulationsqualität und Ro-bustheit. Die Modelle der Schadstoffemissionen, eingebettet in das dynamische Gesamtmodell eines Ver-brennungsmotors, werden zur Ableitung einer optimalen Betriebsstrategie im Hybridfahrzeug ein-gesetzt. Zur Lösung solcher Optimierungsaufgaben bieten sich modellbasierte Verfahren in beson-derer Weise an, wobei insbesondere unter Verwendung dynamischer als auch kaltstartfähiger Mo-delle und der damit verbundenen Realitätsnähe eine hohe Ausgabequalität erreicht werden kann.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In many real world contexts individuals find themselves in situations where they have to decide between options of behaviour that serve a collective purpose or behaviours which satisfy one’s private interests, ignoring the collective. In some cases the underlying social dilemma (Dawes, 1980) is solved and we observe collective action (Olson, 1965). In others social mobilisation is unsuccessful. The central topic of social dilemma research is the identification and understanding of mechanisms which yield to the observed cooperation and therefore resolve the social dilemma. It is the purpose of this thesis to contribute this research field for the case of public good dilemmas. To do so, existing work that is relevant to this problem domain is reviewed and a set of mandatory requirements is derived which guide theory and method development of the thesis. In particular, the thesis focusses on dynamic processes of social mobilisation which can foster or inhibit collective action. The basic understanding is that success or failure of the required process of social mobilisation is determined by heterogeneous individual preferences of the members of a providing group, the social structure in which the acting individuals are contained, and the embedding of the individuals in economic, political, biophysical, or other external contexts. To account for these aspects and for the involved dynamics the methodical approach of the thesis is computer simulation, in particular agent-based modelling and simulation of social systems. Particularly conductive are agent models which ground the simulation of human behaviour in suitable psychological theories of action. The thesis develops the action theory HAPPenInGS (Heterogeneous Agents Providing Public Goods) and demonstrates its embedding into different agent-based simulations. The thesis substantiates the particular added value of the methodical approach: Starting out from a theory of individual behaviour, in simulations the emergence of collective patterns of behaviour becomes observable. In addition, the underlying collective dynamics may be scrutinised and assessed by scenario analysis. The results of such experiments reveal insights on processes of social mobilisation which go beyond classical empirical approaches and yield policy recommendations on promising intervention measures in particular.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Modeling and simulation permeate all areas of business, science and engineering. With the increase in the scale and complexity of simulations, large amounts of computational resources are required, and collaborative model development is needed, as multiple parties could be involved in the development process. The Grid provides a platform for coordinated resource sharing and application development and execution. In this paper, we survey existing technologies in modeling and simulation, and we focus on interoperability and composability of simulation components for both simulation development and execution. We also present our recent work on an HLA-based simulation framework on the Grid, and discuss the issues to achieve composability.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper analyzes a proposed release controlmethodology, WIPLOAD Control (WIPLCtrl), using a transfer line case modeled by Markov process modeling methodology. The performance of WIPLCtrl is compared with that of CONWIP under 13 system configurations in terms of throughput, average inventory level, as well as average cycle time. As a supplement to the analytical model, a simulation model of the transfer line is used to observe the performance of the release control methodologies on the standard deviation of cycle time. From the analysis, we identify the system configurations in which the advantages of WIPLCtrl could be observed.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Este proyecto de investigación busca usar un sistema de cómputo basado en modelación por agentes para medir la percepción de marca de una organización en una población heterogénea. Se espera proporcionar información que permita dar soluciones a una organización acerca del comportamiento de sus consumidores y la asociada percepción de marca. El propósito de este sistema es el de modelar el proceso de percepción-razonamiento-acción para simular un proceso de razonamiento como el resultado de una acumulación de percepciones que resultan en las acciones del consumidor. Este resultado definirá la aceptación de marca o el rechazo del consumidor hacia la empresa. Se realizó un proceso de recolección información acerca de una organización específica en el campo de marketing. Después de compilar y procesar la información obtenida de la empresa, el análisis de la percepción de marca es aplicado mediante procesos de simulación. Los resultados del experimento son emitidos a la organización mediante un informe basado en conclusiones y recomendaciones a nivel de marketing para mejorar la percepción de marca por parte de los consumidores.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

La teoría de redes de Johanson y Mattson (1988) explica como las pequeñas empresas, también conocidas como PyMes, utilizan las redes de negocio para desarrollar sus procesos de internacionalización. Es así que a través de las redes pueden superar sus limitaciones de tamaño para encontrar cierto tipo de fluidez y dinamismo en su gestión, con el fin de aprovechar los beneficios de la internacionalización. A partir del desarrollo y fortalecimiento de las relaciones dentro de la red la organización puede posicionarse en una instancia competitiva cada vez más fuerte (Jarillo, 1988). Según Forsgren y Johanson (1992), para los gerentes es importante coordinar la interacción entre los diferentes actores de la red, ya que a través de estas su posición dentro de la red mejora y así mismo el flujo de recursos será mayor. El propósito de este trabajo es analizar el modelo de internacionalización según la teoría de redes, desde una perspectiva cultural, de e-Tech Simulation una PyME “Born to be global” norteamericana. Esta empresa ha minimizado su riesgo de internacionalización, a través del desarrollo de acuerdos entre los diferentes actores. Al mejorar su posición dentro de la red, es decir al fortalecer aún más los lazos existentes y crear nuevas relaciones, la empresa ha obtenido mayores beneficios de la misma y ha logrado ser aún más flexible con sus clientes. Es por esto que a partir de este análisis se planteó una serie de recomendaciones para mejorar los procesos de negociación dentro de la red, bajo un contexto cultural. De igual forma se evidencio la importancia del papel del emprendimiento del gerente en los procesos de internacionalización, así como su habilidad para mezclar los recursos obtenidos de diferentes mercados internacionales para satisfacer las necesidades de los clientes.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Using the Met Office large-eddy model (LEM) we simulate a mixed-phase altocumulus cloud that was observed from Chilbolton in southern England by a 94 GHz Doppler radar, a 905 nm lidar, a dual-wavelength microwave radiometer and also by four radiosondes. It is important to test and evaluate such simulations with observations, since there are significant differences between results from different cloud-resolving models for ice clouds. Simulating the Doppler radar and lidar data within the LEM allows us to compare observed and modelled quantities directly, and allows us to explore the relationships between observed and unobserved variables. For general-circulation models, which currently tend to give poor representations of mixed-phase clouds, the case shows the importance of using: (i) separate prognostic ice and liquid water, (ii) a vertical resolution that captures the thin layers of liquid water, and (iii) an accurate representation the subgrid vertical velocities that allow liquid water to form. It is shown that large-scale ascents and descents are significant for this case, and so the horizontally averaged LEM profiles are relaxed towards observed profiles to account for these. The LEM simulation then gives a reasonable. cloud, with an ice-water path approximately two thirds of that observed, with liquid water at the cloud top, as observed. However, the liquid-water cells that form in the updraughts at cloud top in the LEM have liquid-water paths (LWPs) up to half those observed, and there are too few cells, giving a mean LWP five to ten times smaller than observed. In reality, ice nucleation and fallout may deplete ice-nuclei concentrations at the cloud top, allowing more liquid water to form there, but this process is not represented in the model. Decreasing the heterogeneous nucleation rate in the LEM increased the LWP, which supports this hypothesis. The LEM captures the increase in the standard deviation in Doppler velocities (and so vertical winds) with height, but values are 1.5 to 4 times smaller than observed (although values are larger in an unforced model run, this only increases the modelled LWP by a factor of approximately two). The LEM data show that, for values larger than approximately 12 cm s(-1), the standard deviation in Doppler velocities provides an almost unbiased estimate of the standard deviation in vertical winds, but provides an overestimate for smaller values. Time-smoothing the observed Doppler velocities and modelled mass-squared-weighted fallspeeds shows that observed fallspeeds are approximately two-thirds of the modelled values. Decreasing the modelled fallspeeds to those observed increases the modelled IWC, giving an IWP 1.6 times that observed.