988 resultados para Simulation tools


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Tämän diplomityön tavoitteena oli löytää Perlos Tools Joensuun tehtaalle sopiva taloudellinen ja läpimenoajaltaan lyhyt ruiskupuristusmuotin valmistusprosessi. Kirjallisuusosiossa tarkastellaan valmistuksen prosessimalleja, esitellään tärkeimmät muotinvalmistuksen menetelmät, sekä nykytilan ja tulevaisuuden haasteet. Lisäksi tarkastellaan muotinvalmistuksen työajan ja kulujen jakaantumista, tärkeimpiä investointilaskelmia, sekä investointien perustelemista simulointien avulla. Tutkimusosiossa simuloidaan erilaisia prosessimalleja, selvitetään valmistusmenetelmienja koneiden vaikutusta muotinvalmistuksen läpimenoaikaan, sekä lasketaan investointien ja valmistuskoneiden vaikutukset takaisinmaksuaikoihin. Simulaation tavoitteena on asiaankuuluvien mallien, sopivien kysymysten sekä prosessimallien kehittämisen kautta tuottaa analysoitua informaatiota päätöksenteon tueksi. Tutkimustulosten perusteella ruiskupuristusmuotin valmistusprosessi on optimoitu. Optimoinnin tuloksena tarkasteltavalla yrityksellä on käytössään taloudellinen ja läpimenoajaltaan lyhyt ruiskupuristusmuotin valmistusprosessi.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Resume : Mieux comprendre les stromatolithes et les tapis microbiens est un sujet important en biogéosciences puisque cela aide à l'étude des premières formes de vie sur Terre, a mieux cerner l'écologie des communautés microbiennes et la contribution des microorganismes a la biominéralisation, et même à poser certains fondements dans les recherches en exobiologie. D'autre part, la modélisation est un outil puissant utilisé dans les sciences naturelles pour appréhender différents phénomènes de façon théorique. Les modèles sont généralement construits sur un système d'équations différentielles et les résultats sont obtenus en résolvant ce système. Les logiciels disponibles pour implémenter les modèles incluent les logiciels mathématiques et les logiciels généraux de simulation. L'objectif principal de cette thèse est de développer des modèles et des logiciels pour aider a comprendre, via la simulation, le fonctionnement des stromatolithes et des tapis microbiens. Ces logiciels ont été développés en C++ en ne partant d'aucun pré-requis de façon a privilégier performance et flexibilité maximales. Cette démarche permet de construire des modèles bien plus spécifiques et plus appropriés aux phénomènes a modéliser. Premièrement, nous avons étudié la croissance et la morphologie des stromatolithes. Nous avons construit un modèle tridimensionnel fondé sur l'agrégation par diffusion limitée. Le modèle a été implémenté en deux applications C++: un moteur de simulation capable d'exécuter un batch de simulations et de produire des fichiers de résultats, et un outil de visualisation qui permet d'analyser les résultats en trois dimensions. Après avoir vérifié que ce modèle peut en effet reproduire la croissance et la morphologie de plusieurs types de stromatolithes, nous avons introduit un processus de sédimentation comme facteur externe. Ceci nous a mené a des résultats intéressants, et permis de soutenir l'hypothèse que la morphologie des stromatolithes pourrait être le résultat de facteurs externes autant que de facteurs internes. Ceci est important car la classification des stromatolithes est généralement fondée sur leur morphologie, imposant que la forme d'un stromatolithe est dépendante de facteurs internes uniquement (c'est-à-dire les tapis microbiens). Les résultats avancés dans ce mémoire contredisent donc ces assertions communément admises. Ensuite, nous avons décidé de mener des recherches plus en profondeur sur les aspects fonctionnels des tapis microbiens. Nous avons construit un modèle bidimensionnel de réaction-diffusion fondé sur la simulation discrète. Ce modèle a été implémenté dans une application C++ qui permet de paramétrer et exécuter des simulations. Nous avons ensuite pu comparer les résultats de simulation avec des données du monde réel et vérifier que le modèle peut en effet imiter le comportement de certains tapis microbiens. Ainsi, nous avons pu émettre et vérifier des hypothèses sur le fonctionnement de certains tapis microbiens pour nous aider à mieux en comprendre certains aspects, comme la dynamique des éléments, en particulier le soufre et l'oxygène. En conclusion, ce travail a abouti à l'écriture de logiciels dédiés à la simulation de tapis microbiens d'un point de vue tant morphologique que fonctionnel, suivant deux approches différentes, l'une holistique, l'autre plus analytique. Ces logiciels sont gratuits et diffusés sous licence GPL (General Public License). Abstract : Better understanding of stromatolites and microbial mats is an important topic in biogeosciences as it helps studying the early forms of life on Earth, provides clues re- garding the ecology of microbial ecosystems and their contribution to biomineralization, and gives basis to a new science, exobiology. On the other hand, modelling is a powerful tool used in natural sciences for the theoretical approach of various phenomena. Models are usually built on a system of differential equations and results are obtained by solving that system. Available software to implement models includes mathematical solvers and general simulation software. The main objective of this thesis is to develop models and software able to help to understand the functioning of stromatolites and microbial mats. Software was developed in C++ from scratch for maximum performance and flexibility. This allows to build models much more specific to a phenomenon rather than general software. First, we studied stromatolite growth and morphology. We built a three-dimensional model based on diffusion-limited aggregation. The model was implemented in two C++ applications: a simulator engine, which can run a batch of simulations and produce result files, and a Visualization tool, which allows results to be analysed in three dimensions. After verifying that our model can indeed reproduce the growth and morphology of several types of stromatolites, we introduced a sedimentation process as an external factor. This lead to interesting results, and allowed to emit the hypothesis that stromatolite morphology may be the result of external factors as much as internal factors. This is important as stromatolite classification is usually based on their morphology, imposing that a stromatolite shape is dependant on internal factors only (i.e. the microbial mat). This statement is contradicted by our findings, Second, we decided to investigate deeper the functioning of microbial mats, We built a two-dimensional reaction-diffusion model based on discrete simulation, The model was implemented in a C++ application that allows setting and running simulations. We could then compare simulation results with real world data and verify that our model can indeed mimic the behaviour of some microbial mats. Thus, we have proposed and verified hypotheses regarding microbial mats functioning in order to help to better understand them, e.g. the cycle of some elements such as oxygen or sulfur. ln conclusion, this PhD provides a simulation software, dealing with two different approaches. This software is free and available under a GPL licence.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The study of fluid flow in pipes is one of the main topic of interest for engineers in industries. In this thesis, an effort is made to study the boundary layers formed near the wall of the pipe and how it behaves as a resistance to heat transfer. Before few decades, the scientists used to derive the analytical and empirical results by hand as there were limited means available to solve the complex fluid flow phenomena. Due to the increase in technology, now it has been practically possible to understand and analyze the actual fluid flow in any type of geometry. Several methodologies have been used in the past to analyze the boundary layer equations and to derive the expression for heat transfer. An integral relation approach is used for the analytical solution of the boundary layer equations and is compared with the FLUENT simulations for the laminar case. Law of the wall approach is used to derive the empirical correlation between dimensionless numbers and is then compared with the results from FLUENT for the turbulent case. In this thesis, different approaches like analytical, empirical and numerical are compared for the same set of fluid flow equations.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Diplomityössä on tutustuttu ydinvoimalaitosten paloriskejä käsittelevään todennäköisyyspohjaiseen turvallisuusanalyysiin. Tavoitteena on ollut Olkiluoto 1 ja 2 laitosyksiköiden paloanalyysimenetelmän kehittäminen. Työssä esitetään paloanalyysin pääpiirteet, kaksi erilaista palotaajuuksien estimointimenetelmää sekä palojen leviämisen arviointimenetelmiä. Palotaajuuksien estimointimenetelmistä keskitytään Berryn menetelmän sekä NUREG/CR-6850-palotaajuuslaskentamenetelmän tarkasteluun. Palon leviämisen arvioinnissa on esitetty kolmen erilaisen virtausteknisen laskentatyökalun perusteet sekä palon leviämistodennäköisyyksiä arvioivan Probabilistic Fire Simulator (PFS) -ohjelman käyttöä. Työn aikana on laskettu molemmilla palotaajuuden estimointimenetelmillä palotaajuuksia eri tyyppisille huonetiloille. Berryn menetelmän palotaajuudet olivat pääosin alhaisempia kuin NUREG/CR-6850-menetelmällä lasketut palotaajuudet. Palon leviämistarkastelussa on tutkittu ydinvoimalaitoksen relehuoneen tulipaloa. PFS:n avulla laskettujen leviämistodennäköisyyksien arvoja on vertailtu TVO:n paloanalyysissa käytettyihin kvalitatiivisiin peittokertoimiin. Palon leviämistodennäköisyys eri osajärjestelmien välillä todettiin suuresti riippuvan analyysissaoletetuista vaurioitumislämpötiloista. Tutkittuja menetelmiä hyödyntäen diplomityössä kehitettiin paloanalyysimenetelmäkuvaus. Menetelmäkuvauksessa huonetilojen paloriskit kartoitetaan aluksi Berryn menetelmällä. Näin kaikille laitoksen huonetiloille saadaan arvioitua palotaajuus sekä paloalkutapahtumaluokkien sydänvauriotaajuus. Seuraavaksi suoritetaan valintamenettely, jossa valitut kriteerit täyttäville huonetiloille tehdään tarkentava palotaajuuslaskenta. Tarkentava palotaajuuslaskenta perustuu NUREG/CR-6850-menetelmän mukaisesti huonetilojen realistisiin syttymislähteisiin. Kriittisimpien huonetilojen osalta palon leviämisen arviointiin on tarkoitus hyödyntää numeerista simulointia.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Prosessisimulointiohjelmistojen käyttö on yleistynyt paperiteollisuuden prosessien kartoituksessa ja kyseiset ohjelmistot ovat jo pitkään olleet myös Pöyry Engineering Oy:n työkaluja prosessisuunnittelussa. Tämän työn tavoitteeksi määritettiin prosessisimulointiohjelmistojen käytön selvittäminen suomalaisissa paperitehtaissa sekä prosessisimuloinnin tulevaisuuden näkymien arviointi metsäteollisuuden suunnittelupalveluissa liiketoiminnan kehittämiseksi. Työn teoriaosassa selvitetään mm. seuraavia asioita: mitä prosessisimulointi on, miksi simuloidaan ja mitkä ovat simuloinnin hyödyt ja haasteet. Teoriaosassa esitellään yleisimmät käytössä olevat prosessisimulointiohjelmistot, simulointiprosessin eteneminen sekä prosessisimuloinnin tuotteistamisen vaatimuksia. Työn kokeellisessa osassa selvitettiin kyselyn avulla prosessisimulointiohjelmistojen käyttöä Suomen paperitehtaissa. Kysely lähetettiin kaikille Suomen tärkeimmille paperitehtaille. Kyselyn avulla selvitettiin mm, mitä ohjelmia käytetään, mitä on simuloitu, mitä pitää vielä simuloida ja kuinka tarpeellisena prosessisimulointia pidetään. Työntulokset osoittavat, että kaikilla kyselyyn vastanneilla suomalaisilla paperitehtailla on käytetty prosessisimulointia. Suurin osa simuloinneista on tehty konelinjoihin sekä massa- ja vesijärjestelmiin. Tulevaisuuden tärkeimpänä kohteena pidetään energiavirtojen simulointia. Simulointimallien pitkäjänteisessä hyödyntämisessä ja ylläpidossa on kehitettävää, jossa simulointipalvelujen hankkiminen palveluna on tehtaille todennäköisin vaihtoehto. Johtopäätöksenä on se, että tehtailla on tarvetta prosessisimuloinnille. Ilmapiiri on kyselytuloksien mukaan suotuisa ja simulointi nähdään tarpeellisena työkaluna. Prosessisimuloinnin markkinointia, erillispalvelutuotteen lisäksi, kannattaisi kehittää siten, että simulointimallin ylläpito jatkuisi projektin jälkeen lähipalveluna. Markkinointi pitäisi tehdä jo projektin alkuvaiheessa tai projektin aikana. Simulointiohjelmien kirjosta suunnittelutoimiston kannattaa valita simulointiohjelmistoja, jotka sopivat sille parhaiten. Erityistapauksissa muiden ohjelmien hankintaa kannattaa harkita asiakkaan toivomusten mukaisesti.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A change in paradigm is needed in the prevention of toxic effects on the nervous system, moving from its present reliance solely on data from animal testing to a prediction model mostly based on in vitro toxicity testing and in silico modeling. According to the report published by the National Research Council (NRC) of the US National Academies of Science, high-throughput in vitro tests will provide evidence for alterations in"toxicity pathways" as the best possible method of large scale toxicity prediction. The challenges to implement this proposal are enormous, and provide much room for debate. While many efforts address the technical aspects of implementing the vision, many questions around it need also to be addressed. Is the overall strategy the only one to be pursued? How can we move from current to future paradigms? Will we ever be able to reliably model for chronic and developmental neurotoxicity in vitro? This paper summarizes four presentations from a symposium held at the International Neurotoxicology Conference held in Xi"an, China, in June 2011. A. Li reviewed the current guidelines for neurotoxicity and developmental neurotoxicity testing, and discussed the major challenges existing to realize the NCR vision for toxicity testing. J. Llorens reviewed the biology of mammalian toxic avoidance in view of present knowledge on the physiology and molecular biology of the chemical senses, taste and smell. This background information supports the hypothesis that relating in vivo toxicity to chemical epitope descriptors that mimic the chemical encoding performed by the olfactory system may provide a way to the long term future of complete in silico toxicity prediction. S. Ceccatelli reviewed the implementation of rodent and human neural stem cells (NSCs) as models for in vitro toxicity testing that measures parameters such as cell proliferation, differentiation and migration. These appear to be sensitive endpoints that can identify substances with developmental neurotoxic potential. C. Sun ol reviewed the use of primary neuronal cultures in testing for neurotoxicity of environmental pollutants, including the study of the effects of persistent exposures and/or in differentiating cells, which allow recording of effects that can be extrapolated to human developmental neurotoxicity.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Työn tarkoituksena on tutkia tehonsiirron ratkaisuja, jotka mahdollistavat jatkuvanopeus ja -momentinsäädön. Työ on rajattu käsittämään kahta eri planeettavaihdetuotetta. Tutkimuksen kohteena ovat valittujen vaihdetuotteiden planeettapyörästöjen eri kytkentämahdollisuudet ja niiden vaikutus toisiinsa. Vaihteiden teknistä toteutusta ja toimivuutta tarkastellaan fyysisten testien, simuloinnin sekä analyyttisen laskennan avulla. Apuna työssä on käytetty Dymola simulointiohjelmaa, jossa kinemaattisten kuvakeliitäntöjen avulla on rakennettu virtuaalimalli tarkastellusta tuotteesta ja sen toiminnasta. Tietokoneen avulla luotua dynaamista simulointimallia on muokattu tutkimuksen edistyessä differentiaalisesti jatkuvasäätöisen momentinmuuntimen aikaan saamiseksi. Tuotteissa on käytetty aurinkopyörällisiä ja aurinkopyörättömiä planeettapyörästöjä. Ensimmäisessä tutkittavassa tuotteessa on kolme planeettapyörästöä ja toisessa kaksi. Teholähteeksi käy polttomoottori tai sähkömoottori. Välitys- ja pyörimissuhteiden muuntoon vaikuttaa planeettavaihteistoon kytketty sähkömoottori, jonka toimintaa voidaan ohjata erikseen. Työssä on selvitetty, millaisia kulmanopeuksia ja momentteja eri ajanhetkellä ja eri pisteissä simulointimallia esiintyy.Lisäksi selvitetään vaihteistojen portaattoman välityssuhteen muuntomahdollisuudet. Tarkoituksena on saada realistista informaatiota tutkittavien laitteiden toiminnasta. Johtavana ajatuksena on modernien menetelmien käyttäminen uusien innovaatioiden toimivuuden ja äärikohtien tutkimiseksi.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Tämä diplomityö käsittelee kartonginmuovaukseen käytettävien puristintyökalujen kehittämistä. Työntavoitteina oli kehittää työkalutekniikan suunnittelua ja valmistusta edullisemmaksi, nopeammaksi ja työkaluja toiminnoiltaan tehokkaammiksi. Työn tuli sisältää myös ohjeet työkalujen suunnittelemiseksi ja valmistamiseksi jatkoa ajatellen. Työn aikana selvitettiin mahdollisia työkalurakennevaihtoehtoja, valmistusmateriaaleja sekä niiden käsittelymenetelmiä ja lastuamista sekä sen tarjoamia mahdollisuuksia valmistusmenetelmänä. Työkalupari suunniteltiin modulaariseksi siten, että uusia työkaluja varten vain osa komponenteista täytyy valmistaa uudelleen, samalla työkalun osien lukumäärää pienennettiin merkittävästi. Valmistusmateriaaliksi valittiin hyvin lastuttava työkaluteräs ja sen koneistaminen tapahtui vaakakaraisessa koneistuskeskuksessa. Työn loppuvaiheessa työkalukokonaisuudelle tehtiin kustannuslaskelma jaoteltuna eri työvaiheille sekä komponenteittain. Työkalu asennettiin puristimeen ja sille suoritettiin käyttötestaus. Työn aikana karttuneen kokemuksen sekä koekäytön perusteella tehtiin jatkokehitysehdotuksia.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

La biologie de la conservation est communément associée à la protection de petites populations menacées d?extinction. Pourtant, il peut également être nécessaire de soumettre à gestion des populations surabondantes ou susceptibles d?une trop grande expansion, dans le but de prévenir les effets néfastes de la surpopulation. Du fait des différences tant quantitatives que qualitatives entre protection des petites populations et contrôle des grandes, il est nécessaire de disposer de modèles et de méthodes distinctes. L?objectif de ce travail a été de développer des modèles prédictifs de la dynamique des grandes populations, ainsi que des logiciels permettant de calculer les paramètres de ces modèles et de tester des scénarios de gestion. Le cas du Bouquetin des Alpes (Capra ibex ibex) - en forte expansion en Suisse depuis sa réintroduction au début du XXème siècle - servit d?exemple. Cette tâche fut accomplie en trois étapes : En premier lieu, un modèle de dynamique locale, spécifique au Bouquetin, fut développé : le modèle sous-jacent - structuré en classes d?âge et de sexe - est basé sur une matrice de Leslie à laquelle ont été ajoutées la densité-dépendance, la stochasticité environnementale et la chasse de régulation. Ce modèle fut implémenté dans un logiciel d?aide à la gestion - nommé SIM-Ibex - permettant la maintenance de données de recensements, l?estimation automatisée des paramètres, ainsi que l?ajustement et la simulation de stratégies de régulation. Mais la dynamique d?une population est influencée non seulement par des facteurs démographiques, mais aussi par la dispersion et la colonisation de nouveaux espaces. Il est donc nécessaire de pouvoir modéliser tant la qualité de l?habitat que les obstacles à la dispersion. Une collection de logiciels - nommée Biomapper - fut donc développée. Son module central est basé sur l?Analyse Factorielle de la Niche Ecologique (ENFA) dont le principe est de calculer des facteurs de marginalité et de spécialisation de la niche écologique à partir de prédicteurs environnementaux et de données d?observation de l?espèce. Tous les modules de Biomapper sont liés aux Systèmes d?Information Géographiques (SIG) ; ils couvrent toutes les opérations d?importation des données, préparation des prédicteurs, ENFA et calcul de la carte de qualité d?habitat, validation et traitement des résultats ; un module permet également de cartographier les barrières et les corridors de dispersion. Le domaine d?application de l?ENFA fut exploré par le biais d?une distribution d?espèce virtuelle. La comparaison à une méthode couramment utilisée pour construire des cartes de qualité d?habitat, le Modèle Linéaire Généralisé (GLM), montra qu?elle était particulièrement adaptée pour les espèces cryptiques ou en cours d?expansion. Les informations sur la démographie et le paysage furent finalement fusionnées en un modèle global. Une approche basée sur un automate cellulaire fut choisie, tant pour satisfaire aux contraintes du réalisme de la modélisation du paysage qu?à celles imposées par les grandes populations : la zone d?étude est modélisée par un pavage de cellules hexagonales, chacune caractérisée par des propriétés - une capacité de soutien et six taux d?imperméabilité quantifiant les échanges entre cellules adjacentes - et une variable, la densité de la population. Cette dernière varie en fonction de la reproduction et de la survie locale, ainsi que de la dispersion, sous l?influence de la densité-dépendance et de la stochasticité. Un logiciel - nommé HexaSpace - fut développé pour accomplir deux fonctions : 1° Calibrer l?automate sur la base de modèles de dynamique (par ex. calculés par SIM-Ibex) et d?une carte de qualité d?habitat (par ex. calculée par Biomapper). 2° Faire tourner des simulations. Il permet d?étudier l?expansion d?une espèce envahisseuse dans un paysage complexe composé de zones de qualité diverses et comportant des obstacles à la dispersion. Ce modèle fut appliqué à l?histoire de la réintroduction du Bouquetin dans les Alpes bernoises (Suisse). SIM-Ibex est actuellement utilisé par les gestionnaires de la faune et par les inspecteurs du gouvernement pour préparer et contrôler les plans de tir. Biomapper a été appliqué à plusieurs espèces (tant végétales qu?animales) à travers le Monde. De même, même si HexaSpace fut initialement conçu pour des espèces animales terrestres, il pourrait aisément être étndu à la propagation de plantes ou à la dispersion d?animaux volants. Ces logiciels étant conçus pour, à partir de données brutes, construire un modèle réaliste complexe, et du fait qu?ils sont dotés d?une interface d?utilisation intuitive, ils sont susceptibles de nombreuses applications en biologie de la conservation. En outre, ces approches peuvent également s?appliquer à des questions théoriques dans les domaines de l?écologie des populations et du paysage.<br/><br/>Conservation biology is commonly associated to small and endangered population protection. Nevertheless, large or potentially large populations may also need human management to prevent negative effects of overpopulation. As there are both qualitative and quantitative differences between small population protection and large population controlling, distinct methods and models are needed. The aim of this work was to develop theoretical models to predict large population dynamics, as well as computer tools to assess the parameters of these models and to test management scenarios. The alpine Ibex (Capra ibex ibex) - which experienced a spectacular increase since its reintroduction in Switzerland at the beginning of the 20th century - was used as paradigm species. This task was achieved in three steps: A local population dynamics model was first developed specifically for Ibex: the underlying age- and sex-structured model is based on a Leslie matrix approach with addition of density-dependence, environmental stochasticity and culling. This model was implemented into a management-support software - named SIM-Ibex - allowing census data maintenance, parameter automated assessment and culling strategies tuning and simulating. However population dynamics is driven not only by demographic factors, but also by dispersal and colonisation of new areas. Habitat suitability and obstacles modelling had therefore to be addressed. Thus, a software package - named Biomapper - was developed. Its central module is based on the Ecological Niche Factor Analysis (ENFA) whose principle is to compute niche marginality and specialisation factors from a set of environmental predictors and species presence data. All Biomapper modules are linked to Geographic Information Systems (GIS); they cover all operations of data importation, predictor preparation, ENFA and habitat suitability map computation, results validation and further processing; a module also allows mapping of dispersal barriers and corridors. ENFA application domain was then explored by means of a simulated species distribution. It was compared to a common habitat suitability assessing method, the Generalised Linear Model (GLM), and was proven better suited for spreading or cryptic species. Demography and landscape informations were finally merged into a global model. To cope with landscape realism and technical constraints of large population modelling, a cellular automaton approach was chosen: the study area is modelled by a lattice of hexagonal cells, each one characterised by a few fixed properties - a carrying capacity and six impermeability rates quantifying exchanges between adjacent cells - and one variable, population density. The later varies according to local reproduction/survival and dispersal dynamics, modified by density-dependence and stochasticity. A software - named HexaSpace - was developed, which achieves two functions: 1° Calibrating the automaton on the base of local population dynamics models (e.g., computed by SIM-Ibex) and a habitat suitability map (e.g. computed by Biomapper). 2° Running simulations. It allows studying the spreading of an invading species across a complex landscape made of variously suitable areas and dispersal barriers. This model was applied to the history of Ibex reintroduction in Bernese Alps (Switzerland). SIM-Ibex is now used by governmental wildlife managers to prepare and verify culling plans. Biomapper has been applied to several species (both plants and animals) all around the World. In the same way, whilst HexaSpace was originally designed for terrestrial animal species, it could be easily extended to model plant propagation or flying animals dispersal. As these softwares were designed to proceed from low-level data to build a complex realistic model and as they benefit from an intuitive user-interface, they may have many conservation applications. Moreover, theoretical questions in the fields of population and landscape ecology might also be addressed by these approaches.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The active magnetic bearings present a new technology which has many advantages compared to traditional bearing designs. Active magnetic bearings, however, require retainer bearings order to prevent damages in the event of a component, power or a control loop failure. In the dropdown situation, when the rotor drops from the magnetic bearings to the retainer bearings, the design parameters of the retainer bearings have a significant influence on the behaviour of the rotor. In this study, the dynamics of an active magnetic bearings supported electric motor during rotor drop on retainer bearings is studied using a multibody simulation approach. Various design parameters of retainer bearings are studied using a simulation model while results are compared with those found in literature. The retainer bearings are modelled using a detailed ball bearing model, which accounts damping and stiffness properties, oil film and friction between races and rolling elements. The model of the ball bearings includes inertia description of rollingelements. The model of the magnetic bearing system contains unbalances of the rotor and stiffness and damping properties of support. In this study, a computationally efficient contact model between the rotor and the retainer bearings is proposed. In addition, this work introduces information for the design of physicalprototype and its retainer bearings.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Theultimate goal of any research in the mechanism/kinematic/design area may be called predictive design, ie the optimisation of mechanism proportions in the design stage without requiring extensive life and wear testing. This is an ambitious goal and can be realised through development and refinement of numerical (computational) technology in order to facilitate the design analysis and optimisation of complex mechanisms, mechanical components and systems. As a part of the systematic design methodology this thesis concentrates on kinematic synthesis (kinematic design and analysis) methods in the mechanism synthesis process. The main task of kinematic design is to find all possible solutions in the form of structural parameters to accomplish the desired requirements of motion. Main formulations of kinematic design can be broadly divided to exact synthesis and approximate synthesis formulations. The exact synthesis formulation is based in solving n linear or nonlinear equations in n variables and the solutions for the problem areget by adopting closed form classical or modern algebraic solution methods or using numerical solution methods based on the polynomial continuation or homotopy. The approximate synthesis formulations is based on minimising the approximation error by direct optimisation The main drawbacks of exact synthesis formulationare: (ia) limitations of number of design specifications and (iia) failure in handling design constraints- especially inequality constraints. The main drawbacks of approximate synthesis formulations are: (ib) it is difficult to choose a proper initial linkage and (iib) it is hard to find more than one solution. Recentformulations in solving the approximate synthesis problem adopts polynomial continuation providing several solutions, but it can not handle inequality const-raints. Based on the practical design needs the mixed exact-approximate position synthesis with two exact and an unlimited number of approximate positions has also been developed. The solutions space is presented as a ground pivot map but thepole between the exact positions cannot be selected as a ground pivot. In this thesis the exact synthesis problem of planar mechanism is solved by generating all possible solutions for the optimisation process ¿ including solutions in positive dimensional solution sets - within inequality constraints of structural parameters. Through the literature research it is first shown that the algebraic and numerical solution methods ¿ used in the research area of computational kinematics ¿ are capable of solving non-parametric algebraic systems of n equations inn variables and cannot handle the singularities associated with positive-dimensional solution sets. In this thesis the problem of positive-dimensional solutionsets is solved adopting the main principles from mathematical research area of algebraic geometry in solving parametric ( in the mathematical sense that all parameter values are considered ¿ including the degenerate cases ¿ for which the system is solvable ) algebraic systems of n equations and at least n+1 variables.Adopting the developed solution method in solving the dyadic equations in direct polynomial form in two- to three-precision-points it has been algebraically proved and numerically demonstrated that the map of the ground pivots is ambiguousand that the singularities associated with positive-dimensional solution sets can be solved. The positive-dimensional solution sets associated with the poles might contain physically meaningful solutions in the form of optimal defectfree mechanisms. Traditionally the mechanism optimisation of hydraulically driven boommechanisms is done at early state of the design process. This will result in optimal component design rather than optimal system level design. Modern mechanismoptimisation at system level demands integration of kinematic design methods with mechanical system simulation techniques. In this thesis a new kinematic design method for hydraulically driven boom mechanism is developed and integrated in mechanical system simulation techniques. The developed kinematic design method is based on the combinations of two-precision-point formulation and on optimisation ( with mathematical programming techniques or adopting optimisation methods based on probability and statistics ) of substructures using calculated criteria from the system level response of multidegree-of-freedom mechanisms. Eg. by adopting the mixed exact-approximate position synthesis in direct optimisation (using mathematical programming techniques) with two exact positions and an unlimitednumber of approximate positions the drawbacks of (ia)-(iib) has been cancelled.The design principles of the developed method are based on the design-tree -approach of the mechanical systems and the design method ¿ in principle ¿ is capable of capturing the interrelationship between kinematic and dynamic synthesis simultaneously when the developed kinematic design method is integrated with the mechanical system simulation techniques.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Technological development brings more and more complex systems to the consumer markets. The time required for bringing a new product to market is crucial for the competitive edge of a company. Simulation is used as a tool to model these products and their operation before actual live systems are built. The complexity of these systems can easily require large amounts of memory and computing power. Distributed simulation can be used to meet these demands. Distributed simulation has its problems. Diworse, a distributed simulation environment, was used in this study to analyze the different factors that affect the time required for the simulation of a system. Examples of these factors are the simulation algorithm, communication protocols, partitioning of the problem, distributionof the problem, capabilities of the computing and communications equipment and the external load. Offices offer vast amounts of unused capabilities in the formof idle workstations. The use of this computing power for distributed simulation requires the simulation to adapt to a changing load situation. This requires all or part of the simulation work to be removed from a workstation when the owner wishes to use the workstation again. If load balancing is not performed, the simulation suffers from the workstation's reduced performance, which also hampers the owner's work. Operation of load balancing in Diworse is studied and it is shown to perform better than no load balancing, as well as which different approaches for load balancing are discussed.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Concerning process control of batch cooling crystallization the present work focused on the cooling profile and seeding technique. Secondly, the influence of additives on batch-wise precipitation process was investigated. Moreover, a Computational Fluid Dynamics (CFD) model for simulation of controlled batch cooling crystallization was developed. A novel cooling model to control supersaturation level during batch-wise cooling crystallization was introduced. The crystallization kinetics together with operating conditions, i.e. seed loading, cooling rate and batch time, were taken into account in the model. Especially, the supersaturation- and suspension density- dependent secondary nucleation was included in the model. The interaction between the operating conditions and their influence on the control target, i.e. the constant level of supersaturation, were studied with the aid of a numerical solution for the cooling model. Further, the batch cooling crystallization was simulated with the ideal mixing model and CFD model. The moment transformation of the population balance, together with the mass and heat balances, were solved numerically in the simulation. In order to clarify a relationship betweenthe operating conditions and product sizes, a system chart was developed for anideal mixing condition. The utilization of the system chart to determine the appropriate operating condition to meet a required product size was introduced. With CFD simulation, batch crystallization, operated following a specified coolingmode, was studied in the crystallizers having different geometries and scales. The introduced cooling model and simulation results were verified experimentallyfor potassium dihydrogen phosphate (KDP) and the novelties of the proposed control policies were demonstrated using potassium sulfate by comparing with the published results in the literature. The study on the batch-wise precipitation showed that immiscible additives could promote the agglomeration of a derivative of benzoic acid, which facilitated the filterability of the crystal product.