996 resultados para lattice Boltzmann method
Resumo:
High-resolution ac susceptibility and thermal conductivity measurement on Cu2Te2O5X2 (X=Br,Cl) single crystals are reported. For Br-sample, sample dependence prevents one from distinguishing between possibilities of magnetically ordered and spin-singlet ground states. In Cl-sample a three-dimensional transition at 18.5 K is accompanied by almost isotropic behavior of susceptibility and almost switching behavior of thermal conductivity. Thermal conductivity studies suggest the presence of a tremendous spin-lattice coupling characterizing Cl- but not Br-sample. Below the transition Cl-sample is in a complex magnetic state involving AF order but also the elements consistent with the presence of a gap in the excitation spectrum.
Resumo:
Une fois déposé, un sédiment est affecté au cours de son enfouissement par un ensemble de processus, regroupé sous le terme diagenèse, le transformant parfois légèrement ou bien suffisamment pour le rendre méconnaissable. Ces modifications ont des conséquences sur les propriétés pétrophysiques qui peuvent être positives ou négatives, c'est-à-dire les améliorer ou bien les détériorer. Une voie alternative de représentation numérique des processus, affranchie de l'utilisation des réactions physico-chimiques, a été adoptée et développée en mimant le déplacement du ou des fluides diagénétiques. Cette méthode s'appuie sur le principe d'un automate cellulaire et permet de simplifier les phénomènes sans sacrifier le résultat et permet de représenter les phénomènes diagénétiques à une échelle fine. Les paramètres sont essentiellement numériques ou mathématiques et nécessitent d'être mieux compris et renseignés à partir de données réelles issues d'études d'affleurements et du travail analytique effectué. La représentation des phénomènes de dolomitisation de faible profondeur suivie d'une phase de dédolomitisation a été dans un premier temps effectuée. Le secteur concerne une portion de la série carbonatée de l'Urgonien (Barrémien-Aptien), localisée dans le massif du Vercors en France. Ce travail a été réalisé à l'échelle de la section afin de reproduire les géométries complexes associées aux phénomènes diagénétiques et de respecter les proportions mesurées en dolomite. De plus, la dolomitisation a été simulée selon trois modèles d'écoulement. En effet, la dédolomitisation étant omniprésente, plusieurs hypothèses sur le mécanisme de dolomitisation ont été énoncées et testées. Plusieurs phases de dolomitisation per ascensum ont été également simulées sur des séries du Lias appartenant aux formations du groupe des Calcaire Gris, localisées au nord-est de l'Italie. Ces fluides diagénétiques empruntent le réseau de fracturation comme vecteur et affectent préférentiellement les lithologies les plus micritisées. Cette étude a permis de mettre en évidence la propagation des phénomènes à l'échelle de l'affleurement. - Once deposited, sediment is affected by diagenetic processes during their burial history. These diagenetic processes are able to affect the petrophysical properties of the sedimentary rocks and also improve as such their reservoir capacity. The modelling of diagenetic processes in carbonate reservoirs is still a challenge as far as neither stochastic nor physicochemical simulations can correctly reproduce the complexity of features and the reservoir heterogeneity generated by these processes. An alternative way to reach this objective deals with process-like methods, which simplify the algorithms while preserving all geological concepts in the modelling process. The aim of the methodology is to conceive a consistent and realistic 3D model of diagenetic overprints on initial facies resulting in petrophysical properties at a reservoir scale. The principle of the method used here is related to a lattice gas automata used to mimic diagenetic fluid flows and to reproduce the diagenetic effects through the evolution of mineralogical composition and petrophysical properties. This method developed in a research group is well adapted to handle dolomite reservoirs through the propagation of dolomitising fluids and has been applied on two case studies. The first study concerns a mid-Cretaceous rudist and granular platform of carbonate succession (Urgonian Fm., Les Gorges du Nan, Vercors, SE France), in which several main diagenetic stages have been identified. The modelling in 2D is focused on dolomitisation followed by a dédolomitisation stage. For the second study, data collected from outcrops on the Venetian platform (Lias, Mont Compomolon NE Italy), in which several diagenetic stages have been identified. The main one is related to per ascensum dolomitisation along fractures. In both examples, the evolution of the effects of the mimetic diagenetic fluid on mineralogical composition can be followed through space and numerical time and help to understand the heterogeneity in reservoir properties. Carbonates, dolomitisation, dédolomitisation, process-like modelling, lattice gas automata, random walk, memory effect.
Resumo:
Nowadays the used fuel variety in power boilers is widening and new boiler constructions and running models have to be developed. This research and development is done in small pilot plants where more faster analyse about the boiler mass and heat balance is needed to be able to find and do the right decisions already during the test run. The barrier on determining boiler balance during test runs is the long process of chemical analyses of collected input and outputmatter samples. The present work is concentrating on finding a way to determinethe boiler balance without chemical analyses and optimise the test rig to get the best possible accuracy for heat and mass balance of the boiler. The purpose of this work was to create an automatic boiler balance calculation method for 4 MW CFB/BFB pilot boiler of Kvaerner Pulping Oy located in Messukylä in Tampere. The calculation was created in the data management computer of pilot plants automation system. The calculation is made in Microsoft Excel environment, which gives a good base and functions for handling large databases and calculations without any delicate programming. The automation system in pilot plant was reconstructed und updated by Metso Automation Oy during year 2001 and the new system MetsoDNA has good data management properties, which is necessary for big calculations as boiler balance calculation. Two possible methods for calculating boiler balance during test run were found. Either the fuel flow is determined, which is usedto calculate the boiler's mass balance, or the unburned carbon loss is estimated and the mass balance of the boiler is calculated on the basis of boiler's heat balance. Both of the methods have their own weaknesses, so they were constructed parallel in the calculation and the decision of the used method was left to user. User also needs to define the used fuels and some solid mass flowsthat aren't measured automatically by the automation system. With sensitivity analysis was found that the most essential values for accurate boiler balance determination are flue gas oxygen content, the boiler's measured heat output and lower heating value of the fuel. The theoretical part of this work concentrates in the error management of these measurements and analyses and on measurement accuracy and boiler balance calculation in theory. The empirical part of this work concentrates on the creation of the balance calculation for the boiler in issue and on describing the work environment.
Resumo:
Electrical Impedance Tomography (EIT) is an imaging method which enables a volume conductivity map of a subject to be produced from multiple impedance measurements. It has the potential to become a portable non-invasive imaging technique of particular use in imaging brain function. Accurate numerical forward models may be used to improve image reconstruction but, until now, have employed an assumption of isotropic tissue conductivity. This may be expected to introduce inaccuracy, as body tissues, especially those such as white matter and the skull in head imaging, are highly anisotropic. The purpose of this study was, for the first time, to develop a method for incorporating anisotropy in a forward numerical model for EIT of the head and assess the resulting improvement in image quality in the case of linear reconstruction of one example of the human head. A realistic Finite Element Model (FEM) of an adult human head with segments for the scalp, skull, CSF, and brain was produced from a structural MRI. Anisotropy of the brain was estimated from a diffusion tensor-MRI of the same subject and anisotropy of the skull was approximated from the structural information. A method for incorporation of anisotropy in the forward model and its use in image reconstruction was produced. The improvement in reconstructed image quality was assessed in computer simulation by producing forward data, and then linear reconstruction using a sensitivity matrix approach. The mean boundary data difference between anisotropic and isotropic forward models for a reference conductivity was 50%. Use of the correct anisotropic FEM in image reconstruction, as opposed to an isotropic one, corrected an error of 24 mm in imaging a 10% conductivity decrease located in the hippocampus, improved localisation for conductivity changes deep in the brain and due to epilepsy by 4-17 mm, and, overall, led to a substantial improvement on image quality. This suggests that incorporation of anisotropy in numerical models used for image reconstruction is likely to improve EIT image quality.
Resumo:
This report illustrates a comparative study of various joining methods involved in sheet metal production. In this report it shows the selection of joining methods, which includes comparing the advantages and disadvantages of a method over the other ones and choosing the best method for joining. On the basis of various joining process from references, a table is generated containing set of criterion that helps in evaluation of various sheet metal joining processes and in selecting the most suitable process for a particular product. Three products are selected and a comprehensive study of the joining methods is analyzed with the help of various parameters. The table thus is the main part of the analysis process of this study and can be advanced with the beneficial results. It helps in a better and easy understanding and comparing the various methods, which provides the foundation of this study and analysis. The suitability of the joining method for various types of cases of different sheet metal products can be tested with the help of this table. The sections of the created table display the requirements of manufacturing. The important factor has been considered and given focus in the table, as how the usage of these parameters is important in percentages according to particular or individual case. The analysis of the methods can be extended or altered by changing the parameters according to the constraint. The use of this table is demonstrated by pertaining the cases from sheet metal production.
Resumo:
Tutkimuksen tavoitteena on ennakoida liiketoimintaprosessien sähköistymisen kehittymistä käyttämällä skenaariomenetelmää, yhtä laajimmin käytetyistä tulevaisuuden tutkimisen menetelmistä. Tarkastelun kohteena ovat erityisesti tulevaisuuden e-business -ratkaisut metsäteollisuudessa. Tutkimuksessa selvitetään skenaariomenetelmän ominaisuuksia, skenaariosuunnittelun periaatteita sekä menetelmän sopivuutta teknologian ja toimialan muutosten tarkasteluun. Tutkimuksen teoriaosassa selvitetään teknologian muutoksen vaikutusta toimialojen kehitykseen. Todettiin, että teknologisella muutoksella on vahva vaikutus toimialojen muutoksiin, ja että jokainen toimiala seuraa tietynlaista kehitystrajektoria. Yritysten tulee olla tietoisia teknologisen muutoksen nopeudesta ja suunnasta, ja seurata toimialansa kehityksen sääntöjä. Metsäteollisuudessa muutosten radikaali luonne sekä ICT-teknologian nopea kehitys asettavat haasteita liiketoimintaprosessien sähköistämisen kentässä. Empiriaosuudessa luotiin kolme erilaista skenaariota e-busineksen tulevaisuudesta metsäteollisuudessa. Skenaariot perustuvat pääosin aiheen asiantuntijoiden tämän hetkisiin näkemyksiin, joita koottiin skenaariotyöpajassa. Skenaarioiden muodostamisessa yhdistettiin kvalitatiivisia ja kvantitatiivisia elementtejä. Muodostetut kolme skenaariota osoittavat, että e-busineksen vaikutukset tulevaisuudessa nähdään pääosin positiivisina, ja että yritysten tulee kehittyä aktiivisesti ja joustavasti pystyäkseen hyödyntämään sähköisiä ratkaisuja tehokkaasti liiketoiminnassaan.
Resumo:
Kokonaisvaltaisen laatujohtamisen malli TQM (Total Quality Management) on noussut yhdeksi merkittävimmistä konsepteista globaalissa liiketoiminnassa, missä laatu on tärkeä kilpailutekijä. Tämä diplomityö pureutuu nykyaikaiseen kokonaisvaltaisen laatujohtamisen konseptiin, joka nostaa perinteisen laatuajattelun uudelle tasolle. Moderni laatujohtamisajattelu on kasvanut koskemaan yrityksen toiminnan kaikkia osa-alueita. Työn tavoitteena on TietoEnator Käsittely ja Verkkopalvelut liiketoiminta-alueen osalta laadun sekä liiketoiminnallisen suorituskyvyn parantaminen kokonaisvaltaisesti. Ennen varsinaisen laatujohtamis-konseptin käsittelyä työ esittelee ensin yleisellä tasolla perinteistä laatu käsitettä sekä käsittelee lyhyestiICT-liiketoimintaa ja siihen liittyviä standardeja. Lopuksi tutkimus esittelee priorisoituja parannusehdotuksia ja askeleita jotka auttavat organisaatiota saavuttamaan kokonaisvaltaisen laatujohtamiskonseptin mukaisia pyrkimyksiä.
Resumo:
The main goal of this paper is to propose a convergent finite volume method for a reactionâeuro"diffusion system with cross-diffusion. First, we sketch an existence proof for a class of cross-diffusion systems. Then the standard two-point finite volume fluxes are used in combination with a nonlinear positivity-preserving approximation of the cross-diffusion coefficients. Existence and uniqueness of the approximate solution are addressed, and it is also shown that the scheme converges to the corresponding weak solution for the studied model. Furthermore, we provide a stability analysis to study pattern-formation phenomena, and we perform two-dimensional numerical examples which exhibit formation of nonuniform spatial patterns. From the simulations it is also found that experimental rates of convergence are slightly below second order. The convergence proof uses two ingredients of interest for various applications, namely the discrete Sobolev embedding inequalities with general boundary conditions and a space-time $L^1$ compactness argument that mimics the compactness lemma due to Kruzhkov. The proofs of these results are given in the Appendix.
Resumo:
Työssä tutkittiin muovattujen kartonkivuokien sekä muovattujen kartonkinäytteiden rinnastettavuutta. Puristusvaiheen prosessiolosuhteiden miellettiin vaikuttavan eniten multidimensionaliseen muodonmuutokseen. Multidimensionaalista muodonmuutosta simuloitiin uudella muovaamiseen soveltuvalla muovauslaitteella. Kirjallisuusosassa keskeisiä teemoja ovat kartongin muovaus sekä kuitupohjaisen materiaalin reologinen käyttäytyminen. Kirjallisuusosassa esitellään lisäksi yksi tekninen sovellus, jonka avulla kyetään ennustamaan kuitumateriaalin muovautuvuutta sekä mittaamaan tapahtunutta muodonmuutosta. Prosessiparametrien teoreettista vaikutustakuituihin tarkastellaan myös kirjallisuusosassa. Kokeellisessa osassa toteutettiin kartonkivuokien valmistus puristamalla. Vastaavilla prosessiparametreilla muovattiin myös pienemmät testinäytteet. Perinteiset yksidimensionaliset deformaatiomittaukset toteutettiin lujuusominaisuuksien laboratoriomäärityksinä. Myös kitka, joka toimii tärkeänä muuttujana prässäysprosessissa, mitattiin laboratorio-olosuhteissa. Tämän työn tulokset osoittavat uuden kehitetyn muovausmenetelmän toimivuuden. Asema-voima kuvaajat ovat selkeitä sekä helposti luettavia. Tuloksissa havaittiin materiaalin muovauspotentiaalin sekä asema-voima kuvaajan välillä vallitseva yhteys. Erittäin merkittävä huomio oli myös, että muovipäällystetyllä kartongilla oli yhteys päällystämättömän kartongin asema-voima kuvaajaan. Tämä tulos osoittaa, että muovipäällystetyn kartongin muovautuvuutta voi olla mahdollista ennustaa pohjakartongin muovautuvuustulosten perusteella. Perinteiset yksidimensionaliset laboratoriomittaukset eivät kykene antamaan riittävää informaatiota muovautuvuuden ennustamiseen. Tästä näkökulmasta on tärkeää että kartongin multidimensionalista muotoutuvuutta voidaankin tutkia kehitetyllä muovausmenetelmällä.
Resumo:
Knowledge of the pathological diagnosis before deciding the best strategy for treating parasellar lesions is of prime importance, due to the relative high morbidity and side-effects of open direct approaches to this region, known to be rich in important vasculo-nervous structures. When imaging is not evocative enough to ascertain an accurate pathological diagnosis, a percutaneous biopsy through the transjugal-transoval route (of Hartel) may be performed to guide the therapeutic decision.The chapter is based on the authors' experience in 50 patients who underwent the procedure over the ten past years. There was no mortality and only little (mostly transient) morbidity. Pathological diagnosis accuracy of the method revealed good, with a sensitivity of 0.83 and a specificity of 1.In the chapter the authors first recall the surgical anatomy background from personal laboratory dissections. They then describe the technical procedure, as well as the tissue harvesting method. Finally they define indications together with the decision-making process.Due to the constraint trajectory of the biopsy needle inserted through the Foramen Ovale, accessible lesions are only those located in the Meckel trigeminal Cave, the posterior sector of the cavernous sinus compartment, and the upper part of the petroclival region.The authors advise to perform this percutaneous biopsy method when imaging does not provide sufficient evidence of the pathological nature of the lesion, for therapeutic decision. Goal is to avoid unnecessary open surgery or radiosurgery, also inappropriate chemo-/radio-therapy.
Resumo:
Personal results are presented to illustrate the development of immunoscintigraphy for the detection of cancer over the last 12 years, from the early experimental results in nude mice grafted with human colon carcinoma to the most modern form of immunoscintigraphy applied to patients, using I123 labeled Fab fragments from monoclonal anti-CEA antibodies detected by single photon emission computerized tomography (SPECT). The first generation of immunoscintigraphy used I131 labeled, immunoadsorbent purified, polyclonal anti-CEA antibodies and planar scintigraphy, as the detection system. The second generation used I131 labeled monoclonal anti-CEA antibodies and SPECT, while the third generation employed I123 labeled fragments of monoclonal antibodies and SPECT. The improvement in the precision of tumor images with the most recent forms of immunoscintigraphy is obvious. However, we think the usefulness of immunoscintigraphy for routine cancer management has not yet been entirely demonstrated. Further prospective trials are still necessary to determine the precise clinical role of immunoscintigraphy. A case report is presented on a patient with two liver metastases from a sigmoid carcinoma, who received through the hepatic artery a therapeutic dose (100 mCi) of I131 coupled to 40 mg of a mixture of two high affinity anti-CEA monoclonal antibodies. Excellent localisation in the metastases of the I131 labeled antibodies was demonstrated by SPECT and the treatment was well tolerated. The irradiation dose to the tumor, however, was too low at 4300 rads (with 1075 rads to the normal liver and 88 rads to the bone marrow), and no evidence of tumor regression was obtained. Different approaches for increasing the irradiation dose delivered to the tumor by the antibodies are considered.
Resumo:
A recently developed calculation method to determine stoichiometric dissociation constants of weak acids from potentiometric titration data is described. The titration data from three different weak acids in aqueous salt solutions at 25 °C were used as examples of the use of the method. The salt alone determined the ionic strength of the solutions considered in this study, and salt molalities up to 0,5 mol kg -1 were used.
Resumo:
La biologie de la conservation est communément associée à la protection de petites populations menacées d?extinction. Pourtant, il peut également être nécessaire de soumettre à gestion des populations surabondantes ou susceptibles d?une trop grande expansion, dans le but de prévenir les effets néfastes de la surpopulation. Du fait des différences tant quantitatives que qualitatives entre protection des petites populations et contrôle des grandes, il est nécessaire de disposer de modèles et de méthodes distinctes. L?objectif de ce travail a été de développer des modèles prédictifs de la dynamique des grandes populations, ainsi que des logiciels permettant de calculer les paramètres de ces modèles et de tester des scénarios de gestion. Le cas du Bouquetin des Alpes (Capra ibex ibex) - en forte expansion en Suisse depuis sa réintroduction au début du XXème siècle - servit d?exemple. Cette tâche fut accomplie en trois étapes : En premier lieu, un modèle de dynamique locale, spécifique au Bouquetin, fut développé : le modèle sous-jacent - structuré en classes d?âge et de sexe - est basé sur une matrice de Leslie à laquelle ont été ajoutées la densité-dépendance, la stochasticité environnementale et la chasse de régulation. Ce modèle fut implémenté dans un logiciel d?aide à la gestion - nommé SIM-Ibex - permettant la maintenance de données de recensements, l?estimation automatisée des paramètres, ainsi que l?ajustement et la simulation de stratégies de régulation. Mais la dynamique d?une population est influencée non seulement par des facteurs démographiques, mais aussi par la dispersion et la colonisation de nouveaux espaces. Il est donc nécessaire de pouvoir modéliser tant la qualité de l?habitat que les obstacles à la dispersion. Une collection de logiciels - nommée Biomapper - fut donc développée. Son module central est basé sur l?Analyse Factorielle de la Niche Ecologique (ENFA) dont le principe est de calculer des facteurs de marginalité et de spécialisation de la niche écologique à partir de prédicteurs environnementaux et de données d?observation de l?espèce. Tous les modules de Biomapper sont liés aux Systèmes d?Information Géographiques (SIG) ; ils couvrent toutes les opérations d?importation des données, préparation des prédicteurs, ENFA et calcul de la carte de qualité d?habitat, validation et traitement des résultats ; un module permet également de cartographier les barrières et les corridors de dispersion. Le domaine d?application de l?ENFA fut exploré par le biais d?une distribution d?espèce virtuelle. La comparaison à une méthode couramment utilisée pour construire des cartes de qualité d?habitat, le Modèle Linéaire Généralisé (GLM), montra qu?elle était particulièrement adaptée pour les espèces cryptiques ou en cours d?expansion. Les informations sur la démographie et le paysage furent finalement fusionnées en un modèle global. Une approche basée sur un automate cellulaire fut choisie, tant pour satisfaire aux contraintes du réalisme de la modélisation du paysage qu?à celles imposées par les grandes populations : la zone d?étude est modélisée par un pavage de cellules hexagonales, chacune caractérisée par des propriétés - une capacité de soutien et six taux d?imperméabilité quantifiant les échanges entre cellules adjacentes - et une variable, la densité de la population. Cette dernière varie en fonction de la reproduction et de la survie locale, ainsi que de la dispersion, sous l?influence de la densité-dépendance et de la stochasticité. Un logiciel - nommé HexaSpace - fut développé pour accomplir deux fonctions : 1° Calibrer l?automate sur la base de modèles de dynamique (par ex. calculés par SIM-Ibex) et d?une carte de qualité d?habitat (par ex. calculée par Biomapper). 2° Faire tourner des simulations. Il permet d?étudier l?expansion d?une espèce envahisseuse dans un paysage complexe composé de zones de qualité diverses et comportant des obstacles à la dispersion. Ce modèle fut appliqué à l?histoire de la réintroduction du Bouquetin dans les Alpes bernoises (Suisse). SIM-Ibex est actuellement utilisé par les gestionnaires de la faune et par les inspecteurs du gouvernement pour préparer et contrôler les plans de tir. Biomapper a été appliqué à plusieurs espèces (tant végétales qu?animales) à travers le Monde. De même, même si HexaSpace fut initialement conçu pour des espèces animales terrestres, il pourrait aisément être étndu à la propagation de plantes ou à la dispersion d?animaux volants. Ces logiciels étant conçus pour, à partir de données brutes, construire un modèle réaliste complexe, et du fait qu?ils sont dotés d?une interface d?utilisation intuitive, ils sont susceptibles de nombreuses applications en biologie de la conservation. En outre, ces approches peuvent également s?appliquer à des questions théoriques dans les domaines de l?écologie des populations et du paysage.<br/><br/>Conservation biology is commonly associated to small and endangered population protection. Nevertheless, large or potentially large populations may also need human management to prevent negative effects of overpopulation. As there are both qualitative and quantitative differences between small population protection and large population controlling, distinct methods and models are needed. The aim of this work was to develop theoretical models to predict large population dynamics, as well as computer tools to assess the parameters of these models and to test management scenarios. The alpine Ibex (Capra ibex ibex) - which experienced a spectacular increase since its reintroduction in Switzerland at the beginning of the 20th century - was used as paradigm species. This task was achieved in three steps: A local population dynamics model was first developed specifically for Ibex: the underlying age- and sex-structured model is based on a Leslie matrix approach with addition of density-dependence, environmental stochasticity and culling. This model was implemented into a management-support software - named SIM-Ibex - allowing census data maintenance, parameter automated assessment and culling strategies tuning and simulating. However population dynamics is driven not only by demographic factors, but also by dispersal and colonisation of new areas. Habitat suitability and obstacles modelling had therefore to be addressed. Thus, a software package - named Biomapper - was developed. Its central module is based on the Ecological Niche Factor Analysis (ENFA) whose principle is to compute niche marginality and specialisation factors from a set of environmental predictors and species presence data. All Biomapper modules are linked to Geographic Information Systems (GIS); they cover all operations of data importation, predictor preparation, ENFA and habitat suitability map computation, results validation and further processing; a module also allows mapping of dispersal barriers and corridors. ENFA application domain was then explored by means of a simulated species distribution. It was compared to a common habitat suitability assessing method, the Generalised Linear Model (GLM), and was proven better suited for spreading or cryptic species. Demography and landscape informations were finally merged into a global model. To cope with landscape realism and technical constraints of large population modelling, a cellular automaton approach was chosen: the study area is modelled by a lattice of hexagonal cells, each one characterised by a few fixed properties - a carrying capacity and six impermeability rates quantifying exchanges between adjacent cells - and one variable, population density. The later varies according to local reproduction/survival and dispersal dynamics, modified by density-dependence and stochasticity. A software - named HexaSpace - was developed, which achieves two functions: 1° Calibrating the automaton on the base of local population dynamics models (e.g., computed by SIM-Ibex) and a habitat suitability map (e.g. computed by Biomapper). 2° Running simulations. It allows studying the spreading of an invading species across a complex landscape made of variously suitable areas and dispersal barriers. This model was applied to the history of Ibex reintroduction in Bernese Alps (Switzerland). SIM-Ibex is now used by governmental wildlife managers to prepare and verify culling plans. Biomapper has been applied to several species (both plants and animals) all around the World. In the same way, whilst HexaSpace was originally designed for terrestrial animal species, it could be easily extended to model plant propagation or flying animals dispersal. As these softwares were designed to proceed from low-level data to build a complex realistic model and as they benefit from an intuitive user-interface, they may have many conservation applications. Moreover, theoretical questions in the fields of population and landscape ecology might also be addressed by these approaches.