144 resultados para Crash Duration Modelling
Resumo:
With six targeted agents approved (sorafenib, sunitinib, temsirolimus, bevacizumab [+interferon], everolimus and pazopanib), many patients with metastatic renal cell carcinoma (mRCC) will receive multiple therapies. However, the optimum sequencing approach has not been defined. A group of European experts reviewed available data and shared their clinical experience to compile an expert agreement on the sequential use of targeted agents in mRCC. To date, there are few prospective studies of sequential therapy. The mammalian target of rapamycin (mTOR) inhibitor everolimus was approved for use in patients who failed treatment with inhibitors of vascular endothelial growth factor (VEGF) and VEGF receptors (VEGFR) based on the results from a Phase III placebo-controlled study; however, until then, the only licensed agents across the spectrum of mRCC were VEGF(R) inhibitors (sorafenib, sunitinib and bevacizumab + interferon), and as such, a large body of evidence has accumulated regarding their use in sequence. Data show that sequential use of VEGF(R) inhibitors may be an effective treatment strategy to achieve prolonged clinical benefit. The optimal place of each targeted agent in the treatment sequence is still unclear, and data from large prospective studies are needed. The Phase III AXIS study of second-line sorafenib vs. axitinib (including post-VEGF(R) inhibitors) has completed, but the data are not yet published; other ongoing studies include the Phase III SWITCH study of sorafenib-sunitinib vs. sunitinib-sorafenib (NCT00732914); the Phase III 404 study of temsirolimus vs. sorafenib post-sunitinib (NCT00474786) and the Phase II RECORD 3 study of sunitinib-everolimus vs. everolimus-sunitinib (NCT00903175). Until additional data are available, consideration of patient response and tolerability to treatment may facilitate current decision-making regarding when to switch and which treatment to switch to in real-life clinical practice.
Resumo:
Sustainable resource use is one of the most important environmental issues of our times. It is closely related to discussions on the 'peaking' of various natural resources serving as energy sources, agricultural nutrients, or metals indispensable in high-technology applications. Although the peaking theory remains controversial, it is commonly recognized that a more sustainable use of resources would alleviate negative environmental impacts related to resource use. In this thesis, sustainable resource use is analysed from a practical standpoint, through several different case studies. Four of these case studies relate to resource metabolism in the Canton of Geneva in Switzerland: the aim was to model the evolution of chosen resource stocks and flows in the coming decades. The studied resources were copper (a bulk metal), phosphorus (a vital agricultural nutrient), and wood (a renewable resource). In addition, the case of lithium (a critical metal) was analysed briefly in a qualitative manner and in an electric mobility perspective. In addition to the Geneva case studies, this thesis includes a case study on the sustainability of space life support systems. Space life support systems are systems whose aim is to provide the crew of a spacecraft with the necessary metabolic consumables over the course of a mission. Sustainability was again analysed from a resource use perspective. In this case study, the functioning of two different types of life support systems, ARES and BIORAT, were evaluated and compared; these systems represent, respectively, physico-chemical and biological life support systems. Space life support systems could in fact be used as a kind of 'laboratory of sustainability' given that they represent closed and relatively simple systems compared to complex and open terrestrial systems such as the Canton of Geneva. The chosen analysis method used in the Geneva case studies was dynamic material flow analysis: dynamic material flow models were constructed for the resources copper, phosphorus, and wood. Besides a baseline scenario, various alternative scenarios (notably involving increased recycling) were also examined. In the case of space life support systems, the methodology of material flow analysis was also employed, but as the data available on the dynamic behaviour of the systems was insufficient, only static simulations could be performed. The results of the case studies in the Canton of Geneva show the following: were resource use to follow population growth, resource consumption would be multiplied by nearly 1.2 by 2030 and by 1.5 by 2080. A complete transition to electric mobility would be expected to only slightly (+5%) increase the copper consumption per capita while the lithium demand in cars would increase 350 fold. For example, phosphorus imports could be decreased by recycling sewage sludge or human urine; however, the health and environmental impacts of these options have yet to be studied. Increasing the wood production in the Canton would not significantly decrease the dependence on wood imports as the Canton's production represents only 5% of total consumption. In the comparison of space life support systems ARES and BIORAT, BIORAT outperforms ARES in resource use but not in energy use. However, as the systems are dimensioned very differently, it remains questionable whether they can be compared outright. In conclusion, the use of dynamic material flow analysis can provide useful information for policy makers and strategic decision-making; however, uncertainty in reference data greatly influences the precision of the results. Space life support systems constitute an extreme case of resource-using systems; nevertheless, it is not clear how their example could be of immediate use to terrestrial systems.
Resumo:
New geochronological data which clarify the timing of syn-orogenic magmatism and regional metamorphism in the Connemara Dalradian are presented. U-Pb zircon data on four intermediate to acid foliated magmatic rocks show important inherited components but the most concordant fractions demonstrate that major magmatism continued until 465 Ma whereas the earliest, basic magmatism has been dated previously at 490 Ma; a fine-grained, fabric-cutting granite contains discordant zircons which also appear to be 465 Ma old. Are magmatism in Connemara therefore spanned a period of at least 25 Ma. Recent U-Pb data on titanite from central Connemara which gave a peak metamorphic age of 478 Ma are supplemented by U-Pb data on titanite and monazite from metamorphic veins in the east of Connemara which indicate that low-P, high-T regional metamorphism ism continued there to 465 Ma, i.e. at least 10 Ma later than in the central region dated previously. New Rb-Sr data on muscovites from coarse-grained segregations in different structural settings range from 475 to 435 Ma; in part this range probably also reflects differences in age from west to east, with three ages close to 455 Ma from the eastern area, which is also the site of the lowest pressure metamorphism. Thermal modelling indicates that at any one locality the duration of metamorphism was probably as little as 1-2 Ma. The new dates emphasize the complexity in the spatial and temporal distribution of high-level regional metamorphism caused by magmatic activity. The relatively simple overall distribution of mineral-appearance isograds revealed by regional mapping masks the complexity of a prolonged but punctuated metamorphic history related to multiple intrusions, primarily in the southern part of Connemara. The later stages of magmatic activity followed progressive uplift and erosion after the onset of magmatism, and were localized in the eastern part of the region.
Resumo:
Résumé de l'étude. L'application d'une pression positive (PEEP) pendant la phase d'induction d'une anesthésie générale peut prévenir la formation d'atélectasies pulmonaires. Ceci pourrait permettre d'accroître la durée d'apnée non hypoxique par l'augmentation de la capacité pulmonaire résiduelle fonctionnelle (CRF). Nous avons étudié le bénéfice de l'application d'une PEEP durant la phase d'induction d'une anesthésie générale sur la durée d'apnée avant que la saturation périphérique en oxygène atteigne 90%. Quarante patients ASA I-II ont été randomisés en deux groupes distincts. - Dans le groupe PEEP (n=20), les patients ont été pré-oxygénés durant 5 minutes avec une Fi02 à l00% par l'intermédiaire d'un appareil de CPAP (6cmH2O). Après induction de l'anesthésie, les patients furent ventilés mécaniquement (PEEP 6cmH2O) durant 5 minutes supplémentaires. - Dans le groupe ZEEP (n=20), aucune pression positive (ni CPAP, ni PEEP) ne fut utilisée. La durée d'apnée pour atteindre une saturation périphérique de 90% fut mesurée. La durée d'apnée non hypoxique était plus longue dans le groupe PEEP par rapport au groupe ZEEP (599 +/- 135 s vs 470 +/- 150 s, p= 0,007). Nous concluons que l'application d'une pression positive durant la phase d'induction d'une anesthésie générale chez l'adulte prolonge la durée d'apnée non hypoxique de plus de 2 minutes.
Resumo:
Recent trends (1980-2007) in mortality from road traffic crashes in European countries, and, for comparative purposes, in the USA and Japan were reviewed. Data came from the World Health Organisation database. Age-standardised rates, at all ages and at 15-24, 25-64, >=65 years, were computed. Joinpoint regression analyses to evaluate significant changes in trends were performed. In the European Union as a whole rates declined from 20.2 in 1987 to 13.5/100,000 in 2007 in men, and from 6.3 to 3.7/100,000 in women; European Union rates remained lower than USA, but higher than Japanese ones. In 2007, the highest male rates were in Lithuania (36.7/100,000), the Russian Federation (35.2), Ukraine (29.8), and Latvia (28.5), and the lowest ones in the Netherlands (6.2) and Sweden (6.9); the highest female rates were in the Russian Federation (11.3), Lithuania (9.7), Belarus, Latvia, and Ukraine (around 8), and the lowest ones in Switzerland (1.7), the UK, and Nordic countries (around 2). Mortality from motor vehicle crashes declined in northern and western European countries and - though to a lesser extent - in southern European countries, too. Mortality trends were also favourable in the Czech Republic and Poland since the mid 1990's, whereas they were still upwards in Romania and the Russian Federation. No trend was observed in Hungary and Ukraine. Trends were consistent in various age groups considered. Thus, additional urgent and integrated intervention is required to prevent avoidable deaths from motor vehicle crashes, particularly in selected central and eastern European countries.
Resumo:
Using numerical simulations of pairs of long polymeric chains confined in microscopic cylinders, we investigate consequences of double-strand DNA breaks occurring in independent topological domains, such as these constituting bacterial chromosomes. Our simulations show a transition between segregated and mixed state upon linearization of one of the modelled topological domains. Our results explain how chromosomal organization into topological domains can fulfil two opposite conditions: (i) effectively repulse various loops from each other thus promoting chromosome separation and (ii) permit local DNA intermingling when one or more loops are broken and need to be repaired in a process that requires homology search between broken ends and their homologous sequences in closely positioned sister chromatid.
Resumo:
Background: In patients with cancer and acute venous thromboembolism (VTE), current consensus guidelines recommend anticoagulation therapy for an indefinite duration or until the cancer is resolved. Methods and results: Among 1'247 patients with acute VTE enrolled in the Swiss Venous Thromboembolism Registry (SWIVTER) from 18 hospitals, 315 (25%) had cancer of whom 179 (57%) had metastatic disease, 159 (50%) ongoing or recent chemotherapy, and 83 (26%) tumor surgery within 6 months. Patients with cancer were older (66±14 vs. 60±19 years, p<0.001), more often hospitalized at the time of VTE diagnosis (46% vs. 36%, p=0.001), immobile for >3 days (25% vs. 16%, p<0.001), and more often had thrombocytopenia (6% vs. 1%, p<0.001) than patients without cancer. The 30-day rate of VTE-related death or recurrent VTE was 9% in cancer patients vs. 4% in patients without cancer (p<0.001), and the rates of bleeding requiring medical attention were 5% in both groups (p=0.57). Cancer patients received indefinite-duration anticoagulation treatment more often than patients without cancer (47% vs. 19%, p<0.001), and LMWH mono-therapy during the initial 3 months was prescribed to 45% vs. 8%, p<0.001, respectively. Among patients with cancer, prior VTE (OR 4.0, 95%CI 2.0-8.0), metastatic disease (OR 3.0, 95%CI 1.7-5.2), outpatient status at the time of VTE diagnosis (OR 3.8, 95%CI 1.9-7.6), and inpatient treatment (OR 4.4, 95%CI 2.1-9.2) were independently associated with the prescription of indefinite-duration anticoagulation treatment. Conclusions: Less than half of the cancer patients with acute VTE received a prescription for indefinite-duration anticoagulation treatment. Recurrent VTE, metastatic cancer, outpatient VTE diagnosis, and VTE requiring hospitalization were associated with an increased use of this strategy.
Resumo:
Abstract One of the most important issues in molecular biology is to understand regulatory mechanisms that control gene expression. Gene expression is often regulated by proteins, called transcription factors which bind to short (5 to 20 base pairs),degenerate segments of DNA. Experimental efforts towards understanding the sequence specificity of transcription factors is laborious and expensive, but can be substantially accelerated with the use of computational predictions. This thesis describes the use of algorithms and resources for transcriptionfactor binding site analysis in addressing quantitative modelling, where probabilitic models are built to represent binding properties of a transcription factor and can be used to find new functional binding sites in genomes. Initially, an open-access database(HTPSELEX) was created, holding high quality binding sequences for two eukaryotic families of transcription factors namely CTF/NF1 and LEFT/TCF. The binding sequences were elucidated using a recently described experimental procedure called HTP-SELEX, that allows generation of large number (> 1000) of binding sites using mass sequencing technology. For each HTP-SELEX experiments we also provide accurate primary experimental information about the protein material used, details of the wet lab protocol, an archive of sequencing trace files, and assembled clone sequences of binding sequences. The database also offers reasonably large SELEX libraries obtained with conventional low-throughput protocols.The database is available at http://wwwisrec.isb-sib.ch/htpselex/ and and ftp://ftp.isrec.isb-sib.ch/pub/databases/htpselex. The Expectation-Maximisation(EM) algorithm is one the frequently used methods to estimate probabilistic models to represent the sequence specificity of transcription factors. We present computer simulations in order to estimate the precision of EM estimated models as a function of data set parameters(like length of initial sequences, number of initial sequences, percentage of nonbinding sequences). We observed a remarkable robustness of the EM algorithm with regard to length of training sequences and the degree of contamination. The HTPSELEX database and the benchmarked results of the EM algorithm formed part of the foundation for the subsequent project, where a statistical framework called hidden Markov model has been developed to represent sequence specificity of the transcription factors CTF/NF1 and LEF1/TCF using the HTP-SELEX experiment data. The hidden Markov model framework is capable of both predicting and classifying CTF/NF1 and LEF1/TCF binding sites. A covariance analysis of the binding sites revealed non-independent base preferences at different nucleotide positions, providing insight into the binding mechanism. We next tested the LEF1/TCF model by computing binding scores for a set of LEF1/TCF binding sequences for which relative affinities were determined experimentally using non-linear regression. The predicted and experimentally determined binding affinities were in good correlation.
Resumo:
Electrical Impedance Tomography (EIT) is an imaging method which enables a volume conductivity map of a subject to be produced from multiple impedance measurements. It has the potential to become a portable non-invasive imaging technique of particular use in imaging brain function. Accurate numerical forward models may be used to improve image reconstruction but, until now, have employed an assumption of isotropic tissue conductivity. This may be expected to introduce inaccuracy, as body tissues, especially those such as white matter and the skull in head imaging, are highly anisotropic. The purpose of this study was, for the first time, to develop a method for incorporating anisotropy in a forward numerical model for EIT of the head and assess the resulting improvement in image quality in the case of linear reconstruction of one example of the human head. A realistic Finite Element Model (FEM) of an adult human head with segments for the scalp, skull, CSF, and brain was produced from a structural MRI. Anisotropy of the brain was estimated from a diffusion tensor-MRI of the same subject and anisotropy of the skull was approximated from the structural information. A method for incorporation of anisotropy in the forward model and its use in image reconstruction was produced. The improvement in reconstructed image quality was assessed in computer simulation by producing forward data, and then linear reconstruction using a sensitivity matrix approach. The mean boundary data difference between anisotropic and isotropic forward models for a reference conductivity was 50%. Use of the correct anisotropic FEM in image reconstruction, as opposed to an isotropic one, corrected an error of 24 mm in imaging a 10% conductivity decrease located in the hippocampus, improved localisation for conductivity changes deep in the brain and due to epilepsy by 4-17 mm, and, overall, led to a substantial improvement on image quality. This suggests that incorporation of anisotropy in numerical models used for image reconstruction is likely to improve EIT image quality.
Resumo:
Résumé La levodopa (LD) est le traitement antiparkinsonien le plus efficace et le plus répandu. Son effet est composé d'une réponse de courte (quelques heures) et de longue durée (jours à semaines). La persistance de cette dernière dans les phases avancées de la maladie de Parkinson est controversée, et sa mesure directe n'a jamais été faite en raison des risques liés à un sevrage complet de LD. La stimulation du noyau sous-thalamique est un nouveau traitement neurochirurgical de la maladie de Parkinson, indiqué dans les formes avancées, qui permet l'arrêt complet du traitement médicamenteux chez certains patients. Nous avons étudié 30 patients qui ont bénéficié d'une telle stimulation, et les avons évalués avant l'intervention sans médicaments, et à 6 mois postopératoires, sans médicaments et sans stimulation. Chez 19 patients, la médication a pu être complètement arrêtée, alors qu'elle a dû être réintroduite chez les 11 patients restants. Au cours des 6 mois qui ont suivi l'intervention, le parkinsonisme s'est aggravé de façon significative dans le groupe sans LD, et non dans le groupe avec LD. Cette différence d'évolution s'explique par la perte de l'effet à long terme de la LD dans le groupe chez qui ce médicament a pu être arrêté. En comparant cette aggravation à la magnitude de l'effet à court terme, la réponse de longue durée correspond environ à 80 pourcent de la réponse de courte durée, et elle lui est inversement corrélée. Parmi les signes cardinaux de la maladie, la réponse de longue durée affecte surtout la bradycinésie et la rigidité, mais pas le tremblement ni la composante axiale. La comparaison du parkinsonisme avec traitement (stimulation et LD si applicable) ne montre aucune différence d'évolution entre les 2 groupes, suggérant que la stimulation compense tant la réponse de courte que de longue durée. Notre travail montre que la réponse de longue durée à la LD demeure significative chez les patients parkinsoniens après plus de 15 ans d'évolution, et suggère que la stimulation du noyau sous-thalamique compense les réponses de courte et de longue durée. Abstract Background: Long duration response to levodopa is supposed to decrease with Parkinson's disease (PD) progression, but direct observation of this response in advanced PD has never been performed. Objective: To study the long duration response to levodopa in advanced PD patients treated with subthalamic deep-brain stimulation. Design and settings: We studied 30 consecutive PD patients who underwent subthalamic deep-brain stimulation. One group had no antiparkinsonian treatment since surgery (no levodopa), while medical treatment had to be reinitiated in the other group (levodopa). Main outcome measures: motor Unified Parkinson's Disease Rating Scale (UPDRS). Results: In comparison with preoperative assessment, evaluation six months postoperatively with stimulation turned off for three hours found a worsening of the motor part of UPDRS in the no-levodopa group. This worsening being absent in the levodopa group, it most probably reflected the loss of the long duration response to levodopa in the no-levodopa group. Stimulation turned on, postoperative motor UPDRS in both groups were similar to preoperative on medication scores, suggesting that subthalamic deep-brain stimulation compensated for both the short and long duration responses to levodopa. Conclusions: Our results suggest that the long duration response to levodopa remains significant even in advanced PD, and that subthalamic deep-brain stimulation compensates for both the short and the long duration resposes to levodopa.
Resumo:
BACKGROUND: This difference in how populations living in low-, middle or upper-income countries accumulate daily PA, i.e. patterns and intensity, is an important part in addressing the global PA movement. We sought to characterize objective PA in 2,500 participants spanning the epidemiologic transition. The Modeling the Epidemiologic Transition Study (METS) is a longitudinal study, in 5 countries. METS seeks to define the association between physical activity (PA), obesity and CVD risk in populations of African origin: Ghana (GH), South Africa (SA), Seychelles (SEY), Jamaica (JA) and the US (suburban Chicago). METHODS: Baseline measurements of objective PA, SES, anthropometrics and body composition, were completed on 2,500 men and women, aged 25-45 years. Moderate and vigorous PA (MVPA, min/d) on week and weekend days was explored ecologically, by adiposity status and manual labor. RESULTS: Among the men, obesity prevalence reflected the level of economic transition and was lowest in GH (1.7%) and SA (4.8%) and highest in the US (41%). SA (55%) and US (65%) women had the highest levels of obesity, compared to only 16% in GH. More men and women in developing countries engaged in manual labor and this was reflected by an almost doubling of measured MPVA among the men in GH (45 min/d) and SA (47 min/d) compared to only 28 min/d in the US. Women in GH (25 min/d), SA (21 min/d), JA (20 min/d) and SEY (20 min/d) accumulated significantly more MPVA than women in the US (14 min/d), yet this difference was not reflected by differences in BMI between SA, JA, SEY and US. Moderate PA constituted the bulk of the PA, with no study populations except SA men accumulating > 5 min/d of vigorous PA. Among the women, no sites accumulated >2 min/d of vigorous PA. Overweight/obese men were 22% less likely to engage in manual occupations. CONCLUSION: While there is some association for PA with obesity, this relationship is inconsistent across the epidemiologic transition and suggests that PA policy recommendations should be tailored for each environment.
Resumo:
INTRODUCTION: To assess the impact of duration of untreated psychosis (DUP) on baseline and 18-month follow-up characteristics controlling for relevant confounders in an epidemiological first-episode psychosis (FEP) cohort. METHOD: The Early Psychosis Prevention and Intervention Centre (EPPIC) in Australia admitted 786 FEP patients from January 1998 to December 2000. Data were collected from medical files using a standardized questionnaire. Data from 636 patients were analyzed. RESULTS: Median DUP was 8.7 weeks. Longer DUP was associated with worse premorbid functioning (p<0.001), higher rate of schizophrenia-spectrum disorders (p<0.001), and younger age at onset of psychosis (p=0.004). Longer DUP was not associated with baseline variables but with a lower rate of remission of positive symptoms (p<0.001) and employment/occupation (p<0.001), a higher rate of persistent substance use (p=0.015), worse illness severity (p<0.001) and global functioning (p<0.001) at follow-up after controlling for relevant confounders, explaining approximately 5% of variance of remission of positive symptoms (p<0.001) in the total sample and 3% in schizophrenia-spectrum disorders excluding bipolar I disorder (p=0.002). Outcome was significantly worse when DUP exceeded 1-3 months. CONCLUSION: Avoiding pitfalls of non-epidemiological studies, DUP appears to be a modest independent predictor of prognosis in the medium-term. Results support the need for assertive early detection strategies.
Resumo:
La biologie de la conservation est communément associée à la protection de petites populations menacées d?extinction. Pourtant, il peut également être nécessaire de soumettre à gestion des populations surabondantes ou susceptibles d?une trop grande expansion, dans le but de prévenir les effets néfastes de la surpopulation. Du fait des différences tant quantitatives que qualitatives entre protection des petites populations et contrôle des grandes, il est nécessaire de disposer de modèles et de méthodes distinctes. L?objectif de ce travail a été de développer des modèles prédictifs de la dynamique des grandes populations, ainsi que des logiciels permettant de calculer les paramètres de ces modèles et de tester des scénarios de gestion. Le cas du Bouquetin des Alpes (Capra ibex ibex) - en forte expansion en Suisse depuis sa réintroduction au début du XXème siècle - servit d?exemple. Cette tâche fut accomplie en trois étapes : En premier lieu, un modèle de dynamique locale, spécifique au Bouquetin, fut développé : le modèle sous-jacent - structuré en classes d?âge et de sexe - est basé sur une matrice de Leslie à laquelle ont été ajoutées la densité-dépendance, la stochasticité environnementale et la chasse de régulation. Ce modèle fut implémenté dans un logiciel d?aide à la gestion - nommé SIM-Ibex - permettant la maintenance de données de recensements, l?estimation automatisée des paramètres, ainsi que l?ajustement et la simulation de stratégies de régulation. Mais la dynamique d?une population est influencée non seulement par des facteurs démographiques, mais aussi par la dispersion et la colonisation de nouveaux espaces. Il est donc nécessaire de pouvoir modéliser tant la qualité de l?habitat que les obstacles à la dispersion. Une collection de logiciels - nommée Biomapper - fut donc développée. Son module central est basé sur l?Analyse Factorielle de la Niche Ecologique (ENFA) dont le principe est de calculer des facteurs de marginalité et de spécialisation de la niche écologique à partir de prédicteurs environnementaux et de données d?observation de l?espèce. Tous les modules de Biomapper sont liés aux Systèmes d?Information Géographiques (SIG) ; ils couvrent toutes les opérations d?importation des données, préparation des prédicteurs, ENFA et calcul de la carte de qualité d?habitat, validation et traitement des résultats ; un module permet également de cartographier les barrières et les corridors de dispersion. Le domaine d?application de l?ENFA fut exploré par le biais d?une distribution d?espèce virtuelle. La comparaison à une méthode couramment utilisée pour construire des cartes de qualité d?habitat, le Modèle Linéaire Généralisé (GLM), montra qu?elle était particulièrement adaptée pour les espèces cryptiques ou en cours d?expansion. Les informations sur la démographie et le paysage furent finalement fusionnées en un modèle global. Une approche basée sur un automate cellulaire fut choisie, tant pour satisfaire aux contraintes du réalisme de la modélisation du paysage qu?à celles imposées par les grandes populations : la zone d?étude est modélisée par un pavage de cellules hexagonales, chacune caractérisée par des propriétés - une capacité de soutien et six taux d?imperméabilité quantifiant les échanges entre cellules adjacentes - et une variable, la densité de la population. Cette dernière varie en fonction de la reproduction et de la survie locale, ainsi que de la dispersion, sous l?influence de la densité-dépendance et de la stochasticité. Un logiciel - nommé HexaSpace - fut développé pour accomplir deux fonctions : 1° Calibrer l?automate sur la base de modèles de dynamique (par ex. calculés par SIM-Ibex) et d?une carte de qualité d?habitat (par ex. calculée par Biomapper). 2° Faire tourner des simulations. Il permet d?étudier l?expansion d?une espèce envahisseuse dans un paysage complexe composé de zones de qualité diverses et comportant des obstacles à la dispersion. Ce modèle fut appliqué à l?histoire de la réintroduction du Bouquetin dans les Alpes bernoises (Suisse). SIM-Ibex est actuellement utilisé par les gestionnaires de la faune et par les inspecteurs du gouvernement pour préparer et contrôler les plans de tir. Biomapper a été appliqué à plusieurs espèces (tant végétales qu?animales) à travers le Monde. De même, même si HexaSpace fut initialement conçu pour des espèces animales terrestres, il pourrait aisément être étndu à la propagation de plantes ou à la dispersion d?animaux volants. Ces logiciels étant conçus pour, à partir de données brutes, construire un modèle réaliste complexe, et du fait qu?ils sont dotés d?une interface d?utilisation intuitive, ils sont susceptibles de nombreuses applications en biologie de la conservation. En outre, ces approches peuvent également s?appliquer à des questions théoriques dans les domaines de l?écologie des populations et du paysage.<br/><br/>Conservation biology is commonly associated to small and endangered population protection. Nevertheless, large or potentially large populations may also need human management to prevent negative effects of overpopulation. As there are both qualitative and quantitative differences between small population protection and large population controlling, distinct methods and models are needed. The aim of this work was to develop theoretical models to predict large population dynamics, as well as computer tools to assess the parameters of these models and to test management scenarios. The alpine Ibex (Capra ibex ibex) - which experienced a spectacular increase since its reintroduction in Switzerland at the beginning of the 20th century - was used as paradigm species. This task was achieved in three steps: A local population dynamics model was first developed specifically for Ibex: the underlying age- and sex-structured model is based on a Leslie matrix approach with addition of density-dependence, environmental stochasticity and culling. This model was implemented into a management-support software - named SIM-Ibex - allowing census data maintenance, parameter automated assessment and culling strategies tuning and simulating. However population dynamics is driven not only by demographic factors, but also by dispersal and colonisation of new areas. Habitat suitability and obstacles modelling had therefore to be addressed. Thus, a software package - named Biomapper - was developed. Its central module is based on the Ecological Niche Factor Analysis (ENFA) whose principle is to compute niche marginality and specialisation factors from a set of environmental predictors and species presence data. All Biomapper modules are linked to Geographic Information Systems (GIS); they cover all operations of data importation, predictor preparation, ENFA and habitat suitability map computation, results validation and further processing; a module also allows mapping of dispersal barriers and corridors. ENFA application domain was then explored by means of a simulated species distribution. It was compared to a common habitat suitability assessing method, the Generalised Linear Model (GLM), and was proven better suited for spreading or cryptic species. Demography and landscape informations were finally merged into a global model. To cope with landscape realism and technical constraints of large population modelling, a cellular automaton approach was chosen: the study area is modelled by a lattice of hexagonal cells, each one characterised by a few fixed properties - a carrying capacity and six impermeability rates quantifying exchanges between adjacent cells - and one variable, population density. The later varies according to local reproduction/survival and dispersal dynamics, modified by density-dependence and stochasticity. A software - named HexaSpace - was developed, which achieves two functions: 1° Calibrating the automaton on the base of local population dynamics models (e.g., computed by SIM-Ibex) and a habitat suitability map (e.g. computed by Biomapper). 2° Running simulations. It allows studying the spreading of an invading species across a complex landscape made of variously suitable areas and dispersal barriers. This model was applied to the history of Ibex reintroduction in Bernese Alps (Switzerland). SIM-Ibex is now used by governmental wildlife managers to prepare and verify culling plans. Biomapper has been applied to several species (both plants and animals) all around the World. In the same way, whilst HexaSpace was originally designed for terrestrial animal species, it could be easily extended to model plant propagation or flying animals dispersal. As these softwares were designed to proceed from low-level data to build a complex realistic model and as they benefit from an intuitive user-interface, they may have many conservation applications. Moreover, theoretical questions in the fields of population and landscape ecology might also be addressed by these approaches.