43 resultados para decision support systems, GIS, interpolation, multiple regression


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Objectives Medical futility at the end of life is a growing challenge to medicine. The goals of the authors were to elucidate how clinicians define futility, when they perceive life-sustaining treatment (LST) to be futile, how they communicate this situation and why LST is sometimes continued despite being recognised as futile. Methods The authors reviewed ethics case consultation protocols and conducted semi-structured interviews with 18 physicians and 11 nurses from adult intensive and palliative care units at a tertiary hospital in Germany. The transcripts were subjected to qualitative content analysis. Results Futility was identified in the majority of case consultations. Interviewees associated futility with the failure to achieve goals of care that offer a benefit to the patient's quality of life and are proportionate to the risks, harms and costs. Prototypic examples mentioned are situations of irreversible dependence on LST, advanced metastatic malignancies and extensive brain injury. Participants agreed that futility should be assessed by physicians after consultation with the care team. Intensivists favoured an indirect and stepwise disclosure of the prognosis. Palliative care clinicians focused on a candid and empathetic information strategy. The reasons for continuing futile LST are primarily emotional, such as guilt, grief, fear of legal consequences and concerns about the family's reaction. Other obstacles are organisational routines, insufficient legal and palliative knowledge and treatment requests by patients or families. Conclusion Managing futility could be improved by communication training, knowledge transfer, organisational improvements and emotional and ethical support systems. The authors propose an algorithm for end-of-life decision making focusing on goals of treatment.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

BACKGROUND: Maintaining therapeutic concentrations of drugs with a narrow therapeutic window is a complex task. Several computer systems have been designed to help doctors determine optimum drug dosage. Significant improvements in health care could be achieved if computer advice improved health outcomes and could be implemented in routine practice in a cost effective fashion. This is an updated version of an earlier Cochrane systematic review, by Walton et al, published in 2001. OBJECTIVES: To assess whether computerised advice on drug dosage has beneficial effects on the process or outcome of health care. SEARCH STRATEGY: We searched the Cochrane Effective Practice and Organisation of Care Group specialized register (June 1996 to December 2006), MEDLINE (1966 to December 2006), EMBASE (1980 to December 2006), hand searched the journal Therapeutic Drug Monitoring (1979 to March 2007) and the Journal of the American Medical Informatics Association (1996 to March 2007) as well as reference lists from primary articles. SELECTION CRITERIA: Randomized controlled trials, controlled trials, controlled before and after studies and interrupted time series analyses of computerized advice on drug dosage were included. The participants were health professionals responsible for patient care. The outcomes were: any objectively measured change in the behaviour of the health care provider (such as changes in the dose of drug used); any change in the health of patients resulting from computerized advice (such as adverse reactions to drugs). DATA COLLECTION AND ANALYSIS: Two reviewers independently extracted data and assessed study quality. MAIN RESULTS: Twenty-six comparisons (23 articles) were included (as compared to fifteen comparisons in the original review) including a wide range of drugs in inpatient and outpatient settings. Interventions usually targeted doctors although some studies attempted to influence prescriptions by pharmacists and nurses. Although all studies used reliable outcome measures, their quality was generally low. Computerized advice for drug dosage gave significant benefits by:1.increasing the initial dose (standardised mean difference 1.12, 95% CI 0.33 to 1.92)2.increasing serum concentrations (standradised mean difference 1.12, 95% CI 0.43 to 1.82)3.reducing the time to therapeutic stabilisation (standardised mean difference -0.55, 95%CI -1.03 to -0.08)4.reducing the risk of toxic drug level (rate ratio 0.45, 95% CI 0.30 to 0.70)5.reducing the length of hospital stay (standardised mean difference -0.35, 95% CI -0.52 to -0.17). AUTHORS' CONCLUSIONS: This review suggests that computerized advice for drug dosage has some benefits: it increased the initial dose of drug, increased serum drug concentrations and led to a more rapid therapeutic control. It also reduced the risk of toxic drug levels and the length of time spent in the hospital. However, it had no effect on adverse reactions. In addition, there was no evidence to suggest that some decision support technical features (such as its integration into a computer physician order entry system) or aspects of organization of care (such as the setting) could optimise the effect of computerised advice.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

BACKGROUND: Whole pelvis intensity modulated radiotherapy (IMRT) is increasingly being used to treat cervical cancer aiming to reduce side effects. Encouraged by this, some groups have proposed the use of simultaneous integrated boost (SIB) to target the tumor, either to get a higher tumoricidal effect or to replace brachytherapy. Nevertheless, physiological organ movement and rapid tumor regression throughout treatment might substantially reduce any benefit of this approach. PURPOSE: To evaluate the clinical target volume - simultaneous integrated boost (CTV-SIB) regression and motion during chemo-radiotherapy (CRT) for cervical cancer, and to monitor treatment progress dosimetrically and volumetrically to ensure treatment goals are met. METHODS AND MATERIALS: Ten patients treated with standard doses of CRT and brachytherapy were retrospectively re-planned using a helical Tomotherapy - SIB technique for the hypothetical scenario of this feasibility study. Target and organs at risk (OAR) were contoured on deformable fused planning-computed tomography and megavoltage computed tomography images. The CTV-SIB volume regression was determined. The center of mass (CM) was used to evaluate the degree of motion. The Dice's similarity coefficient (DSC) was used to assess the spatial overlap of CTV-SIBs between scans. A cumulative dose-volume histogram modeled estimated delivered doses. RESULTS: The CTV-SIB relative reduction was between 31 and 70%. The mean maximum CM change was 12.5, 9, and 3 mm in the superior-inferior, antero-posterior, and right-left dimensions, respectively. The CTV-SIB-DSC approached 1 in the first week of treatment, indicating almost perfect overlap. CTV-SIB-DSC regressed linearly during therapy, and by the end of treatment was 0.5, indicating 50% discordance. Two patients received less than 95% of the prescribed dose. Much higher doses to the OAR were observed. A multiple regression analysis showed a significant interaction between CTV-SIB reduction and OAR dose increase. CONCLUSIONS: The CTV-SIB had important regression and motion during CRT, receiving lower therapeutic doses than expected. The OAR had unpredictable shifts and received higher doses. The use of SIB without frequent adaptation of the treatment plan exposes cervical cancer patients to an unpredictable risk of under-dosing the target and/or overdosing adjacent critical structures. In that scenario, brachytherapy continues to be the gold standard approach.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In Switzerland there is a strong movement at a national policy level towards strengthening patient rights and patient involvement in health care decisions. Yet, there is no national programme promoting shared decision making. First decision support tools (prenatal diagnosis and screening) for the counselling process have been developed and implemented. Although Swiss doctors acknowledge that shared decision making is important, hierarchical structures and asymmetric physician-patient relationships are still prevailing. The last years have seen some promising activities regarding the training of medical students and the development of patient support programmes. Swiss direct democracy and the habit of consensual decision making and citizen involvement in general may provide a fertile ground for SDM development in the primary care setting.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Extracorporeal life support systems (ECLS) have become common in cardiothoracic surgery, but are still "Terra Incognita" in other medical fields due to the fact that perfusion units are normally bound to cardiothoracic centres. The Lifebridge B2T is an ECLS that is meant to be used as an easy and fast-track extracorporeal cardiac support to provide short-term perfusion for the transport of a patient to a specialized centre. With the Lifebridge B2T it is now possible to provide extracorporeal bypass for patients in hospitals without a perfusion unit. The Lifebridge B2T was tested on three calves to analyze the handling, performance and security of this system. The Lifebridge B2T safely can be used clinically and can provide full extracorporeal support for patients in cardiac or pulmonary failure. Flows up to 3.9 +/- 0.2l/min were reached, with an inflow pressure of -103 +/- 13mmHg, using a 21Fr. BioMedicus (Medtronic, Minneapolis, MN, USA) venous cannula. The "Plug and Play" philosophy, with semi-automatic priming, integrated check-list, a long battery time of over two hours and instinctively designed user interface, makes this device very interesting for units with high-risk interventions, such as catheterisation labs. If a system is necessary in an emergency unit, the Lifebridge can provide a high security level, even in centres not acquainted with cardiopulmonary bypass.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

ABSTRACT : A firm's competitive advantage can arise from internal resources as well as from an interfirm network. -This dissertation investigates the competitive advantage of a firm involved in an innovation network by integrating strategic management theory and social network theory. It develops theory and provides empirical evidence that illustrates how a networked firm enables the network value and appropriates this value in an optimal way according to its strategic purpose. The four inter-related essays in this dissertation provide a framework that sheds light on the extraction of value from an innovation network by managing and designing the network in a proactive manner. The first essay reviews research in social network theory and knowledge transfer management, and identifies the crucial factors of innovation network configuration for a firm's learning performance or innovation output. The findings suggest that network structure, network relationship, and network position all impact on a firm's performance. Although the previous literature indicates that there are disagreements about the impact of dense or spare structure, as well as strong or weak ties, case evidence from Chinese software companies reveals that dense and strong connections with partners are positively associated with firms' performance. The second essay is a theoretical essay that illustrates the limitations of social network theory for explaining the source of network value and offers a new theoretical model that applies resource-based view to network environments. It suggests that network configurations, such as network structure, network relationship and network position, can be considered important network resources. In addition, this essay introduces the concept of network capability, and suggests that four types of network capabilities play an important role in unlocking the potential value of network resources and determining the distribution of network rents between partners. This essay also highlights the contingent effects of network capability on a firm's innovation output, and explains how the different impacts of network capability depend on a firm's strategic choices. This new theoretical model has been pre-tested with a case study of China software industry, which enhances the internal validity of this theory. The third essay addresses the questions of what impact network capability has on firm innovation performance and what are the antecedent factors of network capability. This essay employs a structural equation modelling methodology that uses a sample of 211 Chinese Hi-tech firms. It develops a measurement of network capability and reveals that networked firms deal with cooperation between, and coordination with partners on different levels according to their levels of network capability. The empirical results also suggests that IT maturity, the openness of culture, management system involved, and experience with network activities are antecedents of network capabilities. Furthermore, the two-group analysis of the role of international partner(s) shows that when there is a culture and norm gap between foreign partners, a firm must mobilize more resources and effort to improve its performance with respect to its innovation network. The fourth essay addresses the way in which network capabilities influence firm innovation performance. By using hierarchical multiple regression with data from Chinese Hi-tech firms, the findings suggest that there is a significant partial mediating effect of knowledge transfer on the relationships between network capabilities and innovation performance. The findings also reveal that the impacts of network capabilities divert with the environment and strategic decision the firm has made: exploration or exploitation. Network constructing capability provides a greater positive impact on and yields more contributions to innovation performance than does network operating capability in an exploration network. Network operating capability is more important than network constructing capability for innovative firms in an exploitation network. Therefore, these findings highlight that the firm can shape the innovation network proactively for better benefits, but when it does so, it should adjust its focus and change its efforts in accordance with its innovation purposes or strategic orientation.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

RATIONALE: An objective and simple prognostic model for patients with pulmonary embolism could be helpful in guiding initial intensity of treatment. OBJECTIVES: To develop a clinical prediction rule that accurately classifies patients with pulmonary embolism into categories of increasing risk of mortality and other adverse medical outcomes. METHODS: We randomly allocated 15,531 inpatient discharges with pulmonary embolism from 186 Pennsylvania hospitals to derivation (67%) and internal validation (33%) samples. We derived our prediction rule using logistic regression with 30-day mortality as the primary outcome, and patient demographic and clinical data routinely available at presentation as potential predictor variables. We externally validated the rule in 221 inpatients with pulmonary embolism from Switzerland and France. MEASUREMENTS: We compared mortality and nonfatal adverse medical outcomes across the derivation and two validation samples. MAIN RESULTS: The prediction rule is based on 11 simple patient characteristics that were independently associated with mortality and stratifies patients with pulmonary embolism into five severity classes, with 30-day mortality rates of 0-1.6% in class I, 1.7-3.5% in class II, 3.2-7.1% in class III, 4.0-11.4% in class IV, and 10.0-24.5% in class V across the derivation and validation samples. Inpatient death and nonfatal complications were <or= 1.1% among patients in class I and <or= 1.9% among patients in class II. CONCLUSIONS: Our rule accurately classifies patients with pulmonary embolism into classes of increasing risk of mortality and other adverse medical outcomes. Further validation of the rule is important before its implementation as a decision aid to guide the initial management of patients with pulmonary embolism.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Orientation: Research that considers the effects of individual characteristics and job characteristics jointly in burnout is necessary, especially when one considers the possibility of curvilinear relationships between job characteristics and burnout. Research purpose: This study examines the contribution of sense of coherence (SOC) and job characteristics to predicting burnout by considering direct and moderating effects. Motivation for this study: Understanding the relationships of individual and job characteristics with burnout is necessary for preventing burnout. It also informs the design of interventions. Research design, approach and method: The participants were 632 working adults (57% female) in South Africa. The measures included the Job Content Questionnaire, the Sense of Coherence Questionnaire and the Maslach Burnout Inventory. The authors analysed the data using hierarchical multiple regression with the enter method. Main findings: Job characteristics and SOC show the expected direct effects on burnout. SOC has a direct negative effect on burnout. Job demands and supervisor social support show nonlinear relationships with burnout. SOC moderates the effect of demands on burnout and has a protective function so that the demands-burnout relationship differs for those with high and low SOC. Practical/managerial implications: The types of effects, the shape of the stressor-strain relationship and the different contributions of individual and job characteristics have implications for designing interventions. Contribution/value add: SOC functions differently when combined with demands, control and support. These different effects suggest that it is not merely the presence or absence of a job characteristic that is important for well-being outcomes but how people respond to its presence or absence.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

BACKGROUND: Conversion of glucose into lipid (de novo lipogenesis; DNL) is a possible fate of carbohydrate administered during nutritional support. It cannot be detected by conventional methods such as indirect calorimetry if it does not exceed lipid oxidation. OBJECTIVE: The objective was to evaluate the effects of carbohydrate administered as part of continuous enteral nutrition in critically ill patients. DESIGN: This was a prospective, open study including 25 patients nonconsecutively admitted to a medicosurgical intensive care unit. Glucose metabolism and hepatic DNL were measured in the fasting state or after 3 d of continuous isoenergetic enteral feeding providing 28%, 53%, or 75% carbohydrate. RESULTS: DNL increased with increasing carbohydrate intake (f1.gif" BORDER="0"> +/- SEM: 7.5 +/- 1.2% with 28% carbohydrate, 9.2 +/- 1.5% with 53% carbohydrate, and 19.4 +/- 3.8% with 75% carbohydrate) and was nearly zero in a group of patients who had fasted for an average of 28 h (1.0 +/- 0.2%). In multiple regression analysis, DNL was correlated with carbohydrate intake, but not with body weight or plasma insulin concentrations. Endogenous glucose production, assessed with a dual-isotope technique, was not significantly different between the 3 groups of patients (13.7-15.3 micromol * kg(-1) * min(-1)), indicating impaired suppression by carbohydrate feeding. Gluconeogenesis was measured with [(13)C]bicarbonate, and increased as the carbohydrate intake increased (from 2.1 +/- 0.5 micromol * kg(-1) * min(-1) with 28% carbohydrate intake to 3.7 +/- 0.3 micromol * kg(-1) * min(-1) with 75% carbohydrate intake, P: < 0. 05). CONCLUSION: Carbohydrate feeding fails to suppress endogenous glucose production and gluconeogenesis, but stimulates DNL in critically ill patients.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In decision making, speed-accuracy trade-offs are well known and often inevitable because accuracy depends on being well informed and gathering information takes time. However, trade-offs between speed and cohesion, that is the degree to which a group remains together as a single entity, as a result of their decision making, have been comparatively neglected. We combine theory and experimentation to show that in decision-making systems, speed-cohesion trade-offs are a natural complement to speed-accuracy trade-offs and are therefore of general importance. We then analyse the decision performance of 32 rock ant, Temnothorax albipennis, colonies in experiments in which accuracy of collective decision making was held constant, but time urgency varied. These experiments reveal for the first time an adaptive speed-cohesion trade-off in collective decision making and how this is achieved. In accord with different time constraints, colonies can decide quickly, at the cost of social unity, or they can decide slowly with much greater cohesion. We discuss the similarity between cohesion and the term precision as used in statistics and engineering. This emphasizes the generality of speed versus cohesion/precision trade-offs in decision making and decision implementation in other fields within animal behaviour such as sexually selected motor displays and even certain aspects of birdsong. We also suggest that speed versus precision trade-offs may occur when individuals within a group need to synchronize their activity, and in collective navigation, cooperative hunting and in certain escape behaviours.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The decision-making process regarding drug dose, regularly used in everyday medical practice, is critical to patients' health and recovery. It is a challenging process, especially for a drug with narrow therapeutic ranges, in which a medical doctor decides the quantity (dose amount) and frequency (dose interval) on the basis of a set of available patient features and doctor's clinical experience (a priori adaptation). Computer support in drug dose administration makes the prescription procedure faster, more accurate, objective, and less expensive, with a tendency to reduce the number of invasive procedures. This paper presents an advanced integrated Drug Administration Decision Support System (DADSS) to help clinicians/patients with the dose computing. Based on a support vector machine (SVM) algorithm, enhanced with the random sample consensus technique, this system is able to predict the drug concentration values and computes the ideal dose amount and dose interval for a new patient. With an extension to combine the SVM method and the explicit analytical model, the advanced integrated DADSS system is able to compute drug concentration-to-time curves for a patient under different conditions. A feedback loop is enabled to update the curve with a new measured concentration value to make it more personalized (a posteriori adaptation).

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The aim of the present study was to determinate the cycle length of spermatogenesis in three species of shrew, Suncus murinus, Sorex coronatus and Sorex minutus, and to assess the relative influence of variation in basal metabolic rate (BMR) and mating system (level of sperm competition) on the observed rate of spermatogenesis, including data of shrew species studied before (Sorex araneus, Crocidura russula and Neomys fodiens). The dynamics of sperm production were determined by tracing 5-bromodeoxyuridine in the DNA of germ cells. As a continuous scaling of mating systems is not evident, the level of sperm competition was evaluated by the significantly correlated relative testis size (RTS). The cycle durations estimated by linear regression were 14.3 days (RTS 0.3%) in Suncus murinus, 9.0 days (RTS 0.5%) in Sorex coronatus and 8.5 days (RTS 2.8%) in Sorex minutus. In regression and multiple regression analyses including all six studied species of shrew, cycle length was significantly correlated with BMR (r2=0.73) and RTS (r2=0.77). Sperm competition as an ultimate factor obviously leads to a reduction in the time of spermatogenesis in order to increase sperm production. BMR may act in the same way, independently or as a proximate factor, revealed by the covariation, but other factors (related to testes size and thus to mating system) may also be involved.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

La biologie de la conservation est communément associée à la protection de petites populations menacées d?extinction. Pourtant, il peut également être nécessaire de soumettre à gestion des populations surabondantes ou susceptibles d?une trop grande expansion, dans le but de prévenir les effets néfastes de la surpopulation. Du fait des différences tant quantitatives que qualitatives entre protection des petites populations et contrôle des grandes, il est nécessaire de disposer de modèles et de méthodes distinctes. L?objectif de ce travail a été de développer des modèles prédictifs de la dynamique des grandes populations, ainsi que des logiciels permettant de calculer les paramètres de ces modèles et de tester des scénarios de gestion. Le cas du Bouquetin des Alpes (Capra ibex ibex) - en forte expansion en Suisse depuis sa réintroduction au début du XXème siècle - servit d?exemple. Cette tâche fut accomplie en trois étapes : En premier lieu, un modèle de dynamique locale, spécifique au Bouquetin, fut développé : le modèle sous-jacent - structuré en classes d?âge et de sexe - est basé sur une matrice de Leslie à laquelle ont été ajoutées la densité-dépendance, la stochasticité environnementale et la chasse de régulation. Ce modèle fut implémenté dans un logiciel d?aide à la gestion - nommé SIM-Ibex - permettant la maintenance de données de recensements, l?estimation automatisée des paramètres, ainsi que l?ajustement et la simulation de stratégies de régulation. Mais la dynamique d?une population est influencée non seulement par des facteurs démographiques, mais aussi par la dispersion et la colonisation de nouveaux espaces. Il est donc nécessaire de pouvoir modéliser tant la qualité de l?habitat que les obstacles à la dispersion. Une collection de logiciels - nommée Biomapper - fut donc développée. Son module central est basé sur l?Analyse Factorielle de la Niche Ecologique (ENFA) dont le principe est de calculer des facteurs de marginalité et de spécialisation de la niche écologique à partir de prédicteurs environnementaux et de données d?observation de l?espèce. Tous les modules de Biomapper sont liés aux Systèmes d?Information Géographiques (SIG) ; ils couvrent toutes les opérations d?importation des données, préparation des prédicteurs, ENFA et calcul de la carte de qualité d?habitat, validation et traitement des résultats ; un module permet également de cartographier les barrières et les corridors de dispersion. Le domaine d?application de l?ENFA fut exploré par le biais d?une distribution d?espèce virtuelle. La comparaison à une méthode couramment utilisée pour construire des cartes de qualité d?habitat, le Modèle Linéaire Généralisé (GLM), montra qu?elle était particulièrement adaptée pour les espèces cryptiques ou en cours d?expansion. Les informations sur la démographie et le paysage furent finalement fusionnées en un modèle global. Une approche basée sur un automate cellulaire fut choisie, tant pour satisfaire aux contraintes du réalisme de la modélisation du paysage qu?à celles imposées par les grandes populations : la zone d?étude est modélisée par un pavage de cellules hexagonales, chacune caractérisée par des propriétés - une capacité de soutien et six taux d?imperméabilité quantifiant les échanges entre cellules adjacentes - et une variable, la densité de la population. Cette dernière varie en fonction de la reproduction et de la survie locale, ainsi que de la dispersion, sous l?influence de la densité-dépendance et de la stochasticité. Un logiciel - nommé HexaSpace - fut développé pour accomplir deux fonctions : 1° Calibrer l?automate sur la base de modèles de dynamique (par ex. calculés par SIM-Ibex) et d?une carte de qualité d?habitat (par ex. calculée par Biomapper). 2° Faire tourner des simulations. Il permet d?étudier l?expansion d?une espèce envahisseuse dans un paysage complexe composé de zones de qualité diverses et comportant des obstacles à la dispersion. Ce modèle fut appliqué à l?histoire de la réintroduction du Bouquetin dans les Alpes bernoises (Suisse). SIM-Ibex est actuellement utilisé par les gestionnaires de la faune et par les inspecteurs du gouvernement pour préparer et contrôler les plans de tir. Biomapper a été appliqué à plusieurs espèces (tant végétales qu?animales) à travers le Monde. De même, même si HexaSpace fut initialement conçu pour des espèces animales terrestres, il pourrait aisément être étndu à la propagation de plantes ou à la dispersion d?animaux volants. Ces logiciels étant conçus pour, à partir de données brutes, construire un modèle réaliste complexe, et du fait qu?ils sont dotés d?une interface d?utilisation intuitive, ils sont susceptibles de nombreuses applications en biologie de la conservation. En outre, ces approches peuvent également s?appliquer à des questions théoriques dans les domaines de l?écologie des populations et du paysage.<br/><br/>Conservation biology is commonly associated to small and endangered population protection. Nevertheless, large or potentially large populations may also need human management to prevent negative effects of overpopulation. As there are both qualitative and quantitative differences between small population protection and large population controlling, distinct methods and models are needed. The aim of this work was to develop theoretical models to predict large population dynamics, as well as computer tools to assess the parameters of these models and to test management scenarios. The alpine Ibex (Capra ibex ibex) - which experienced a spectacular increase since its reintroduction in Switzerland at the beginning of the 20th century - was used as paradigm species. This task was achieved in three steps: A local population dynamics model was first developed specifically for Ibex: the underlying age- and sex-structured model is based on a Leslie matrix approach with addition of density-dependence, environmental stochasticity and culling. This model was implemented into a management-support software - named SIM-Ibex - allowing census data maintenance, parameter automated assessment and culling strategies tuning and simulating. However population dynamics is driven not only by demographic factors, but also by dispersal and colonisation of new areas. Habitat suitability and obstacles modelling had therefore to be addressed. Thus, a software package - named Biomapper - was developed. Its central module is based on the Ecological Niche Factor Analysis (ENFA) whose principle is to compute niche marginality and specialisation factors from a set of environmental predictors and species presence data. All Biomapper modules are linked to Geographic Information Systems (GIS); they cover all operations of data importation, predictor preparation, ENFA and habitat suitability map computation, results validation and further processing; a module also allows mapping of dispersal barriers and corridors. ENFA application domain was then explored by means of a simulated species distribution. It was compared to a common habitat suitability assessing method, the Generalised Linear Model (GLM), and was proven better suited for spreading or cryptic species. Demography and landscape informations were finally merged into a global model. To cope with landscape realism and technical constraints of large population modelling, a cellular automaton approach was chosen: the study area is modelled by a lattice of hexagonal cells, each one characterised by a few fixed properties - a carrying capacity and six impermeability rates quantifying exchanges between adjacent cells - and one variable, population density. The later varies according to local reproduction/survival and dispersal dynamics, modified by density-dependence and stochasticity. A software - named HexaSpace - was developed, which achieves two functions: 1° Calibrating the automaton on the base of local population dynamics models (e.g., computed by SIM-Ibex) and a habitat suitability map (e.g. computed by Biomapper). 2° Running simulations. It allows studying the spreading of an invading species across a complex landscape made of variously suitable areas and dispersal barriers. This model was applied to the history of Ibex reintroduction in Bernese Alps (Switzerland). SIM-Ibex is now used by governmental wildlife managers to prepare and verify culling plans. Biomapper has been applied to several species (both plants and animals) all around the World. In the same way, whilst HexaSpace was originally designed for terrestrial animal species, it could be easily extended to model plant propagation or flying animals dispersal. As these softwares were designed to proceed from low-level data to build a complex realistic model and as they benefit from an intuitive user-interface, they may have many conservation applications. Moreover, theoretical questions in the fields of population and landscape ecology might also be addressed by these approaches.