122 resultados para Model methodology of empirical research in communication
Resumo:
BACKGROUND: The potential effects of ionizing radiation are of particular concern in children. The model-based iterative reconstruction VEO(TM) is a technique commercialized to improve image quality and reduce noise compared with the filtered back-projection (FBP) method. OBJECTIVE: To evaluate the potential of VEO(TM) on diagnostic image quality and dose reduction in pediatric chest CT examinations. MATERIALS AND METHODS: Twenty children (mean 11.4 years) with cystic fibrosis underwent either a standard CT or a moderately reduced-dose CT plus a minimum-dose CT performed at 100 kVp. Reduced-dose CT examinations consisted of two consecutive acquisitions: one moderately reduced-dose CT with increased noise index (NI = 70) and one minimum-dose CT at CTDIvol 0.14 mGy. Standard CTs were reconstructed using the FBP method while low-dose CTs were reconstructed using FBP and VEO. Two senior radiologists evaluated diagnostic image quality independently by scoring anatomical structures using a four-point scale (1 = excellent, 2 = clear, 3 = diminished, 4 = non-diagnostic). Standard deviation (SD) and signal-to-noise ratio (SNR) were also computed. RESULTS: At moderately reduced doses, VEO images had significantly lower SD (P < 0.001) and higher SNR (P < 0.05) in comparison to filtered back-projection images. Further improvements were obtained at minimum-dose CT. The best diagnostic image quality was obtained with VEO at minimum-dose CT for the small structures (subpleural vessels and lung fissures) (P < 0.001). The potential for dose reduction was dependent on the diagnostic task because of the modification of the image texture produced by this reconstruction. CONCLUSIONS: At minimum-dose CT, VEO enables important dose reduction depending on the clinical indication and makes visible certain small structures that were not perceptible with filtered back-projection.
Resumo:
BACKGROUND:: Voltage-gated sodium channels dysregulation is important for hyperexcitability leading to pain persistence. Sodium channel blockers currently used to treat neuropathic pain are poorly tolerated. Getting new molecules to clinical use is laborious. We here propose a drug already marketed as anticonvulsant, rufinamide. METHODS:: We compared the behavioral effect of rufinamide to amitriptyline using the Spared Nerve Injury neuropathic pain model in mice. We compared the effect of rufinamide on sodium currents using in vitro patch clamp in cells expressing the voltage-gated sodium channel Nav1.7 isoform and on dissociated dorsal root ganglion neurons to amitriptyline and mexiletine. RESULTS:: In naive mice, amitriptyline (20 mg/kg) increased withdrawal threshold to mechanical stimulation from 1.3 (0.6-1.9) (median [95% CI]) to 2.3 g (2.2-2.5) and latency of withdrawal to heat stimulation from 13.1 (10.4-15.5) to 30.0 s (21.8-31.9), whereas rufinamide had no effect. Rufinamide and amitriptyline alleviated injury-induced mechanical allodynia for 4 h (maximal effect: 0.10 ± 0.03 g (mean ± SD) to 1.99 ± 0.26 g for rufinamide and 0.25 ± 0.22 g to 1.92 ± 0.85 g for amitriptyline). All drugs reduced peak current and stabilized the inactivated state of voltage-gated sodium channel Nav1.7, with similar effects in dorsal root ganglion neurons. CONCLUSIONS:: At doses alleviating neuropathic pain, amitriptyline showed alteration of behavioral response possibly related to either alteration of basal pain sensitivity or sedative effect or both. Side-effects and drug tolerance/compliance are major problems with drugs such as amitriptyline. Rufinamide seems to have a better tolerability profile and could be a new alternative to explore for the treatment of neuropathic pain.
Resumo:
OBJECTIVE: The aim of this study was to assess the implementation process and economic impact of a new pharmaceutical care service provided since 2002 by pharmacists in Swiss nursing homes. SETTING: The setting was 42 nursing homes located in the canton of Fribourg, Switzerland under the responsibility of 22 pharmacists. METHOD: We developed different facilitators, such as a monitoring system, a coaching program, and a research project, to help pharmacists change their practice and to improve implementation of this new service. We evaluated the implementation rate of the service delivered in nursing homes. We assessed the economic impact of the service since its start in 2002 using statistical evaluation (Chow test) with retrospective analysis of the annual drug costs per resident over an 8-year period (1998-2005). MAIN OUTCOME MEASURES: The description of the facilitators and their implications in implementation of the service; the economic impact of the service since its start in 2002. RESULTS: In 2005, after a 4-year implementation period supported by the introduction of facilitators of practice change, all 42 nursing homes (2,214 residents) had implemented the pharmaceutical care service. The annual drug costs per resident decreased by about 16.4% between 2002 and 2005; this change proved to be highly significant. The performance of the pharmacists continuously improved using a specific coaching program including an annual expert comparative report, working groups, interdisciplinary continuing education symposia, and individual feedback. This research project also determined priorities to develop practice guidelines to prevent drug-related problems in nursing homes, especially in relation to the use of psychotropic drugs. CONCLUSION: The pharmaceutical care service was fully and successfully implemented in Fribourg's nursing homes within a period of 4 years. These findings highlight the importance of facilitators designed to assist pharmacists in the implementation of practice changes. The economic impact was confirmed on a large scale, and priorities for clinical and pharmacoeconomic research were identified in order to continue to improve the quality of integrated care for the elderly.
Resumo:
This chapter describes the profile of the HIA, provides insight into the process and gives an example of how political decisions may be made on behalf of a concerned population through an HIA approach. [Introduction p. 284]
Resumo:
The draft of the new law on the confidentiality of personal data severely curtails medical and epidemiological research. This might be detrimental and dangerous to public health. The project therefore has to be amended.
Resumo:
MOTIVATION: The analysis of molecular coevolution provides information on the potential functional and structural implication of positions along DNA sequences, and several methods are available to identify coevolving positions using probabilistic or combinatorial approaches. The specific nucleotide or amino acid profile associated with the coevolution process is, however, not estimated, but only known profiles, such as the Watson-Crick constraint, are usually considered a priori in current measures of coevolution. RESULTS: Here, we propose a new probabilistic model, Coev, to identify coevolving positions and their associated profile in DNA sequences while incorporating the underlying phylogenetic relationships. The process of coevolution is modeled by a 16 × 16 instantaneous rate matrix that includes rates of transition as well as a profile of coevolution. We used simulated, empirical and illustrative data to evaluate our model and to compare it with a model of 'independent' evolution using Akaike Information Criterion. We showed that the Coev model is able to discriminate between coevolving and non-coevolving positions and provides better specificity and specificity than other available approaches. We further demonstrate that the identification of the profile of coevolution can shed new light on the process of dependent substitution during lineage evolution.
Resumo:
Inhibition of tumor angiogenesis suppresses tumor growth and metastatic spreading in many experimental models, suggesting that anti-angiogenic drugs may be used to treat human cancer. During the past decade more than eighty molecules that showed anti-angiogenic activity in preclinical studies were tested in clinical cancer trials, but most of them failed to demonstrate any measurable anti-tumor activity and none have been approved for clinical use. Recent results stemming from trials with anti-VEGF antibodies, used alone or in combination with chemotherapy, suggest that systemic anti-angiogenic therapy may indeed have a measurable impact on cancer progression and patient survival. From the clinical studies it became nevertheless clear that the classical endpoints used in anti-cancer trials do not bring sufficient discriminative power to monitor the effects of anti-angiogenic drugs. It is therefore necessary to identify and validate molecular, cellular and functional surrogate markers of angiogenesis to monitor activity and efficacy of anti-angiogenic drugs in patients. Availability of such markers will be instrumental to re-evaluate the role of tumor angiogenesis in human cancer, to identify new molecular targets and drugs, and to improve planning, monitoring and interpretation of future studies. Future anti-angiogenesis trials integrating biological endpoints and surrogate markers or angiogenesis will require close collaboration between clinical investigators and laboratory-based researchers.
Resumo:
Discussion on improving the power of genome-wide association studies to identify candidate variants and genes is generally centered on issues of maximizing sample size; less attention is given to the role of phenotype definition and ascertainment. The authors used genome-wide data from patients infected with human immunodeficiency virus type 1 (HIV-1) to assess whether differences in type of population (622 seroconverters vs. 636 seroprevalent subjects) or the number of measurements available for defining the phenotype resulted in differences in the effect sizes of associations between single nucleotide polymorphisms and the phenotype, HIV-1 viral load at set point. The effect estimate for the top 100 single nucleotide polymorphisms was 0.092 (95% confidence interval: 0.074, 0.110) log(10) viral load (log(10) copies of HIV-1 per mL of blood) greater in seroconverters than in seroprevalent subjects. The difference was even larger when the authors focused on chromosome 6 variants (0.153 log(10) viral load) or on variants that achieved genome-wide significance (0.232 log(10) viral load). The estimates of the genetic effects tended to be slightly larger when more viral load measurements were available, particularly among seroconverters and for variants that achieved genome-wide significance. Differences in phenotype definition and ascertainment may affect the estimated magnitude of genetic effects and should be considered in optimizing power for discovering new associations.
Resumo:
1. Costs of reproduction lie at the core of basic ecological and evolutionary theories, and their existence is commonly invoked to explain adaptive processes. Despite their sheer importance, empirical evidence for the existence and quantification of costs of reproduction in tree species comes mostly from correlational studies, while more comprehensive approaches remain missing. Manipulative experiments are a preferred approach to study cost of reproduction, as they allow controlling for otherwise inherent confounding factors like size or genetic background. 2. Here, we conducted a manipulative experiment in a Pinus halepensis common garden, removing developing cones from a group of trees and comparing growth and reproduction after treatment with a control group. We also estimated phenotypic and genetic correlations between reproductive and vegetative traits. 3. Manipulated trees grew slightly more than control trees just after treatment, with just a transient, marginally non-significant difference. By contrast, larger differences were observed for the number of female cones initiated 1 year after treatment, with an increase of 70% more cones in the manipulated group. Phenotypic and genetic correlations showed that smaller trees invested a higher proportion of their resources in reproduction, compared with larger trees, which could be interpreted as an indirect evidence for costs of reproduction. 4. Synthesis. This research showed a high impact of current reproduction on reproductive potential, even when not significant on vegetative growth. This has strong implications for how we understand adaptive strategies in forest trees and should encourage further interest on their still poorly known reproductive life-history traits.
Resumo:
Vitamin K antagonists (VKAs) are prescribed worldwide and remain the oral anticoagulant of choice. These drugs are characterized by a narrow therapeutic index and a large inter- and intra-individual variability. P-glycoprotein could contribute to this variability. The aim of this study was to investigate the involvement of P-gp in the transport of acenocoumarol, phenprocoumon and warfarin using an in vitro Caco-2 cell monolayer model. These results were compared with those obtained with rivaroxaban, a new oral anticoagulant known to be a P-gp substrate. The transport of these four drugs was assessed at pH conditions 6.8/7.4 in the presence or absence of the P-gp inhibitor cyclosporine A (10 μM) and the more potent and specific P-gp inhibitor valspodar (5 μM). Analytical quantification was performed by LC/MS. With an efflux ratio of 1.7 and a significant decrease in the efflux (Papp B-A), in the presence of P-gp inhibitors at a concentration of 50 μM, acenocoumarol can be considered as a weak P-gp substrate. Concerning phenprocoumon, the results suggest that this molecule is a poor P-gp substrate. The P-gp inhibitors did not affect significantly the transport of warfarin. The efflux of rivaroxaban was strongly inhibited by the two P-gp inhibitors. In conclusion, none of the three VKAs tested are strong P-gp substrates. However, acenocoumarol can be considered as a weak P-gp substrate and phenprocoumon as a poor P-gp substrate.
Resumo:
La gouvernance de l'Internet est une thématique récente dans la politique mondiale. Néanmoins, elle est devenue au fil des années un enjeu économique et politique important. La question a même pris une importance particulière au cours des derniers mois en devenant un sujet d'actualité récurrent. Forte de ce constat, c ette recherche retrace l'histoire de la gouvernance de l'Internet depuis son émergence comme enjeu politique dans les années 1980 jusqu'à la fin du Sommet Mondial sur la Société de l'Information (SMSI) en 2005. Plutôt que de se focaliser sur l'une ou l'autre des institutions impliquées dans la régulation du réseau informatique mondial, cette recherche analyse l'émergence et l'évolution historique d'un espace de luttes rassemblant un nombre croissant d'acteurs différents. Cette évolution est décrite à travers le prisme de la relation dialectique entre élites et non-élites et de la lutte autour de la définition de la gouvernance de l'Internet. Cette thèse explore donc la question de comment les relations au sein des élites de la gouvernance de l'Internet et entre ces élites et les non-élites expliquent l'emergence, l'évolution et la structuration d'un champ relativement autonome de la politique mondiale centré sur la gouvernance de l'Internet. Contre les perspectives dominantes réaliste et libérales, cette recherche s'ancre dans une approche issue de la combinaison des traditions hétérodoxes en économie politique internationale et des apports de la sociologie politique internationale. Celle-ci s'articule autour des concepts de champ, d'élites et d'hégémonie. Le concept de champ, développé par Bourdieu inspire un nombre croissant d'études de la politique mondiale. Il permet à la fois une étude différenciée de la mondialisation et l'émergence d'espaces de lutte et de domination au niveau transnational. La sociologie des élites, elle, permet une approche pragmatique et centrée sur les acteurs des questions de pouvoir dans la mondialisation. Cette recherche utilise plus particulièrement le concept d'élite du pouvoir de Wright Mills pour étudier l'unification d'élites a priori différentes autour de projets communs. Enfin, cette étude reprend le concept néo-gramscien d'hégémonie afin d'étudier à la fois la stabilité relative du pouvoir d'une élite garantie par la dimension consensuelle de la domination, et les germes de changement contenus dans tout ordre international. A travers l'étude des documents produits au cours de la période étudiée et en s'appuyant sur la création de bases de données sur les réseaux d'acteurs, cette étude s'intéresse aux débats qui ont suivi la commercialisation du réseau au début des années 1990 et aux négociations lors du SMSI. La première période a abouti à la création de l'Internet Corporation for Assigned Names and Numbers (ICANN) en 1998. Cette création est le résultat de la recherche d'un consensus entre les discours dominants des années 1990. C'est également le fruit d'une coalition entre intérêts au sein d'une élite du pouvoir de la gouvernance de l'Internet. Cependant, cette institutionnalisation de l'Internet autour de l'ICANN excluait un certain nombre d'acteurs et de discours qui ont depuis tenté de renverser cet ordre. Le SMSI a été le cadre de la remise en cause du mode de gouvernance de l'Internet par les États exclus du système, des universitaires et certaines ONG et organisations internationales. C'est pourquoi le SMSI constitue la seconde période historique étudiée dans cette thèse. La confrontation lors du SMSI a donné lieu à une reconfiguration de l'élite du pouvoir de la gouvernance de l'Internet ainsi qu'à une redéfinition des frontières du champ. Un nouveau projet hégémonique a vu le jour autour d'éléments discursifs tels que le multipartenariat et autour d'insitutions telles que le Forum sur la Gouvernance de l'Internet. Le succès relatif de ce projet a permis une stabilité insitutionnelle inédite depuis la fin du SMSI et une acceptation du discours des élites par un grand nombre d'acteurs du champ. Ce n'est que récemment que cet ordre a été remis en cause par les pouvoirs émergents dans la gouvernance de l'Internet. Cette thèse cherche à contribuer au débat scientifique sur trois plans. Sur le plan théorique, elle contribue à l'essor d'un dialogue entre approches d'économie politique mondiale et de sociologie politique internationale afin d'étudier à la fois les dynamiques structurelles liées au processus de mondialisation et les pratiques localisées des acteurs dans un domaine précis. Elle insiste notamment sur l'apport de les notions de champ et d'élite du pouvoir et sur leur compatibilité avec les anlayses néo-gramsciennes de l'hégémonie. Sur le plan méthodologique, ce dialogue se traduit par une utilisation de méthodes sociologiques telles que l'anlyse de réseaux d'acteurs et de déclarations pour compléter l'analyse qualitative de documents. Enfin, sur le plan empirique, cette recherche offre une perspective originale sur la gouvernance de l'Internet en insistant sur sa dimension historique, en démontrant la fragilité du concept de gouvernance multipartenaire (multistakeholder) et en se focalisant sur les rapports de pouvoir et les liens entre gouvernance de l'Internet et mondialisation. - Internet governance is a recent issue in global politics. However, it gradually became a major political and economic issue. It recently became even more important and now appears regularly in the news. Against this background, this research outlines the history of Internet governance from its emergence as a political issue in the 1980s to the end of the World Summit on the Information Society (WSIS) in 2005. Rather than focusing on one or the other institution involved in Internet governance, this research analyses the emergence and historical evolution of a space of struggle affecting a growing number of different actors. This evolution is described through the analysis of the dialectical relation between elites and non-elites and through the struggle around the definition of Internet governance. The thesis explores the question of how the relations among the elites of Internet governance and between these elites and non-elites explain the emergence, the evolution, and the structuration of a relatively autonomous field of world politics centred around Internet governance. Against dominant realist and liberal perspectives, this research draws upon a cross-fertilisation of heterodox international political economy and international political sociology. This approach focuses on concepts such as field, elites and hegemony. The concept of field, as developed by Bourdieu, is increasingly used in International Relations to build a differentiated analysis of globalisation and to describe the emergence of transnational spaces of struggle and domination. Elite sociology allows for a pragmatic actor-centred analysis of the issue of power in the globalisation process. This research particularly draws on Wright Mill's concept of power elite in order to explore the unification of different elites around shared projects. Finally, this thesis uses the Neo-Gramscian concept of hegemony in order to study both the consensual dimension of domination and the prospect of change contained in any international order. Through the analysis of the documents produced within the analysed period, and through the creation of databases of networks of actors, this research focuses on the debates that followed the commercialisation of the Internet throughout the 1990s and during the WSIS. The first time period led to the creation of the Internet Corporation for Assigned Names and Numbers (ICANN) in 1998. This creation resulted from the consensus-building between the dominant discourses of the time. It also resulted from the coalition of interests among an emerging power elite. However, this institutionalisation of Internet governance around the ICANN excluded a number of actors and discourses that resisted this mode of governance. The WSIS became the institutional framework within which the governance system was questioned by some excluded states, scholars, NGOs and intergovernmental organisations. The confrontation between the power elite and counter-elites during the WSIS triggered a reconfiguration of the power elite as well as a re-definition of the boundaries of the field. A new hegemonic project emerged around discursive elements such as the idea of multistakeholderism and institutional elements such as the Internet Governance Forum. The relative success of the hegemonic project allowed for a certain stability within the field and an acceptance by most non-elites of the new order. It is only recently that this order began to be questioned by the emerging powers of Internet governance. This research provides three main contributions to the scientific debate. On the theoretical level, it contributes to the emergence of a dialogue between International Political Economy and International Political Sociology perspectives in order to analyse both the structural trends of the globalisation process and the located practices of actors in a given issue-area. It notably stresses the contribution of concepts such as field and power elite and their compatibility with a Neo-Gramscian framework to analyse hegemony. On the methodological level, this perspective relies on the use of mixed methods, combining qualitative content analysis with social network analysis of actors and statements. Finally, on the empirical level, this research provides an original perspective on Internet governance. It stresses the historical dimension of current Internet governance arrangements. It also criticise the notion of multistakeholde ism and focuses instead on the power dynamics and the relation between Internet governance and globalisation.
Resumo:
AIMS: To validate a model for quantifying the prognosis of patients with pulmonary embolism (PE). The model was previously derived from 10 534 US patients. METHODS AND RESULTS: We validated the model in 367 patients prospectively diagnosed with PE at 117 European emergency departments. We used baseline data for the model's 11 prognostic variables to stratify patients into five risk classes (I-V). We compared 90-day mortality within each risk class and the area under the receiver operating characteristic curve between the validation and the original derivation samples. We also assessed the rate of recurrent venous thrombo-embolism and major bleeding within each risk class. Mortality was 0% in Risk Class I, 1.0% in Class II, 3.1% in Class III, 10.4% in Class IV, and 24.4% in Class V and did not differ between the validation and the original derivation samples. The area under the curve was larger in the validation sample (0.87 vs. 0.78, P=0.01). No patients in Classes I and II developed recurrent thrombo-embolism or major bleeding. CONCLUSION: The model accurately stratifies patients with PE into categories of increasing risk of mortality and other relevant complications. Patients in Risk Classes I and II are at low risk of adverse outcomes and are potential candidates for outpatient treatment.
Resumo:
Triiodothyronine (30 nM) added to serum-free cultures of mechanically dissociated re-aggregating fetal (15-16 days gestation) rat brain cells greatly increased the enzymatic activity of choline acetyltransferase and acetylcholinesterase throughout the entire culture period (33 days), and markedly accelerated the developmental rise of glutamic acid decarboxylase specific activity. The enhancement of choline acetyltransferase and acetylcholinesterase specific activities in the presence of triiodothyronine was even more pronouned in cultures of telencephalic cells. If triiodothyronine treatment was restricted to the first 17 culture days, the level of choline acetyltransferase specific activity at day 33 was 84% of that in chronically treated cultures and 270% of that in cultures receiving triiodothyronine between days 17 and 33, indicating that relatively undifferentiated cells were more responsive to the hormone. Triiodothyronine had no apparent effect on the incorporation of [3H]thymidine at day 5 or on the total DNA content of cultures, suggesting that cellular differentiation, rather than proliferation was affected by the hormone. Our findings in vitro are in good agreement with many observations in vivo, suggesting that rotation-mediated aggregating cell cultures of fetal rat brain provide a useful model to study thyroid hormone action in the developing brain.
Resumo:
Resume : Mieux comprendre les stromatolithes et les tapis microbiens est un sujet important en biogéosciences puisque cela aide à l'étude des premières formes de vie sur Terre, a mieux cerner l'écologie des communautés microbiennes et la contribution des microorganismes a la biominéralisation, et même à poser certains fondements dans les recherches en exobiologie. D'autre part, la modélisation est un outil puissant utilisé dans les sciences naturelles pour appréhender différents phénomènes de façon théorique. Les modèles sont généralement construits sur un système d'équations différentielles et les résultats sont obtenus en résolvant ce système. Les logiciels disponibles pour implémenter les modèles incluent les logiciels mathématiques et les logiciels généraux de simulation. L'objectif principal de cette thèse est de développer des modèles et des logiciels pour aider a comprendre, via la simulation, le fonctionnement des stromatolithes et des tapis microbiens. Ces logiciels ont été développés en C++ en ne partant d'aucun pré-requis de façon a privilégier performance et flexibilité maximales. Cette démarche permet de construire des modèles bien plus spécifiques et plus appropriés aux phénomènes a modéliser. Premièrement, nous avons étudié la croissance et la morphologie des stromatolithes. Nous avons construit un modèle tridimensionnel fondé sur l'agrégation par diffusion limitée. Le modèle a été implémenté en deux applications C++: un moteur de simulation capable d'exécuter un batch de simulations et de produire des fichiers de résultats, et un outil de visualisation qui permet d'analyser les résultats en trois dimensions. Après avoir vérifié que ce modèle peut en effet reproduire la croissance et la morphologie de plusieurs types de stromatolithes, nous avons introduit un processus de sédimentation comme facteur externe. Ceci nous a mené a des résultats intéressants, et permis de soutenir l'hypothèse que la morphologie des stromatolithes pourrait être le résultat de facteurs externes autant que de facteurs internes. Ceci est important car la classification des stromatolithes est généralement fondée sur leur morphologie, imposant que la forme d'un stromatolithe est dépendante de facteurs internes uniquement (c'est-à-dire les tapis microbiens). Les résultats avancés dans ce mémoire contredisent donc ces assertions communément admises. Ensuite, nous avons décidé de mener des recherches plus en profondeur sur les aspects fonctionnels des tapis microbiens. Nous avons construit un modèle bidimensionnel de réaction-diffusion fondé sur la simulation discrète. Ce modèle a été implémenté dans une application C++ qui permet de paramétrer et exécuter des simulations. Nous avons ensuite pu comparer les résultats de simulation avec des données du monde réel et vérifier que le modèle peut en effet imiter le comportement de certains tapis microbiens. Ainsi, nous avons pu émettre et vérifier des hypothèses sur le fonctionnement de certains tapis microbiens pour nous aider à mieux en comprendre certains aspects, comme la dynamique des éléments, en particulier le soufre et l'oxygène. En conclusion, ce travail a abouti à l'écriture de logiciels dédiés à la simulation de tapis microbiens d'un point de vue tant morphologique que fonctionnel, suivant deux approches différentes, l'une holistique, l'autre plus analytique. Ces logiciels sont gratuits et diffusés sous licence GPL (General Public License). Abstract : Better understanding of stromatolites and microbial mats is an important topic in biogeosciences as it helps studying the early forms of life on Earth, provides clues re- garding the ecology of microbial ecosystems and their contribution to biomineralization, and gives basis to a new science, exobiology. On the other hand, modelling is a powerful tool used in natural sciences for the theoretical approach of various phenomena. Models are usually built on a system of differential equations and results are obtained by solving that system. Available software to implement models includes mathematical solvers and general simulation software. The main objective of this thesis is to develop models and software able to help to understand the functioning of stromatolites and microbial mats. Software was developed in C++ from scratch for maximum performance and flexibility. This allows to build models much more specific to a phenomenon rather than general software. First, we studied stromatolite growth and morphology. We built a three-dimensional model based on diffusion-limited aggregation. The model was implemented in two C++ applications: a simulator engine, which can run a batch of simulations and produce result files, and a Visualization tool, which allows results to be analysed in three dimensions. After verifying that our model can indeed reproduce the growth and morphology of several types of stromatolites, we introduced a sedimentation process as an external factor. This lead to interesting results, and allowed to emit the hypothesis that stromatolite morphology may be the result of external factors as much as internal factors. This is important as stromatolite classification is usually based on their morphology, imposing that a stromatolite shape is dependant on internal factors only (i.e. the microbial mat). This statement is contradicted by our findings, Second, we decided to investigate deeper the functioning of microbial mats, We built a two-dimensional reaction-diffusion model based on discrete simulation, The model was implemented in a C++ application that allows setting and running simulations. We could then compare simulation results with real world data and verify that our model can indeed mimic the behaviour of some microbial mats. Thus, we have proposed and verified hypotheses regarding microbial mats functioning in order to help to better understand them, e.g. the cycle of some elements such as oxygen or sulfur. ln conclusion, this PhD provides a simulation software, dealing with two different approaches. This software is free and available under a GPL licence.