95 resultados para Low-level protocols


Relevância:

80.00% 80.00%

Publicador:

Resumo:

Background Long-term treatment of primary HIV-1 infection (PHI) may allow the immune reconstitution of responses lost during the acute viremic phase and decrease of peripheral reservoirs. This in turn may represent the best setting for the use of therapeutic vaccines in order to lower the viral set-point or control of viral rebound upon ART discontinuation. Methods We investigated a cohort of 16 patients who started ART at PHI, with treatment duration of ≥4 years and persistent aviremia (<50 HIV-1 copies/ml). The cohort was characterized in terms of viral subtype, cell-associated RNA, proviral DNA and HLA genotype. Secretion of IFN-γ, IL-2 and TNF-α by CD8 T-cells was analysed by polychromatic flowcytometry using a panel of 192 HIV-1-derived epitopes. Results This cohort is highly homogenous in terms of viral subtype: 81% clade B. We identified 44 epitope-specific responses: all patients had detectable responses to >1 epitope and the mean number of responding epitopes per patient was 3. The mean frequency of cytokines-secreting CD8 T-cells was 0.32%. CD8 T-cells secreting simultaneously IFN-γ, IL-2 and TNF-α made up for about 40% of the response and cells secreting at least 2 cytokines for about 80%, consistent with a highly polyfunctional CD8 T-cell profile. There was no difference in term of polyfunctionality when HLA restriction, or recognized viral regions and epitopes were considered. Proviral DNA was detectable in all patients but at low levels (mean = 108 copies/1 million PBMCs) while cell-associated mRNA was not detectable in 19% of patients (mean = 11 copies/1 million PBMCs when detectable). Conclusion Patients with sustained virological suppression after initiation of ART at PHI show polyfunctional CD8 T-cell and low levels of proviral DNA with an absence of residual replication in a substantial percentage of patients. The use of therapeutic vaccines in this population may promote low level of rebound viremia or control of viral replication upon ART cessation.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Multisensory interactions have been documented within low-level, even primary, cortices and at early post-stimulus latencies. These effects are in turn linked to behavioral and perceptual modulations. In humans, visual cortex excitability, as measured by transcranial magnetic stimulation (TMS) induced phosphenes, can be reliably enhanced by the co-presentation of sounds. This enhancement occurs at pre-perceptual stages and is selective for different types of complex sounds. However, the source(s) of auditory inputs effectuating these excitability changes in primary visual cortex remain disputed. The present study sought to determine if direct connections between low-level auditory cortices and primary visual cortex are mediating these kinds of effects by varying the pitch and bandwidth of the sounds co-presented with single-pulse TMS over the occipital pole. Our results from 10 healthy young adults indicate that both the central frequency and bandwidth of a sound independently affect the excitability of visual cortex during processing stages as early as 30 msec post-sound onset. Such findings are consistent with direct connections mediating early-latency, low-level multisensory interactions within visual cortices.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Homologous recombination provides a major pathway for the repair of DNA double-strand breaks in mammalian cells. Defects in homologous recombination can lead to high levels of chromosomal translocations or deletions, which may promote cell transformation and cancer development. A key component of this process is RAD51. In comparison to RecA, the bacterial homologue, human RAD51 protein exhibits low-level strand-exchange activity in vitro. This activity can, however, be stimulated by the presence of high salt. Here, we have investigated the mechanistic basis for this stimulation. We show that high ionic strength favours the co-aggregation of RAD51-single-stranded DNA (ssDNA) nucleoprotein filaments with naked duplex DNA, to form a complex in which the search for homologous sequences takes place. High ionic strength allows differential binding of RAD51 to ssDNA and double-stranded DNA (dsDNA), such that ssDNA-RAD51 interactions are unaffected, whereas those between RAD51 and dsDNA are destabilized. Most importantly, high salt induces a conformational change in RAD51, leading to the formation of extended nucleoprotein filaments on ssDNA. These extended filaments mimic the active form of the Escherichia coli RecA-ssDNA filament that exhibits efficient strand-exchange activity.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

A procedure was developed for determining Pu-241 activity in environmental samples. This beta emitter isotope of plutonium was measured by ultra low level liquid scintillation, after several separation and purification steps that involved the use of a highly selective extraction chromatographic resin (Eichrom-TEVA). Due to the lack of reference material for Pu-241, the method was nevertheless validated using four IAEA reference sediments with information values for Pu-241. Next, the method was used to determine the Pu-241 activity in alpine soils of Switzerland and France. The Pu-241/Pu-239,Pu-240 and Pu-238/Pu-239,Pu-240 activity ratios confirmed that Pu contamination in the tested alpine soils originated mainly from global fallout from nuclear weapon tests conducted in the fifties and sixties. Estimation of the date of the contamination, using the Pu-241/Am-241 age-dating method, further confirmed this origin. However, the Pu-241/Am-241 dating method was limited to samples where Pu-Am fractionation was insignificant. If any, the contribution of the Chernobyl accident is negligible.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Members of the Sox gene family of transcription factors are defined by the presence of an 80 amino acid homology domain, the High Mobility Group (HMG) box. Here we report the cloning and initial analysis of murine Sox-13 . The 984 amino acids Sox-13 protein contains a single HMG box, a leucine zipper motif and a glutamine-rich stretch. These characteristics are shared with another member of the Sox gene family, Sox-6. High level embryonic expression of Sox-13 occurs uniquely in the arterial walls of 13.5 days post coitum (dpc) mice and later. Low level expression was observed in the inner ear of 13.5 dpc mice and in a limited number of cells in the thymus of 16.5 dpc mice, from which Sox-13 was originally cloned. At 18.5 dpc, Sox-13 is expressed in the tracheal epithelium below the vocal cord and in the hair follicles. The Sox-13 protein binds to the consensus HMG box motif, AACAAAG, but does not transactivate transcription through a concatamer of this motif. Sox-13, like other members of the Sox family likely plays an important role in development.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Recent multisensory research has emphasized the occurrence of early, low-level interactions in humans. As such, it is proving increasingly necessary to also consider the kinds of information likely extracted from the unisensory signals that are available at the time and location of these interaction effects. This review addresses current evidence regarding how the spatio-temporal brain dynamics of auditory information processing likely curtails the information content of multisensory interactions observable in humans at a given latency and within a given brain region. First, we consider the time course of signal propagation as a limitation on when auditory information (of any kind) can impact the responsiveness of a given brain region. Next, we overview the dual pathway model for the treatment of auditory spatial and object information ranging from rudimentary to complex environmental stimuli. These dual pathways are considered an intrinsic feature of auditory information processing, which are not only partially distinct in their associated brain networks, but also (and perhaps more importantly) manifest only after several tens of milliseconds of cortical signal processing. This architecture of auditory functioning would thus pose a constraint on when and in which brain regions specific spatial and object information are available for multisensory interactions. We then separately consider evidence regarding mechanisms and dynamics of spatial and object processing with a particular emphasis on when discriminations along either dimension are likely performed by specific brain regions. We conclude by discussing open issues and directions for future research.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Sensory information can interact to impact perception and behavior. Foods are appreciated according to their appearance, smell, taste and texture. Athletes and dancers combine visual, auditory, and somatosensory information to coordinate their movements. Under laboratory settings, detection and discrimination are likewise facilitated by multisensory signals. Research over the past several decades has shown that the requisite anatomy exists to support interactions between sensory systems in regions canonically designated as exclusively unisensory in their function and, more recently, that neural response interactions occur within these same regions, including even primary cortices and thalamic nuclei, at early post-stimulus latencies. Here, we review evidence concerning direct links between early, low-level neural response interactions and behavioral measures of multisensory integration.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

PURPOSE OF REVIEW: To review the recent findings on the relationships between delirium and cognitive decline in the elderly. RECENT FINDINGS: Current advances in the field include substantial new evidence that delirium increases the risk of dementia in patients without previous cognitive impairment and accelerates cognitive decline in patients with Alzheimer's disease. Findings on cognitive trajectories and domains affected contribute to better understanding of the clinical nature of cognitive impairment after delirium. Volume loss and disruption of white matter integrity may represent early MRI markers for long-term cognitive impairment. Neurodegenerative and low-level chronic inflammatory processes predispose to exaggerated response to incident stimuli that may precipitate both acute brain dysfunction and persisting cerebral damage. SUMMARY: Still little is known about the relationship between delirium and cognitive trajectories in the elderly, and the underlying pathophysiological mechanisms. The association of neurodegenerative and inflammatory processes appears to play an important role in the pathogenesis and the clinical course of cognitive impairment after delirium. The hypothetical role of several other factors remains to be clarified. Further clinical studies are needed to evaluate whether prevention and treatment approaches that proved to be useful to reduce delirium incidence and severity may also improve long-term outcomes, and prevent cognitive decline.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Using one male-inherited, one female-inherited and eight biparentally inherited markers, we investigate the population genetic structure of the Valais shrew (Sorex antinorii) in the Swiss Alps. Bayesian analysis on autosomal microsatellites suggests a clear genetic differentiation between two groups of populations. This geographically based structure is consistent with two separate postglacial recolonization routes of the species into Switzerland from Italian refugia after the last Pleistocene glaciations. Sex-specific markers also confirm genetic structuring among western and eastern areas, since very few haplotypes for either Y chromosome or mtDNA genome are shared between the two regions. Overall, these results suggest that two already well-differentiated genetic lineages colonized the Swiss Alps and came into secondary contact in the Rhône Valley. Low level of admixture between the two lineages is likely explained by the mountainous landscape structure of lateral valleys orthogonal to the main Rhône valley.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Cytomegalovirus (CMV) remains a major cause of morbidity in solid organ transplant patients. In order to reduce CMV morbidity, we designed a program of routine virological monitoring that included throat and urine CMV shell vial culture, along with peripheral blood leukocyte (PBL) shell vial quantitative culture for 12 weeks post-transplantation, as well as 8 weeks after treatment for acute rejection. The program also included preemptive ganciclovir treatment for those patients with the highest risk of developing CMV disease, i.e., with either high-level viremia (>10 infectious units [IU]/106 PBL) or low-level viremia (<10 IU/106 PBL) and either D+/R- CMV serostatus or treatment for graft rejection. During 1995-96, 90 solid organ transplant recipients (39 kidneys, 28 livers, and 23 hearts) were followed up. A total of 60 CMV infection episodes occurred in 45 patients. Seventeen episodes were symptomatic. Of 26 episodes managed according to the program, only 4 presented with CMV disease and none died. No patient treated preemptively for asymptomatic infection developed disease. In contrast, among 21 episodes managed in non-compliance with the program (i.e., the monitoring was not performed or preemptive treatment was not initiated despite a high risk of developing CMV disease), 12 episodes turned into symptomatic infection (P=0.0048 compared to patients treated preemptively), and 2 deaths possibly related to CMV were recorded. This difference could not be explained by an increased proportion of D+/R- patients or an increased incidence of rejection among patients with episodes treated in non-compliance with the program. Our data identify compliance with guidelines as an important factor in effectively reducing CMV morbidity through preemptive treatment, and suggest that the complexity of the preemptive approach may represent an important obstacle to the successful prevention of CMV morbidity by this approach in the regular healthcare setting.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Summary This dissertation explores how stakeholder dialogue influences corporate processes, and speculates about the potential of this phenomenon - particularly with actors, like non-governmental organizations (NGOs) and other representatives of civil society, which have received growing attention against a backdrop of increasing globalisation and which have often been cast in an adversarial light by firms - as a source of teaming and a spark for innovation in the firm. The study is set within the context of the introduction of genetically-modified organisms (GMOs) in Europe. Its significance lies in the fact that scientific developments and new technologies are being generated at an unprecedented rate in an era where civil society is becoming more informed, more reflexive, and more active in facilitating or blocking such new developments, which could have the potential to trigger widespread changes in economies, attitudes, and lifestyles, and address global problems like poverty, hunger, climate change, and environmental degradation. In the 1990s, companies using biotechnology to develop and offer novel products began to experience increasing pressure from civil society to disclose information about the risks associated with the use of biotechnology and GMOs, in particular. Although no harmful effects for humans or the environment have been factually demonstrated even to date (2008), this technology remains highly-contested and its introduction in Europe catalysed major companies to invest significant financial and human resources in stakeholder dialogue. A relatively new phenomenon at the time, with little theoretical backing, dialogue was seen to reflect a move towards greater engagement with stakeholders, commonly defined as those "individuals or groups with which. business interacts who have a 'stake', or vested interest in the firm" (Carroll, 1993:22) with whom firms are seen to be inextricably embedded (Andriof & Waddock, 2002). Regarding the organisation of this dissertation, Chapter 1 (Introduction) describes the context of the study, elaborates its significance for academics and business practitioners as an empirical work embedded in a sector at the heart of the debate on corporate social responsibility (CSR). Chapter 2 (Literature Review) traces the roots and evolution of CSR, drawing on Stakeholder Theory, Institutional Theory, Resource Dependence Theory, and Organisational Learning to establish what has already been developed in the literature regarding the stakeholder concept, motivations for engagement with stakeholders, the corporate response to external constituencies, and outcomes for the firm in terms of organisational learning and change. I used this review of the literature to guide my inquiry and to develop the key constructs through which I viewed the empirical data that was gathered. In this respect, concepts related to how the firm views itself (as a victim, follower, leader), how stakeholders are viewed (as a source of pressure and/or threat; as an asset: current and future), corporate responses (in the form of buffering, bridging, boundary redefinition), and types of organisational teaming (single-loop, double-loop, triple-loop) and change (first order, second order, third order) were particularly important in building the key constructs of the conceptual model that emerged from the analysis of the data. Chapter 3 (Methodology) describes the methodology that was used to conduct the study, affirms the appropriateness of the case study method in addressing the research question, and describes the procedures for collecting and analysing the data. Data collection took place in two phases -extending from August 1999 to October 2000, and from May to December 2001, which functioned as `snapshots' in time of the three companies under study. The data was systematically analysed and coded using ATLAS/ti, a qualitative data analysis tool, which enabled me to sort, organise, and reduce the data into a manageable form. Chapter 4 (Data Analysis) contains the three cases that were developed (anonymised as Pioneer, Helvetica, and Viking). Each case is presented in its entirety (constituting a `within case' analysis), followed by a 'cross-case' analysis, backed up by extensive verbatim evidence. Chapter 5 presents the research findings, outlines the study's limitations, describes managerial implications, and offers suggestions for where more research could elaborate the conceptual model developed through this study, as well as suggestions for additional research in areas where managerial implications were outlined. References and Appendices are included at the end. This dissertation results in the construction and description of a conceptual model, grounded in the empirical data and tied to existing literature, which portrays a set of elements and relationships deemed important for understanding the impact of stakeholder engagement for firms in terms of organisational learning and change. This model suggests that corporate perceptions about the nature of stakeholder influence the perceived value of stakeholder contributions. When stakeholders are primarily viewed as a source of pressure or threat, firms tend to adopt a reactive/defensive posture in an effort to manage stakeholders and protect the firm from sources of outside pressure -behaviour consistent with Resource Dependence Theory, which suggests that firms try to get control over extemal threats by focussing on the relevant stakeholders on whom they depend for critical resources, and try to reverse the control potentially exerted by extemal constituencies by trying to influence and manipulate these valuable stakeholders. In situations where stakeholders are viewed as a current strategic asset, firms tend to adopt a proactive/offensive posture in an effort to tap stakeholder contributions and connect the organisation to its environment - behaviour consistent with Institutional Theory, which suggests that firms try to ensure the continuing license to operate by internalising external expectations. In instances where stakeholders are viewed as a source of future value, firms tend to adopt an interactive/innovative posture in an effort to reduce or widen the embedded system and bring stakeholders into systems of innovation and feedback -behaviour consistent with the literature on Organisational Learning, which suggests that firms can learn how to optimize their performance as they develop systems and structures that are more adaptable and responsive to change The conceptual model moreover suggests that the perceived value of stakeholder contribution drives corporate aims for engagement, which can be usefully categorised as dialogue intentions spanning a continuum running from low-level to high-level to very-high level. This study suggests that activities aimed at disarming critical stakeholders (`manipulation') providing guidance and correcting misinformation (`education'), being transparent about corporate activities and policies (`information'), alleviating stakeholder concerns (`placation'), and accessing stakeholder opinion ('consultation') represent low-level dialogue intentions and are experienced by stakeholders as asymmetrical, persuasive, compliance-gaining activities that are not in line with `true' dialogue. This study also finds evidence that activities aimed at redistributing power ('partnership'), involving stakeholders in internal corporate processes (`participation'), and demonstrating corporate responsibility (`stewardship') reflect high-level dialogue intentions. This study additionally finds evidence that building and sustaining high-quality, trusted relationships which can meaningfully influence organisational policies incline a firm towards the type of interactive, proactive processes that underpin the development of sustainable corporate strategies. Dialogue intentions are related to type of corporate response: low-level intentions can lead to buffering strategies; high-level intentions can underpin bridging strategies; very high-level intentions can incline a firm towards boundary redefinition. The nature of corporate response (which encapsulates a firm's posture towards stakeholders, demonstrated by the level of dialogue intention and the firm's strategy for dealing with stakeholders) favours the type of learning and change experienced by the organisation. This study indicates that buffering strategies, where the firm attempts to protect itself against external influences and cant' out its existing strategy, typically lead to single-loop learning, whereby the firm teams how to perform better within its existing paradigm and at most, improves the performance of the established system - an outcome associated with first-order change. Bridging responses, where the firm adapts organisational activities to meet external expectations, typically leads a firm to acquire new behavioural capacities characteristic of double-loop learning, whereby insights and understanding are uncovered that are fundamentally different from existing knowledge and where stakeholders are brought into problem-solving conversations that enable them to influence corporate decision-making to address shortcomings in the system - an outcome associated with second-order change. Boundary redefinition suggests that the firm engages in triple-loop learning, where the firm changes relations with stakeholders in profound ways, considers problems from a whole-system perspective, examining the deep structures that sustain the system, producing innovation to address chronic problems and develop new opportunities - an outcome associated with third-order change. This study supports earlier theoretical and empirical studies {e.g. Weick's (1979, 1985) work on self-enactment; Maitlis & Lawrence's (2007) and Maitlis' (2005) work and Weick et al's (2005) work on sensegiving and sensemaking in organisations; Brickson's (2005, 2007) and Scott & Lane's (2000) work on organisational identity orientation}, which indicate that corporate self-perception is a key underlying factor driving the dynamics of organisational teaming and change. Such theorizing has important implications for managerial practice; namely, that a company which perceives itself as a 'victim' may be highly inclined to view stakeholders as a source of negative influence, and would therefore be potentially unable to benefit from the positive influence of engagement. Such a selfperception can blind the firm from seeing stakeholders in a more positive, contributing light, which suggests that such firms may not be inclined to embrace external sources of innovation and teaming, as they are focussed on protecting the firm against disturbing environmental influences (through buffering), and remain more likely to perform better within an existing paradigm (single-loop teaming). By contrast, a company that perceives itself as a 'leader' may be highly inclined to view stakeholders as a source of positive influence. On the downside, such a firm might have difficulty distinguishing when stakeholder contributions are less pertinent as it is deliberately more open to elements in operating environment (including stakeholders) as potential sources of learning and change, as the firm is oriented towards creating space for fundamental change (through boundary redefinition), opening issues to entirely new ways of thinking and addressing issues from whole-system perspective. A significant implication of this study is that potentially only those companies who see themselves as a leader are ultimately able to tap the innovation potential of stakeholder dialogue.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Telomerase is an RNA-dependent DNA polymerase that synthesizes telomeric DNA. Its activity is not detectable in most somatic cells but it is reactivated during tumorigenesis. In most cancers, the combination of hTERT hypermethylation and hypomethylation of a short promoter region is permissive for low-level hTERT transcription. Activated and malignant lymphocytes express high telomerase activity, through a mechanism that seems methylation-independent. The aim of this study was to determine which mechanism is involved in the enhanced expression of hTERT in lymphoid cells. Our data confirm that in B cells, some T cell lymphomas and non-neoplastic lymph nodes, the hTERT promoter is unmethylated. Binding sites for the B cell-specific transcription factor PAX5 were identified downstream of the ATG translational start site through EMSA and ChIP experiments. ChIP assays indicated that the transcriptional activation of hTERT by PAX5 does not involve repression of CTCF binding. In a B cell lymphoma cell line, siRNA-induced knockdown of PAX5 expression repressed hTERT transcription. Moreover, ectopic expression of PAX5 in a telomerase-negative normal fibroblast cell line was found to be sufficient to activate hTERT expression. These data show that activation of hTERT in telomerase-positive B cells is due to a methylation-independent mechanism in which PAX5 plays an important role.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Abstract One of the most important issues in molecular biology is to understand regulatory mechanisms that control gene expression. Gene expression is often regulated by proteins, called transcription factors which bind to short (5 to 20 base pairs),degenerate segments of DNA. Experimental efforts towards understanding the sequence specificity of transcription factors is laborious and expensive, but can be substantially accelerated with the use of computational predictions. This thesis describes the use of algorithms and resources for transcriptionfactor binding site analysis in addressing quantitative modelling, where probabilitic models are built to represent binding properties of a transcription factor and can be used to find new functional binding sites in genomes. Initially, an open-access database(HTPSELEX) was created, holding high quality binding sequences for two eukaryotic families of transcription factors namely CTF/NF1 and LEFT/TCF. The binding sequences were elucidated using a recently described experimental procedure called HTP-SELEX, that allows generation of large number (> 1000) of binding sites using mass sequencing technology. For each HTP-SELEX experiments we also provide accurate primary experimental information about the protein material used, details of the wet lab protocol, an archive of sequencing trace files, and assembled clone sequences of binding sequences. The database also offers reasonably large SELEX libraries obtained with conventional low-throughput protocols.The database is available at http://wwwisrec.isb-sib.ch/htpselex/ and and ftp://ftp.isrec.isb-sib.ch/pub/databases/htpselex. The Expectation-Maximisation(EM) algorithm is one the frequently used methods to estimate probabilistic models to represent the sequence specificity of transcription factors. We present computer simulations in order to estimate the precision of EM estimated models as a function of data set parameters(like length of initial sequences, number of initial sequences, percentage of nonbinding sequences). We observed a remarkable robustness of the EM algorithm with regard to length of training sequences and the degree of contamination. The HTPSELEX database and the benchmarked results of the EM algorithm formed part of the foundation for the subsequent project, where a statistical framework called hidden Markov model has been developed to represent sequence specificity of the transcription factors CTF/NF1 and LEF1/TCF using the HTP-SELEX experiment data. The hidden Markov model framework is capable of both predicting and classifying CTF/NF1 and LEF1/TCF binding sites. A covariance analysis of the binding sites revealed non-independent base preferences at different nucleotide positions, providing insight into the binding mechanism. We next tested the LEF1/TCF model by computing binding scores for a set of LEF1/TCF binding sequences for which relative affinities were determined experimentally using non-linear regression. The predicted and experimentally determined binding affinities were in good correlation.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Cell therapy for nucleus pulposus (NP) regeneration is an attractive treatment for early disc degeneration as shown by studies using autologous NP cells or stem cells. Another potential source of cells is foetal cells. We investigated the feasibility of isolating foetal cells from human foetal spine tissues and assessed their chondrogenic potential in alginate bead cultures. Histology and immunohistochemistry of foetal tissues showed that the structure and the matrix composition (aggrecan, type I and II collagen) of foetal intervertebral disc (IVD) were similar to adult IVD. Isolated foetal cells were cultured in monolayer in basic media supplemented with 10% Fetal Bovine Serum (FBS) and from each foetal tissue donation, a cell bank of foetal spine cells at passage 2 was established and was composed of around 2000 vials of 5 million cells. Gene expression and immunohistochemistry of foetal spine cells cultured in alginate beads during 28 days showed that cells were able to produce aggrecan and type II collagen and very low level of type I and type X collagen, indicating chondrogenic differentiation. However variability in matrix synthesis was observed between donors. In conclusion, foetal cells could be isolated from human foetal spine tissues and since these cells showed chondrogenic potential, they could be a potential cell source for IVD regeneration.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

La biologie de la conservation est communément associée à la protection de petites populations menacées d?extinction. Pourtant, il peut également être nécessaire de soumettre à gestion des populations surabondantes ou susceptibles d?une trop grande expansion, dans le but de prévenir les effets néfastes de la surpopulation. Du fait des différences tant quantitatives que qualitatives entre protection des petites populations et contrôle des grandes, il est nécessaire de disposer de modèles et de méthodes distinctes. L?objectif de ce travail a été de développer des modèles prédictifs de la dynamique des grandes populations, ainsi que des logiciels permettant de calculer les paramètres de ces modèles et de tester des scénarios de gestion. Le cas du Bouquetin des Alpes (Capra ibex ibex) - en forte expansion en Suisse depuis sa réintroduction au début du XXème siècle - servit d?exemple. Cette tâche fut accomplie en trois étapes : En premier lieu, un modèle de dynamique locale, spécifique au Bouquetin, fut développé : le modèle sous-jacent - structuré en classes d?âge et de sexe - est basé sur une matrice de Leslie à laquelle ont été ajoutées la densité-dépendance, la stochasticité environnementale et la chasse de régulation. Ce modèle fut implémenté dans un logiciel d?aide à la gestion - nommé SIM-Ibex - permettant la maintenance de données de recensements, l?estimation automatisée des paramètres, ainsi que l?ajustement et la simulation de stratégies de régulation. Mais la dynamique d?une population est influencée non seulement par des facteurs démographiques, mais aussi par la dispersion et la colonisation de nouveaux espaces. Il est donc nécessaire de pouvoir modéliser tant la qualité de l?habitat que les obstacles à la dispersion. Une collection de logiciels - nommée Biomapper - fut donc développée. Son module central est basé sur l?Analyse Factorielle de la Niche Ecologique (ENFA) dont le principe est de calculer des facteurs de marginalité et de spécialisation de la niche écologique à partir de prédicteurs environnementaux et de données d?observation de l?espèce. Tous les modules de Biomapper sont liés aux Systèmes d?Information Géographiques (SIG) ; ils couvrent toutes les opérations d?importation des données, préparation des prédicteurs, ENFA et calcul de la carte de qualité d?habitat, validation et traitement des résultats ; un module permet également de cartographier les barrières et les corridors de dispersion. Le domaine d?application de l?ENFA fut exploré par le biais d?une distribution d?espèce virtuelle. La comparaison à une méthode couramment utilisée pour construire des cartes de qualité d?habitat, le Modèle Linéaire Généralisé (GLM), montra qu?elle était particulièrement adaptée pour les espèces cryptiques ou en cours d?expansion. Les informations sur la démographie et le paysage furent finalement fusionnées en un modèle global. Une approche basée sur un automate cellulaire fut choisie, tant pour satisfaire aux contraintes du réalisme de la modélisation du paysage qu?à celles imposées par les grandes populations : la zone d?étude est modélisée par un pavage de cellules hexagonales, chacune caractérisée par des propriétés - une capacité de soutien et six taux d?imperméabilité quantifiant les échanges entre cellules adjacentes - et une variable, la densité de la population. Cette dernière varie en fonction de la reproduction et de la survie locale, ainsi que de la dispersion, sous l?influence de la densité-dépendance et de la stochasticité. Un logiciel - nommé HexaSpace - fut développé pour accomplir deux fonctions : 1° Calibrer l?automate sur la base de modèles de dynamique (par ex. calculés par SIM-Ibex) et d?une carte de qualité d?habitat (par ex. calculée par Biomapper). 2° Faire tourner des simulations. Il permet d?étudier l?expansion d?une espèce envahisseuse dans un paysage complexe composé de zones de qualité diverses et comportant des obstacles à la dispersion. Ce modèle fut appliqué à l?histoire de la réintroduction du Bouquetin dans les Alpes bernoises (Suisse). SIM-Ibex est actuellement utilisé par les gestionnaires de la faune et par les inspecteurs du gouvernement pour préparer et contrôler les plans de tir. Biomapper a été appliqué à plusieurs espèces (tant végétales qu?animales) à travers le Monde. De même, même si HexaSpace fut initialement conçu pour des espèces animales terrestres, il pourrait aisément être étndu à la propagation de plantes ou à la dispersion d?animaux volants. Ces logiciels étant conçus pour, à partir de données brutes, construire un modèle réaliste complexe, et du fait qu?ils sont dotés d?une interface d?utilisation intuitive, ils sont susceptibles de nombreuses applications en biologie de la conservation. En outre, ces approches peuvent également s?appliquer à des questions théoriques dans les domaines de l?écologie des populations et du paysage.<br/><br/>Conservation biology is commonly associated to small and endangered population protection. Nevertheless, large or potentially large populations may also need human management to prevent negative effects of overpopulation. As there are both qualitative and quantitative differences between small population protection and large population controlling, distinct methods and models are needed. The aim of this work was to develop theoretical models to predict large population dynamics, as well as computer tools to assess the parameters of these models and to test management scenarios. The alpine Ibex (Capra ibex ibex) - which experienced a spectacular increase since its reintroduction in Switzerland at the beginning of the 20th century - was used as paradigm species. This task was achieved in three steps: A local population dynamics model was first developed specifically for Ibex: the underlying age- and sex-structured model is based on a Leslie matrix approach with addition of density-dependence, environmental stochasticity and culling. This model was implemented into a management-support software - named SIM-Ibex - allowing census data maintenance, parameter automated assessment and culling strategies tuning and simulating. However population dynamics is driven not only by demographic factors, but also by dispersal and colonisation of new areas. Habitat suitability and obstacles modelling had therefore to be addressed. Thus, a software package - named Biomapper - was developed. Its central module is based on the Ecological Niche Factor Analysis (ENFA) whose principle is to compute niche marginality and specialisation factors from a set of environmental predictors and species presence data. All Biomapper modules are linked to Geographic Information Systems (GIS); they cover all operations of data importation, predictor preparation, ENFA and habitat suitability map computation, results validation and further processing; a module also allows mapping of dispersal barriers and corridors. ENFA application domain was then explored by means of a simulated species distribution. It was compared to a common habitat suitability assessing method, the Generalised Linear Model (GLM), and was proven better suited for spreading or cryptic species. Demography and landscape informations were finally merged into a global model. To cope with landscape realism and technical constraints of large population modelling, a cellular automaton approach was chosen: the study area is modelled by a lattice of hexagonal cells, each one characterised by a few fixed properties - a carrying capacity and six impermeability rates quantifying exchanges between adjacent cells - and one variable, population density. The later varies according to local reproduction/survival and dispersal dynamics, modified by density-dependence and stochasticity. A software - named HexaSpace - was developed, which achieves two functions: 1° Calibrating the automaton on the base of local population dynamics models (e.g., computed by SIM-Ibex) and a habitat suitability map (e.g. computed by Biomapper). 2° Running simulations. It allows studying the spreading of an invading species across a complex landscape made of variously suitable areas and dispersal barriers. This model was applied to the history of Ibex reintroduction in Bernese Alps (Switzerland). SIM-Ibex is now used by governmental wildlife managers to prepare and verify culling plans. Biomapper has been applied to several species (both plants and animals) all around the World. In the same way, whilst HexaSpace was originally designed for terrestrial animal species, it could be easily extended to model plant propagation or flying animals dispersal. As these softwares were designed to proceed from low-level data to build a complex realistic model and as they benefit from an intuitive user-interface, they may have many conservation applications. Moreover, theoretical questions in the fields of population and landscape ecology might also be addressed by these approaches.