967 resultados para Matrix Approach


Relevância:

60.00% 60.00%

Publicador:

Resumo:

Background: Xylella fastidiosa, a Gram-negative fastidious bacterium, grows in the xylem of several plants causing diseases such as citrus variegated chlorosis. As the xylem sap contains low concentrations of amino acids and other compounds, X. fastidiosa needs to cope with nitrogen limitation in its natural habitat. Results: In this work, we performed a whole-genome microarray analysis of the X. fastidiosa nitrogen starvation response. A time course experiment (2, 8 and 12 hours) of cultures grown in defined medium under nitrogen starvation revealed many differentially expressed genes, such as those related to transport, nitrogen assimilation, amino acid biosynthesis, transcriptional regulation, and many genes encoding hypothetical proteins. In addition, a decrease in the expression levels of many genes involved in carbon metabolism and energy generation pathways was also observed. Comparison of gene expression profiles between the wild type strain and the rpoN null mutant allowed the identification of genes directly or indirectly induced by nitrogen starvation in a sigma(54)-dependent manner. A more complete picture of the sigma(54) regulon was achieved by combining the transcriptome data with an in silico search for potential sigma(54)-dependent promoters, using a position weight matrix approach. One of these sigma(54)-predicted binding sites, located upstream of the glnA gene (encoding glutamine synthetase), was validated by primer extension assays, confirming that this gene has a sigma(54)-dependent promoter. Conclusions: Together, these results show that nitrogen starvation causes intense changes in the X. fastidiosa transcriptome and some of these differentially expressed genes belong to the sigma(54) regulon.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Electrical Impedance Tomography (EIT) is an imaging method which enables a volume conductivity map of a subject to be produced from multiple impedance measurements. It has the potential to become a portable non-invasive imaging technique of particular use in imaging brain function. Accurate numerical forward models may be used to improve image reconstruction but, until now, have employed an assumption of isotropic tissue conductivity. This may be expected to introduce inaccuracy, as body tissues, especially those such as white matter and the skull in head imaging, are highly anisotropic. The purpose of this study was, for the first time, to develop a method for incorporating anisotropy in a forward numerical model for EIT of the head and assess the resulting improvement in image quality in the case of linear reconstruction of one example of the human head. A realistic Finite Element Model (FEM) of an adult human head with segments for the scalp, skull, CSF, and brain was produced from a structural MRI. Anisotropy of the brain was estimated from a diffusion tensor-MRI of the same subject and anisotropy of the skull was approximated from the structural information. A method for incorporation of anisotropy in the forward model and its use in image reconstruction was produced. The improvement in reconstructed image quality was assessed in computer simulation by producing forward data, and then linear reconstruction using a sensitivity matrix approach. The mean boundary data difference between anisotropic and isotropic forward models for a reference conductivity was 50%. Use of the correct anisotropic FEM in image reconstruction, as opposed to an isotropic one, corrected an error of 24 mm in imaging a 10% conductivity decrease located in the hippocampus, improved localisation for conductivity changes deep in the brain and due to epilepsy by 4-17 mm, and, overall, led to a substantial improvement on image quality. This suggests that incorporation of anisotropy in numerical models used for image reconstruction is likely to improve EIT image quality.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

La biologie de la conservation est communément associée à la protection de petites populations menacées d?extinction. Pourtant, il peut également être nécessaire de soumettre à gestion des populations surabondantes ou susceptibles d?une trop grande expansion, dans le but de prévenir les effets néfastes de la surpopulation. Du fait des différences tant quantitatives que qualitatives entre protection des petites populations et contrôle des grandes, il est nécessaire de disposer de modèles et de méthodes distinctes. L?objectif de ce travail a été de développer des modèles prédictifs de la dynamique des grandes populations, ainsi que des logiciels permettant de calculer les paramètres de ces modèles et de tester des scénarios de gestion. Le cas du Bouquetin des Alpes (Capra ibex ibex) - en forte expansion en Suisse depuis sa réintroduction au début du XXème siècle - servit d?exemple. Cette tâche fut accomplie en trois étapes : En premier lieu, un modèle de dynamique locale, spécifique au Bouquetin, fut développé : le modèle sous-jacent - structuré en classes d?âge et de sexe - est basé sur une matrice de Leslie à laquelle ont été ajoutées la densité-dépendance, la stochasticité environnementale et la chasse de régulation. Ce modèle fut implémenté dans un logiciel d?aide à la gestion - nommé SIM-Ibex - permettant la maintenance de données de recensements, l?estimation automatisée des paramètres, ainsi que l?ajustement et la simulation de stratégies de régulation. Mais la dynamique d?une population est influencée non seulement par des facteurs démographiques, mais aussi par la dispersion et la colonisation de nouveaux espaces. Il est donc nécessaire de pouvoir modéliser tant la qualité de l?habitat que les obstacles à la dispersion. Une collection de logiciels - nommée Biomapper - fut donc développée. Son module central est basé sur l?Analyse Factorielle de la Niche Ecologique (ENFA) dont le principe est de calculer des facteurs de marginalité et de spécialisation de la niche écologique à partir de prédicteurs environnementaux et de données d?observation de l?espèce. Tous les modules de Biomapper sont liés aux Systèmes d?Information Géographiques (SIG) ; ils couvrent toutes les opérations d?importation des données, préparation des prédicteurs, ENFA et calcul de la carte de qualité d?habitat, validation et traitement des résultats ; un module permet également de cartographier les barrières et les corridors de dispersion. Le domaine d?application de l?ENFA fut exploré par le biais d?une distribution d?espèce virtuelle. La comparaison à une méthode couramment utilisée pour construire des cartes de qualité d?habitat, le Modèle Linéaire Généralisé (GLM), montra qu?elle était particulièrement adaptée pour les espèces cryptiques ou en cours d?expansion. Les informations sur la démographie et le paysage furent finalement fusionnées en un modèle global. Une approche basée sur un automate cellulaire fut choisie, tant pour satisfaire aux contraintes du réalisme de la modélisation du paysage qu?à celles imposées par les grandes populations : la zone d?étude est modélisée par un pavage de cellules hexagonales, chacune caractérisée par des propriétés - une capacité de soutien et six taux d?imperméabilité quantifiant les échanges entre cellules adjacentes - et une variable, la densité de la population. Cette dernière varie en fonction de la reproduction et de la survie locale, ainsi que de la dispersion, sous l?influence de la densité-dépendance et de la stochasticité. Un logiciel - nommé HexaSpace - fut développé pour accomplir deux fonctions : 1° Calibrer l?automate sur la base de modèles de dynamique (par ex. calculés par SIM-Ibex) et d?une carte de qualité d?habitat (par ex. calculée par Biomapper). 2° Faire tourner des simulations. Il permet d?étudier l?expansion d?une espèce envahisseuse dans un paysage complexe composé de zones de qualité diverses et comportant des obstacles à la dispersion. Ce modèle fut appliqué à l?histoire de la réintroduction du Bouquetin dans les Alpes bernoises (Suisse). SIM-Ibex est actuellement utilisé par les gestionnaires de la faune et par les inspecteurs du gouvernement pour préparer et contrôler les plans de tir. Biomapper a été appliqué à plusieurs espèces (tant végétales qu?animales) à travers le Monde. De même, même si HexaSpace fut initialement conçu pour des espèces animales terrestres, il pourrait aisément être étndu à la propagation de plantes ou à la dispersion d?animaux volants. Ces logiciels étant conçus pour, à partir de données brutes, construire un modèle réaliste complexe, et du fait qu?ils sont dotés d?une interface d?utilisation intuitive, ils sont susceptibles de nombreuses applications en biologie de la conservation. En outre, ces approches peuvent également s?appliquer à des questions théoriques dans les domaines de l?écologie des populations et du paysage.<br/><br/>Conservation biology is commonly associated to small and endangered population protection. Nevertheless, large or potentially large populations may also need human management to prevent negative effects of overpopulation. As there are both qualitative and quantitative differences between small population protection and large population controlling, distinct methods and models are needed. The aim of this work was to develop theoretical models to predict large population dynamics, as well as computer tools to assess the parameters of these models and to test management scenarios. The alpine Ibex (Capra ibex ibex) - which experienced a spectacular increase since its reintroduction in Switzerland at the beginning of the 20th century - was used as paradigm species. This task was achieved in three steps: A local population dynamics model was first developed specifically for Ibex: the underlying age- and sex-structured model is based on a Leslie matrix approach with addition of density-dependence, environmental stochasticity and culling. This model was implemented into a management-support software - named SIM-Ibex - allowing census data maintenance, parameter automated assessment and culling strategies tuning and simulating. However population dynamics is driven not only by demographic factors, but also by dispersal and colonisation of new areas. Habitat suitability and obstacles modelling had therefore to be addressed. Thus, a software package - named Biomapper - was developed. Its central module is based on the Ecological Niche Factor Analysis (ENFA) whose principle is to compute niche marginality and specialisation factors from a set of environmental predictors and species presence data. All Biomapper modules are linked to Geographic Information Systems (GIS); they cover all operations of data importation, predictor preparation, ENFA and habitat suitability map computation, results validation and further processing; a module also allows mapping of dispersal barriers and corridors. ENFA application domain was then explored by means of a simulated species distribution. It was compared to a common habitat suitability assessing method, the Generalised Linear Model (GLM), and was proven better suited for spreading or cryptic species. Demography and landscape informations were finally merged into a global model. To cope with landscape realism and technical constraints of large population modelling, a cellular automaton approach was chosen: the study area is modelled by a lattice of hexagonal cells, each one characterised by a few fixed properties - a carrying capacity and six impermeability rates quantifying exchanges between adjacent cells - and one variable, population density. The later varies according to local reproduction/survival and dispersal dynamics, modified by density-dependence and stochasticity. A software - named HexaSpace - was developed, which achieves two functions: 1° Calibrating the automaton on the base of local population dynamics models (e.g., computed by SIM-Ibex) and a habitat suitability map (e.g. computed by Biomapper). 2° Running simulations. It allows studying the spreading of an invading species across a complex landscape made of variously suitable areas and dispersal barriers. This model was applied to the history of Ibex reintroduction in Bernese Alps (Switzerland). SIM-Ibex is now used by governmental wildlife managers to prepare and verify culling plans. Biomapper has been applied to several species (both plants and animals) all around the World. In the same way, whilst HexaSpace was originally designed for terrestrial animal species, it could be easily extended to model plant propagation or flying animals dispersal. As these softwares were designed to proceed from low-level data to build a complex realistic model and as they benefit from an intuitive user-interface, they may have many conservation applications. Moreover, theoretical questions in the fields of population and landscape ecology might also be addressed by these approaches.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

In the context of a two-dimensional exactly solvable model, the dynamics of quantum black holes is obtained by analytically continuing the description of the regime where no black hole is formed. The resulting spectrum of outgoing radiation departs from the one predicted by the Hawking model in the region where the outgoing modes arise from the horizon with Planck-order frequencies. This occurs early in the evaporation process, and the resulting physical picture is unconventional. The theory predicts that black holes will only radiate out an energy of Planck mass order, stabilizing after a transitory period. The continuation from a regime without black hole formationaccessible in the 1+1 gravity theory consideredis implicit in an S-matrix approach and suggests in this way a possible solution to the problem of information loss.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

The symmetry energy coefficients, incompressibility, and single-particle and isovector potentials of clusterized dilute nuclear matter are calculated at different temperatures employing the S-matrix approach to the evaluation of the equation of state. Calculations have been extended to understand the aforesaid properties of homogeneous and clusterized supernova matter in the subnuclear density region. A comparison of the results in the S-matrix and mean-field approach reveals some subtle differences in the density and temperature region we explore.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

In the last years, a great interest in nonequilibrium systems has been witnessed. Although the Master Equations are one of the most common methods used to describe these systems, the literature about these equations is not straightforward due to the mathematical framework used in their derivations. The goals of this work are to present the physical concepts behind the Master Equations development and to discuss their basic proprieties via a matrix approach. It is also shown how the Master Equations can be used to model typical nonequilibrium processes like multi-wells chemical reactions and radiation absorption processes.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

In this article a two-dimensional transient boundary element formulation based on the mass matrix approach is discussed. The implicit formulation of the method to deal with elastoplastic analysis is considered, as well as the way to deal with viscous damping effects. The time integration processes are based on the Newmark rhoand Houbolt methods, while the domain integrals for mass, elastoplastic and damping effects are carried out by the well known cell approximation technique. The boundary element algebraic relations are also coupled with finite element frame relations to solve stiffened domains. Some examples to illustrate the accuracy and efficiency of the proposed formulation are also presented.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Given a heterogeneous relation algebra R, it is well known that the algebra of matrices with coefficient from R is relation algebra with relational sums that is not necessarily finite. When a relational product exists or the point axiom is given, we can represent the relation algebra by concrete binary relations between sets, which means the algebra may be seen as an algebra of Boolean matrices. However, it is not possible to represent every relation algebra. It is well known that the smallest relation algebra that is not representable has only 16 elements. Such an algebra can not be put in a Boolean matrix form.[15] In [15, 16] it was shown that every relation algebra R with relational sums and sub-objects is equivalent to an algebra of matrices over a suitable basis. This basis is given by the integral objects of R, and is, compared to R, much smaller. Aim of my thesis is to develop a system called ReAlM - Relation Algebra Manipulator - that is capable of visualizing computations in arbitrary relation algebras using the matrix approach.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Les systèmes multiprocesseurs sur puce électronique (On-Chip Multiprocessor [OCM]) sont considérés comme les meilleures structures pour occuper l'espace disponible sur les circuits intégrés actuels. Dans nos travaux, nous nous intéressons à un modèle architectural, appelé architecture isométrique de systèmes multiprocesseurs sur puce, qui permet d'évaluer, de prédire et d'optimiser les systèmes OCM en misant sur une organisation efficace des nœuds (processeurs et mémoires), et à des méthodologies qui permettent d'utiliser efficacement ces architectures. Dans la première partie de la thèse, nous nous intéressons à la topologie du modèle et nous proposons une architecture qui permet d'utiliser efficacement et massivement les mémoires sur la puce. Les processeurs et les mémoires sont organisés selon une approche isométrique qui consiste à rapprocher les données des processus plutôt que d'optimiser les transferts entre les processeurs et les mémoires disposés de manière conventionnelle. L'architecture est un modèle maillé en trois dimensions. La disposition des unités sur ce modèle est inspirée de la structure cristalline du chlorure de sodium (NaCl), où chaque processeur peut accéder à six mémoires à la fois et où chaque mémoire peut communiquer avec autant de processeurs à la fois. Dans la deuxième partie de notre travail, nous nous intéressons à une méthodologie de décomposition où le nombre de nœuds du modèle est idéal et peut être déterminé à partir d'une spécification matricielle de l'application qui est traitée par le modèle proposé. Sachant que la performance d'un modèle dépend de la quantité de flot de données échangées entre ses unités, en l'occurrence leur nombre, et notre but étant de garantir une bonne performance de calcul en fonction de l'application traitée, nous proposons de trouver le nombre idéal de processeurs et de mémoires du système à construire. Aussi, considérons-nous la décomposition de la spécification du modèle à construire ou de l'application à traiter en fonction de l'équilibre de charge des unités. Nous proposons ainsi une approche de décomposition sur trois points : la transformation de la spécification ou de l'application en une matrice d'incidence dont les éléments sont les flots de données entre les processus et les données, une nouvelle méthodologie basée sur le problème de la formation des cellules (Cell Formation Problem [CFP]), et un équilibre de charge de processus dans les processeurs et de données dans les mémoires. Dans la troisième partie, toujours dans le souci de concevoir un système efficace et performant, nous nous intéressons à l'affectation des processeurs et des mémoires par une méthodologie en deux étapes. Dans un premier temps, nous affectons des unités aux nœuds du système, considéré ici comme un graphe non orienté, et dans un deuxième temps, nous affectons des valeurs aux arcs de ce graphe. Pour l'affectation, nous proposons une modélisation des applications décomposées en utilisant une approche matricielle et l'utilisation du problème d'affectation quadratique (Quadratic Assignment Problem [QAP]). Pour l'affectation de valeurs aux arcs, nous proposons une approche de perturbation graduelle, afin de chercher la meilleure combinaison du coût de l'affectation, ceci en respectant certains paramètres comme la température, la dissipation de chaleur, la consommation d'énergie et la surface occupée par la puce. Le but ultime de ce travail est de proposer aux architectes de systèmes multiprocesseurs sur puce une méthodologie non traditionnelle et un outil systématique et efficace d'aide à la conception dès la phase de la spécification fonctionnelle du système.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

L’analyse biomécanique du mouvement humain en utilisant des systèmes optoélectroniques et des marqueurs cutanés considère les segments du corps comme des corps rigides. Cependant, le mouvement des tissus mous par rapport à l'os, c’est à dire les muscles et le tissu adipeux, provoque le déplacement des marqueurs. Ce déplacement est le fait de deux composantes, une composante propre correspondant au mouvement aléatoire de chaque marqueur et une composante à l’unisson provoquant le déplacement commun des marqueurs cutanés lié au mouvement des masses sous-jacentes. Si nombre d’études visent à minimiser ces déplacements, des simulations ont montré que le mouvement des masses molles réduit la dynamique articulaire. Cette observation est faite uniquement par la simulation, car il n'existe pas de méthodes capables de dissocier la cinématique des masses molles de celle de l’os. L’objectif principal de cette thèse consiste à développer une méthode numérique capable de distinguer ces deux cinématiques. Le premier objectif était d'évaluer une méthode d'optimisation locale pour estimer le mouvement des masses molles par rapport à l’humérus obtenu avec une tige intra-corticale vissée chez trois sujets. Les résultats montrent que l'optimisation locale sous-estime de 50% le déplacement des marqueurs et qu’elle conduit à un classement de marqueurs différents en fonction de leur déplacement. La limite de cette méthode vient du fait qu'elle ne tient pas compte de l’ensemble des composantes du mouvement des tissus mous, notamment la composante en unisson. Le second objectif était de développer une méthode numérique qui considère toutes les composantes du mouvement des tissus mous. Plus précisément, cette méthode devait fournir une cinématique similaire et une plus grande estimation du déplacement des marqueurs par rapport aux méthodes classiques et dissocier ces composantes. Le membre inférieur est modélisé avec une chaine cinématique de 10 degrés de liberté reconstruite par optimisation globale en utilisant seulement les marqueurs placés sur le pelvis et la face médiale du tibia. L’estimation de la cinématique sans considérer les marqueurs placés sur la cuisse et le mollet permet d'éviter l’influence de leur déplacement sur la reconstruction du modèle cinématique. Cette méthode testée sur 13 sujets lors de sauts a obtenu jusqu’à 2,1 fois plus de déplacement des marqueurs en fonction de la méthode considérée en assurant des cinématiques similaires. Une approche vectorielle a montré que le déplacement des marqueurs est surtout dû à la composante à l’unisson. Une approche matricielle associant l’optimisation locale à la chaine cinématique a montré que les masses molles se déplacent principalement autour de l'axe longitudinal et le long de l'axe antéro-postérieur de l'os. L'originalité de cette thèse est de dissocier numériquement la cinématique os de celle des masses molles et les composantes de ce mouvement. Les méthodes développées dans cette thèse augmentent les connaissances sur le mouvement des masses molles et permettent d’envisager l’étude de leur effet sur la dynamique articulaire.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Effective medium approximations for the frequency-dependent and complex-valued effective stiffness tensors of cracked/ porous rocks with multiple solid constituents are developed on the basis of the T-matrix approach (based on integral equation methods for quasi-static composites), the elastic - viscoelastic correspondence principle, and a unified treatment of the local and global flow mechanisms, which is consistent with the principle of fluid mass conservation. The main advantage of using the T-matrix approach, rather than the first-order approach of Eshelby or the second-order approach of Hudson, is that it produces physically plausible results even when the volume concentrations of inclusions or cavities are no longer small. The new formulae, which operates with an arbitrary homogeneous (anisotropic) reference medium and contains terms of all order in the volume concentrations of solid particles and communicating cavities, take explicitly account of inclusion shape and spatial distribution independently. We show analytically that an expansion of the T-matrix formulae to first order in the volume concentration of cavities (in agreement with the dilute estimate of Eshelby) has the correct dependence on the properties of the saturating fluid, in the sense that it is consistent with the Brown-Korringa relation, when the frequency is sufficiently low. We present numerical results for the (anisotropic) effective viscoelastic properties of a cracked permeable medium with finite storage porosity, indicating that the complete T-matrix formulae (including the higher-order terms) are generally consistent with the Brown-Korringa relation, at least if we assume the spatial distribution of cavities to be the same for all cavity pairs. We have found an efficient way to treat statistical correlations in the shapes and orientations of the communicating cavities, and also obtained a reasonable match between theoretical predictions (based on a dual porosity model for quartz-clay mixtures, involving relatively flat clay-related pores and more rounded quartz-related pores) and laboratory results for the ultrasonic velocity and attenuation spectra of a suite of typical reservoir rocks. (C) 2003 Elsevier B.V. All rights reserved.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

In a global business economy, firms have a broad range of corporate real estate needs. During the past decade, multiple strategies and tactics have emerged in the corporate real estate community for meeting those needs. We propose here a framework for analysing and prioritising the various types of risk inherent in corporate real estate decisions. From a business strategy perspective, corporate real estate must serve needs beyond the simple one of shelter for the workforce and production process. Certain uses are strategic in that they allow access to externalities, embody the business strategy, or provide entrée to new markets. Other uses may be tactical, in that they arise from business activities of relatively short duration or provide an opportunity to pre-empt competitors. Still other corporate real estate uses can be considered “core” to the existence of the business enterprise. These might be special use properties or may be generic buildings that have become embodiments of the organisation’s culture. We argue that a multi-dimensional matrix approach organised around three broad themes and nine sub-categories allow the decision-maker to organise and evaluate choices with an acceptable degree of rigor and thoroughness. The three broad themes are Use (divided into Core, Cyclical or Casual) – Asset Type (which can be Strategic, Specialty or Generic) and Market Environment (which ranges from Mature Domestic to Emerging Economy). Proper understanding of each of these groupings brings critical variables to the fore and allows for efficient resource allocation and enhanced risk management.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

In most bacteria, the ferric uptake regulator (Fur) is a global regulator that controls iron homeostasis and other cellular processes, such as oxidative stress defense. In this work, we apply a combination of bioinformatics, in vitro and in vivo assays to identify the Caulobacter crescentus Fur regulon. A C. crescentus fur deletion mutant showed a slow growth phenotype, and was hypersensitive to H(2)O(2) and organic peroxide. Using a position weight matrix approach, several predicted Fur-binding sites were detected in the genome of C. crescentus, located in regulatory regions of genes not only involved in iron uptake and usage but also in other functions. Selected Fur-binding sites were validated using electrophoretic mobility shift assay and DNAse I footprinting analysis. Gene expression assays revealed that genes involved in iron uptake were repressed by iron-Fur and induced under conditions of iron limitation, whereas genes encoding iron-using proteins were activated by Fur under conditions of iron sufficiency. Furthermore, several genes that are regulated via small RNAs in other bacteria were found to be directly regulated by Fur in C. crescentus. In conclusion, Fur functions as an activator and as a repressor, integrating iron metabolism and oxidative stress response in C. crescentus.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Abstract Background Xylella fastidiosa, a Gram-negative fastidious bacterium, grows in the xylem of several plants causing diseases such as citrus variegated chlorosis. As the xylem sap contains low concentrations of amino acids and other compounds, X. fastidiosa needs to cope with nitrogen limitation in its natural habitat. Results In this work, we performed a whole-genome microarray analysis of the X. fastidiosa nitrogen starvation response. A time course experiment (2, 8 and 12 hours) of cultures grown in defined medium under nitrogen starvation revealed many differentially expressed genes, such as those related to transport, nitrogen assimilation, amino acid biosynthesis, transcriptional regulation, and many genes encoding hypothetical proteins. In addition, a decrease in the expression levels of many genes involved in carbon metabolism and energy generation pathways was also observed. Comparison of gene expression profiles between the wild type strain and the rpoN null mutant allowed the identification of genes directly or indirectly induced by nitrogen starvation in a σ54-dependent manner. A more complete picture of the σ54 regulon was achieved by combining the transcriptome data with an in silico search for potential σ54-dependent promoters, using a position weight matrix approach. One of these σ54-predicted binding sites, located upstream of the glnA gene (encoding glutamine synthetase), was validated by primer extension assays, confirming that this gene has a σ54-dependent promoter. Conclusions Together, these results show that nitrogen starvation causes intense changes in the X. fastidiosa transcriptome and some of these differentially expressed genes belong to the σ54 regulon.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

This paper investigates how demographic (socioeconomic) and land-use (physical and environmental) data can be integrated within a decision support framework to formulate and evaluate land-use planning scenarios. A case-study approach is undertaken with land-use planning scenarios for a rapidly growing coastal area in Australia, the Shire of Hervey Bay. The town and surrounding area require careful planning of the future urban growth between competing land uses. Three potential urban growth scenarios are put forth to address this issue. Scenario A ('continued growth') is based on existing socioeconomic trends. Scenario B ('maximising rates base') is derived using optimisation modelling of land-valuation data. Scenario C ('sustainable development') is derived using a number of social, economic, and environmental factors and assigning weightings of importance to each factor using a multiple criteria analysis approach. The land-use planning scenarios are presented through the use of maps and tables within a geographical information system, which delineate future possible land-use allocations up until 2021. The planning scenarios are evaluated by using a goal-achievement matrix approach. The matrix is constructed with a number of criteria derived from key policy objectives outlined in the regional growth management framework and town planning schemes. The authors of this paper examine the final efficiency scores calculated for each of the three planning scenarios and discuss the advantages and disadvantages of the three land-use modelling approaches used to formulate the final scenarios.