923 resultados para Stabilité (Stability)


Relevância:

20.00% 20.00%

Publicador:

Resumo:

The problem of stability analysis for a class of neutral systems with mixed time-varying neutral, discrete and distributed delays and nonlinear parameter perturbations is addressed. By introducing a novel Lyapunov-Krasovskii functional and combining the descriptor model transformation, the Leibniz-Newton formula, some free-weighting matrices, and a suitable change of variables, new sufficient conditions are established for the stability of the considered system, which are neutral-delay-dependent, discrete-delay-range dependent, and distributeddelay-dependent. The conditions are presented in terms of linear matrix inequalities (LMIs) and can be efficiently solved using convex programming techniques. Two numerical examples are given to illustrate the efficiency of the proposed method

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Why does the EU have an ambiguous and inconsistent democracy promotion (DP) policy towards the Mediterranean countries? This paper argues that the EU´s DP is determined by a crucial conflict of interests conceptualised as a stability – democracy dilemma. The EU has been attempting to promote democracy, but without risking the current stability and in connivance with incumbent autocratic regimes. In view of this dilemma, the four main characteristics of the EU´s DP promotion are explored, namely: gradualism, a strong notion of partnership-building, a narrow definition of civil society, and a strong belief in economic liberalisation. A fifth feature, relation of the EU with moderate Islamists, is analysed in the paper as it represents the most striking illustration of its contradictions. The paper concludes by arguing that the definition of a clear DP by the EU that considered engagement with moderate Islamists would represent a major step towards squaring its stability – democracy circle.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this paper, we give a new construction of resonant normal forms with a small remainder for near-integrable Hamiltonians at a quasi-periodic frequency. The construction is based on the special case of a periodic frequency, a Diophantine result concerning the approximation of a vector by independent periodic vectors and a technique of composition of periodic averaging. It enables us to deal with non-analytic Hamiltonians, and in this first part we will focus on Gevrey Hamiltonians and derive normal forms with an exponentially small remainder. This extends a result which was known for analytic Hamiltonians, and only in the periodic case for Gevrey Hamiltonians. As applications, we obtain an exponentially large upper bound on the stability time for the evolution of the action variables and an exponentially small upper bound on the splitting of invariant manifolds for hyperbolic tori, generalizing corresponding results for analytic Hamiltonians.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper is a sequel to ``Normal forms, stability and splitting of invariant manifolds I. Gevrey Hamiltonians", in which we gave a new construction of resonant normal forms with an exponentially small remainder for near-integrable Gevrey Hamiltonians at a quasi-periodic frequency, using a method of periodic approximations. In this second part we focus on finitely differentiable Hamiltonians, and we derive normal forms with a polynomially small remainder. As applications, we obtain a polynomially large upper bound on the stability time for the evolution of the action variables and a polynomially small upper bound on the splitting of invariant manifolds for hyperbolic tori.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Animals can often coordinate their actions to achieve mutually beneficial outcomes. However, this can result in a social dilemma when uncertainty about the behavior of partners creates multiple fitness peaks. Strategies that minimize risk ("risk dominant") instead of maximizing reward ("payoff dominant") are favored in economic models when individuals learn behaviors that increase their payoffs. Specifically, such strategies are shown to be "stochastically stable" (a refinement of evolutionary stability). Here, we extend the notion of stochastic stability to biological models of continuous phenotypes at a mutation-selection-drift balance. This allows us to make a unique prediction for long-term evolution in games with multiple equilibria. We show how genetic relatedness due to limited dispersal and scaled to account for local competition can crucially affect the stochastically-stable outcome of coordination games. We find that positive relatedness (weak local competition) increases the chance the payoff dominant strategy is stochastically stable, even when it is not risk dominant. Conversely, negative relatedness (strong local competition) increases the chance that strategies evolve that are neither payoff nor risk dominant. Extending our results to large multiplayer coordination games we find that negative relatedness can create competition so extreme that the game effectively changes to a hawk-dove game and a stochastically stable polymorphism between the alternative strategies evolves. These results demonstrate the usefulness of stochastic stability in characterizing long-term evolution of continuous phenotypes: the outcomes of multiplayer games can be reduced to the generic equilibria of two-player games and the effect of spatial structure can be analyzed readily.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Summary : Mining activities produce enormous amounts of waste material known as tailings which are composed of fine to medium size particles. These tailings often contain sulfides, which oxidation can lead to acid and metal contamination of water; therefore they need to be remediated. In this work a tailings bioremediation approach was investigated by an interdisciplinary study including geochemistry, mineralogy and microbiology. The aim of the work was to study the effect of the implementation of wetland above oxidizing tailings on the hydrogeology and the biogeochemical element cycles, and to assess the system evolution over time. To reach these goals, biogeochemical processes occurring in a marine shore tailings deposit were investigated. The studied tailings deposit is located at the Bahìa de Ite, Pacific Ocean, southern Peru, where between 1940 and 1996 the tailings were discharged from the two porphyry copper mines Cuajone and Toquepala. After the end of deposition, a remediation approach was initiated in 1997 with a wetland implementation above the oxidizing tailings. Around 90% of the tailings deposits (total 16 km2) were thus remediated, except the central delta area and some areas close to the shoreline. The multi-stable isotope study showed that the tailings were saturated with fresh water in spite of the marine setting, due to the high hydraulic gradient resulting from the wetland implementation. Submarine groundwater discharge (SGD) was the major source of SO4 2-, C1-, Na+, Fe2+, and Mn2+ input into the tailings at the original shelf-seawater interface. The geochemical study (aquatic geochemistry and X-Ray diffraction (XRD) and sequential extractions from the solid fraction) showed that iron and sulfur oxidation were the main processes in the non-remediated tailings, which showed a top a low-pH oxidation zone with strong accumulation of efflorescent salts at the surface due to capillary upward transport of heavy metals (Fe, Cu, Zn, Mn, Cd, Co, and Ni) in the arid climate. The study showed also that the implementation of the wetland resulted in very low concentrations of heavy metals in solution (mainly under the detection limit) due to the near neutral pH and more reducing conditions (100-150 mV). The heavy metals, which were taken from solution, precipitated as hydroxides and sulfides or were bound to organic matter. The bacterial community composition analysis by Terminal Restriction Fragment Length Polymorphism (T-RFLP) and cloning and sequencing of 16S rRNA genes combined with a detailed statistical analysis revealed a high correlation between the bacterial distribution and the geochemical variables. Acidophilic autotrophic oxidizing bacteria were dominating the oxidizing tailings, whereas neutrophilic and heterotrophic reducing bacteria were driving the biogeochemical processes in the remediated tailings below the wetland. At the subsurface of the remediated tailings, an iron cycling was highlighted with oxidation and reduction processes due to micro-aerophilic niches provided by the plant rhizosphere in this overall reducing environment. The in situ bioremediation experiment showed that the main parameter to take into account for the effectiveness was the water table and chemistry which controls the system. The constructed remediation cells were more efficient and rapid in metal removal when saturation conditions were available. This study showed that the bioremediation by wetland implementation could be an effective and rapid treatment for some sulfidic mine tailings deposits. However, the water saturation of the tailings has to be managed on a long-term basis in order to guarantee stability. Résumé : L'activité minière produit d'énormes quantités de déchets géologiques connus sous le nom de « tailings » composées de particules de taille fine à moyenne. Ces déchets contiennent souvent des sulfures dont l'oxydation conduit à la formation d'effluents acides contaminés en métaux, d'où la nécessité d'effectuer une remédiation des sites de stockage concernés. Le but de ce travail est dans un premier temps d'étudier l'effet de la bio-remédiation d'un dépôt de tailings oxydés sur l'hydrogéologie du système et les cycles biogéochimiques des éléments et en second lieu, d'évaluer l'évolution du processus de remédiation dans le temps. Le site étudié dans ce travail est situé dans la Bahía de Ite, au sud du Pérou, au bord de l'Océan Pacifique. Les déchets miniers en question sont déposés dans un environnement marin. De 1940 à 1996, les déchets de deux mines de porphyre cuprifère - Cuajone et Toquepala - ont été acheminés sur le site via la rivière Locumba. En 1997, une première remédiation a été initiée avec la construction d'une zone humide sur les tailings. Depuis, environ 90% de la surface du dépôt (16 km2) a été traité, les parties restantes étant la zone centrale du delta du Locumba et certaines zones proches de la plage. Malgré la proximité de l'océan, les études isotopiques menées dans le cadre de ce travail ont montré que les tailings étaient saturés en eau douce. Cette saturation est due à la pression hydraulique résultant de la mise en place des zones humides. Un écoulement d'eau souterrain sous-marin a été à détecté à l'interface entre les résidus et l'ancien fond marin. En raison de la géologie locale, il constitue une source d'entrée de SO4 2-, Cl-, Na+, FeZ+, et Mn2+ dans le système. L'analyse de la géochimie aquatique, la Diffraction aux Rayons X (XRD) et l'extraction séquentielle ont montré que l'oxydation du fer et .des sulfures est le principal processus se produisant dans les déchets non remédiés. Ceci a entraîné le développement d'une zone d'oxydation à pH bas induisant une forte accumulation des sels efflorescents, conséquence de la migration capillaire des métaux lourds (Fe, Cu, Zn, Mn, Cd, Co et Ni) de la solution vers la surface dans ce climat aride. Cette étude a montré également que la construction de la zone humide a eu comme résultats une précipitation des métaux dans des phases minérales en raison du pH neutre et des conditions réductrices (100-150mV). Les métaux lourds ont précipité sous la forme d'hydroxydes et de sulfures ou sont adsorbés à la matière organique. L'analyse de la composition de la communauté bactérienne à l'aide la technique T-RFLP (Terminal Restriction Fragment Length Polymorphism) et par le clonage/séquençage des gènes de l'ARNr 16S a été combinée à une statistique détaillée. Cette dernière a révélé une forte corrélation entre la distribution de bactéries spécifiques et la géochimie : Les bactéries autotrophes acidophiles dominent dans les déchets oxydés non remédiés, tandis que des bactéries hétérotrophes neutrophiles ont mené les processus microbiens dans les déchets remédiés sous la zone humide. Sous la surface de la zone humide, nos analyses ont également mis en évidence un cycle du fer par des processus d'oxydoréduction rendus possibles par la présence de niches micro-aérées par la rhizosphère dans cet environnement réducteur. L'expérience de bio-remédiation in situ a montré que les paramètres clés qui contrôlent l'efficacité du traitement sont le niveau de la nappe aquifère et la chimie de l'eau. Les cellules de remédiation se sont montrées plus efficaces et plus rapides lorsque le système a pu être saturé en eau. Finalement, cette étude a montré que la bio-remédiation de déchets miniers par la construction de zones humides est un moyen de traitement efficace, rapide et peu coûteux. Cependant, la saturation en eau du système doit être gérée sur le long terme afin de garantir la stabilité de l'ensemble du système.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In vivo dosimetry is a way to verify the radiation dose delivered to the patient in measuring the dose generally during the first fraction of the treatment. It is the only dose delivery control based on a measurement performed during the treatment. In today's radiotherapy practice, the dose delivered to the patient is planned using 3D dose calculation algorithms and volumetric images representing the patient. Due to the high accuracy and precision necessary in radiation treatments, national and international organisations like ICRU and AAPM recommend the use of in vivo dosimetry. It is also mandatory in some countries like France. Various in vivo dosimetry methods have been developed during the past years. These methods are point-, line-, plane- or 3D dose controls. A 3D in vivo dosimetry provides the most information about the dose delivered to the patient, with respect to ID and 2D methods. However, to our knowledge, it is generally not routinely applied to patient treatments yet. The aim of this PhD thesis was to determine whether it is possible to reconstruct the 3D delivered dose using transmitted beam measurements in the context of narrow beams. An iterative dose reconstruction method has been described and implemented. The iterative algorithm includes a simple 3D dose calculation algorithm based on the convolution/superposition principle. The methodology was applied to narrow beams produced by a conventional 6 MV linac. The transmitted dose was measured using an array of ion chambers, as to simulate the linear nature of a tomotherapy detector. We showed that the iterative algorithm converges quickly and reconstructs the dose within a good agreement (at least 3% / 3 mm locally), which is inside the 5% recommended by the ICRU. Moreover it was demonstrated on phantom measurements that the proposed method allows us detecting some set-up errors and interfraction geometry modifications. We also have discussed the limitations of the 3D dose reconstruction for dose delivery error detection. Afterwards, stability tests of the tomotherapy MVCT built-in onboard detector was performed in order to evaluate if such a detector is suitable for 3D in-vivo dosimetry. The detector showed stability on short and long terms comparable to other imaging devices as the EPIDs, also used for in vivo dosimetry. Subsequently, a methodology for the dose reconstruction using the tomotherapy MVCT detector is proposed in the context of static irradiations. This manuscript is composed of two articles and a script providing further information related to this work. In the latter, the first chapter introduces the state-of-the-art of in vivo dosimetry and adaptive radiotherapy, and explains why we are interested in performing 3D dose reconstructions. In chapter 2 a dose calculation algorithm implemented for this work is reviewed with a detailed description of the physical parameters needed for calculating 3D absorbed dose distributions. The tomotherapy MVCT detector used for transit measurements and its characteristics are described in chapter 3. Chapter 4 contains a first article entitled '3D dose reconstruction for narrow beams using ion chamber array measurements', which describes the dose reconstruction method and presents tests of the methodology on phantoms irradiated with 6 MV narrow photon beams. Chapter 5 contains a second article 'Stability of the Helical TomoTherapy HiArt II detector for treatment beam irradiations. A dose reconstruction process specific to the use of the tomotherapy MVCT detector is presented in chapter 6. A discussion and perspectives of the PhD thesis are presented in chapter 7, followed by a conclusion in chapter 8. The tomotherapy treatment device is described in appendix 1 and an overview of 3D conformai- and intensity modulated radiotherapy is presented in appendix 2. - La dosimétrie in vivo est une technique utilisée pour vérifier la dose délivrée au patient en faisant une mesure, généralement pendant la première séance du traitement. Il s'agit de la seule technique de contrôle de la dose délivrée basée sur une mesure réalisée durant l'irradiation du patient. La dose au patient est calculée au moyen d'algorithmes 3D utilisant des images volumétriques du patient. En raison de la haute précision nécessaire lors des traitements de radiothérapie, des organismes nationaux et internationaux tels que l'ICRU et l'AAPM recommandent l'utilisation de la dosimétrie in vivo, qui est devenue obligatoire dans certains pays dont la France. Diverses méthodes de dosimétrie in vivo existent. Elles peuvent être classées en dosimétrie ponctuelle, planaire ou tridimensionnelle. La dosimétrie 3D est celle qui fournit le plus d'information sur la dose délivrée. Cependant, à notre connaissance, elle n'est généralement pas appliquée dans la routine clinique. Le but de cette recherche était de déterminer s'il est possible de reconstruire la dose 3D délivrée en se basant sur des mesures de la dose transmise, dans le contexte des faisceaux étroits. Une méthode itérative de reconstruction de la dose a été décrite et implémentée. L'algorithme itératif contient un algorithme simple basé sur le principe de convolution/superposition pour le calcul de la dose. La dose transmise a été mesurée à l'aide d'une série de chambres à ionisations alignées afin de simuler la nature linéaire du détecteur de la tomothérapie. Nous avons montré que l'algorithme itératif converge rapidement et qu'il permet de reconstruire la dose délivrée avec une bonne précision (au moins 3 % localement / 3 mm). De plus, nous avons démontré que cette méthode permet de détecter certaines erreurs de positionnement du patient, ainsi que des modifications géométriques qui peuvent subvenir entre les séances de traitement. Nous avons discuté les limites de cette méthode pour la détection de certaines erreurs d'irradiation. Par la suite, des tests de stabilité du détecteur MVCT intégré à la tomothérapie ont été effectués, dans le but de déterminer si ce dernier peut être utilisé pour la dosimétrie in vivo. Ce détecteur a démontré une stabilité à court et à long terme comparable à d'autres détecteurs tels que les EPIDs également utilisés pour l'imagerie et la dosimétrie in vivo. Pour finir, une adaptation de la méthode de reconstruction de la dose a été proposée afin de pouvoir l'implémenter sur une installation de tomothérapie. Ce manuscrit est composé de deux articles et d'un script contenant des informations supplémentaires sur ce travail. Dans ce dernier, le premier chapitre introduit l'état de l'art de la dosimétrie in vivo et de la radiothérapie adaptative, et explique pourquoi nous nous intéressons à la reconstruction 3D de la dose délivrée. Dans le chapitre 2, l'algorithme 3D de calcul de dose implémenté pour ce travail est décrit, ainsi que les paramètres physiques principaux nécessaires pour le calcul de dose. Les caractéristiques du détecteur MVCT de la tomothérapie utilisé pour les mesures de transit sont décrites dans le chapitre 3. Le chapitre 4 contient un premier article intitulé '3D dose reconstruction for narrow beams using ion chamber array measurements', qui décrit la méthode de reconstruction et présente des tests de la méthodologie sur des fantômes irradiés avec des faisceaux étroits. Le chapitre 5 contient un second article intitulé 'Stability of the Helical TomoTherapy HiArt II detector for treatment beam irradiations'. Un procédé de reconstruction de la dose spécifique pour l'utilisation du détecteur MVCT de la tomothérapie est présenté au chapitre 6. Une discussion et les perspectives de la thèse de doctorat sont présentées au chapitre 7, suivies par une conclusion au chapitre 8. Le concept de la tomothérapie est exposé dans l'annexe 1. Pour finir, la radiothérapie «informationnelle 3D et la radiothérapie par modulation d'intensité sont présentées dans l'annexe 2.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The riboregulator RsmY of Pseudomonas fluorescens strain CHA0 is an example of small regulatory RNAs belonging to the global Rsm/Csr regulatory systems controlling diverse cellular processes such as glycogen accumulation, motility, or formation of extracellular products in various bacteria. By binding multiple molecules of the small regulatory protein RsmA, RsmY relieves the negative effect of RsmA on the translation of several target genes involved in the biocontrol properties of strain CHA0. RsmY and functionally related riboregulators have repeated GGA motifs predicted to be exposed in single-stranded regions, notably in the loops of hairpins. The secondary structure of RsmY was corroborated by in vivo cleavage with lead acetate. RsmY mutants lacking three or five (out of six) of the GGA motifs showed reduced ability to derepress the expression of target genes in vivo and failed to bind the RsmA protein efficiently in vitro. The absence of GGA motifs in RsmY mutants resulted in reduced abundance of these transcripts and in a shorter half-life (< or = 6 min as compared with 27 min for wild type RsmY). These results suggest that both the interaction of RsmY with RsmA and the stability of RsmY strongly depend on the GGA repeats and that the ability of RsmY to interact with small regulatory proteins such as RsmA may protect this RNA from degradation.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Background: With increasing computer power, simulating the dynamics of complex systems in chemistry and biology is becoming increasingly routine. The modelling of individual reactions in (bio)chemical systems involves a large number of random events that can be simulated by the stochastic simulation algorithm (SSA). The key quantity is the step size, or waiting time, τ, whose value inversely depends on the size of the propensities of the different channel reactions and which needs to be re-evaluated after every firing event. Such a discrete event simulation may be extremely expensive, in particular for stiff systems where τ can be very short due to the fast kinetics of some of the channel reactions. Several alternative methods have been put forward to increase the integration step size. The so-called τ-leap approach takes a larger step size by allowing all the reactions to fire, from a Poisson or Binomial distribution, within that step. Although the expected value for the different species in the reactive system is maintained with respect to more precise methods, the variance at steady state can suffer from large errors as τ grows. Results: In this paper we extend Poisson τ-leap methods to a general class of Runge-Kutta (RK) τ-leap methods. We show that with the proper selection of the coefficients, the variance of the extended τ-leap can be well-behaved, leading to significantly larger step sizes.Conclusions: The benefit of adapting the extended method to the use of RK frameworks is clear in terms of speed of calculation, as the number of evaluations of the Poisson distribution is still one set per time step, as in the original τ-leap method. The approach paves the way to explore new multiscale methods to simulate (bio)chemical systems.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In neurons, the regulation of microtubules plays an important role for neurite outgrowth, axonal elongation, and growth cone steering. SCG10 family proteins are the only known neuronal proteins that have a strong destabilizing effect, are highly enriched in growth cones and are thought to play an important role during axonal elongation. MAP1B, a microtubule-stabilizing protein, is found in growth cones as well, therefore it was important to test their effect on microtubules in the presence of both proteins. We used recombinant proteins in microtubule assembly assays and in transfected COS-7 cells to analyze their combined effects in vitro and in living cells, respectively. Individually, both proteins showed their expected activities in microtubule stabilization and destruction respectively. In MAP1B/SCG10 double-transfected cells, MAP1B could not protect microtubules from SCG10-induced disassembly in most cells, in particular not in cells that contained high levels of SCG10. This suggests that SCG10 is more potent to destabilize microtubules than MAP1B to rescue them. In microtubule assembly assays, MAP1B promoted microtubule formation at a ratio of 1 MAP1B per 70 tubulin dimers while a ratio of 1 SCG10 per two tubulin dimers was needed to destroy microtubules. In addition to its known binding to tubulin dimers, SCG10 binds also to purified microtubules in growth cones of dorsal root ganglion neurons in culture. In conclusion, neuronal microtubules are regulated by antagonistic effects of MAP1B and SCG10 and a fine tuning of the balance of these proteins may be critical for the regulation of microtubule dynamics in growth cones.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Stability berms are commonly constructed where roadway embankments cross soft or unstable ground conditions. Under certain circumstances, the construction of stability berms cause unfavorable environmental impacts, either directly or indirectly, through their effect on wetlands, endangered species habitat, stream channelization, longer culvert lengths, larger right-of-way purchases, and construction access limits. Due to an ever more restrictive regulatory environment, these impacts are problematic. The result is the loss of valuable natural resources to the public, lengthy permitting review processes for the department of transportation and permitting agencies, and the additional expenditures of time and money for all parties. The purpose of this project was to review existing stability berm alternatives for potential use in environmentally sensitive areas. The project also evaluates how stabilization technologies are made feasible, desirable, and cost-effective for transportation projects and determines which alternatives afford practical solutions for avoiding and minimizing impacts to environmentally sensitive areas. An online survey of engineers at state departments of transportation was also conducted to assess the frequency and cost effectiveness of the various stabilization technologies. Geotechnical engineers that responded to the survey overwhelmingly use geosynthetic reinforcement as a suitable and cost-effective solution for stabilizing embankments and cut slopes. Alternatively, chemical stabilization and installation of lime/cement columns is rarely a remediation measure employed by state departments of transportation.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper retakes previous work of the authors, about the relationship between non-quasi-competitiveness (the increase in price caused by an increase in the number of oligopolists) and stability of the equilibrium in the classical Cournot oligopoly model. Though it has been widely accepted in the literature that the loss of quasi-competitiveness is linked, in the long run as new firms entered the market, to instability of the model, the authors in their previous work put forward a model in which a situation of monopoly changed to duopoly losing quasi-competitiveness but maintaining the stability of the equilibrium. That model could not, at the time, be extended to any number of oligopolists. The present paper exhibits such an extension. An oligopoly model is shown in which the loss of quasi-competitiveness resists the presence in the market of as many firms as one wishes and where the successive Cournot's equilibrium points are unique and asymptotically stable. In this way, for the first time, the conjecture that non-quasi- competitiveness and instability were equivalent in the long run, is proved false.