33 resultados para Stable And Unstable Manifolds
em Consorci de Serveis Universitaris de Catalunya (CSUC), Spain
Resumo:
In this paper, we develop numerical algorithms that use small requirements of storage and operations for the computation of invariant tori in Hamiltonian systems (exact symplectic maps and Hamiltonian vector fields). The algorithms are based on the parameterization method and follow closely the proof of the KAM theorem given in [LGJV05] and [FLS07]. They essentially consist in solving a functional equation satisfied by the invariant tori by using a Newton method. Using some geometric identities, it is possible to perform a Newton step using little storage and few operations. In this paper we focus on the numerical issues of the algorithms (speed, storage and stability) and we refer to the mentioned papers for the rigorous results. We show how to compute efficiently both maximal invariant tori and whiskered tori, together with the associated invariant stable and unstable manifolds of whiskered tori. Moreover, we present fast algorithms for the iteration of the quasi-periodic cocycles and the computation of the invariant bundles, which is a preliminary step for the computation of invariant whiskered tori. Since quasi-periodic cocycles appear in other contexts, this section may be of independent interest. The numerical methods presented here allow to compute in a unified way primary and secondary invariant KAM tori. Secondary tori are invariant tori which can be contracted to a periodic orbit. We present some preliminary results that ensure that the methods are indeed implementable and fast. We postpone to a future paper optimized implementations and results on the breakdown of invariant tori.
Resumo:
In two previous papers [J. Differential Equations, 228 (2006), pp. 530 579; Discrete Contin. Dyn. Syst. Ser. B, 6 (2006), pp. 1261 1300] we have developed fast algorithms for the computations of invariant tori in quasi‐periodic systems and developed theorems that assess their accuracy. In this paper, we study the results of implementing these algorithms and study their performance in actual implementations. More importantly, we note that, due to the speed of the algorithms and the theoretical developments about their reliability, we can compute with confidence invariant objects close to the breakdown of their hyperbolicity properties. This allows us to identify a mechanism of loss of hyperbolicity and measure some of its quantitative regularities. We find that some systems lose hyperbolicity because the stable and unstable bundles approach each other but the Lyapunov multipliers remain away from 1. We find empirically that, close to the breakdown, the distances between the invariant bundles and the Lyapunov multipliers which are natural measures of hyperbolicity depend on the parameters, with power laws with universal exponents. We also observe that, even if the rigorous justifications in [J. Differential Equations, 228 (2006), pp. 530-579] are developed only for hyperbolic tori, the algorithms work also for elliptic tori in Hamiltonian systems. We can continue these tori and also compute some bifurcations at resonance which may lead to the existence of hyperbolic tori with nonorientable bundles. We compute manifolds tangent to nonorientable bundles.
Resumo:
Results for elastic electron scattering by nuclei, calculated with charge densities of Skyrme forces and covariant effective Lagrangians that accurately describe nuclear ground states, are compared against experiment in stable isotopes. Dirac partial-wave calculations are performed with an adapted version of the ELSEPA package. Motivated by the fact that studies of electron scattering off exotic nuclei are intended in future facilities in the commissioned GSI and RIKEN upgrades, we survey the theoretical predictions from neutron-deficient to neutron-rich isotopes in the tin and calcium isotopic chains. The charge densities of a covariant interaction that describes the low-energy electromagnetic structure of the nucleon within the Lagrangian of the theory are used to this end. The study is restricted to medium- and heavy-mass nuclei because the charge densities are computed in mean-field approach. Because the experimental analysis of scattering data commonly involves parameterized charge densities, as a surrogate exercise for the yet unexplored exotic nuclei, we fit our calculated mean-field densities with Helm model distributions. This procedure turns out to be helpful to study the neutron-number variation of the scattering observables and allows us to identify correlations of potential interest among some of these observables within the isotopic chains.
Resumo:
We extend Jackson and Watts's (2002) result on the coincidence of S-stochastically stable and core stable networks from marriage problems to roommate problems. In particular, we show that the existence of a side-optimal core stable network, on which the proof of Jackson and Watts (2002) hinges, is not crucial for their result.
Resumo:
Scarcities of environmental services are no longer merely a remote hypothesis. Consequently, analysis of their inequalities between nations becomes of paramount importance for the achievement of sustainability in terms either of international policy, or of Universalist ethical principles of equity. This paper aims, on the one hand, at revising methodological aspects of the inequality measurement of certain environmental data and, on the other, at extending the scarce empirical evidence relating to the international distribution of Ecological Footprint (EF), by using a longer EF time series. Most of the techniques currently important in the literature are revised and then tested on EF data with interesting results. We look in depth at Lorenz dominance analyses and consider the underlying properties of different inequality indices. Those indices which fit best with environmental inequality measurements are CV2 and GE(2) because of their neutrality property, however a trade-off may occur when subgroup decompositions are performed. A weighting factor decomposition method is proposed in order to isolate weighting factor changes in inequality growth rates. Finally, the only non-ambiguous way of decomposing inequality by source is the natural decomposition of CV2, which additionally allows the interpretation of marginal term contributions. Empirically, this paper contributes to the environmental inequality measurement of EF: this inequality has been quite stable and its change over time is due to per capita vector changes rather than population changes. Almost the entirety of the EF inequality is explainable by differences in the means between the countries of the World Bank group. This finding suggests that international environmental agreements should be attempted on a regional basis in an attempt to achieve greater consensus between the parties involved. Additionally, source decomposition warns of the dangers of confining CO2 emissions reduction to crop-based energies because of the implications for basic needs satisfaction.
Resumo:
Scarcities of environmental services are no longer merely a remote hypothesis. Consequently, analysis of their inequalities between nations becomes of paramount importance for the achievement of sustainability in terms either of international policy, or of Universalist ethical principles of equity. This paper aims, on the one hand, at revising methodological aspects of the inequality measurement of certain environmental data and, on the other, at extending the scarce empirical evidence relating to the international distribution of Ecological Footprint (EF), by using a longer EF time series. Most of the techniques currently important in the literature are revised and then tested on EF data with interesting results. We look in depth at Lorenz dominance analyses and consider the underlying properties of different inequality indices. Those indices which fit best with environmental inequality measurements are CV2 and GE(2) because of their neutrality property, however a trade-off may occur when subgroup decompositions are performed. A weighting factor decomposition method is proposed in order to isolate weighting factor changes in inequality growth rates. Finally, the only non-ambiguous way of decomposing inequality by source is the natural decomposition of CV2, which additionally allows the interpretation of marginal term contributions. Empirically, this paper contributes to the environmental inequality measurement of EF: this inequality has been quite stable and its change over time is due to per capita vector changes rather than population changes. Almost the entirety of the EF inequality is explainable by differences in the means between the countries of the World Bank group. This finding suggests that international environmental agreements should be attempted on a regional basis in an attempt to achieve greater consensus between the parties involved. Additionally, source decomposition warns of the dangers of confining CO2 emissions reduction to crop-based energies because of the implications for basic needs satisfaction. Keywords: ecological footprint; ecological inequality measurement, inequality decomposition.
Resumo:
Resorting to four waves of the European Community Household Panel, this research explores the association between temporary employment and the likelihood of being over-educated. Such an association has been largely ignored by the literature explaining over-education, more inclined to attribute such a mismatch to the system of education. Selecting three similarly standarised and stratified systems of education (France, Italy and Spain) and controlling for many other variables likely to affect over-education, like gender, age, tenure, job change, firm size or sector, the paper demonstrates that such an association between temporary employment and over-education exists. Being a stepping stone towards a more stable and adjusted position in the labour market, holding a temporary employment may be associated to a higher likelihood of being over-educated. Such an association is more likely in Italy and France. Yet, the opposite sign prevails where permanent employment becomes such a valuable asset as to make individuals trade human capital by employment security. This is the case of Spain.
Resumo:
We propose a criterion for the validity of semiclassical gravity (SCG) which is based on the stability of the solutions of SCG with respect to quantum metric fluctuations. We pay special attention to the two-point quantum correlation functions for the metric perturbations, which contain both intrinsic and induced fluctuations. These fluctuations can be described by the Einstein-Langevin equation obtained in the framework of stochastic gravity. Specifically, the Einstein-Langevin equation yields stochastic correlation functions for the metric perturbations which agree, to leading order in the large N limit, with the quantum correlation functions of the theory of gravity interacting with N matter fields. The homogeneous solutions of the Einstein-Langevin equation are equivalent to the solutions of the perturbed semiclassical equation, which describe the evolution of the expectation value of the quantum metric perturbations. The information on the intrinsic fluctuations, which are connected to the initial fluctuations of the metric perturbations, can also be retrieved entirely from the homogeneous solutions. However, the induced metric fluctuations proportional to the noise kernel can only be obtained from the Einstein-Langevin equation (the inhomogeneous term). These equations exhibit runaway solutions with exponential instabilities. A detailed discussion about different methods to deal with these instabilities is given. We illustrate our criterion by showing explicitly that flat space is stable and a description based on SCG is a valid approximation in that case.
Resumo:
Triheptanoin-enriched diets have been successfully used in the experimental treatment of various metabolic disorders. Maximal therapeutic effect is achieved in the context of a ketogenic diet where triheptanoin oil provides 3040% of the daily caloric intake. However, pre-clinical studies using triheptanoin-rich diets are hindered by the difficulty of administering to laboratory animals as a solid foodstuff. In the present study, we successfully synthesized triheptanoin to the highest standards of purity from glycerol and heptanoic acid, using sulfonated charcoal as a catalyst. Triheptanoin oil was then formulated as a solid, stable and palatable preparation using a ketogenic base and a combination of four commercially available formulation agents: hydrophilic fumed silica, hydrophobic fumed silica, microcrystalline cellulose, and talc. Diet compliance and safety was tested on C57Bl/6 mice over a 15-week period, comparing overall status and body weight change. Practical applications: This work provides a complete description of (i) an efficient and cost-effective synthesis of triheptanoin and (ii) its formulation as a solid, stable, and palatable ketogenic diet (triheptanoin-rich; 39% of the caloric intake) for rodents. Triheptanoin-rich diets will be helpful on pre-clinical experiments testing the therapeutic efficacy of triheptanoin in different rodent models of human diseases. In addition, using the same solidification procedure, other oils could be incorporated into rodent ketogenic diet to study their dosage and long-term effects on mammal health and development. This approach could be extremely valuable as ketogenic diet is widely used clinically for epilepsy treatment.
Resumo:
Triheptanoin-enriched diets have been successfully used in the experimental treatment of various metabolic disorders. Maximal therapeutic effect is achieved in the context of a ketogenic diet where triheptanoin oil provides 3040% of the daily caloric intake. However, pre-clinical studies using triheptanoin-rich diets are hindered by the difficulty of administering to laboratory animals as a solid foodstuff. In the present study, we successfully synthesized triheptanoin to the highest standards of purity from glycerol and heptanoic acid, using sulfonated charcoal as a catalyst. Triheptanoin oil was then formulated as a solid, stable and palatable preparation using a ketogenic base and a combination of four commercially available formulation agents: hydrophilic fumed silica, hydrophobic fumed silica, microcrystalline cellulose, and talc. Diet compliance and safety was tested on C57Bl/6 mice over a 15-week period, comparing overall status and body weight change. Practical applications: This work provides a complete description of (i) an efficient and cost-effective synthesis of triheptanoin and (ii) its formulation as a solid, stable, and palatable ketogenic diet (triheptanoin-rich; 39% of the caloric intake) for rodents. Triheptanoin-rich diets will be helpful on pre-clinical experiments testing the therapeutic efficacy of triheptanoin in different rodent models of human diseases. In addition, using the same solidification procedure, other oils could be incorporated into rodent ketogenic diet to study their dosage and long-term effects on mammal health and development. This approach could be extremely valuable as ketogenic diet is widely used clinically for epilepsy treatment.
Resumo:
Triheptanoin-enriched diets have been successfully used in the experimental treatment of various metabolic disorders. Maximal therapeutic effect is achieved in the context of a ketogenic diet where triheptanoin oil provides 3040% of the daily caloric intake. However, pre-clinical studies using triheptanoin-rich diets are hindered by the difficulty of administering to laboratory animals as a solid foodstuff. In the present study, we successfully synthesized triheptanoin to the highest standards of purity from glycerol and heptanoic acid, using sulfonated charcoal as a catalyst. Triheptanoin oil was then formulated as a solid, stable and palatable preparation using a ketogenic base and a combination of four commercially available formulation agents: hydrophilic fumed silica, hydrophobic fumed silica, microcrystalline cellulose, and talc. Diet compliance and safety was tested on C57Bl/6 mice over a 15-week period, comparing overall status and body weight change. Practical applications: This work provides a complete description of (i) an efficient and cost-effective synthesis of triheptanoin and (ii) its formulation as a solid, stable, and palatable ketogenic diet (triheptanoin-rich; 39% of the caloric intake) for rodents. Triheptanoin-rich diets will be helpful on pre-clinical experiments testing the therapeutic efficacy of triheptanoin in different rodent models of human diseases. In addition, using the same solidification procedure, other oils could be incorporated into rodent ketogenic diet to study their dosage and long-term effects on mammal health and development. This approach could be extremely valuable as ketogenic diet is widely used clinically for epilepsy treatment.
Resumo:
Triheptanoin-enriched diets have been successfully used in the experimental treatment of various metabolic disorders. Maximal therapeutic effect is achieved in the context of a ketogenic diet where triheptanoin oil provides 30-40% of the daily caloric intake. However, pre-clinical studies using triheptanoin-rich diets are hindered by the difficulty of administering to laboratory animals as a solid foodstuff. In the present study, we successfully synthesized triheptanoin to the highest standards of purity from glycerol and heptanoic acid, using sulfonated charcoal as a catalyst. Triheptanoin oil was then formulated as a solid, stable and palatable preparation using a ketogenic base and a combination of four commercially available formulation agents: hydrophilic fumed silica, hydrophobic fumed silica, microcrystalline cellulose, and talc. Diet compliance and safety was tested on C57Bl/6 mice over a 15-week period, comparing overall status and body weight change. Practical applications: This work provides a complete description of (i) an efficient and cost-effective synthesis of triheptanoin and (ii) its formulation as a solid, stable, and palatable ketogenic diet (triheptanoin-rich; 39% of the caloric intake) for rodents. Triheptanoin-rich diets will be helpful on pre-clinical experiments testing the therapeutic efficacy of triheptanoin in different rodent models of human diseases. In addition, using the same solidification procedure, other oils could be incorporated into rodent ketogenic diet to study their dosage and long-term effects on mammal health and development. This approach could be extremely valuable as ketogenic diet is widely used clinically for epilepsy treatment.
Resumo:
Triheptanoin-enriched diets have been successfully used in the experimental treatment of various metabolic disorders. Maximal therapeutic effect is achieved in the context of a ketogenic diet where triheptanoin oil provides 30-40% of the daily caloric intake. However, pre-clinical studies using triheptanoin-rich diets are hindered by the difficulty of administering to laboratory animals as a solid foodstuff. In the present study, we successfully synthesized triheptanoin to the highest standards of purity from glycerol and heptanoic acid, using sulfonated charcoal as a catalyst. Triheptanoin oil was then formulated as a solid, stable and palatable preparation using a ketogenic base and a combination of four commercially available formulation agents: hydrophilic fumed silica, hydrophobic fumed silica, microcrystalline cellulose, and talc. Diet compliance and safety was tested on C57Bl/6 mice over a 15-week period, comparing overall status and body weight change. Practical applications: This work provides a complete description of (i) an efficient and cost-effective synthesis of triheptanoin and (ii) its formulation as a solid, stable, and palatable ketogenic diet (triheptanoin-rich; 39% of the caloric intake) for rodents. Triheptanoin-rich diets will be helpful on pre-clinical experiments testing the therapeutic efficacy of triheptanoin in different rodent models of human diseases. In addition, using the same solidification procedure, other oils could be incorporated into rodent ketogenic diet to study their dosage and long-term effects on mammal health and development. This approach could be extremely valuable as ketogenic diet is widely used clinically for epilepsy treatment.
Resumo:
The deuteric alteration processes undergone by the granites of the Ricobayo Batholith were: microclinization, chloritization, albitization, muscovitization, tourmalinization and garnetization. These processes must be interpreted in a dynamic context so that the different reactions that take place are the consequence of a successive interaction between rock and fluids. The physicochemical conditions deduced from these fluids are: temperature lower than 600 OC, pressure between 1.5 and 1 Kb, fugacity of oxygen between 10-25 and 10-35 bars, fugacity of sulphur lower than 10-l5 bars, the composition was kept stable and their log (a(K+)/a(Ht)) and log (a(Na+)/a(H+)) varied between 3.8 and 3.2 and between 3.5 and 4.6, respectively, and the pH of the fluids was higher than 5 during the microclinization, muscovitization and tourmalinization, and lower during chloritization and albitization. The deposition of cassiterite occurs with pH episodes that exceed 5.
Resumo:
We describe here the construction of a delivery system for stable and directed insertion of gene constructs in a permissive chromosomal site of the bacterial wilt pathogen Ralstonia solanacearum. The system consists of a collection of suicide vectors the Ralstonia chromosome (pRC) series that carry an integration element flanked by transcription terminators and two sequences of homology to the chromosome of strain GMI1000, where the integration element is inserted through a double recombination event. Unique restriction enzyme sites and a GATEWAY cassette enable cloning of any promoter::gene combination in the integration element. Variants endowed with different selectable antibiotic resistance genes and promoter::gene combinations are described. We show that the system can be readily used in GMI1000 and adapted to other R. solanacearum strains using an accessory plasmid. We prove that the pRC system can be employed to complement a deletion mutation with a single copy of the native gene, and to measure transcription of selected promoters in monocopy both in vitro and in planta. Finally, the system has been used to purify and study secretion type III effectors. These novel genetic tools will be particularly useful for the construction of recombinant bacteria that maintain inserted genes or reporter fusions in competitive situations (i.e., during plant infection).