933 resultados para Set of Weak Stationary Dynamic Actions
Resumo:
Las actividades agropecuarias ejercen diferentes presiones sobre los recursos naturales. Esto ha llevado, en algunas áreas, a un deterioro del suelo que provoca un impacto sobre la sustentabilidad en los sistemas agropecuarios. Para evaluar la degradación del suelo se han propuesto listas de indicadores, sin embargo, se carece de una herramienta metodológica robusta, adaptada a las condiciones edafoclimáticas regionales. Además, existe una demanda de productores e instituciones interesados en orientar acciones para preservar el suelo. El objetivo de este proyecto es evaluar la degradación física, química y biológica de los suelos en agroecosistemas del centro-sur de Córdoba. Por ello se propone desarrollar una herramienta metodológica que consiste en un set de indicadores físicos, químicos y biológicos, con valores umbrales, integrados en índices de degradación, que asistan a los agentes tomadores de decisiones y productores, en la toma de decisiones respecto de la degradación del suelo. El área de trabajo será una región agrícola del centro-sur de Córdoba con más de 100 años de agricultura. La metodología comienza con la caracterización del uso del territorio y sistemas de manejo, su clasificación y la obtención de mapas base de usos y manejos, mediante sensores remotos y encuestas. Se seleccionarán sitios de muestreo mediante una metodología semi-dirigida usando un SIG, asegurando un mínimo de un punto de muestreo por unidad de mapeo. Se elegirán sitios de referencia lo más cercano a una condición natural. Los indicadores a evaluar surgen de listas propuestas en trabajos previos del grupo, seleccionados en base a criterios internacionales y a adecuados a suelos de la región. Se usarán indicadores núcleo y complementarios. Para la obtención de umbrales, se usarán por un lado valores provenientes de la bibliografía y por otro, umbrales generados a partir de la distribución estadística del indicador en suelos de referencia. Para estandarizar cada indicador se definirá una función de transformación. Luego serán ponderarán mediante análisis estadísticos mulivariados e integrados en índices de degradación física, química y biológica, y un índice general de degradación. El abordaje concluirá con el desarrollo de dos instrumentos para la toma de decisiones: uno a escala regional, que consistirá en mapas de degradación en base a unidades cartográficas ambientales, de uso del territorio y de sistemas de manejo y otro a escala predial que informará sobre la degradación del suelo de un lote en particular, en comparación con suelos de referencia. Los actores interesados contarán con herramientas robustas para la toma de decisiones respecto de la degradación del suelo tanto a escala regional como local. Agricultural activities exert different pressures on natural resources. In some areas this has led to soil degradation and has an impact on agricultural sustainability. To assess soil degradation a robust methodological tool, adapted to regional soil and climatic conditions, is lacking. In addition, there is a demand from farmers and institutions interested in direct actions to preserve the soil. The objective of this project is to assess physical, chemical and biological soil degradation in agroecosystems of Córdoba. We propose to develop a tool that consists of a set of physical, chemical and biological indicators, with threshold values, integrated in soil degradation indices. The study area is a region with more than 100 years of agriculture. The methodology begins with the characterization of land use and management systems and the obtaining of base maps by means of remote sensing and survey. Sampling sites will be selected through a semi-directed methodology using GIS, ensuring at least one sampling point by mapping unit. Reference sites will be chosen as close to a natural condition. The proposed indicators emerge from previous works of the group, selected based on international standards and appropriate for the local soils. To obtain the thresholds, we will use, by one side, values from the literature, and by the other, values generated from the statistical distribution of the indicator in the reference soils. To standardize indicators transformation functions will be defined. Indicators will be weighted by mans of multivariate analysis and integrated in soil degradation indices. The approach concluded with the development of two instruments for decision making: a regional scale one, consisting in degradation maps based on environmental, land use and management systems mapping units; and an instrument at a plot level which will report on soil degradation of a particular plot compared to reference soils.
Resumo:
Information on the breeding biology of birds is essential for improving avian life-history theory and implementing sound management and conservation actions for these organisms. Comprehensive reviews of this kind of information are lacking for most Neotropical regions, including Rio Grande do Sul, the southernmost Brazilian state. Aiming to update the knowledge on the reproductive status of birds in Rio Grande do Sul, we reviewed breeding records of all potential breeding species recorded in the state using a set of predefined, restrictive criteria for accepting breeding evidences as effective. Data satisfying our criteria were available for 165 species in the literature. We also collected novel breeding information obtained in the state for an additional 126 species, including observations for several species whose reproductive biology is poorly known. Among these are birds previously unknown to breed in Brazil. This new data and the critical review of the previous information resulted in a total of 291 species for which breeding evidences are accepted as effective. This corresponds to 54.7% of the 532 species considered either confirmed or potential breeders in the state. In addition to providing information on nesting dates, clutch size, nest architecture and breeding behavior of south Brazilian birds, our review serves as a benchmark for the adequate assessment of avian breeding records elsewhere. We hope to stimulate observers to rigorously document breeding events, especially for taxa for which basic information is lacking.
Resumo:
Existing empirical evidence suggests that the Uncovered Interest Rate Parity (UIRP) condition may not hold due to an exchange risk premium. For a panel data set of eleven emerging European economies we decompose this exchange risk premium into an idiosyncratic (country-specific) elements and a common factor using a principal components approach. We present evidence of a stationary idiosyncratic component and nonstationary common factor. This result leads to the conclusion of a nonstationary risk premium for these countries and a violation of the UIRP in the long-run, which is in contrast to previous studies often documenting a stationary premium in developed countries. Furthermore, we report that the variation in the premium is largely attributable to a common factor influenced by economic developments in the United States.
Resumo:
In this paper, I look at the interaction between social learning and cooperative behavior. I model this using a social dilemma game with publicly observed sequential actions and asymmetric information about pay offs. I find that some informed agents in this model act, individually and without collusion, to conceal the privately optimal action. Because the privately optimal action is socially costly the behavior of informed agents can lead to a Pareto improvement in a social dilemma. In my model I show that it is possible to get cooperative behavior if information is restricted to a small but non-zero proportion of the population. Moreover, such cooperative behavior occurs in a finite setting where it is public knowledge which agent will act last. The proportion of cooperative agents within the population can be made arbitrarily close to 1 by increasing the finite number of agents playing the game. Finally, I show that under a broad set of conditions that it is a Pareto improvement on a corner value, in the ex-ante welfare sense, for an interior proportion of the population to be informed.
Resumo:
An active, solvent-free solid sampler was developed for the collection of 1,6-hexamethylene diisocyanate (HDI) aerosol and prepolymers. The sampler was made of a filter impregnated with 1-(2-methoxyphenyl)piperazine contained in a filter holder. Interferences with HDI were observed when a set of cellulose acetate filters and a polystyrene filter holder were used; a glass fiber filter and polypropylene filter cassette gave better results. The applicability of the sampling and analytical procedure was validated with a test chamber, constructed for the dynamic generation of HDI aerosol and prepolymers in commercial two-component spray paints (Desmodur(R) N75) used in car refinishing. The particle size distribution, temporal stability, and spatial uniformity of the simulated aerosol were established in order to test the sample. The monitoring of aerosol concentrations was conducted with the solid sampler paired to the reference impinger technique (impinger flasks contained 10 mL of 0.5 mg/mL 1-(2-methoxyphenyl)piperazine in toluene) under a controlled atmosphere in the test chamber. Analyses of derivatized HDI and prepolymers were carried out by using high-performance liquid chromatography and ultraviolet detection. The correlation between the solvent-free and the impinger techniques appeared fairly good (Y = 0.979X - 0.161; R = 0.978), when the tests were conducted in the range of 0.1 to 10 times the threshold limit value (TLV) for HDI monomer and up to 60-mu-g/m3 (3 U.K. TLVs) for total -N = C = O groups.
Resumo:
STAT transcription factors are expressed in many cell types and bind to similar sequences. However, different STAT gene knock-outs show very distinct phenotypes. To determine whether differences between the binding specificities of STAT proteins account for these effects, we compared the sequences bound by STAT1, STAT5A, STAT5B, and STAT6. One sequence set was selected from random oligonucleotides by recombinant STAT1, STAT5A, or STAT6. For another set including many weak binding sites, we quantified the relative affinities to STAT1, STAT5A, STAT5B, and STAT6. We compared the results to the binding sites in natural STAT target genes identified by others. The experiments confirmed the similar specificity of different STAT proteins. Detailed analysis indicated that STAT5A specificity is more similar to that of STAT6 than that of STAT1, as expected from the evolutionary relationships. The preference of STAT6 for sites in which the half-palindromes (TTC) are separated by four nucleotides (N(4)) was confirmed, but analysis of weak binding sites showed that STAT6 binds fairly well to N(3) sites. As previously reported, STAT1 and STAT5 prefer N(3) sites; however, STAT5A, but not STAT1, weakly binds N(4) sites. None of the STATs bound to half-palindromes. There were no specificity differences between STAT5A and STAT5B.
Resumo:
ICEclc is a mobile genetic element found in two copies on the chromosome of the bacterium Pseudomonas knackmussii B13. ICEclc harbors genes encoding metabolic pathways for the degradation of chlorocatechols (CLC) and 2-aminophenol (2AP). At low frequencies, ICEclc excises from the chromosome, closes into a circular DNA molecule which can transfer to another bacterium via conjugation. Once in the recipient cell, ICEclc can reintegrate into the chromosome by site-specific recombination. This thesis aimed at identifying the regulatory network underlying the decisions for ICEclc horizontal transfer (HGT). The first chapter is an introduction on integrative and conjugative elements (ICEs) more in general, of which ICEclc is one example. In particular I emphasized the current knowledge of regulation and conjugation machineries of the different classes of ICE. In the second chapter, I describe a transcriptional analysis using microarrays and other experiments to understand expression of ICEclc in exponential and stationary phase. By overlaying transcriptomic profiles with Northern hybridizations and RT- PCR data, we established a transcription map for the entire core region of ICEclc, a region assumed to encode the ICE conjugation process. We also demonstrated how transcription of the ICEclc core is maximal in stationary phase, which correlates to expression of reporter genes fused to key ICEclc promoters. In the third chapter, I present a transcriptome analysis of ICEclc in a variety of different host species, in order to explore whether there are species-specific differences. In the fourth chapter, I focus on the role of a curious ICEclc-encoded TetR-type transcriptional repressor. We find that this gene, which we name mfsR, not only controls its own expression but that of a set of genes for a putative multi-drug efflux pump (mfsABC) as well. By using a combination of biochemical and molecular biology techniques, I could show that MfsR specifically binds to operator boxes in two ICEclc promoters (PmfsR and PmfsA), inhibiting the transcription of both the mfsR and mfsABC-orf38184 operons. Although we could not detect a clear phenotype of an mfsABC deletion, we discuss the implications of pump gene reorganizations in ICEclc and close relatives. In the fifth chapter, we find that mfsR not only controls its own expression and that of the mfsABC operon, but is also indirectly controlling ICEclc transfer. Using gene deletions, microarrays, transfer assays and microscopy-based reporter fusions, we demonstrate that mfsR actually controls a small operon of three regulatory genes. The last gene of this mfsR operon, orf17162, encodes a LysR-type activator that when deleted strongly impairs ICEclc transfer. Interestingly, deletion of mfsR leads to transfer competence in almost all cells, thereby overruling the bistability process in the wild-type. In the final sixth chapter, I discuss the relevance of the present thesis and the resulting perspectives for future studies.
Resumo:
The weak selection approximation of population genetics has made possible the analysis of social evolution under a considerable variety of biological scenarios. Despite its extensive usage, the accuracy of weak selection in predicting the emergence of altruism under limited dispersal when selection intensity increases remains unclear. Here, we derive the condition for the spread of an altruistic mutant in the infinite island model of dispersal under a Moran reproductive process and arbitrary strength of selection. The simplicity of the model allows us to compare weak and strong selection regimes analytically. Our results demonstrate that the weak selection approximation is robust to moderate increases in selection intensity and therefore provides a good approximation to understand the invasion of altruism in spatially structured population. In particular, we find that the weak selection approximation is excellent even if selection is very strong, when either migration is much stronger than selection or when patches are large. Importantly, we emphasize that the weak selection approximation provides the ideal condition for the invasion of altruism, and increasing selection intensity will impede the emergence of altruism. We discuss that this should also hold for more complicated life cycles and for culturally transmitted altruism. Using the weak selection approximation is therefore unlikely to miss out on any demographic scenario that lead to the evolution of altruism under limited dispersal.
Resumo:
Background: Hirschsprung disease is characterized by the absence of intramural ganglion cells in the enteric plexuses, due to a fail during enteric nervous system formation. Hirschsprung has a complex genetic aetiology and mutations in several genes have been related to the disease. There is a clear predominance of missense/nonsense mutations in these genes whereas copy number variations (CNVs) have been seldom described, probably due to the limitations of conventional techniques usually employed for mutational analysis. In this study, we have looked for CNVs in some of the genes related to Hirschsprung (EDNRB, GFRA1, NRTN and PHOX2B) using the Multiple Ligation-dependent Probe Amplification (MLPA) approach. Methods: CNVs screening was performed in 208 HSCR patients using a self-designed set of MLPA probes, covering the coding region of those genes. Results: A deletion comprising the first 4 exons in GFRA1 gene was detected in 2 sporadic HSCR patients and in silico approaches have shown that the critical translation initiation signal in the mutant gene was abolished. In this study, we have been able to validate the reliability of this technique for CNVs screening in HSCR. Conclusions: The implemented MLPA based technique presented here allows CNV analysis of genes involved in HSCR that have not been not previously evaluated. Our results indicate that CNVs could be implicated in the pathogenesis of HSCR, although they seem to be an uncommon molecular cause of HSCR.
Resumo:
The introduction of engineered nanostructured materials into a rapidly increasing number of industrial and consumer products will result in enhanced exposure to engineered nanoparticles. Workplace exposure has been identified as the most likely source of uncontrolled inhalation of engineered aerosolized nanoparticles, but release of engineered nanoparticles may occur at any stage of the lifecycle of (consumer) products. The dynamic development of nanomaterials with possibly unknown toxicological effects poses a challenge for the assessment of nanoparticle induced toxicity and safety.In this consensus document from a workshop on in-vitro cell systems for nanoparticle toxicity testing11Workshop on 'In-Vitro Exposure Studies for Toxicity Testing of Engineered Nanoparticles' sponsored by the Association for Aerosol Research (GAeF), 5-6 September 2009, Karlsruhe, Germany. an overview is given of the main issues concerning exposure to airborne nanoparticles, lung physiology, biological mechanisms of (adverse) action, in-vitro cell exposure systems, realistic tissue doses, risk assessment and social aspects of nanotechnology. The workshop participants recognized the large potential of in-vitro cell exposure systems for reliable, high-throughput screening of nanoparticle toxicity. For the investigation of lung toxicity, a strong preference was expressed for air-liquid interface (ALI) cell exposure systems (rather than submerged cell exposure systems) as they more closely resemble in-vivo conditions in the lungs and they allow for unaltered and dosimetrically accurate delivery of aerosolized nanoparticles to the cells. An important aspect, which is frequently overlooked, is the comparison of typically used in-vitro dose levels with realistic in-vivo nanoparticle doses in the lung. If we consider average ambient urban exposure and occupational exposure at 5mg/m3 (maximum level allowed by Occupational Safety and Health Administration (OSHA)) as the boundaries of human exposure, the corresponding upper-limit range of nanoparticle flux delivered to the lung tissue is 3×10-5-5×10-3μg/h/cm2 of lung tissue and 2-300particles/h/(epithelial) cell. This range can be easily matched and even exceeded by almost all currently available cell exposure systems.The consensus statement includes a set of recommendations for conducting in-vitro cell exposure studies with pulmonary cell systems and identifies urgent needs for future development. As these issues are crucial for the introduction of safe nanomaterials into the marketplace and the living environment, they deserve more attention and more interaction between biologists and aerosol scientists. The members of the workshop believe that further advances in in-vitro cell exposure studies would be greatly facilitated by a more active role of the aerosol scientists. The technical know-how for developing and running ALI in-vitro exposure systems is available in the aerosol community and at the same time biologists/toxicologists are required for proper assessment of the biological impact of nanoparticles.
Resumo:
Gene-on-gene regulations are key components of every living organism. Dynamical abstract models of genetic regulatory networks help explain the genome's evolvability and robustness. These properties can be attributed to the structural topology of the graph formed by genes, as vertices, and regulatory interactions, as edges. Moreover, the actual gene interaction of each gene is believed to play a key role in the stability of the structure. With advances in biology, some effort was deployed to develop update functions in Boolean models that include recent knowledge. We combine real-life gene interaction networks with novel update functions in a Boolean model. We use two sub-networks of biological organisms, the yeast cell-cycle and the mouse embryonic stem cell, as topological support for our system. On these structures, we substitute the original random update functions by a novel threshold-based dynamic function in which the promoting and repressing effect of each interaction is considered. We use a third real-life regulatory network, along with its inferred Boolean update functions to validate the proposed update function. Results of this validation hint to increased biological plausibility of the threshold-based function. To investigate the dynamical behavior of this new model, we visualized the phase transition between order and chaos into the critical regime using Derrida plots. We complement the qualitative nature of Derrida plots with an alternative measure, the criticality distance, that also allows to discriminate between regimes in a quantitative way. Simulation on both real-life genetic regulatory networks show that there exists a set of parameters that allows the systems to operate in the critical region. This new model includes experimentally derived biological information and recent discoveries, which makes it potentially useful to guide experimental research. The update function confers additional realism to the model, while reducing the complexity and solution space, thus making it easier to investigate.
Resumo:
We construct a dynamic theory of civil conflict hinging on inter-ethnic trust and trade. The model economy is inhabitated by two ethnic groups. Inter-ethnic trade requires imperfectly observed bilateral investments and one group has to form beliefs on the average propensity to trade of the other group. Since conflict disrupts trade, the onset of a conflict signals that the aggressor has a low propensity to trade. Agents observe the history of conflicts and update their beliefs over time, transmitting them to the next generation. The theory bears a set of testable predictions. First, war is a stochastic process whose frequency depends on the state of endogenous beliefs. Second, the probability of future conflicts increases after each conflict episode. Third, "accidental" conflicts that do not reflect economic fundamentals can lead to a permanent breakdown of trust, plunging a society into a vicious cycle of recurrent conflicts (a war trap). The incidence of conflict can be reduced by policies abating cultural barriers, fostering inter-ethnic trade and human capital, and shifting beliefs. Coercive peace policies such as peacekeeping forces or externally imposed regime changes have instead no persistent effects.
Resumo:
Nessie is an Autonomous Underwater Vehicle (AUV) created by a team of students in the Heriot Watt University to compete in the Student Autonomous Underwater Competition, Europe (SAUC-E) in August 2006. The main objective of the project is to find the dynamic equation of the robot, dynamic model. With it, the behaviour of the robot will be easier to understand and movement tests will be available by computer without the need of the robot, what is a way to save time, batteries, money and the robot from water inside itself. The object of the second part in this project is setting a control system for Nessie by using the model
Resumo:
One of the first useful products from the human genome will be a set of predicted genes. Besides its intrinsic scientific interest, the accuracy and completeness of this data set is of considerable importance for human health and medicine. Though progress has been made on computational gene identification in terms of both methods and accuracy evaluation measures, most of the sequence sets in which the programs are tested are short genomic sequences, and there is concern that these accuracy measures may not extrapolate well to larger, more challenging data sets. Given the absence of experimentally verified large genomic data sets, we constructed a semiartificial test set comprising a number of short single-gene genomic sequences with randomly generated intergenic regions. This test set, which should still present an easier problem than real human genomic sequence, mimics the approximately 200kb long BACs being sequenced. In our experiments with these longer genomic sequences, the accuracy of GENSCAN, one of the most accurate ab initio gene prediction programs, dropped significantly, although its sensitivity remained high. Conversely, the accuracy of similarity-based programs, such as GENEWISE, PROCRUSTES, and BLASTX was not affected significantly by the presence of random intergenic sequence, but depended on the strength of the similarity to the protein homolog. As expected, the accuracy dropped if the models were built using more distant homologs, and we were able to quantitatively estimate this decline. However, the specificities of these techniques are still rather good even when the similarity is weak, which is a desirable characteristic for driving expensive follow-up experiments. Our experiments suggest that though gene prediction will improve with every new protein that is discovered and through improvements in the current set of tools, we still have a long way to go before we can decipher the precise exonic structure of every gene in the human genome using purely computational methodology.
Resumo:
In a number of programs for gene structure prediction in higher eukaryotic genomic sequences, exon prediction is decoupled from gene assembly: a large pool of candidate exons is predicted and scored from features located in the query DNA sequence, and candidate genes are assembled from such a pool as sequences of nonoverlapping frame-compatible exons. Genes are scored as a function of the scores of the assembled exons, and the highest scoring candidate gene is assumed to be the most likely gene encoded by the query DNA sequence. Considering additive gene scoring functions, currently available algorithms to determine such a highest scoring candidate gene run in time proportional to the square of the number of predicted exons. Here, we present an algorithm whose running time grows only linearly with the size of the set of predicted exons. Polynomial algorithms rely on the fact that, while scanning the set of predicted exons, the highest scoring gene ending in a given exon can be obtained by appending the exon to the highest scoring among the highest scoring genes ending at each compatible preceding exon. The algorithm here relies on the simple fact that such highest scoring gene can be stored and updated. This requires scanning the set of predicted exons simultaneously by increasing acceptor and donor position. On the other hand, the algorithm described here does not assume an underlying gene structure model. Indeed, the definition of valid gene structures is externally defined in the so-called Gene Model. The Gene Model specifies simply which gene features are allowed immediately upstream which other gene features in valid gene structures. This allows for great flexibility in formulating the gene identification problem. In particular it allows for multiple-gene two-strand predictions and for considering gene features other than coding exons (such as promoter elements) in valid gene structures.