916 resultados para complexity in spatiotemporal evolution


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Yeast successfully adapts to an environmental stress by altering physiology and fine-tuning metabolism. This fine-tuning is achieved through regulation of both gene expression and protein activity, and it is shaped by various physiological requirements. Such requirements impose a sustained evolutionary pressure that ultimately selects a specific gene expression profile, generating a suitable adaptive response to each environmental change. Although some of the requirements are stress specific, it is likely that others are common to various situations. We hypothesize that an evolutionary pressure for minimizing biosynthetic costs might have left signatures in the physicochemical properties of proteins whose gene expression is fine-tuned during adaptive responses. To test this hypothesis we analyze existing yeast transcriptomic data for such responses and investigate how several properties of proteins correlate to changes in gene expression. Our results reveal signatures that are consistent with a selective pressure for economy in protein synthesis during adaptive response of yeast to various types of stress. These signatures differentiate two groups of adaptive responses with respect to how cells manage expenditure in protein biosynthesis. In one group, significant trends towards downregulation of large proteins and upregulation of small ones are observed. In the other group we find no such trends. These results are consistent with resource limitation being important in the evolution of the first group of stress responses.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Although tumor heterogeneity is widely accepted, the existence of cancer stem cells (CSCs) and their proposed role in tumor maintenance has always been challenged and remains a matter of debate. Recently, a path-breaking chapter was added to this saga when three independent groups reported the in vivo existence of CSCs in brain, skin and intestinal tumors using lineage-tracing and thus strengthens the CSC concept; even though certain fundamental caveats are always associated with lineage-tracing approach. In principle, the CSC hypothesis proposes that similar to normal stem cells, CSCs maintain self renewal and multilineage differentiation property and are found at the central echelon of cellular hierarchy present within tumors. However, these cells differ from their normal counterpart by maintaining their malignant potential, alteration of genomic integrity, epigenetic identity and the expression of specific surface protein profiles. As CSCs are highly resistant to chemotherapeutics, they are thought to be a crucial factor involved in tumor relapse and superficially appear as the ultimate therapeutic target. However, even that is not the end; further complication is attributed by reports of bidirectional regeneration mechanism for CSCs, one from their self-renewal capability and another from the recently proposed concept of dynamic equilibrium between CSCs and non-CSCs via their interconversion. This phenomenon has currently added a new layer of complexity in understanding the biology of tumor heterogeneity. In-spite of its associated controversies, this area has rapidly emerged as the center of attention for researchers and clinicians, because of the conceptual framework it provides towards devising new therapies.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We present the first density model of Stromboli volcano (Aeolian Islands, Italy) obtained by simultaneously inverting land-based (543) and sea-surface (327) relative gravity data. Modern positioning technology, a 1 x 1 m digital elevation model, and a 15 x 15 m bathymetric model made it possible to obtain a detailed 3-D density model through an iteratively reweighted smoothness-constrained least-squares inversion that explained the land-based gravity data to 0.09 mGal and the sea-surface data to 5 mGal. Our inverse formulation avoids introducing any assumptions about density magnitudes. At 125 m depth from the land surface, the inferred mean density of the island is 2380 kg m(-3), with corresponding 2.5 and 97.5 percentiles of 2200 and 2530 kg m-3. This density range covers the rock densities of new and previously published samples of Paleostromboli I, Vancori, Neostromboli and San Bartolo lava flows. High-density anomalies in the central and southern part of the island can be related to two main degassing faults crossing the island (N41 and NM) that are interpreted as preferential regions of dyke intrusions. In addition, two low-density anomalies are found in the northeastern part and in the summit area of the island. These anomalies seem to be geographically related with past paroxysmal explosive phreato-magmatic events that have played important roles in the evolution of Stromboli Island by forming the Scari caldera and the Neostromboli crater, respectively. (C) 2014 Elsevier B.V. All rights reserved.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

During infection with human immunodeficiency virus (HIV), immune pressure from cytotoxic T-lymphocytes (CTLs) selects for viral mutants that confer escape from CTL recognition. These escape variants can be transmitted between individuals where, depending upon their cost to viral fitness and the CTL responses made by the recipient, they may revert. The rates of within-host evolution and their concordant impact upon the rate of spread of escape mutants at the population level are uncertain. Here we present a mathematical model of within-host evolution of escape mutants, transmission of these variants between hosts and subsequent reversion in new hosts. The model is an extension of the well-known SI model of disease transmission and includes three further parameters that describe host immunogenetic heterogeneity and rates of within host viral evolution. We use the model to explain why some escape mutants appear to have stable prevalence whilst others are spreading through the population. Further, we use it to compare diverse datasets on CTL escape, highlighting where different sources agree or disagree on within-host evolutionary rates. The several dozen CTL epitopes we survey from HIV-1 gag, RT and nef reveal a relatively sedate rate of evolution with average rates of escape measured in years and reversion in decades. For many epitopes in HIV, occasional rapid within-host evolution is not reflected in fast evolution at the population level.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The human CERKL gene is responsible for common and severe forms of retinal dystrophies. Despite intense in vitro studies at the molecular and cellular level and in vivo analyses of the retina of murine knockout models, CERKL function remains unknown. In this study, we aimed to approach the developmental and functional features of cerkl in Danio rerio within an Evo-Devo framework. We show that gene expression increases from early developmental stages until the formation of the retina in the optic cup. Unlike the high mRNA-CERKL isoform multiplicity shown in mammals, the moderate transcriptional complexity in fish facilitates phenotypic studies derived from gene silencing. Moreover, of relevance to pathogenicity, teleost CERKL shares the two main human protein isoforms. Morpholino injection has been used to generate a cerkl knockdown zebrafish model. The morphant phenotype results in abnormal eye development with lamination defects, failure to develop photoreceptor outer segments, increased apoptosis of retinal cells and small eyes. Our data support that zebrafish Cerkl does not interfere with proliferation and neural differentiation during early developmental stages but is relevant for survival and protection of the retinal tissue. Overall, we propose that this zebrafish model is a powerful tool to unveil CERKL contribution to human retinal degeneration

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The functional method is a new test theory using a new scoring method that assumes complexity in test structure, and thus takes into account every correlation between factors and items. The main specificity of the functional method is to model test scores by multiple regression instead of estimating them by using simplistic sums of points. In order to proceed, the functional method requires the creation of hyperspherical measurement space, in which item responses are expressed by their correlation with orthogonal factors. This method has three main qualities. First, measures are expressed in the absolute metric of correlations; therefore, items, scales and persons are expressed in the same measurement space using the same single metric. Second, factors are systematically orthogonal and without errors, which is optimal in order to predict other outcomes. Such predictions can be performed to estimate how one would answer to other tests, or even to model one's response strategy if it was perfectly coherent. Third, the functional method provides measures of individuals' response validity (i.e., control indices). Herein, we propose a standard procedure in order to identify whether test results are interpretable and to exclude invalid results caused by various response biases based on control indices.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The mutualistic versus antagonistic nature of an interaction is defined by costs and benefits of each partner, which may vary depending on the environment. Contrasting with this dynamic view, several pollination interactions are considered as strictly obligate and mutualistic. Here, we focus on the interaction between Trollius europaeus and Chiastocheta flies, considered as a specialized and obligate nursery pollination system - the flies are thought to be exclusive pollinators of the plant and their larvae develop only in T.europaeus fruits. In this system, features such as the globelike flower shape are claimed to have evolved in a coevolutionary context. We examine the specificity of this pollination system and measure traits related to offspring fitness in isolated T.europaeus populations, in some of which Chiastocheta flies have gone extinct. We hypothesize that if this interaction is specific and obligate, the plant should experience dramatic drop in its relative fitness in the absence of Chiastocheta. Contrasting with this hypothesis, T.europaeus populations without flies demonstrate a similar relative fitness to those with the flies present, contradicting the putative obligatory nature of this pollination system. It also agrees with our observation that many other insects also visit and carry pollen among T.europaeus flowers. We propose that the interaction could have evolved through maximization of by-product benefits of the Chiastocheta visits, through the male flower function, and selection on floral traits by the most effective pollinator. We argue this mechanism is also central in the evolution of other nursery pollination systems.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The Swiss financial centre witnessed an important shift during the 1960s: the number of foreign banks and their importance in relation to the domestic banking sector significantly increased. Faced with this rapid development, Swiss bank representatives and political and monetary authorities reacted strongly. This paper investigates the evolution of the regulatory response by Swiss banking policy actors to the proliferation of foreign financial in- stitutions. In 1969, those reactions led to the adoption of a discriminatory regime, setting higher entry barriers for foreign banks than for domestic institutions. After examining possible reasons for the attractiveness of Switzerland to foreign banks, this paper will analyse the concerns and fears of the domestic banking sector and its regulators. In this regard, it appears that issues such as mere competition, preservation of the international reputation of the Swiss banks and anti-inflationary monetary policy were central to the chosen regulatory regime. Moreover, this paper shows that foreign banks were used as scapegoats in the evolution of the Swiss system of banking supervision. They were more tightly regulated, yet the general framework remained very lax.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The phosphatidylinositol 3-kinase (PI3K)/AKT signaling pathway regulates multiple cellular processes. An overactivation of the pathway is frequently present in human malignancies and plays a key role in cancer progression. Hence, its inhibition has become a promising approach in cancer therapy. However, the development of resistances, such as the abrogation of negative feedback mechanisms or the activation of other proliferative signaling pathways, has considerably limited the anticancer efficacy of PI3K/AKT inhibitors. In addition, emerging evidence points out that although AKT is acknowledged as the major downstream effector of PI3K, both PI3K and AKT can operate independently of each other in cancer, revealing another level of complexity in this pathway. Here, we highlight the complex relationship between PI3K and AKT in cancer and further discuss the consequences of this relationship for cancer therapy.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This articles describes three models which played a key role in the evolution of the International Olympic Committee (IOC) and all the organizations which contribute to the staging of the Olympic Games and constitute the Olympic System, from its beginnings in 1894 to the present day. This evolution and the addition of many stakeholders has increased the complexity of the management of the Olympic System over the years from pure Olympic administration (when the IOC headquarters moved to Lausanne in 1915) to Olympic network governance which must take into consideration more than 24 types of stakeholders, including goverments and intergovernmental organizations.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Durante toda la evolución de la tecnología, se han empleado aparatos interconexionados por cables. Los cables limitan la libertad de movimiento del usuario y pueden captar interferencias entre ellos si la red de cableado es elevada. Mientras avanzaba la tecnología inalámbrica, se ha ido adaptando al equipamiento electrónico a la vez que se iban haciendo cada vez más pequeños. Por esto, se impone la necesidad de utilizarlos como controles a distancia sin el empleo de cables debido a los inconvenientes que estos conllevan. El presente trabajo, pretende unificar tres tecnologías que pueden tener en el futuro una gran afinidad. · Dispositivos basados en el sistema Android. Desde sus inicios, han tenido una evolución meteórica. Se han ido haciendo cada vez más rápidos y mejores. · Sistemas inalámbricos. Los sistemas wifi o bluetooth, se han ido incorporando a nuestras vidas cada vez más y están prácticamente en cualquier aparato. · Robótica. Cualquier proceso de producción incorpora un robot. Son necesarios para hacer muchos trabajos que, aunque el hombre lo puede realizar, un robot reduce los tiempos y la peligrosidad de los procesos. Aunque las dos primeras tecnologías van unidas, ¿quién no tiene un teléfono con conexión wifi y bluetooth?, pocos diseños aúnan estos campos con la Robótica. El objetivo final de este trabajo es realizar una aplicación en Android para el control remoto de un robot, empleando el sistema de comunicación inalámbrico. La aplicación desarrollada, permite controlar el robot a conveniencia del usuario en un entorno táctil/teledirigido. Gracias a la utilización de simulador en ambos lenguajes (RAPID y Android), ha sido posible realizar la programación sin tener que estar presente ante el robot objeto de este trabajo. A través de su progreso, se ha ido evolucionando en la cantidad de datos enviados al robot y complejidad en su procesamiento, a la vez que se ha mejorado en la estética de la aplicación. Finalmente se usó la aplicación desarrollada con el robot, consiguiendo con éxito que realizara los movimientos que eran enviados con la tablet programada.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

La tomodensitométrie (TDM) est une technique d'imagerie pour laquelle l'intérêt n'a cessé de croitre depuis son apparition au début des années 70. De nos jours, l'utilisation de cette technique est devenue incontournable, grâce entre autres à sa capacité à produire des images diagnostiques de haute qualité. Toutefois, et en dépit d'un bénéfice indiscutable sur la prise en charge des patients, l'augmentation importante du nombre d'examens TDM pratiqués soulève des questions sur l'effet potentiellement dangereux des rayonnements ionisants sur la population. Parmi ces effets néfastes, l'induction de cancers liés à l'exposition aux rayonnements ionisants reste l'un des risques majeurs. Afin que le rapport bénéfice-risques reste favorable au patient il est donc nécessaire de s'assurer que la dose délivrée permette de formuler le bon diagnostic tout en évitant d'avoir recours à des images dont la qualité est inutilement élevée. Ce processus d'optimisation, qui est une préoccupation importante pour les patients adultes, doit même devenir une priorité lorsque l'on examine des enfants ou des adolescents, en particulier lors d'études de suivi requérant plusieurs examens tout au long de leur vie. Enfants et jeunes adultes sont en effet beaucoup plus sensibles aux radiations du fait de leur métabolisme plus rapide que celui des adultes. De plus, les probabilités des évènements auxquels ils s'exposent sont également plus grandes du fait de leur plus longue espérance de vie. L'introduction des algorithmes de reconstruction itératifs, conçus pour réduire l'exposition des patients, est certainement l'une des plus grandes avancées en TDM, mais elle s'accompagne de certaines difficultés en ce qui concerne l'évaluation de la qualité des images produites. Le but de ce travail est de mettre en place une stratégie pour investiguer le potentiel des algorithmes itératifs vis-à-vis de la réduction de dose sans pour autant compromettre la qualité du diagnostic. La difficulté de cette tâche réside principalement dans le fait de disposer d'une méthode visant à évaluer la qualité d'image de façon pertinente d'un point de vue clinique. La première étape a consisté à caractériser la qualité d'image lors d'examen musculo-squelettique. Ce travail a été réalisé en étroite collaboration avec des radiologues pour s'assurer un choix pertinent de critères de qualité d'image. Une attention particulière a été portée au bruit et à la résolution des images reconstruites à l'aide d'algorithmes itératifs. L'analyse de ces paramètres a permis aux radiologues d'adapter leurs protocoles grâce à une possible estimation de la perte de qualité d'image liée à la réduction de dose. Notre travail nous a également permis d'investiguer la diminution de la détectabilité à bas contraste associée à une diminution de la dose ; difficulté majeure lorsque l'on pratique un examen dans la région abdominale. Sachant que des alternatives à la façon standard de caractériser la qualité d'image (métriques de l'espace Fourier) devaient être utilisées, nous nous sommes appuyés sur l'utilisation de modèles d'observateurs mathématiques. Nos paramètres expérimentaux ont ensuite permis de déterminer le type de modèle à utiliser. Les modèles idéaux ont été utilisés pour caractériser la qualité d'image lorsque des paramètres purement physiques concernant la détectabilité du signal devaient être estimés alors que les modèles anthropomorphes ont été utilisés dans des contextes cliniques où les résultats devaient être comparés à ceux d'observateurs humain, tirant profit des propriétés de ce type de modèles. Cette étude a confirmé que l'utilisation de modèles d'observateurs permettait d'évaluer la qualité d'image en utilisant une approche basée sur la tâche à effectuer, permettant ainsi d'établir un lien entre les physiciens médicaux et les radiologues. Nous avons également montré que les reconstructions itératives ont le potentiel de réduire la dose sans altérer la qualité du diagnostic. Parmi les différentes reconstructions itératives, celles de type « model-based » sont celles qui offrent le plus grand potentiel d'optimisation, puisque les images produites grâce à cette modalité conduisent à un diagnostic exact même lors d'acquisitions à très basse dose. Ce travail a également permis de clarifier le rôle du physicien médical en TDM: Les métriques standards restent utiles pour évaluer la conformité d'un appareil aux requis légaux, mais l'utilisation de modèles d'observateurs est inévitable pour optimiser les protocoles d'imagerie. -- Computed tomography (CT) is an imaging technique in which interest has been quickly growing since it began to be used in the 1970s. Today, it has become an extensively used modality because of its ability to produce accurate diagnostic images. However, even if a direct benefit to patient healthcare is attributed to CT, the dramatic increase in the number of CT examinations performed has raised concerns about the potential negative effects of ionising radiation on the population. Among those negative effects, one of the major risks remaining is the development of cancers associated with exposure to diagnostic X-ray procedures. In order to ensure that the benefits-risk ratio still remains in favour of the patient, it is necessary to make sure that the delivered dose leads to the proper diagnosis without producing unnecessarily high-quality images. This optimisation scheme is already an important concern for adult patients, but it must become an even greater priority when examinations are performed on children or young adults, in particular with follow-up studies which require several CT procedures over the patient's life. Indeed, children and young adults are more sensitive to radiation due to their faster metabolism. In addition, harmful consequences have a higher probability to occur because of a younger patient's longer life expectancy. The recent introduction of iterative reconstruction algorithms, which were designed to substantially reduce dose, is certainly a major achievement in CT evolution, but it has also created difficulties in the quality assessment of the images produced using those algorithms. The goal of the present work was to propose a strategy to investigate the potential of iterative reconstructions to reduce dose without compromising the ability to answer the diagnostic questions. The major difficulty entails disposing a clinically relevant way to estimate image quality. To ensure the choice of pertinent image quality criteria this work was continuously performed in close collaboration with radiologists. The work began by tackling the way to characterise image quality when dealing with musculo-skeletal examinations. We focused, in particular, on image noise and spatial resolution behaviours when iterative image reconstruction was used. The analyses of the physical parameters allowed radiologists to adapt their image acquisition and reconstruction protocols while knowing what loss of image quality to expect. This work also dealt with the loss of low-contrast detectability associated with dose reduction, something which is a major concern when dealing with patient dose reduction in abdominal investigations. Knowing that alternative ways had to be used to assess image quality rather than classical Fourier-space metrics, we focused on the use of mathematical model observers. Our experimental parameters determined the type of model to use. Ideal model observers were applied to characterise image quality when purely objective results about the signal detectability were researched, whereas anthropomorphic model observers were used in a more clinical context, when the results had to be compared with the eye of a radiologist thus taking advantage of their incorporation of human visual system elements. This work confirmed that the use of model observers makes it possible to assess image quality using a task-based approach, which, in turn, establishes a bridge between medical physicists and radiologists. It also demonstrated that statistical iterative reconstructions have the potential to reduce the delivered dose without impairing the quality of the diagnosis. Among the different types of iterative reconstructions, model-based ones offer the greatest potential, since images produced using this modality can still lead to an accurate diagnosis even when acquired at very low dose. This work has clarified the role of medical physicists when dealing with CT imaging. The use of the standard metrics used in the field of CT imaging remains quite important when dealing with the assessment of unit compliance to legal requirements, but the use of a model observer is the way to go when dealing with the optimisation of the imaging protocols.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The role of behavior in evolution remains controversial, despite that some ideas are over 100 years old. Changes in behavior are generally believed to enhance evolution by exposing individuals to new selective pressures and by facilitating range expansions. However, this hypothesis lacks firm empirical evidence. Moreover, behavioral changes can also inhibit evolution by hiding heritable variation from natural selection. Taking advantage of the complete phylogeny of extant birds, a new species-level measure of past diversification rate and the best existing measures of brain size (n = 1326 species), I show here that relative brain size is associated (albeit weakly) with diversification rates. Assuming that brain relative size reflects behavioral flexibility, an assumption well-supported by evidence, this finding supports the idea that behavior can enhance evolutionary diversification. This view is further supported by the discovery that the most important factor influencing diversification rates is ecological generalism, which is believed to require behavioral flexibility. Thus, behavioral changes that expose animals to a variety of environments can have played an important role in the evolution of birds.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

"Science as culture" is based on the assumption that science is a valuable component of human culture. We therefore have to build the bridge, in cultural terms, from the scientific community to the common citizen. Teaching science as culture requires the co-construction of knowledge and citizenship. Ways of articulating science/technology with society are invoked, pondering on the ethical ambivalence of such connections. The goals of this reflection are to think about: a) epistemological obstacles that, in favouring the logic of monoculture, oppose the implantation of the science as culture; b) epistemological strategies that point towards a diversity of cultural practices and "constellations" of knowledge leading to the reconfiguration of the being through knowledge; c) imperatives that force us to (re)think the epistemological bases suited to the paradigmatic changes and which translate the dynamics and complexity of the evolution of the frameworks that currently sustain science and school scientific education.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Protein engineering aims to improve the properties of enzymes and affinity reagents by genetic changes. Typical engineered properties are affinity, specificity, stability, expression, and solubility. Because proteins are complex biomolecules, the effects of specific genetic changes are seldom predictable. Consequently, a popular strategy in protein engineering is to create a library of genetic variants of the target molecule, and render the population in a selection process to sort the variants by the desired property. This technique, called directed evolution, is a central tool for trimming protein-based products used in a wide range of applications from laundry detergents to anti-cancer drugs. New methods are continuously needed to generate larger gene repertoires and compatible selection platforms to shorten the development timeline for new biochemicals. In the first study of this thesis, primer extension mutagenesis was revisited to establish higher quality gene variant libraries in Escherichia coli cells. In the second study, recombination was explored as a method to expand the number of screenable enzyme variants. A selection platform was developed to improve antigen binding fragment (Fab) display on filamentous phages in the third article and, in the fourth study, novel design concepts were tested by two differentially randomized recombinant antibody libraries. Finally, in the last study, the performance of the same antibody repertoire was compared in phage display selections as a genetic fusion to different phage capsid proteins and in different antibody formats, Fab vs. single chain variable fragment (ScFv), in order to find out the most suitable display platform for the library at hand. As a result of the studies, a novel gene library construction method, termed selective rolling circle amplification (sRCA), was developed. The method increases mutagenesis frequency close to 100% in the final library and the number of transformants over 100-fold compared to traditional primer extension mutagenesis. In the second study, Cre/loxP recombination was found to be an appropriate tool to resolve the DNA concatemer resulting from error-prone RCA (epRCA) mutagenesis into monomeric circular DNA units for higher efficiency transformation into E. coli. Library selections against antigens of various size in the fourth study demonstrated that diversity placed closer to the antigen binding site of antibodies supports generation of antibodies against haptens and peptides, whereas diversity at more peripheral locations is better suited for targeting proteins. The conclusion from a comparison of the display formats was that truncated capsid protein three (p3Δ) of filamentous phage was superior to the full-length p3 and protein nine (p9) in obtaining a high number of uniquely specific clones. Especially for digoxigenin, a difficult hapten target, the antibody repertoire as ScFv-p3Δ provided the clones with the highest affinity for binding. This thesis on the construction, design, and selection of gene variant libraries contributes to the practical know-how in directed evolution and contains useful information for scientists in the field to support their undertakings.