17 resultados para Search space reduction


Relevância:

80.00% 80.00%

Publicador:

Resumo:

Biochemical systems are commonly modelled by systems of ordinary differential equations (ODEs). A particular class of such models called S-systems have recently gained popularity in biochemical system modelling. The parameters of an S-system are usually estimated from time-course profiles. However, finding these estimates is a difficult computational problem. Moreover, although several methods have been recently proposed to solve this problem for ideal profiles, relatively little progress has been reported for noisy profiles. We describe a special feature of a Newton-flow optimisation problem associated with S-system parameter estimation. This enables us to significantly reduce the search space, and also lends itself to parameter estimation for noisy data. We illustrate the applicability of our method by applying it to noisy time-course data synthetically produced from previously published 4- and 30-dimensional S-systems. In addition, we propose an extension of our method that allows the detection of network topologies for small S-systems. We introduce a new method for estimating S-system parameters from time-course profiles. We show that the performance of this method compares favorably with competing methods for ideal profiles, and that it also allows the determination of parameters for noisy profiles.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Optimizing collective behavior in multiagent systems requires algorithms to find not only appropriate individual behaviors but also a suitable composition of agents within a team. Over the last two decades, evolutionary methods have emerged as a promising approach for the design of agents and their compositions into teams. The choice of a crossover operator that facilitates the evolution of optimal team composition is recognized to be crucial, but so far, it has never been thoroughly quantified. Here, we highlight the limitations of two different crossover operators that exchange entire agents between teams: restricted agent swapping (RAS) that exchanges only corresponding agents between teams and free agent swapping (FAS) that allows an arbitrary exchange of agents. Our results show that RAS suffers from premature convergence, whereas FAS entails insufficient convergence. Consequently, in both cases, the exploration and exploitation aspects of the evolutionary algorithm are not well balanced resulting in the evolution of suboptimal team compositions. To overcome this problem, we propose combining the two methods. Our approach first applies FAS to explore the search space and then RAS to exploit it. This mixed approach is a much more efficient strategy for the evolution of team compositions compared to either strategy on its own. Our results suggest that such a mixed agent-swapping algorithm should always be preferred whenever the optimal composition of individuals in a multiagent system is unknown.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

In recent years, protein-ligand docking has become a powerful tool for drug development. Although several approaches suitable for high throughput screening are available, there is a need for methods able to identify binding modes with high accuracy. This accuracy is essential to reliably compute the binding free energy of the ligand. Such methods are needed when the binding mode of lead compounds is not determined experimentally but is needed for structure-based lead optimization. We present here a new docking software, called EADock, that aims at this goal. It uses an hybrid evolutionary algorithm with two fitness functions, in combination with a sophisticated management of the diversity. EADock is interfaced with the CHARMM package for energy calculations and coordinate handling. A validation was carried out on 37 crystallized protein-ligand complexes featuring 11 different proteins. The search space was defined as a sphere of 15 A around the center of mass of the ligand position in the crystal structure, and on the contrary to other benchmarks, our algorithm was fed with optimized ligand positions up to 10 A root mean square deviation (RMSD) from the crystal structure, excluding the latter. This validation illustrates the efficiency of our sampling strategy, as correct binding modes, defined by a RMSD to the crystal structure lower than 2 A, were identified and ranked first for 68% of the complexes. The success rate increases to 78% when considering the five best ranked clusters, and 92% when all clusters present in the last generation are taken into account. Most failures could be explained by the presence of crystal contacts in the experimental structure. Finally, the ability of EADock to accurately predict binding modes on a real application was illustrated by the successful docking of the RGD cyclic pentapeptide on the alphaVbeta3 integrin, starting far away from the binding pocket.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

3 Summary 3. 1 English The pharmaceutical industry has been facing several challenges during the last years, and the optimization of their drug discovery pipeline is believed to be the only viable solution. High-throughput techniques do participate actively to this optimization, especially when complemented by computational approaches aiming at rationalizing the enormous amount of information that they can produce. In siiico techniques, such as virtual screening or rational drug design, are now routinely used to guide drug discovery. Both heavily rely on the prediction of the molecular interaction (docking) occurring between drug-like molecules and a therapeutically relevant target. Several softwares are available to this end, but despite the very promising picture drawn in most benchmarks, they still hold several hidden weaknesses. As pointed out in several recent reviews, the docking problem is far from being solved, and there is now a need for methods able to identify binding modes with a high accuracy, which is essential to reliably compute the binding free energy of the ligand. This quantity is directly linked to its affinity and can be related to its biological activity. Accurate docking algorithms are thus critical for both the discovery and the rational optimization of new drugs. In this thesis, a new docking software aiming at this goal is presented, EADock. It uses a hybrid evolutionary algorithm with two fitness functions, in combination with a sophisticated management of the diversity. EADock is interfaced with .the CHARMM package for energy calculations and coordinate handling. A validation was carried out on 37 crystallized protein-ligand complexes featuring 11 different proteins. The search space was defined as a sphere of 15 R around the center of mass of the ligand position in the crystal structure, and conversely to other benchmarks, our algorithms was fed with optimized ligand positions up to 10 A root mean square deviation 2MSD) from the crystal structure. This validation illustrates the efficiency of our sampling heuristic, as correct binding modes, defined by a RMSD to the crystal structure lower than 2 A, were identified and ranked first for 68% of the complexes. The success rate increases to 78% when considering the five best-ranked clusters, and 92% when all clusters present in the last generation are taken into account. Most failures in this benchmark could be explained by the presence of crystal contacts in the experimental structure. EADock has been used to understand molecular interactions involved in the regulation of the Na,K ATPase, and in the activation of the nuclear hormone peroxisome proliferatoractivated receptors a (PPARa). It also helped to understand the action of common pollutants (phthalates) on PPARy, and the impact of biotransformations of the anticancer drug Imatinib (Gleevec®) on its binding mode to the Bcr-Abl tyrosine kinase. Finally, a fragment-based rational drug design approach using EADock was developed, and led to the successful design of new peptidic ligands for the a5ß1 integrin, and for the human PPARa. In both cases, the designed peptides presented activities comparable to that of well-established ligands such as the anticancer drug Cilengitide and Wy14,643, respectively. 3.2 French Les récentes difficultés de l'industrie pharmaceutique ne semblent pouvoir se résoudre que par l'optimisation de leur processus de développement de médicaments. Cette dernière implique de plus en plus. de techniques dites "haut-débit", particulièrement efficaces lorsqu'elles sont couplées aux outils informatiques permettant de gérer la masse de données produite. Désormais, les approches in silico telles que le criblage virtuel ou la conception rationnelle de nouvelles molécules sont utilisées couramment. Toutes deux reposent sur la capacité à prédire les détails de l'interaction moléculaire entre une molécule ressemblant à un principe actif (PA) et une protéine cible ayant un intérêt thérapeutique. Les comparatifs de logiciels s'attaquant à cette prédiction sont flatteurs, mais plusieurs problèmes subsistent. La littérature récente tend à remettre en cause leur fiabilité, affirmant l'émergence .d'un besoin pour des approches plus précises du mode d'interaction. Cette précision est essentielle au calcul de l'énergie libre de liaison, qui est directement liée à l'affinité du PA potentiel pour la protéine cible, et indirectement liée à son activité biologique. Une prédiction précise est d'une importance toute particulière pour la découverte et l'optimisation de nouvelles molécules actives. Cette thèse présente un nouveau logiciel, EADock, mettant en avant une telle précision. Cet algorithme évolutionnaire hybride utilise deux pressions de sélections, combinées à une gestion de la diversité sophistiquée. EADock repose sur CHARMM pour les calculs d'énergie et la gestion des coordonnées atomiques. Sa validation a été effectuée sur 37 complexes protéine-ligand cristallisés, incluant 11 protéines différentes. L'espace de recherche a été étendu à une sphère de 151 de rayon autour du centre de masse du ligand cristallisé, et contrairement aux comparatifs habituels, l'algorithme est parti de solutions optimisées présentant un RMSD jusqu'à 10 R par rapport à la structure cristalline. Cette validation a permis de mettre en évidence l'efficacité de notre heuristique de recherche car des modes d'interactions présentant un RMSD inférieur à 2 R par rapport à la structure cristalline ont été classés premier pour 68% des complexes. Lorsque les cinq meilleures solutions sont prises en compte, le taux de succès grimpe à 78%, et 92% lorsque la totalité de la dernière génération est prise en compte. La plupart des erreurs de prédiction sont imputables à la présence de contacts cristallins. Depuis, EADock a été utilisé pour comprendre les mécanismes moléculaires impliqués dans la régulation de la Na,K ATPase et dans l'activation du peroxisome proliferatoractivated receptor a (PPARa). Il a également permis de décrire l'interaction de polluants couramment rencontrés sur PPARy, ainsi que l'influence de la métabolisation de l'Imatinib (PA anticancéreux) sur la fixation à la kinase Bcr-Abl. Une approche basée sur la prédiction des interactions de fragments moléculaires avec protéine cible est également proposée. Elle a permis la découverte de nouveaux ligands peptidiques de PPARa et de l'intégrine a5ß1. Dans les deux cas, l'activité de ces nouveaux peptides est comparable à celles de ligands bien établis, comme le Wy14,643 pour le premier, et le Cilengitide (PA anticancéreux) pour la seconde.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Combinatorial optimization involves finding an optimal solution in a finite set of options; many everyday life problems are of this kind. However, the number of options grows exponentially with the size of the problem, such that an exhaustive search for the best solution is practically infeasible beyond a certain problem size. When efficient algorithms are not available, a practical approach to obtain an approximate solution to the problem at hand, is to start with an educated guess and gradually refine it until we have a good-enough solution. Roughly speaking, this is how local search heuristics work. These stochastic algorithms navigate the problem search space by iteratively turning the current solution into new candidate solutions, guiding the search towards better solutions. The search performance, therefore, depends on structural aspects of the search space, which in turn depend on the move operator being used to modify solutions. A common way to characterize the search space of a problem is through the study of its fitness landscape, a mathematical object comprising the space of all possible solutions, their value with respect to the optimization objective, and a relationship of neighborhood defined by the move operator. The landscape metaphor is used to explain the search dynamics as a sort of potential function. The concept is indeed similar to that of potential energy surfaces in physical chemistry. Borrowing ideas from that field, we propose to extend to combinatorial landscapes the notion of the inherent network formed by energy minima in energy landscapes. In our case, energy minima are the local optima of the combinatorial problem, and we explore several definitions for the network edges. At first, we perform an exhaustive sampling of local optima basins of attraction, and define weighted transitions between basins by accounting for all the possible ways of crossing the basins frontier via one random move. Then, we reduce the computational burden by only counting the chances of escaping a given basin via random kick moves that start at the local optimum. Finally, we approximate network edges from the search trajectory of simple search heuristics, mining the frequency and inter-arrival time with which the heuristic visits local optima. Through these methodologies, we build a weighted directed graph that provides a synthetic view of the whole landscape, and that we can characterize using the tools of complex networks science. We argue that the network characterization can advance our understanding of the structural and dynamical properties of hard combinatorial landscapes. We apply our approach to prototypical problems such as the Quadratic Assignment Problem, the NK model of rugged landscapes, and the Permutation Flow-shop Scheduling Problem. We show that some network metrics can differentiate problem classes, correlate with problem non-linearity, and predict problem hardness as measured from the performances of trajectory-based local search heuristics.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

The study reports a set of forty proteinogenic histidine-containing dipeptides as potential carbonyl quenchers. The peptides were chosen to cover as exhaustively as possible the accessible chemical space, and their quenching activities toward 4-hydroxy-2-nonenal (HNE) and pyridoxal were evaluated by HPLC analyses. The peptides were capped at the C-terminus as methyl esters or amides to favor their resistance to proteolysis and diastereoisomeric pairs were considered to reveal the influence of configuration on quenching. On average, the examined dipeptides are less active than the parent compound carnosine (βAla + His) thus emphasizing the unfavorable effect of the shortening of the βAla residue as confirmed by the control dipeptide Gly-His. Nevertheless, some peptides show promising activities toward HNE combined with a remarkable selectivity. The results emphasize the beneficial role of aromatic and positively charged residues, while negatively charged and H-bonding side chains show a detrimental effect on quenching. As a trend, ester derivatives are slightly more active than amides while heterochiral peptides are more active than their homochiral diastereoisomer. Overall, the results reveal that quenching activity strongly depends on conformational effects and vicinal residues (as evidenced by the reported QSAR analysis), offering insightful clues for the design of improved carbonyl quenchers and to rationalize the specific reactivity of histidine residues within proteins.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Breathing-induced bulk motion of the myocardium during data acquisition may cause severe image artifacts in coronary magnetic resonance angiography (MRA). Current motion compensation strategies include breath-holding or free-breathing MR navigator gating and tracking techniques. Navigator-based techniques have been further refined by the applications of sophisticated 2D k-space reordering techniques. A further improvement in image quality and a reduction of relative scanning duration may be expected from a 3D k-space reordering scheme. Therefore, a 3D k-space reordered acquisition scheme including a 3D navigator gated and corrected segmented k-space gradient echo imaging sequence for coronary MRA was implemented. This new zonal motion-adapted acquisition and reordering technique (ZMART) was developed on the basis of a numerical simulation of the Bloch equations. The technique was implemented on a commercial 1.5T MR system, and first phantom and in vivo experiments were performed. Consistent with the results of the theoretical findings, the results obtained in the phantom studies demonstrate a significant reduction of motion artifacts when compared to conventional (non-k-space reordered) gating techniques. Preliminary in vivo findings also compare favorably with the phantom experiments and theoretical considerations. Magn Reson Med 45:645-652, 2001.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The purpose of this study was to prospectively compare free-breathing navigator-gated cardiac-triggered three-dimensional steady-state free precession (SSFP) spin-labeling coronary magnetic resonance (MR) angiography performed by using Cartesian k-space sampling with that performed by using radial k-space sampling. A new dedicated placement of the two-dimensional selective labeling pulse and an individually adjusted labeling delay time approved by the institutional review board were used. In 14 volunteers (eight men, six women; mean age, 28.8 years) who gave informed consent, signal-to-noise ratio (SNR), contrast-to-noise ratio (CNR), vessel sharpness, vessel length, and subjective image quality were investigated. Differences between groups were analyzed with nonparametric tests (Wilcoxon, Pearson chi2). Radial imaging, as compared with Cartesian imaging, resulted in a significant reduction in the severity of motion artifacts, as well as an increase in SNR (26.9 vs 12.0, P < .05) in the coronary arteries and CNR (23.1 vs 8.8, P < .05) between the coronary arteries and the myocardium. A tendency toward improved vessel sharpness and vessel length was also found with radial imaging. Radial SSFP imaging is a promising technique for spin-labeling coronary MR angiography.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

BACKGROUND: Malaria is almost invariably ranked as the leading cause of morbidity and mortality in Africa. There is growing evidence of a decline in malaria transmission, morbidity and mortality over the last decades, especially so in East Africa. However, there is still doubt whether this decline is reflected in a reduction of the proportion of malaria among fevers. The objective of this systematic review was to estimate the change in the Proportion of Fevers associated with Plasmodium falciparum parasitaemia (PFPf) over the past 20 years in sub-Saharan Africa. METHODS: Search strategy. In December 2009, publications from the National Library of Medicine database were searched using the combination of 16 MeSH terms.Selection criteria. Inclusion criteria: studies 1) conducted in sub-Saharan Africa, 2) patients presenting with a syndrome of 'presumptive malaria', 3) numerators (number of parasitologically confirmed cases) and denominators (total number of presumptive malaria cases) available, 4) good quality microscopy.Data collection and analysis. The following variables were extracted: parasite presence/absence, total number of patients, age group, year, season, country and setting, clinical inclusion criteria. To assess the dynamic of PFPf over time, the median PFPf was compared between studies published in the years ≤2000 and > 2000. RESULTS: 39 studies conducted between 1986 and 2007 in 16 different African countries were included in the final analysis. When comparing data up to year 2000 (24 studies) with those afterwards (15 studies), there was a clear reduction in the median PFPf from 44% (IQR 31-58%; range 7-81%) to 22% (IQR 13-33%; range 2-77%). This dramatic decline is likely to reflect a true change since stratified analyses including explanatory variables were performed and median PFPfs were always lower after 2000 compared to before. CONCLUSIONS: There was a considerable reduction of the proportion of malaria among fevers over time in Africa. This decline provides evidence for the policy change from presumptive anti-malarial treatment of all children with fever to laboratory diagnosis and treatment upon result. This should insure appropriate care of non-malaria fevers and rationale use of anti-malarials.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Abstract The main objective of this work is to show how the choice of the temporal dimension and of the spatial structure of the population influences an artificial evolutionary process. In the field of Artificial Evolution we can observe a common trend in synchronously evolv¬ing panmictic populations, i.e., populations in which any individual can be recombined with any other individual. Already in the '90s, the works of Spiessens and Manderick, Sarma and De Jong, and Gorges-Schleuter have pointed out that, if a population is struc¬tured according to a mono- or bi-dimensional regular lattice, the evolutionary process shows a different dynamic with respect to the panmictic case. In particular, Sarma and De Jong have studied the selection pressure (i.e., the diffusion of a best individual when the only selection operator is active) induced by a regular bi-dimensional structure of the population, proposing a logistic modeling of the selection pressure curves. This model supposes that the diffusion of a best individual in a population follows an exponential law. We show that such a model is inadequate to describe the process, since the growth speed must be quadratic or sub-quadratic in the case of a bi-dimensional regular lattice. New linear and sub-quadratic models are proposed for modeling the selection pressure curves in, respectively, mono- and bi-dimensional regu¬lar structures. These models are extended to describe the process when asynchronous evolutions are employed. Different dynamics of the populations imply different search strategies of the resulting algorithm, when the evolutionary process is used to solve optimisation problems. A benchmark of both discrete and continuous test problems is used to study the search characteristics of the different topologies and updates of the populations. In the last decade, the pioneering studies of Watts and Strogatz have shown that most real networks, both in the biological and sociological worlds as well as in man-made structures, have mathematical properties that set them apart from regular and random structures. In particular, they introduced the concepts of small-world graphs, and they showed that this new family of structures has interesting computing capabilities. Populations structured according to these new topologies are proposed, and their evolutionary dynamics are studied and modeled. We also propose asynchronous evolutions for these structures, and the resulting evolutionary behaviors are investigated. Many man-made networks have grown, and are still growing incrementally, and explanations have been proposed for their actual shape, such as Albert and Barabasi's preferential attachment growth rule. However, many actual networks seem to have undergone some kind of Darwinian variation and selection. Thus, how these networks might have come to be selected is an interesting yet unanswered question. In the last part of this work, we show how a simple evolutionary algorithm can enable the emrgence o these kinds of structures for two prototypical problems of the automata networks world, the majority classification and the synchronisation problems. Synopsis L'objectif principal de ce travail est de montrer l'influence du choix de la dimension temporelle et de la structure spatiale d'une population sur un processus évolutionnaire artificiel. Dans le domaine de l'Evolution Artificielle on peut observer une tendence à évoluer d'une façon synchrone des populations panmictiques, où chaque individu peut être récombiné avec tout autre individu dans la population. Déjà dans les année '90, Spiessens et Manderick, Sarma et De Jong, et Gorges-Schleuter ont observé que, si une population possède une structure régulière mono- ou bi-dimensionnelle, le processus évolutionnaire montre une dynamique différente de celle d'une population panmictique. En particulier, Sarma et De Jong ont étudié la pression de sélection (c-à-d la diffusion d'un individu optimal quand seul l'opérateur de sélection est actif) induite par une structure régulière bi-dimensionnelle de la population, proposant une modélisation logistique des courbes de pression de sélection. Ce modèle suppose que la diffusion d'un individu optimal suit une loi exponentielle. On montre que ce modèle est inadéquat pour décrire ce phénomène, étant donné que la vitesse de croissance doit obéir à une loi quadratique ou sous-quadratique dans le cas d'une structure régulière bi-dimensionnelle. De nouveaux modèles linéaires et sous-quadratique sont proposés pour des structures mono- et bi-dimensionnelles. Ces modèles sont étendus pour décrire des processus évolutionnaires asynchrones. Différentes dynamiques de la population impliquent strategies différentes de recherche de l'algorithme résultant lorsque le processus évolutionnaire est utilisé pour résoudre des problèmes d'optimisation. Un ensemble de problèmes discrets et continus est utilisé pour étudier les charactéristiques de recherche des différentes topologies et mises à jour des populations. Ces dernières années, les études de Watts et Strogatz ont montré que beaucoup de réseaux, aussi bien dans les mondes biologiques et sociologiques que dans les structures produites par l'homme, ont des propriétés mathématiques qui les séparent à la fois des structures régulières et des structures aléatoires. En particulier, ils ont introduit la notion de graphe sm,all-world et ont montré que cette nouvelle famille de structures possède des intéressantes propriétés dynamiques. Des populations ayant ces nouvelles topologies sont proposés, et leurs dynamiques évolutionnaires sont étudiées et modélisées. Pour des populations ayant ces structures, des méthodes d'évolution asynchrone sont proposées, et la dynamique résultante est étudiée. Beaucoup de réseaux produits par l'homme se sont formés d'une façon incrémentale, et des explications pour leur forme actuelle ont été proposées, comme le preferential attachment de Albert et Barabàsi. Toutefois, beaucoup de réseaux existants doivent être le produit d'un processus de variation et sélection darwiniennes. Ainsi, la façon dont ces structures ont pu être sélectionnées est une question intéressante restée sans réponse. Dans la dernière partie de ce travail, on montre comment un simple processus évolutif artificiel permet à ce type de topologies d'émerger dans le cas de deux problèmes prototypiques des réseaux d'automates, les tâches de densité et de synchronisation.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Drug combinations can improve angiostatic cancer treatment efficacy and enable the reduction of side effects and drug resistance. Combining drugs is non-trivial due to the high number of possibilities. We applied a feedback system control (FSC) technique with a population-based stochastic search algorithm to navigate through the large parametric space of nine angiostatic drugs at four concentrations to identify optimal low-dose drug combinations. This implied an iterative approach of in vitro testing of endothelial cell viability and algorithm-based analysis. The optimal synergistic drug combination, containing erlotinib, BEZ-235 and RAPTA-C, was reached in a small number of iterations. Final drug combinations showed enhanced endothelial cell specificity and synergistically inhibited proliferation (p < 0.001), but not migration of endothelial cells, and forced enhanced numbers of endothelial cells to undergo apoptosis (p < 0.01). Successful translation of this drug combination was achieved in two preclinical in vivo tumor models. Tumor growth was inhibited synergistically and significantly (p < 0.05 and p < 0.01, respectively) using reduced drug doses as compared to optimal single-drug concentrations. At the applied conditions, single-drug monotherapies had no or negligible activity in these models. We suggest that FSC can be used for rapid identification of effective, reduced dose, multi-drug combinations for the treatment of cancer and other diseases.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

How a stimulus or a task alters the spontaneous dynamics of the brain remains a fundamental open question in neuroscience. One of the most robust hallmarks of task/stimulus-driven brain dynamics is the decrease of variability with respect to the spontaneous level, an effect seen across multiple experimental conditions and in brain signals observed at different spatiotemporal scales. Recently, it was observed that the trial-to-trial variability and temporal variance of functional magnetic resonance imaging (fMRI) signals decrease in the task-driven activity. Here we examined the dynamics of a large-scale model of the human cortex to provide a mechanistic understanding of these observations. The model allows computing the statistics of synaptic activity in the spontaneous condition and in putative tasks determined by external inputs to a given subset of brain regions. We demonstrated that external inputs decrease the variance, increase the covariances, and decrease the autocovariance of synaptic activity as a consequence of single node and large-scale network dynamics. Altogether, these changes in network statistics imply a reduction of entropy, meaning that the spontaneous synaptic activity outlines a larger multidimensional activity space than does the task-driven activity. We tested this model's prediction on fMRI signals from healthy humans acquired during rest and task conditions and found a significant decrease of entropy in the stimulus-driven activity. Altogether, our study proposes a mechanism for increasing the information capacity of brain networks by enlarging the volume of possible activity configurations at rest and reliably settling into a confined stimulus-driven state to allow better transmission of stimulus-related information.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Na,K-ATPase is the main active transport system that maintains the large gradients of Na(+) and K(+) across the plasma membrane of animal cells. The crystal structure of a K(+)-occluding conformation of this protein has been recently published, but the movements of its different domains allowing for the cation pumping mechanism are not yet known. The structure of many more conformations is known for the related calcium ATPase SERCA, but the reliability of homology modeling is poor for several domains with low sequence identity, in particular the extracellular loops. To better define the structure of the large fourth extracellular loop between the seventh and eighth transmembrane segments of the alpha subunit, we have studied the formation of a disulfide bond between pairs of cysteine residues introduced by site-directed mutagenesis in the second and the fourth extracellular loop. We found a specific pair of cysteine positions (Y308C and D884C) for which extracellular treatment with an oxidizing agent inhibited the Na,K pump function, which could be rapidly restored by a reducing agent. The formation of the disulfide bond occurred preferentially under the E2-P conformation of Na,K-ATPase, in the absence of extracellular cations. Using recently published crystal structure and a distance constraint reproducing the existence of disulfide bond, we performed an extensive conformational space search using simulated annealing and showed that the Tyr(308) and Asp(884) residues can be in close proximity, and simultaneously, the SYGQ motif of the fourth extracellular loop, known to interact with the extracellular domain of the beta subunit, can be exposed to the exterior of the protein and can easily interact with the beta subunit.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The interfaces between the intrapsychic, interactional, and intergenerational domains are a new frontier. As a pilot, we exposed ourselves to a complex but controllable situation as viewed by people whose main interest is in one of the three interfaces; we also fully integrated the subjects in the team, to learn about their subjective perspectives and to provide them with an enriching experience. We started with a brief "triadification" sequence (i.e., moving from a "two plus one" to a "three together" family organization). Considering this sequence as representing at a micro level many larger family transitions, we proceeded with a microanalytic interview, a psychodynamic investigation, and a family interview. As expected, larger patterns of correspondences are emerging. Central questions under debate are: What are the most appropriate units at each level of description and what are their articulations between these levels? What is the status of "triadification"? Les interfaces entre les domaines intrapsychiques, interactionnels et intergénérationnels représentent une nouvelle frontiére. A titre exploratoire, nous nous sommes exposés à une situation complexe mais contrǒlable ainsi que le voient ceux dont I'intérět principal se porte sur l'une de ces trois interfaces. Nous avons aussi entièrement intégré les sujets dans l'équipe, de facon à comprendre leur perspective subjective et à leur offrir une expérience enrichissante. Nous avons commencé avec une brève séquence de "triadification," c'est-à-dire passer d'une organisation familiale "deux plus un" à Ltne organisation familiale "trois (add sentenc)ensemble." Considérant cette séquence comme representative à un niveau microscopique de transitions familiales bien plus larges, nous avons procedé à l'entretien microanalytique, à une enquěte psychodynamique et à un entretien familial. Comme prévu, de grands patterns de correspondances émergent. Les questions essentielles sur lesquelles portent le débat sont: quelles les unités les plus appropiées à chaque niveau de description et quelles sont les articulations entre ces niveaux? Quel est le statut de la "triadification"?

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This thesis explores the importance of literary New York City in the urban narratives of Edith Wharton and Anzia Yezierska. It specifically looks at the Empire City of the Progressive Period when the concept of the city was not only a new theme but also very much a typical American one which was as central to the American experience as had been the Western frontier. It could be argued, in fact, that the American city had become the new frontier where modern experiences like urbanization, industrialization, immigration, and also women's emancipation and suffrage, caused all kinds of sensations on the human scale from smoothly lived assimilation and acculturation to deeply felt alienation because of the constantly shifting urban landscape. The developing urban space made possible the emergence of new female literary protagonists like the working girl, the reformer, the prostitute, and the upper class lady dedicating her life to 'conspicuous consumption'. Industrialization opened up city space to female exploration: on the one hand, upper and middle class ladies ventured out of the home because of the many novel urban possibilities, and on the other, lower class and immigrant girls also left their domestic sphere to look for paid jobs outside the home. New York City at the time was not only considered the epicenter of the world at large, it was also a city of great extremes. Everything was constantly in flux: small brownstones made way for ever taller skyscrapers and huge waves of immigrants from Europe pushed native New Yorkers further uptown on the island, adding to the crowdedness and intensity of the urban experience. The city became a polarized urban space with Fifth Avenue representing one end of the spectrum and the Lower East Side the other. Questions of space and the urban home greatly mattered. It has been pointed out that the city setting functions as an ideal means for the display of human nature as well as social processes. Narrative representations of urban space, therefore, provide a similar canvas for a protagonist's journey and development. From widely diverging vantage points both Edith Wharton and Anzia Yezierska thus create a polarized city where domesticity is a primal concern. Looking at all of their New York narratives by close readings of exterior and interior city representations, this thesis shows how urban space greatly affects questions of identity, assimilation, and alienation in literary protagonists who cannot escape the influence of their respective urban settings. Edith Wharton's upper class "millionaire" heroines are framed and contained by the city interiors of "old" New York, making it impossible for them to truly participate in the urban landscape in order to develop outside of their 'Gilt Cages'. On the other side are Anzia Yezierska's struggling "immigrant" protagonists who, against all odds, never give up in their urban context of streets, rooftops, and stoops. Their New York City, while always challenging and perpetually changing, at least allows them perspectives of hope for a 'Promised Land' in the making. Central for both urban narrative approaches is the quest for a home as an architectural structure, a spiritual resting place, and a locus for identity forming. But just as the actual city embraces change, urban protagonists must embrace change also if they desire to find fulfillment and success. That this turns out to be much easier for Anzia Yezierska's driven immigrants rather than for Edith Wharton's well established native New Yorkers is a surprising conclusion to this urban theme.