47 resultados para Theoretical calculation
Resumo:
Vaccines have been used as a successful tool in medicine by way of controlling many major diseases. In spite of this, vaccines today represent only a handful of all infectious diseases. Therefore, there is a pressing demand for improvements of existing vaccines with particular reference to higher efficacy and undisputed safety profiles. To this effect, as an alternative to available vaccine technologies, there has been a drive to develop vaccine candidate polypeptides by chemical synthesis. In our laboratory, we have recently developed a technology to manufacture long synthetic peptides of up to 130 residues, which are correctly folded and biologically active. This paper discusses the advantages of the molecularly defined, long synthetic peptide approach in the context of vaccine design, development and use in human vaccination.
Resumo:
This work compares the structural/dynamics features of the wild-type alb-adrenergic receptor (AR) with those of the D142A active mutant and the agonist-bound state. The two active receptor forms were compared in their isolated states as well as in their ability to form homodimers and to recognize the G alpha q beta 1 gamma 2 heterotrimer. The analysis of the isolated structures revealed that, although the mutation- and agonist-induced active states of the alpha 1b-AR are different, they, however, share several structural peculiarities including (a) the release of some constraining interactions found in the wild-type receptor and (b) the opening of a cytosolic crevice formed by the second and third intracellular loops and the cytosolic extensions of helices 5 and 6. Accordingly, also their tendency to form homodimers shows commonalties and differences. In fact, in both the active receptor forms, helix 6 plays a crucial role in mediating homodimerization. However, the homodimeric models result from different interhelical assemblies. On the same line of evidence, in both of the active receptor forms, the cytosolic opened crevice recognizes similar domains on the G protein. However, the docking solutions are differently populated and the receptor-G protein preorientation models suggest that the final complexes should be characterized by different interaction patterns.
Resumo:
To further validate the doubly labeled water method for measurement of CO2 production and energy expenditure in humans, we compared it with near-continuous respiratory gas exchange in nine healthy young adult males. Subjects were housed in a respiratory chamber for 4 days. Each received 2H2(18)O at either a low (n = 6) or a moderate (n = 3) isotope dose. Low and moderate doses produced initial 2H enrichments of 5 and 10 X 10(-3) atom percent excess, respectively, and initial 18O enrichments of 2 and 2.5 X 10(-2) atom percent excess, respectively. Total body water was calculated from isotope dilution in saliva collected at 4 and 5 h after the dose. CO2 production was calculated by the two-point method using the isotopic enrichments of urines collected just before each subject entered and left the chamber. Isotope enrichments relative to predose samples were measured by isotope ratio mass spectrometry. At low isotope dose, doubly labeled water overestimated average daily energy expenditure by 8 +/- 9% (SD) (range -7 to 22%). At moderate dose the difference was reduced to +4 +/- 5% (range 0-9%). The isotope elimination curves for 2H and 18O from serial urines collected from one of the subjects showed expected diurnal variations but were otherwise quite smooth. The overestimate may be due to approximations in the corrections for isotope fractionation and isotope dilution. An alternative approach to the corrections is presented that reduces the overestimate to 1%.
Resumo:
AbstractBreast cancer is one of the most common cancers affecting one in eight women during their lives. Survival rates have increased steadily thanks to early diagnosis with mammography screening and more efficient treatment strategies. Post-operative radiation therapy is a standard of care in the management of breast cancer and has been shown to reduce efficiently both local recurrence rate and breast cancer mortality. Radiation therapy is however associated with some late effects for long-term survivors. Radiation-induced secondary cancer is a relatively rare but severe late effect of radiation therapy. Currently, radiotherapy plans are essentially optimized to maximize tumor control and minimize late deterministic effects (tissue reactions) that are mainly associated with high doses (» 1 Gy). With improved cure rates and new radiation therapy technologies, it is also important to evaluate and minimize secondary cancer risks for different treatment techniques. This is a particularly challenging task due to the large uncertainties in the dose-response relationship.In contrast with late deterministic effects, secondary cancers may be associated with much lower doses and therefore out-of-field doses (also called peripheral doses) that are typically inferior to 1 Gy need to be determined accurately. Out-of-field doses result from patient scatter and head scatter from the treatment unit. These doses are particularly challenging to compute and we characterized it by Monte Carlo (MC) calculation. A detailed MC model of the Siemens Primus linear accelerator has been thoroughly validated with measurements. We investigated the accuracy of such a model for retrospective dosimetry in epidemiological studies on secondary cancers. Considering that patients in such large studies could be treated on a variety of machines, we assessed the uncertainty in reconstructed peripheral dose due to the variability of peripheral dose among various linac geometries. For large open fields (> 10x10 cm2), the uncertainty would be less than 50%, but for small fields and wedged fields the uncertainty in reconstructed dose could rise up to a factor of 10. It was concluded that such a model could be used for conventional treatments using large open fields only.The MC model of the Siemens Primus linac was then used to compare out-of-field doses for different treatment techniques in a female whole-body CT-based phantom. Current techniques such as conformai wedged-based radiotherapy and hybrid IMRT were investigated and compared to older two-dimensional radiotherapy techniques. MC doses were also compared to those of a commercial Treatment Planning System (TPS). While the TPS is routinely used to determine the dose to the contralateral breast and the ipsilateral lung which are mostly out of the treatment fields, we have shown that these doses may be highly inaccurate depending on the treatment technique investigated. MC shows that hybrid IMRT is dosimetrically similar to three-dimensional wedge-based radiotherapy within the field, but offers substantially reduced doses to out-of-field healthy organs.Finally, many different approaches to risk estimations extracted from the literature were applied to the calculated MC dose distribution. Absolute risks varied substantially as did the ratio of risk between two treatment techniques, reflecting the large uncertainties involved with current risk models. Despite all these uncertainties, the hybrid IMRT investigated resulted in systematically lower cancer risks than any of the other treatment techniques. More epidemiological studies with accurate dosimetry are required in the future to construct robust risk models. In the meantime, any treatment strategy that reduces out-of-field doses to healthy organs should be investigated. Electron radiotherapy might offer interesting possibilities with this regard.RésuméLe cancer du sein affecte une femme sur huit au cours de sa vie. Grâce au dépistage précoce et à des thérapies de plus en plus efficaces, le taux de guérison a augmenté au cours du temps. La radiothérapie postopératoire joue un rôle important dans le traitement du cancer du sein en réduisant le taux de récidive et la mortalité. Malheureusement, la radiothérapie peut aussi induire des toxicités tardives chez les patients guéris. En particulier, les cancers secondaires radio-induits sont une complication rare mais sévère de la radiothérapie. En routine clinique, les plans de radiothérapie sont essentiellement optimisées pour un contrôle local le plus élevé possible tout en minimisant les réactions tissulaires tardives qui sont essentiellement associées avec des hautes doses (» 1 Gy). Toutefois, avec l'introduction de différentes nouvelles techniques et avec l'augmentation des taux de survie, il devient impératif d'évaluer et de minimiser les risques de cancer secondaire pour différentes techniques de traitement. Une telle évaluation du risque est une tâche ardue étant donné les nombreuses incertitudes liées à la relation dose-risque.Contrairement aux effets tissulaires, les cancers secondaires peuvent aussi être induits par des basses doses dans des organes qui se trouvent hors des champs d'irradiation. Ces organes reçoivent des doses périphériques typiquement inférieures à 1 Gy qui résultent du diffusé du patient et du diffusé de l'accélérateur. Ces doses sont difficiles à calculer précisément, mais les algorithmes Monte Carlo (MC) permettent de les estimer avec une bonne précision. Un modèle MC détaillé de l'accélérateur Primus de Siemens a été élaboré et validé avec des mesures. La précision de ce modèle a également été déterminée pour la reconstruction de dose en épidémiologie. Si on considère que les patients inclus dans de larges cohortes sont traités sur une variété de machines, l'incertitude dans la reconstruction de dose périphérique a été étudiée en fonction de la variabilité de la dose périphérique pour différents types d'accélérateurs. Pour de grands champs (> 10x10 cm ), l'incertitude est inférieure à 50%, mais pour de petits champs et des champs filtrés, l'incertitude de la dose peut monter jusqu'à un facteur 10. En conclusion, un tel modèle ne peut être utilisé que pour les traitements conventionnels utilisant des grands champs.Le modèle MC de l'accélérateur Primus a été utilisé ensuite pour déterminer la dose périphérique pour différentes techniques dans un fantôme corps entier basé sur des coupes CT d'une patiente. Les techniques actuelles utilisant des champs filtrés ou encore l'IMRT hybride ont été étudiées et comparées par rapport aux techniques plus anciennes. Les doses calculées par MC ont été comparées à celles obtenues d'un logiciel de planification commercial (TPS). Alors que le TPS est utilisé en routine pour déterminer la dose au sein contralatéral et au poumon ipsilatéral qui sont principalement hors des faisceaux, nous avons montré que ces doses peuvent être plus ou moins précises selon la technTque étudiée. Les calculs MC montrent que la technique IMRT est dosimétriquement équivalente à celle basée sur des champs filtrés à l'intérieur des champs de traitement, mais offre une réduction importante de la dose aux organes périphériques.Finalement différents modèles de risque ont été étudiés sur la base des distributions de dose calculées par MC. Les risques absolus et le rapport des risques entre deux techniques de traitement varient grandement, ce qui reflète les grandes incertitudes liées aux différents modèles de risque. Malgré ces incertitudes, on a pu montrer que la technique IMRT offrait une réduction du risque systématique par rapport aux autres techniques. En attendant des données épidémiologiques supplémentaires sur la relation dose-risque, toute technique offrant une réduction des doses périphériques aux organes sains mérite d'être étudiée. La radiothérapie avec des électrons offre à ce titre des possibilités intéressantes.
Resumo:
Recognition by the T-cell receptor (TCR) of immunogenic peptides (p) presented by Class I major histocompatibility complexes (MHC) is the key event in the immune response against virus-infected cells or tumor cells. A study of the 2C TCR/SIYR/H-2K(b) system using a computational alanine scanning and a much faster binding free energy decomposition based on the Molecular Mechanics-Generalized Born Surface Area (MM-GBSA) method is presented. The results show that the TCR-p-MHC binding free energy decomposition using this approach and including entropic terms provides a detailed and reliable description of the interactions between the molecules at an atomistic level. Comparison of the decomposition results with experimentally determined activity differences for alanine mutants yields a correlation of 0.67 when the entropy is neglected and 0.72 when the entropy is taken into account. Similarly, comparison of experimental activities with variations in binding free energies determined by computational alanine scanning yields correlations of 0.72 and 0.74 when the entropy is neglected or taken into account, respectively. Some key interactions for the TCR-p-MHC binding are analyzed and some possible side chains replacements are proposed in the context of TCR protein engineering. In addition, a comparison of the two theoretical approaches for estimating the role of each side chain in the complexation is given, and a new ad hoc approach to decompose the vibrational entropy term into atomic contributions, the linear decomposition of the vibrational entropy (LDVE), is introduced. The latter allows the rapid calculation of the entropic contribution of interesting side chains to the binding. This new method is based on the idea that the most important contributions to the vibrational entropy of a molecule originate from residues that contribute most to the vibrational amplitude of the normal modes. The LDVE approach is shown to provide results very similar to those of the exact but highly computationally demanding method.
Resumo:
Le conseil génétique doit fournir aux individus une information médicale précise et un soutien psychologique. L'importance des principes d'autonomie et de confidentialité, dogmes du conseil génétique, est renforcée par la nouvelle loi suisse (LAGH). Dans certains pays, une grande partie du conseil génétique est assurée par des conseillers en génétique non médecins ayant une formation postgraduée spécifique. Le conseil génétique joue un rôle grandissant dans différents domaines de la médecine. En particulier, il est indispensable dans le contexte du prénatal où les couples reçoivent des informations complexes et doivent bénéficier dun soutien pour prendre une décision. Genetic counselling provides families with accurate medical information and psychological support. Respect and concern for the emotional well-being should be taken into account while discussing genetics aspects and recurrence risks. The importance of autonomy and confidentiality, central to genetic counselling, is reinforced by the new Swiss law (LAGH). In many countries, most of the genetic counselling is provided by genetic counsellors who have a specialised post-graduate training. Genetic counselling plays an increasing role in different medical specialities. In particular, it is essential in the context of prenatal and pre-conceptual care, where couples are confronted to complex information and should have access to appropriate support during the decision-making process
Resumo:
The quantity of interest for high-energy photon beam therapy recommended by most dosimetric protocols is the absorbed dose to water. Thus, ionization chambers are calibrated in absorbed dose to water, which is the same quantity as what is calculated by most treatment planning systems (TPS). However, when measurements are performed in a low-density medium, the presence of the ionization chamber generates a perturbation at the level of the secondary particle range. Therefore, the measured quantity is close to the absorbed dose to a volume of water equivalent to the chamber volume. This quantity is not equivalent to the dose calculated by a TPS, which is the absorbed dose to an infinitesimally small volume of water. This phenomenon can lead to an overestimation of the absorbed dose measured with an ionization chamber of up to 40% in extreme cases. In this paper, we propose a method to calculate correction factors based on the Monte Carlo simulations. These correction factors are obtained by the ratio of the absorbed dose to water in a low-density medium □D(w,Q,V1)(low) averaged over a scoring volume V₁ for a geometry where V₁ is filled with the low-density medium and the absorbed dose to water □D(w,QV2)(low) averaged over a volume V₂ for a geometry where V₂ is filled with water. In the Monte Carlo simulations, □D(w,QV2)(low) is obtained by replacing the volume of the ionization chamber by an equivalent volume of water, according to the definition of the absorbed dose to water. The method is validated in two different configurations which allowed us to study the behavior of this correction factor as a function of depth in phantom, photon beam energy, phantom density and field size.
Resumo:
This study proposes a theoretical model describing the electrostatically driven step of the alpha 1 b-adrenergic receptor (AR)-G protein recognition. The comparative analysis of the structural-dynamics features of functionally different receptor forms, i.e., the wild type (ground state) and its constitutively active mutants D142A and A293E, was instrumental to gain insight on the receptor-G protein electrostatic and steric complementarity. Rigid body docking simulations between the different forms of the alpha 1 b-AR and the heterotrimeric G alpha q, G alpha s, G alpha i1, and G alpha t suggest that the cytosolic crevice shared by the active receptor and including the second and the third intracellular loops as well as the cytosolic extension of helices 5 and 6, represents the receptor surface with docking complementarity with the G protein. On the other hand, the G protein solvent-exposed portions that recognize the intracellular loops of the activated receptors are the N-terminal portion of alpha 3, alpha G, the alpha G/alpha 4 loop, alpha 4, the alpha 4/beta 6 loop, alpha 5, and the C-terminus. Docking simulations suggest that the two constitutively active mutants D142A and A293E recognize different G proteins with similar selectivity orders, i.e., G alpha q approximately equal to G alpha s > G alpha i > G alpha t. The theoretical models herein proposed might provide useful suggestions for new experiments aiming at exploring the receptor-G protein interface.
Resumo:
Empirical studies have recently pointed towards a socio-structural category largely overlooked in social inequality research: the dynamic positions of households adjacent to those of the poor and yet not representing those of the established, more prosperous positions in society. These results suggest that the population in this category fluctuates into and out of poverty more often than moving into and out of secure prosperity. This category - still lacking theoretical conceptualization - is characterized by both precariousness and a certain degree of prosperity; despite a restricted and uncertain living standard it holds a range of opportunities for action. We seek analytical elements to conceptualize 'precarious prosperity' for comparative empirical research by subjecting various concepts of social inequality research to critical scrutiny. We then operationally define 'precarious prosperity' to screen for this population in three countries. Based on qualitative interviews with households in precarious prosperity, we present first analyses of perceptions and household strategies that underline the relevance of the concept in different countries.
Resumo:
The specificities of multinational corporations (MNCs) have to date not been a focus area of IS research. Extant literature mostly proposes IS configurations for specific types of MNCs, following a static and prescriptive approach. Our research seeks to explain the dynamics of global IS design. It suggests a new theoretical lens for studying global IS design by applying the structural adjustment paradigm from organizational change theories. Relying on archetype theory, we conduct a longitudinal case study to theorize the dynamics of IS adaptation. We find that global IS design emerges as an organizational adaptation process to balance interpretative schemes (i.e. the organization's values and beliefs) and structural arrangements (i.e. strategic, organizational, and IS configurations). The resulting insights can be used as a basis to further explore alternative global IS designs and movements between them.
Resumo:
During the last decade, argumentation has attracted growing attention as a means to elicit processes (linguistic, logical, dialogical, psychological, etc.) that can sustain or provoke reasoning and learning. Constituting an important dimension of daily life and of professional activities, argumentation plays a special role in democracies and is at the heart of philosophical reasoning and scientific inquiry. Argumentation, as such, requires specific intellectual and social skills. Hence, argumentation will have an increasing importance in education, both because it is a critical competence that has to be learned, and because argumentation can be used to foster learning in philosophy, history, sciences and in many other domains. Argumentation and Education answers these and other questions by providing both theoretical backgrounds, in psychology, education and theory of argumentation, and concrete examples of experiments and results in school contexts in a range of domains. It reports on existing innovative practices in education settings at various levels.