56 resultados para Riemann Sphere
Resumo:
Neuroblastoma (NB) is the most common extracranial malignant tumor in young children and arises at any site of the sympathetic nervous system. The disease exhibits a remarkable phenotypic diversity ranging from spontaneous regression to fatal disease. Poor outcome results from a rapidly progressive, metastatic and drug-resistant disease. Recent studies have suggested that solid tumors may arise from a minor population of cancer stem cells (CSCs) with stem cell markers and typical properties such as self-renewal ability, asymmetric division and drug resistance. In this model, CSCs possess the exclusive ability to initiate and maintain the tumor, and to produce distant metastases. Tumor cell subpopulations with stem-like phenotypes have indeed been identified in several cancer including leukemia, breast, brain and colon cancers. CSC hypothesis still needs to be validated in the other cancers including NB.NB originates from neural crest-derived malignant sympatho-adrenal cells. We have identified rare cells that express markers in conformity with neural crest stem cells and their derived lineages within primary NB tissue and cell lines, leading us to postulate the existence of CSCs in NB tumors.In the absence of specific markers to isolate CSCs, we adapted to NB tumor cells the sphere functional assay, based on the ability of stem cells to grow as spheres in non-adherent conditions. By serial passages of spheres from bone marrow NB metastases, a subset of cells was gradually selected and its specific gene expression profile identified by micro-array time-course analysis. The differentially expressed genes in spheres are enriched in genes implicated in development including CD133, ABC-transporters, WNT and NOTCH genes, identified in others solid cancers as CSCs markers, and other new markers, all referred by us as the Neurosphere Expression Profile (NEP). We confirmed the presence of a cell subpopulation expressing a combination of the NEP markers within a few primary NB samples.The tumorigenic potential of NB spheres was assayed by in vivo tumor growth analyses using orthotopic (adrenal glands) implantations of tumor cells into immune-compromised mice. Tumors derived from the sphere cells were significantly more frequent and were detected earlier compared to whole tumor cells. However, NB cells expressing the neurosphere-associated genes and isolated from the bulk tumors did not recapitulate the CSC-like phenotype in the orthotopic model. In addition, the NB sphere cells lost their higher tumorigenic potential when implanted in a subcutaneous heterotopic in vivo model.These results highlighted the complex behavior of CSC functions and led us to consider the stem-like NB cells as a dynamic and heterogeneous cell population influenced by microenvironment signals.Our approach identified for the first time candidate genes that may be associated with NB self-renewal and tumorigenicity and therefore would establish specific functional targets for more effective therapies in aggressive NB.
Resumo:
PURPOSE: The combination of embolic beads with a multitargeted tyrosine kinase inhibitor that inhibits tumor vessel growth is suggested as an alternative and improvement to the current standard doxorubicin-eluting beads for use in transarterial chemoembolization. This study demonstrates the in vitro loading and release kinetics of sunitinib using commercially available embolization microspheres and evaluates the in vitro biologic efficacy on cell cultures and the resulting in vivo pharmacokinetics profiles in an animal model. MATERIALS AND METHODS: DC Bead microspheres, 70-150 µm and 100-300 µm (Biocompatibles Ltd., Farnham, United Kingdom), were loaded by immersion in sunitinib solution. Drug release was measured in saline in a USP-approved flow-through apparatus and quantified by spectrophotometry. Activity after release was confirmed in cell culture. For pharmacokinetics and in vivo toxicity evaluation, New Zealand white rabbits received sunitinib either by intraarterial injection of 100-300 µm sized beads or per os. Plasma and liver tissue drug concentrations were assessed by liquid chromatography-tandem mass spectroscopy. RESULTS: Sunitinib loading on beads was close to complete and homogeneous. A total release of 80% in saline was measured, with similar fast-release profiles for both sphere sizes. After embolization, drug plasma levels remained below the therapeutic threshold (< 50 ng/mL), but high concentrations at 6 hours (14.9 µg/g) and 24 hours (3.4 µg/g) were found in the liver tissue. CONCLUSIONS: DC Bead microspheres of two sizes were efficiently loaded with sunitinib and displayed a fast and almost complete release in saline. High liver drug concentrations and low systemic levels indicated the potential of sunitinib-eluting beads for use in embolization.
Resumo:
Drug-eluting microspheres are used for embolization of hypervascular tumors and allow for local controlled drug release. Although the drug release from the microspheres relies on fast ion-exchange, so far only slow-releasing in vitro dissolution methods have been correlated to in vivo data. Three in vitro release methods are assessed in this study for their potential to predict slow in vivo release of sunitinib from chemoembolization spheres to the plasma, and fast local in vivo release obtained in an earlier study in rabbits. Release in an orbital shaker was slow (t50%=4.5h, 84% release) compared to fast release in USP 4 flow-through implant cells (t50%=1h, 100% release). Sunitinib release in saline from microspheres enclosed in dialysis inserts was prolonged and incomplete (t50%=9 days, 68% release) due to low drug diffusion through the dialysis membrane. The slow-release profile fitted best to low sunitinib plasma AUC following injection of sunitinib-eluting spheres. Although limited by lack of standardization, release in the orbital shaker fitted best to local in vivo sunitinib concentrations. Drug release in USP flow-through implant cells was too fast to correlate with local concentrations, although this method is preferred to discriminate between different sphere types.
Resumo:
Cortical folding (gyrification) is determined during the first months of life, so that adverse events occurring during this period leave traces that will be identifiable at any age. As recently reviewed by Mangin and colleagues(2), several methods exist to quantify different characteristics of gyrification. For instance, sulcal morphometry can be used to measure shape descriptors such as the depth, length or indices of inter-hemispheric asymmetry(3). These geometrical properties have the advantage of being easy to interpret. However, sulcal morphometry tightly relies on the accurate identification of a given set of sulci and hence provides a fragmented description of gyrification. A more fine-grained quantification of gyrification can be achieved with curvature-based measurements, where smoothed absolute mean curvature is typically computed at thousands of points over the cortical surface(4). The curvature is however not straightforward to comprehend, as it remains unclear if there is any direct relationship between the curvedness and a biologically meaningful correlate such as cortical volume or surface. To address the diverse issues raised by the measurement of cortical folding, we previously developed an algorithm to quantify local gyrification with an exquisite spatial resolution and of simple interpretation. Our method is inspired of the Gyrification Index(5), a method originally used in comparative neuroanatomy to evaluate the cortical folding differences across species. In our implementation, which we name local Gyrification Index (lGI(1)), we measure the amount of cortex buried within the sulcal folds as compared with the amount of visible cortex in circular regions of interest. Given that the cortex grows primarily through radial expansion(6), our method was specifically designed to identify early defects of cortical development. In this article, we detail the computation of local Gyrification Index, which is now freely distributed as a part of the FreeSurfer Software (http://surfer.nmr.mgh.harvard.edu/, Martinos Center for Biomedical Imaging, Massachusetts General Hospital). FreeSurfer provides a set of automated reconstruction tools of the brain's cortical surface from structural MRI data. The cortical surface extracted in the native space of the images with sub-millimeter accuracy is then further used for the creation of an outer surface, which will serve as a basis for the lGI calculation. A circular region of interest is then delineated on the outer surface, and its corresponding region of interest on the cortical surface is identified using a matching algorithm as described in our validation study(1). This process is repeatedly iterated with largely overlapping regions of interest, resulting in cortical maps of gyrification for subsequent statistical comparisons (Fig. 1). Of note, another measurement of local gyrification with a similar inspiration was proposed by Toro and colleagues(7), where the folding index at each point is computed as the ratio of the cortical area contained in a sphere divided by the area of a disc with the same radius. The two implementations differ in that the one by Toro et al. is based on Euclidian distances and thus considers discontinuous patches of cortical area, whereas ours uses a strict geodesic algorithm and include only the continuous patch of cortical area opening at the brain surface in a circular region of interest.
Resumo:
This essay focuses on how Spielberg's film engages with and contributes to the myth of Lincoln as a super-natural figure, a saint more than a hero or great statesman, while anchoring his moral authority in the sentimental rhetoric of the domestic sphere. It is this use of the melodramatic mode, linking the familial space with the national through the trope of the victim-hero, which is the essay's main concern. With Tony Kushner, author of Angels in America, as scriptwriter, it is perhaps not surprising that melodrama is the operative mode in the film. One of the issues that emerge from this analysis is how the film updates melodrama for a contemporary audience in order to minimize what could be perceived as manipulative sentimental devices, observing for most of the film an aesthetic of relative sobriety and realism. In the last hour, and especially the final minutes of the film, melodramatic conventions are deployed in full force and infused with hagiographic iconography to produce a series of emotionally charged moments that create a perfect union of American Civil Religion and classical melodrama. The cornerstone of both cultural paradigms, as deployed in this film, is death: Lincoln's at the hands of an assassin, and the Civil War soldiers', poignantly depicted at key moments of the film. Finally, the essay shows how film melodrama as a genre weaves together the private and the public, the domestic with the national, the familial with the military, and links pathos to politics in a carefully choreographed narrative of sentimentalized mythopoesis.
Resumo:
Dans certaines portions des agglomérations (poches de pauvreté de centre-ville, couronnes suburbaines dégradées, espaces périurbains sans aménité), un cumul entre des inégalités sociales (pauvreté, chômage, etc.) et environnementales (exposition au bruit, aux risques industriels, etc.) peut être observé. La persistance de ces inégalités croisées dans le temps indique une tendance de fond : la capacité d'accéder à un cadre de vie de qualité n'est pas équitablement partagée parmi les individus. Ce constat interroge : comment se créent ces inégalités ? Comment infléchir cette tendance et faire la ville plus juste ?¦Apporter des réponses à cette problématique nécessite d'identifier les facteurs de causalités qui entrent en jeu dans le système de (re)production des inégalités urbaines. Le fonctionnement des marchés foncier et immobilier, la « tyrannie des petites décisions » et les politiques publiques à incidence spatiale sont principalement impliqués. Ces dernières, agissant sur tous les éléments du système, sont placées au coeur de ce travail. On va ainsi s'intéresser précisément à la manière dont les collectivités publiques pilotent la production de la ville contemporaine, en portant l'attention sur la maîtrise publique d'ouvrage (MPO) des grands projets urbains.¦Poser la question de la justice dans la fabrique de la ville implique également de questionner les référentiels normatifs de l'action publique : à quelle conception de la justice celle-ci doit- elle obéir? Quatre perspectives (radicale, substantialiste, procédurale et intégrative) sont caractérisées, chacune se traduisant par des principes d'action différenciés. Une méthodologie hybride - empruntant à la sociologie des organisations et à l'analyse des politiques publiques - vient clore le volet théorique, proposant par un détour métaphorique d'appréhender le projet urbain comme une pièce de théâtre dont le déroulement dépend du jeu d'acteurs.¦Cette méthodologie est utilisée dans le volet empirique de la recherche, qui consiste en une analyse de la MPO d'un projet urbain en cours dans la première couronne de l'agglomération lyonnaise : le Carré de Soie. Trois grands objectifs sont poursuivis : descriptif (reconstruire le scénario), analytique (évaluer la nature de la pièce : conte de fée, tragédie ou match d'improvisation ?) et prescriptif (tirer la morale de l'histoire). La description de la MPO montre le déploiement successif de quatre stratégies de pilotage, dont les implications sur les temporalités, le contenu du projet (programmes, morphologies) et les financements publics vont être déterminantes. Sur la base de l'analyse, plusieurs recommandations peuvent être formulées - importance de l'anticipation et de l'articulation entre planification et stratégie foncière notamment - pour permettre à la sphère publique de dominer le jeu et d'assurer la production de justice par le projet urbain (réalisation puis entretien des équipements et espaces publics, financement de logements de qualité à destination d'un large éventail de populations, etc.). Plus généralement, un décalage problématique peut être souligné entre les territoires stratégiques pour le développement de l'agglomération et les capacités de portage limitées des communes concernées. Ce déficit plaide pour le renforcement des capacités d'investissement de la structure intercommunale.¦La seule logique du marché (foncier, immobilier) mène à la polarisation sociale et à la production d'inégalités urbaines. Faire la ville juste nécessite une forte volonté des collectivités publiques, laquelle doit se traduire aussi bien dans l'ambition affichée - une juste hiérarchisation des priorités dans le développement urbain - que dans son opérationnalisation - une juste maîtrise publique d'ouvrage des projets urbains.¦Inner-city neighborhoods, poor outskirts, and peri-urban spaces with no amenities usually suffer from social and environmental inequalities, such as poverty, unemployment, and exposure to noise and industrial hazards. The observed persistence of these inequalities over time points to an underlying trend - namely, that access to proper living conditions is fundamentally unequal, thus eliciting the question of how such inequalities are effected and how this trend can be reversed so as to build a more equitable city.¦Providing answers to such questions requires that the causal factors at play within the system of (re)production of urban inequalities be identified. Real estate markets, "micromotives and macrobehavior", and public policies that bear on space are mostly involved. The latter are central in that they act on all the elements of the system. This thesis therefore focuses on the way public authorities shape the production of contemporary cities, by studying the public project ownership of major urban projects.¦The study of justice within the urban fabric also implies that the normative frames of reference of public action be questioned: what conception of justice should public action refer to? This thesis examines four perspectives (radical, substantialist, procedural, and integrative) each of which results in different principles of action. This theoretical part is concluded by a hybrid methodology that draws from sociology of organizations and public policy analysis and that suggests that the urban project may be understood as a play, whose outcome hinges on the actors' acting.¦This methodology is applied to the empirical analysis of the public project ownership of an ongoing urban project in the Lyon first-ring suburbs: the Carré de Soie. Three main objectives are pursued: descriptive (reconstructing the scenario), analytical (assessing the nature of the play - fairy tale, tragedy or improvisation match), and prescriptive (drawing the moral of the story). The description of the public project ownership shows the successive deployment of four control strategies, whose implications on deadlines, project content (programs, morphologies), and public funding are significant. Building on the analysis, several recommendations can be made to allow the public sphere to control the process and ensure the urban project produces equity (most notably, anticipation and articulation of planning and real- estate strategy, as well as provision and maintenance of equipment and public spaces, funding of quality housing for a wide range of populations, etc.). More generally, a gap can be highlighted between those territories that are strategic to the development of the agglomeration and the limited resources of the municipalities involved. This deficit calls for strengthening the investment abilities of the intermunicipal structure.¦By itself, the real-estate market logic brings about social polarization and urban inequalities. Building an equitable city requires a strong will on the part of public authorities, a will that must be reflected both in the stated ambition - setting priorities of urban development equitably - and in its implementation managing urban public projects fairly.
Resumo:
La monnaie a été étudiée par des économistes hétérodoxes, des sociologues et des historiens qui ont souligné ses rapports à l'ordre collectif, mais elle n'est que rarement analysée sous l'angle de la citoyenneté. Notre thèse propose une réflexion théorique sur quatre types de fonctions (politique, symbolique, socioéconomique et psychoaffective) qui permettent à la monnaie de jouer un rôle de médiation de la citoyenneté. A partir d'une perspective qui combine les apports de l'économie politique internationale et de l'école de la régulation, nous montrons que cette médiation ne mobilise pas seulement des mécanismes sociopolitiques nationaux, mais aussi des mécanismes internationaux qui rétroagissent sur la sphère domestique des États et affectent leur capacité à définir leur régime de citoyenneté. Cette relation est analysée dans le contexte de l'institutionnalisation du système monétaire international de Bretton Woods (1944) et du développement de la globalisation financière depuis les années 1970. Si la monnaie a été mise au service d'un principe de protection des droits sociaux des citoyens contre les pressions financières extérieures après la Seconde guerre mondiale, elle contribue aujourd'hui à l'ouverture de la sphère domestique des Etats aux flux de capitaux transnationaux et à la création d'un ordre politique et juridique favorable aux droits des investisseurs. Cette dynamique est impulsée par l'essor de nouveaux intermédiaires financiers (notamment les agences de notation et les investisseurs institutionnels) et l'émergence concomitante d'une nouvelle forme d'Etat légitimée à partir d'un discours politique néolibéral insistant sur la quête de compétitivité, la réduction de la protection sociale et la responsabilisation individuelle. Elle se traduit par la privatisation des régimes de retraite et le développement des politiques d'éducation financière qui incitent les citoyens à se comporter en « preneurs de risques » actifs et responsables, assurant eux-mêmes leur sécurité économique à travers le placement de leur épargne retraite sur les marchés financiers. Nous soulignons toutefois les difficultés institutionnelles, cognitives et socioéconomiques qui rendent cette transformation de la citoyenneté contradictoire et problématique. Money has been studied by heterodox economists, sociologists and historians who stressed its relationship to collective order. However, it has hardly been analysed from the viewpoint of its relationship to citizenship. We propose a theoretical account of four types of functions (political, symbolic, socioeconomic and psychoaffective) enabling money to operate as a mediation of citizenship. From a perspective that combines the contributions of international political economy and the regulation school, we show that this mediation mobilises not only national sociopolitical mechanisms, but also international mechanisms which feed back on the domestic sphere of states and affect their capacity to define their regime of citizenship. This relationship is analysed in the context of the institutionalisation of the international monetary system of Bretton Woods (1944) and the development of financial globalization since the 1970s. If money has served to protect the social rights of citizens against external financial pressures after the Second World War, today it contributes to the opening of the domestic sphere of states to transnational capital flows and to the creation of a political and legal order favorable to the rights of investors. This dynamic is driven by the rise of new financial intermediaries (in particular rating agencies and institutional investisors) and the simultaneous emergence of a new form of state legitimized from a neoliberal political discourse emphasizing the quest for competitiveness, reduced social protection and individual responsibilization. It results in the privatization of pension systems and the development of policies of financial education that encourage citizens to behave as active and responsible « risk takers », ensuring their own economic security through the investment of their savings retirement on financial markets. However, we emphasize the institutional, cognitive and socioeconomic difficulties that make this transformation of citizenship contradictory and problematic. - Money has been studied by heterodox economists, sociologists and historians who stressed its relationship to collective order. However, it has hardly been analysed from the viewpoint of its relationship to citizenship. We propose a theoretical account of four types of functions (political, symbolic, socioeconomic and psychoaffective) enabling money to operate as a mediation of citizenship. From a perspective that combines the contributions of international political economy and the regulation school, we show that this mediation mobilises not only national sociopolitical mechanisms, but also international mechanisms which feed back on the domestic sphere of states and affect their capacity to define their regime of citizenship. This relationship is analysed in the context of the institutionalisation of the international monetary system of Bretton Woods (1944) and the development of financial globalization since the 1970s. If money has served to protect the social rights of citizens against external financial pressures after the Second World War, today it contributes to the opening of the domestic sphere of states to transnational capital flows and to the creation of a political and legal order favorable to the rights of investors. This dynamic is driven by the rise of new financial intermediaries (in particular rating agencies and institutional investisors) and the simultaneous emergence of a new form of state legitimized from a neoliberal political discourse emphasizing the quest for competitiveness, reduced social protection and individual responsibilization. It results in the privatization of pension systems and the development of policies of financial education that encourage citizens to behave as active and responsible « risk takers », ensuring their own economic security through the investment of their savings retirement on financial markets. However, we emphasize the institutional, cognitive and socioeconomic difficulties that make this transformation of citizenship problematic.
Resumo:
PURPOSE: In the radiopharmaceutical therapy approach to the fight against cancer, in particular when it comes to translating laboratory results to the clinical setting, modeling has served as an invaluable tool for guidance and for understanding the processes operating at the cellular level and how these relate to macroscopic observables. Tumor control probability (TCP) is the dosimetric end point quantity of choice which relates to experimental and clinical data: it requires knowledge of individual cellular absorbed doses since it depends on the assessment of the treatment's ability to kill each and every cell. Macroscopic tumors, seen in both clinical and experimental studies, contain too many cells to be modeled individually in Monte Carlo simulation; yet, in particular for low ratios of decays to cells, a cell-based model that does not smooth away statistical considerations associated with low activity is a necessity. The authors present here an adaptation of the simple sphere-based model from which cellular level dosimetry for macroscopic tumors and their end point quantities, such as TCP, may be extrapolated more reliably. METHODS: Ten homogenous spheres representing tumors of different sizes were constructed in GEANT4. The radionuclide 131I was randomly allowed to decay for each model size and for seven different ratios of number of decays to number of cells, N(r): 1000, 500, 200, 100, 50, 20, and 10 decays per cell. The deposited energy was collected in radial bins and divided by the bin mass to obtain the average bin absorbed dose. To simulate a cellular model, the number of cells present in each bin was calculated and an absorbed dose attributed to each cell equal to the bin average absorbed dose with a randomly determined adjustment based on a Gaussian probability distribution with a width equal to the statistical uncertainty consistent with the ratio of decays to cells, i.e., equal to Nr-1/2. From dose volume histograms the surviving fraction of cells, equivalent uniform dose (EUD), and TCP for the different scenarios were calculated. Comparably sized spherical models containing individual spherical cells (15 microm diameter) in hexagonal lattices were constructed, and Monte Carlo simulations were executed for all the same previous scenarios. The dosimetric quantities were calculated and compared to the adjusted simple sphere model results. The model was then applied to the Bortezomib-induced enzyme-targeted radiotherapy (BETR) strategy of targeting Epstein-Barr virus (EBV)-expressing cancers. RESULTS: The TCP values were comparable to within 2% between the adjusted simple sphere and full cellular models. Additionally, models were generated for a nonuniform distribution of activity, and results were compared between the adjusted spherical and cellular models with similar comparability. The TCP values from the experimental macroscopic tumor results were consistent with the experimental observations for BETR-treated 1 g EBV-expressing lymphoma tumors in mice. CONCLUSIONS: The adjusted spherical model presented here provides more accurate TCP values than simple spheres, on par with full cellular Monte Carlo simulations while maintaining the simplicity of the simple sphere model. This model provides a basis for complementing and understanding laboratory and clinical results pertaining to radiopharmaceutical therapy.
Resumo:
In recent years, protein-ligand docking has become a powerful tool for drug development. Although several approaches suitable for high throughput screening are available, there is a need for methods able to identify binding modes with high accuracy. This accuracy is essential to reliably compute the binding free energy of the ligand. Such methods are needed when the binding mode of lead compounds is not determined experimentally but is needed for structure-based lead optimization. We present here a new docking software, called EADock, that aims at this goal. It uses an hybrid evolutionary algorithm with two fitness functions, in combination with a sophisticated management of the diversity. EADock is interfaced with the CHARMM package for energy calculations and coordinate handling. A validation was carried out on 37 crystallized protein-ligand complexes featuring 11 different proteins. The search space was defined as a sphere of 15 A around the center of mass of the ligand position in the crystal structure, and on the contrary to other benchmarks, our algorithm was fed with optimized ligand positions up to 10 A root mean square deviation (RMSD) from the crystal structure, excluding the latter. This validation illustrates the efficiency of our sampling strategy, as correct binding modes, defined by a RMSD to the crystal structure lower than 2 A, were identified and ranked first for 68% of the complexes. The success rate increases to 78% when considering the five best ranked clusters, and 92% when all clusters present in the last generation are taken into account. Most failures could be explained by the presence of crystal contacts in the experimental structure. Finally, the ability of EADock to accurately predict binding modes on a real application was illustrated by the successful docking of the RGD cyclic pentapeptide on the alphaVbeta3 integrin, starting far away from the binding pocket.
Resumo:
We present here a nonbiased probabilistic method that allows us to consistently analyze knottedness of linear random walks with up to several hundred noncorrelated steps. The method consists of analyzing the spectrum of knots formed by multiple closures of the same open walk through random points on a sphere enclosing the walk. Knottedness of individual "frozen" configurations of linear chains is therefore defined by a characteristic spectrum of realizable knots. We show that in the great majority of cases this method clearly defines the dominant knot type of a walk, i.e., the strongest component of the spectrum. In such cases, direct end-to-end closure creates a knot that usually coincides with the knot type that dominates the random closure spectrum. Interestingly, in a very small proportion of linear random walks, the knot type is not clearly defined. Such walks can be considered as residing in a border zone of the configuration space of two or more knot types. We also characterize the scaling behavior of linear random knots.
Resumo:
INTRODUCTION: Inhalation injury is an important determinant of outcome in patients with major burns. However the diagnostic criteria remain imprecise, preventing objective comparisons of published data. The aims were to evaluate the utility of an inhalation score based on mucosal injury, while assessing separately the oro-pharyngeal sphere (ENT) and tracheobronchial tree (TB) in patients admitted to the ICU with a suspicion of inhalation injury. METHODS: Prospective observational study in 100 patients admitted with suspicion of inhalation injury among 168 consecutive burn admissions to the ICU of a university hospital. Inclusion criteria, endoscopic airway assessment during the first hours. ENT/TB lesion grading was 1: oedema, hyperemia, hypersecretion, 2: bullous mucosal detachment, erosion, exudates, 3: profound ulcers, necrosis. RESULTS: Of the 100 patients (age 42±17 years, burns 23±19%BSA), 79 presented an ENT inhalation injury ≥ENT1 (soot present in 24%): 36 had a tracheobronchial extension, 33 having a grade ≥TB1. Burned vibrissae: 10 patients "without" suffered ENT injury, while 6 patients "with" had no further lesions. Length of mechanical ventilation was strongly associated with the first 24 hrs' fluid resuscitation volume (p<0.0001) and the presence of inhalation injury (p=0.03), while the ICU length of stay was correlated with the %BSA. Soot was associated with prolonged mechanical ventilation (p=0.0115). There was no extubation failure. CONCLUSIONS: The developed inhalation score was simple to use, providing a unified language, and drawing attention to upper airway involvement. Burned vibrissae and suspected history proved to be insufficient diagnostic criteria. Further studies are required to validate the score in a larger population.
Resumo:
3 Summary 3. 1 English The pharmaceutical industry has been facing several challenges during the last years, and the optimization of their drug discovery pipeline is believed to be the only viable solution. High-throughput techniques do participate actively to this optimization, especially when complemented by computational approaches aiming at rationalizing the enormous amount of information that they can produce. In siiico techniques, such as virtual screening or rational drug design, are now routinely used to guide drug discovery. Both heavily rely on the prediction of the molecular interaction (docking) occurring between drug-like molecules and a therapeutically relevant target. Several softwares are available to this end, but despite the very promising picture drawn in most benchmarks, they still hold several hidden weaknesses. As pointed out in several recent reviews, the docking problem is far from being solved, and there is now a need for methods able to identify binding modes with a high accuracy, which is essential to reliably compute the binding free energy of the ligand. This quantity is directly linked to its affinity and can be related to its biological activity. Accurate docking algorithms are thus critical for both the discovery and the rational optimization of new drugs. In this thesis, a new docking software aiming at this goal is presented, EADock. It uses a hybrid evolutionary algorithm with two fitness functions, in combination with a sophisticated management of the diversity. EADock is interfaced with .the CHARMM package for energy calculations and coordinate handling. A validation was carried out on 37 crystallized protein-ligand complexes featuring 11 different proteins. The search space was defined as a sphere of 15 R around the center of mass of the ligand position in the crystal structure, and conversely to other benchmarks, our algorithms was fed with optimized ligand positions up to 10 A root mean square deviation 2MSD) from the crystal structure. This validation illustrates the efficiency of our sampling heuristic, as correct binding modes, defined by a RMSD to the crystal structure lower than 2 A, were identified and ranked first for 68% of the complexes. The success rate increases to 78% when considering the five best-ranked clusters, and 92% when all clusters present in the last generation are taken into account. Most failures in this benchmark could be explained by the presence of crystal contacts in the experimental structure. EADock has been used to understand molecular interactions involved in the regulation of the Na,K ATPase, and in the activation of the nuclear hormone peroxisome proliferatoractivated receptors a (PPARa). It also helped to understand the action of common pollutants (phthalates) on PPARy, and the impact of biotransformations of the anticancer drug Imatinib (Gleevec®) on its binding mode to the Bcr-Abl tyrosine kinase. Finally, a fragment-based rational drug design approach using EADock was developed, and led to the successful design of new peptidic ligands for the a5ß1 integrin, and for the human PPARa. In both cases, the designed peptides presented activities comparable to that of well-established ligands such as the anticancer drug Cilengitide and Wy14,643, respectively. 3.2 French Les récentes difficultés de l'industrie pharmaceutique ne semblent pouvoir se résoudre que par l'optimisation de leur processus de développement de médicaments. Cette dernière implique de plus en plus. de techniques dites "haut-débit", particulièrement efficaces lorsqu'elles sont couplées aux outils informatiques permettant de gérer la masse de données produite. Désormais, les approches in silico telles que le criblage virtuel ou la conception rationnelle de nouvelles molécules sont utilisées couramment. Toutes deux reposent sur la capacité à prédire les détails de l'interaction moléculaire entre une molécule ressemblant à un principe actif (PA) et une protéine cible ayant un intérêt thérapeutique. Les comparatifs de logiciels s'attaquant à cette prédiction sont flatteurs, mais plusieurs problèmes subsistent. La littérature récente tend à remettre en cause leur fiabilité, affirmant l'émergence .d'un besoin pour des approches plus précises du mode d'interaction. Cette précision est essentielle au calcul de l'énergie libre de liaison, qui est directement liée à l'affinité du PA potentiel pour la protéine cible, et indirectement liée à son activité biologique. Une prédiction précise est d'une importance toute particulière pour la découverte et l'optimisation de nouvelles molécules actives. Cette thèse présente un nouveau logiciel, EADock, mettant en avant une telle précision. Cet algorithme évolutionnaire hybride utilise deux pressions de sélections, combinées à une gestion de la diversité sophistiquée. EADock repose sur CHARMM pour les calculs d'énergie et la gestion des coordonnées atomiques. Sa validation a été effectuée sur 37 complexes protéine-ligand cristallisés, incluant 11 protéines différentes. L'espace de recherche a été étendu à une sphère de 151 de rayon autour du centre de masse du ligand cristallisé, et contrairement aux comparatifs habituels, l'algorithme est parti de solutions optimisées présentant un RMSD jusqu'à 10 R par rapport à la structure cristalline. Cette validation a permis de mettre en évidence l'efficacité de notre heuristique de recherche car des modes d'interactions présentant un RMSD inférieur à 2 R par rapport à la structure cristalline ont été classés premier pour 68% des complexes. Lorsque les cinq meilleures solutions sont prises en compte, le taux de succès grimpe à 78%, et 92% lorsque la totalité de la dernière génération est prise en compte. La plupart des erreurs de prédiction sont imputables à la présence de contacts cristallins. Depuis, EADock a été utilisé pour comprendre les mécanismes moléculaires impliqués dans la régulation de la Na,K ATPase et dans l'activation du peroxisome proliferatoractivated receptor a (PPARa). Il a également permis de décrire l'interaction de polluants couramment rencontrés sur PPARy, ainsi que l'influence de la métabolisation de l'Imatinib (PA anticancéreux) sur la fixation à la kinase Bcr-Abl. Une approche basée sur la prédiction des interactions de fragments moléculaires avec protéine cible est également proposée. Elle a permis la découverte de nouveaux ligands peptidiques de PPARa et de l'intégrine a5ß1. Dans les deux cas, l'activité de ces nouveaux peptides est comparable à celles de ligands bien établis, comme le Wy14,643 pour le premier, et le Cilengitide (PA anticancéreux) pour la seconde.