16 resultados para Macroscopic Fundamental Diagram
em Université de Lausanne, Switzerland
Resumo:
The present study was aimed at examining the role of nitric oxide (NO) in the hypoxic contraction of isolated small pulmonary arteries (SPA) in the rat. Animals were treated with either saline (sham experiments) or Escherichia coli lipolysaccharide [LPS, to obtain expression of the inducible NO synthase (iNOS) in the lung] and killed 4 h later. SPA (300- to 600-micrometer outer diameter) were mounted as rings in organ chambers for the recording of isometric tension, precontracted with PGF2alpha, and exposed to either severe (bath PO2 8 +/- 3 mmHg) or milder (21 +/- 3 mmHg) hypoxia. In SPA from sham-treated rats, contractions elicited by severe hypoxia were completely suppressed by either endothelium removal or preincubation with an NOS inhibitor [NG-nitro-L-arginine methyl ester (L-NAME), 10(-3) M]. In SPA from LPS-treated rats, contractions elicited by severe hypoxia occurred irrespective of the presence or absence of endothelium and were largely suppressed by L-NAME. The milder hypoxia elicited no increase in vascular tone. These results indicate an essential role of NO in the hypoxic contractions of precontracted rat SPA. The endothelium independence of HPV in arteries from LPS-treated animals appears related to the extraendothelial expression of iNOS. The severe degree of hypoxia required to elicit any contraction is consistent with a mechanism of reduced NO production caused by a limited availability of O2 as a substrate for NOS.
Resumo:
Much progress has been made over the past decades in the development of in vitro techniques for the assessment of chemically induced effects in embryonic and fetal development. In vitro assays have originally been developed to provide information on the mechanism of action of normal development, and have hence more adequately been used in fundamental research. These assays had to undergo extensive modification to be used in developmental toxicity testing. The present paper focuses on the rat whole embryo culture system, but also reviews modifications that were undertaken for the in vitro chick embryo system and the aggregate cultures of fetal rat brain cells. Today these tests cannot replace the existing in vivo developmental toxicity tests. They can, however, be used to screen chemicals for further development or further testing. In addition, these in vitro tests provide valuable information on the mechanisms of developmental toxicity and help to understand the relevancy of findings for humans. In vitro systems, combined with selected in vivo testing and pharmacokinetic investigations in animals and humans, can thus provide essential information for human risk assessment.
Resumo:
PURPOSE: In the radiopharmaceutical therapy approach to the fight against cancer, in particular when it comes to translating laboratory results to the clinical setting, modeling has served as an invaluable tool for guidance and for understanding the processes operating at the cellular level and how these relate to macroscopic observables. Tumor control probability (TCP) is the dosimetric end point quantity of choice which relates to experimental and clinical data: it requires knowledge of individual cellular absorbed doses since it depends on the assessment of the treatment's ability to kill each and every cell. Macroscopic tumors, seen in both clinical and experimental studies, contain too many cells to be modeled individually in Monte Carlo simulation; yet, in particular for low ratios of decays to cells, a cell-based model that does not smooth away statistical considerations associated with low activity is a necessity. The authors present here an adaptation of the simple sphere-based model from which cellular level dosimetry for macroscopic tumors and their end point quantities, such as TCP, may be extrapolated more reliably. METHODS: Ten homogenous spheres representing tumors of different sizes were constructed in GEANT4. The radionuclide 131I was randomly allowed to decay for each model size and for seven different ratios of number of decays to number of cells, N(r): 1000, 500, 200, 100, 50, 20, and 10 decays per cell. The deposited energy was collected in radial bins and divided by the bin mass to obtain the average bin absorbed dose. To simulate a cellular model, the number of cells present in each bin was calculated and an absorbed dose attributed to each cell equal to the bin average absorbed dose with a randomly determined adjustment based on a Gaussian probability distribution with a width equal to the statistical uncertainty consistent with the ratio of decays to cells, i.e., equal to Nr-1/2. From dose volume histograms the surviving fraction of cells, equivalent uniform dose (EUD), and TCP for the different scenarios were calculated. Comparably sized spherical models containing individual spherical cells (15 microm diameter) in hexagonal lattices were constructed, and Monte Carlo simulations were executed for all the same previous scenarios. The dosimetric quantities were calculated and compared to the adjusted simple sphere model results. The model was then applied to the Bortezomib-induced enzyme-targeted radiotherapy (BETR) strategy of targeting Epstein-Barr virus (EBV)-expressing cancers. RESULTS: The TCP values were comparable to within 2% between the adjusted simple sphere and full cellular models. Additionally, models were generated for a nonuniform distribution of activity, and results were compared between the adjusted spherical and cellular models with similar comparability. The TCP values from the experimental macroscopic tumor results were consistent with the experimental observations for BETR-treated 1 g EBV-expressing lymphoma tumors in mice. CONCLUSIONS: The adjusted spherical model presented here provides more accurate TCP values than simple spheres, on par with full cellular Monte Carlo simulations while maintaining the simplicity of the simple sphere model. This model provides a basis for complementing and understanding laboratory and clinical results pertaining to radiopharmaceutical therapy.
Resumo:
Related to the raise of the awareness of the importance of the Earth heritage, geomorphosites receive increasing attention from the scientific community. Assessment methods, classification and conservation strategies have been developed to safeguard the geomorphological heritage for present and future generations. On the other hand, Earth heritage offers opportunities to develop educational and recreational programs as well as tourism projects. Various interpretive supports and local development projects have been engendered in the past few years to promote geoheritage.¦Be it for the assessment, conservation or promotion of geomorphosites, maps are valuable from many standpoints. They can provide fundamental data for detailed geomorphosite description, serve as visual communication tools helping to guide the selection process in defining protection priority or supporting Earth heritage promotion and interpretàtion.¦This study reviews the main achievements and the objectives yet to be accomplished in the field of geomorphosite mapping and proposes a general framework for the mapping of geomorphosites that takes into account the different aims and publics. The main focus is on mapping geomorphosites for non-specialists in the field of Earth heritage promotion (Geotourism). In this context, maps are often employed to show itineraries or points of interest. Like a scheme or a diagram, a map can also be used as a method for visualising geoscientific information. This function is particularly important since some processes, which contributed to the formation of a geomorphosite or a geomorphological landscape are no longer or not always clearly visible in the landscape. In this case, maps become interpretive media that serve popularisation purposes.¦Mapping for non-specialists holds the challenging task to ensure the information transfer between the cartographer and the user. We therefore focus on both the implementation of the map by the cartographer (which information? which visualisation?) and the interpretation of the map by the user (effectiveness of the knowledge transfer). The research is based on empirical studies carried out in the Maderan valley (Canton of Uri) and in classes of the Cantons of Uri and Tessin that aim to gain knowledge about the familiarity and interests of non- specialists for geoheritage as well as about their map reading skills. The final objective is to formulate methodological proposals for geomorphosite mapping for interpretive purpose.
Resumo:
Wave-induced fluid flow at microscopic and mesoscopic scales arguably constitutes the major cause of intrinsic seismic attenuation throughout the exploration seismic and sonic frequency ranges. The quantitative analysis of these phenomena is, however, complicated by the fact that the governing physical processes may be dependent. The reason for this is that the presence of microscopic heterogeneities, such as micro-cracks or broken grain contacts, causes the stiffness of the so-called modified dry frame to be complex-valued and frequency-dependent, which in turn may affect the viscoelastic behaviour in response to fluid flow at mesoscopic scales. In this work, we propose a simple but effective procedure to estimate the seismic attenuation and velocity dispersion behaviour associated with wave-induced fluid flow due to both microscopic and mesoscopic heterogeneities and discuss the results obtained for a range of pertinent scenarios.
Resumo:
In this paper we study the role of incomplete ex ante contracts for ex post trade. Previous experimental evidence indicates that a contract provides a reference point for entitlements when the terms are negotiated in a competitive market. We show that this finding no longer holds when the terms are determined in a non-competitive way. Our results imply that the presence of a "fundamental transformation" (i.e., the transition from a competitive market to a bilateral relationship) is important for a contract to become a reference point. To the best of our knowledge this behavioral aspect of the fundamental transformation has not been shown before.
Resumo:
PURPOSE: In the radiopharmaceutical therapy approach to the fight against cancer, in particular when it comes to translating laboratory results to the clinical setting, modeling has served as an invaluable tool for guidance and for understanding the processes operating at the cellular level and how these relate to macroscopic observables. Tumor control probability (TCP) is the dosimetric end point quantity of choice which relates to experimental and clinical data: it requires knowledge of individual cellular absorbed doses since it depends on the assessment of the treatment's ability to kill each and every cell. Macroscopic tumors, seen in both clinical and experimental studies, contain too many cells to be modeled individually in Monte Carlo simulation; yet, in particular for low ratios of decays to cells, a cell-based model that does not smooth away statistical considerations associated with low activity is a necessity. The authors present here an adaptation of the simple sphere-based model from which cellular level dosimetry for macroscopic tumors and their end point quantities, such as TCP, may be extrapolated more reliably. METHODS: Ten homogenous spheres representing tumors of different sizes were constructed in GEANT4. The radionuclide 131I was randomly allowed to decay for each model size and for seven different ratios of number of decays to number of cells, N(r): 1000, 500, 200, 100, 50, 20, and 10 decays per cell. The deposited energy was collected in radial bins and divided by the bin mass to obtain the average bin absorbed dose. To simulate a cellular model, the number of cells present in each bin was calculated and an absorbed dose attributed to each cell equal to the bin average absorbed dose with a randomly determined adjustment based on a Gaussian probability distribution with a width equal to the statistical uncertainty consistent with the ratio of decays to cells, i.e., equal to Nr-1/2. From dose volume histograms the surviving fraction of cells, equivalent uniform dose (EUD), and TCP for the different scenarios were calculated. Comparably sized spherical models containing individual spherical cells (15 microm diameter) in hexagonal lattices were constructed, and Monte Carlo simulations were executed for all the same previous scenarios. The dosimetric quantities were calculated and compared to the adjusted simple sphere model results. The model was then applied to the Bortezomib-induced enzyme-targeted radiotherapy (BETR) strategy of targeting Epstein-Barr virus (EBV)-expressing cancers. RESULTS: The TCP values were comparable to within 2% between the adjusted simple sphere and full cellular models. Additionally, models were generated for a nonuniform distribution of activity, and results were compared between the adjusted spherical and cellular models with similar comparability. The TCP values from the experimental macroscopic tumor results were consistent with the experimental observations for BETR-treated 1 g EBV-expressing lymphoma tumors in mice. CONCLUSIONS: The adjusted spherical model presented here provides more accurate TCP values than simple spheres, on par with full cellular Monte Carlo simulations while maintaining the simplicity of the simple sphere model. This model provides a basis for complementing and understanding laboratory and clinical results pertaining to radiopharmaceutical therapy.
Resumo:
In breast cancer, brain metastases are often seen as late complications of recurrent disease and represent a particularly serious condition, since there are limited therapeutic options and patients have an unfavorable prognosis. The frequency of brain metastases in breast cancer is currently on the rise. This might be due to the fact that adjuvant chemotherapeutic and targeted anticancer drugs, while they effectively control disease progression in the periphery, they only poorly cross the blood-brain barrier and do not reach effectively cancer cells disseminated in the brain. It is therefore of fundamental clinical relevance to investigate mechanisms involved in breast cancer metastasis to the brain. To date experimental models of breast cancer metastasis to the brain described in literature are based on the direct intracarotid or intracardiac injection of breast cancer cells. We recently established a brain metastasis breast cancer model in immunocompetent mice based on the orthotopic injection of 4T1 murine breast carcinoma cells in the mammary gland of syngeneic BALB/c mice. 4T1-derived tumors recapitulate the main steps of human breast cancer progression, including epithelial-to-mesenchymal transition, local invasion and metastatic spreading to lung and lymph nodes. 4T1 cells were engineered to stably express firefly Luciferase allowing noninvasive in vivo and ex vivo monitoring of tumor progression and metastatic spreading to target organs. Bioluminescence imaging revealed the appearance of spontaneous lesions to the lung and lymph nodes and, at a much lower frequency, to the brain. Brain metastases were confirmed by macroscopic and microscopic evaluation of the brains at necropsy. We then isolated brain metastatic cells, re-injected them orthotopically in new mice and isolated again lines from brain metastases. After two rounds of selection we obtained lines metastasizing to the brain with 100% penetrance (named 4T1-BM2 for Brain Metastasis, 2nd generation) compared to lines derived after two rounds of in vivo growth from primary tumors (4T1-T2) or from lung metastases (4T1-LM2). We are currently performing experiments to unravel differences in cell proliferation, adhesion, migration, invasion and survival of the 4T1-BM2 line relative to the 4T1-T2 and 4T1-LM2 lines. Initial results indicate that 4T1-BM2 cells are not more invasive or more proliferative in vitro and do not show a more mesenchymal phenotype. Our syngeneic (BALB/c) model of spontaneous breast carcinoma metastasis to the brain is a unique and clinically relevant model to unravel the mechanisms of metastatic breast cancer colonization of the brain. Genes identified in this model represent potentially clinically relevant therapeutic targets for the prevention and the treatment of brain metastases in breast cancer patients.
Resumo:
The development of new drug delivery systems to target the anterior segment of the eye may offer many advantages: to increase the biodisponibility of the drug, to allow the penetration of drug that cannot be formulated as solutions, to obtain constant and sustained drug release, to achieve higher local concentrations without systemic effects, to target more specifically one tissue or cell type, to reduce the frequency of instillation and therefore increase the observance and comfort of the patient while reducing side effects of frequent instillation. Several approaches are developed, aiming to increase the corneal contact time by modified formulation or reservoir systems, or by increasing the tissue permeability using iontophoresis. To date, no ocular drug delivery system is ideal for all purposes. To maximize treatment efficacy, careful evaluation of the specific pathological condition, the targeted Intraocular tissue and the location of the most severe pathology must be made before selecting the method of delivery most suitable for each individual patient.
Resumo:
We address the general question of the extent to which the hydrodynamic behaviour of microscopic freely fluctuating objects can be reproduced by macrosopic rigid objects. In particular, we compare the sedimentation speeds of knotted DNA molecules undergoing gel electrophoresis to the sedimentation speeds of rigid stereolithographic models of ideal knots in both water and silicon oil. We find that the sedimentation speeds grow roughly linearly with the average crossing number of the ideal knot configurations, and that the correlation is stronger within classes of knots. This is consistent with previous observations with DNA knots in gel electrophoresis.
Resumo:
The increase of publicly available sequencing data has allowed for rapid progress in our understanding of genome composition. As new information becomes available we should constantly be updating and reanalyzing existing and newly acquired data. In this report we focus on transposable elements (TEs) which make up a significant portion of nearly all sequenced genomes. Our ability to accurately identify and classify these sequences is critical to understanding their impact on host genomes. At the same time, as we demonstrate in this report, problems with existing classification schemes have led to significant misunderstandings of the evolution of both TE sequences and their host genomes. In a pioneering publication Finnegan (1989) proposed classifying all TE sequences into two classes based on transposition mechanisms and structural features: the retrotransposons (class I) and the DNA transposons (class II). We have retraced how ideas regarding TE classification and annotation in both prokaryotic and eukaryotic scientific communities have changed over time. This has led us to observe that: (1) a number of TEs have convergent structural features and/or transposition mechanisms that have led to misleading conclusions regarding their classification, (2) the evolution of TEs is similar to that of viruses by having several unrelated origins, (3) there might be at least 8 classes and 12 orders of TEs including 10 novel orders. In an effort to address these classification issues we propose: (1) the outline of a universal TE classification, (2) a set of methods and classification rules that could be used by all scientific communities involved in the study of TEs, and (3) a 5-year schedule for the establishment of an International Committee for Taxonomy of Transposable Elements (ICTTE).