883 resultados para Human Model


Relevância:

30.00% 30.00%

Publicador:

Resumo:

EXECUTIVE SUMMARY : Evaluating Information Security Posture within an organization is becoming a very complex task. Currently, the evaluation and assessment of Information Security are commonly performed using frameworks, methodologies and standards which often consider the various aspects of security independently. Unfortunately this is ineffective because it does not take into consideration the necessity of having a global and systemic multidimensional approach to Information Security evaluation. At the same time the overall security level is globally considered to be only as strong as its weakest link. This thesis proposes a model aiming to holistically assess all dimensions of security in order to minimize the likelihood that a given threat will exploit the weakest link. A formalized structure taking into account all security elements is presented; this is based on a methodological evaluation framework in which Information Security is evaluated from a global perspective. This dissertation is divided into three parts. Part One: Information Security Evaluation issues consists of four chapters. Chapter 1 is an introduction to the purpose of this research purpose and the Model that will be proposed. In this chapter we raise some questions with respect to "traditional evaluation methods" as well as identifying the principal elements to be addressed in this direction. Then we introduce the baseline attributes of our model and set out the expected result of evaluations according to our model. Chapter 2 is focused on the definition of Information Security to be used as a reference point for our evaluation model. The inherent concepts of the contents of a holistic and baseline Information Security Program are defined. Based on this, the most common roots-of-trust in Information Security are identified. Chapter 3 focuses on an analysis of the difference and the relationship between the concepts of Information Risk and Security Management. Comparing these two concepts allows us to identify the most relevant elements to be included within our evaluation model, while clearing situating these two notions within a defined framework is of the utmost importance for the results that will be obtained from the evaluation process. Chapter 4 sets out our evaluation model and the way it addresses issues relating to the evaluation of Information Security. Within this Chapter the underlying concepts of assurance and trust are discussed. Based on these two concepts, the structure of the model is developed in order to provide an assurance related platform as well as three evaluation attributes: "assurance structure", "quality issues", and "requirements achievement". Issues relating to each of these evaluation attributes are analysed with reference to sources such as methodologies, standards and published research papers. Then the operation of the model is discussed. Assurance levels, quality levels and maturity levels are defined in order to perform the evaluation according to the model. Part Two: Implementation of the Information Security Assurance Assessment Model (ISAAM) according to the Information Security Domains consists of four chapters. This is the section where our evaluation model is put into a welldefined context with respect to the four pre-defined Information Security dimensions: the Organizational dimension, Functional dimension, Human dimension, and Legal dimension. Each Information Security dimension is discussed in a separate chapter. For each dimension, the following two-phase evaluation path is followed. The first phase concerns the identification of the elements which will constitute the basis of the evaluation: ? Identification of the key elements within the dimension; ? Identification of the Focus Areas for each dimension, consisting of the security issues identified for each dimension; ? Identification of the Specific Factors for each dimension, consisting of the security measures or control addressing the security issues identified for each dimension. The second phase concerns the evaluation of each Information Security dimension by: ? The implementation of the evaluation model, based on the elements identified for each dimension within the first phase, by identifying the security tasks, processes, procedures, and actions that should have been performed by the organization to reach the desired level of protection; ? The maturity model for each dimension as a basis for reliance on security. For each dimension we propose a generic maturity model that could be used by every organization in order to define its own security requirements. Part three of this dissertation contains the Final Remarks, Supporting Resources and Annexes. With reference to the objectives of our thesis, the Final Remarks briefly analyse whether these objectives were achieved and suggest directions for future related research. Supporting resources comprise the bibliographic resources that were used to elaborate and justify our approach. Annexes include all the relevant topics identified within the literature to illustrate certain aspects of our approach. Our Information Security evaluation model is based on and integrates different Information Security best practices, standards, methodologies and research expertise which can be combined in order to define an reliable categorization of Information Security. After the definition of terms and requirements, an evaluation process should be performed in order to obtain evidence that the Information Security within the organization in question is adequately managed. We have specifically integrated into our model the most useful elements of these sources of information in order to provide a generic model able to be implemented in all kinds of organizations. The value added by our evaluation model is that it is easy to implement and operate and answers concrete needs in terms of reliance upon an efficient and dynamic evaluation tool through a coherent evaluation system. On that basis, our model could be implemented internally within organizations, allowing them to govern better their Information Security. RÉSUMÉ : Contexte général de la thèse L'évaluation de la sécurité en général, et plus particulièrement, celle de la sécurité de l'information, est devenue pour les organisations non seulement une mission cruciale à réaliser, mais aussi de plus en plus complexe. A l'heure actuelle, cette évaluation se base principalement sur des méthodologies, des bonnes pratiques, des normes ou des standards qui appréhendent séparément les différents aspects qui composent la sécurité de l'information. Nous pensons que cette manière d'évaluer la sécurité est inefficiente, car elle ne tient pas compte de l'interaction des différentes dimensions et composantes de la sécurité entre elles, bien qu'il soit admis depuis longtemps que le niveau de sécurité globale d'une organisation est toujours celui du maillon le plus faible de la chaîne sécuritaire. Nous avons identifié le besoin d'une approche globale, intégrée, systémique et multidimensionnelle de l'évaluation de la sécurité de l'information. En effet, et c'est le point de départ de notre thèse, nous démontrons que seule une prise en compte globale de la sécurité permettra de répondre aux exigences de sécurité optimale ainsi qu'aux besoins de protection spécifiques d'une organisation. Ainsi, notre thèse propose un nouveau paradigme d'évaluation de la sécurité afin de satisfaire aux besoins d'efficacité et d'efficience d'une organisation donnée. Nous proposons alors un modèle qui vise à évaluer d'une manière holistique toutes les dimensions de la sécurité, afin de minimiser la probabilité qu'une menace potentielle puisse exploiter des vulnérabilités et engendrer des dommages directs ou indirects. Ce modèle se base sur une structure formalisée qui prend en compte tous les éléments d'un système ou programme de sécurité. Ainsi, nous proposons un cadre méthodologique d'évaluation qui considère la sécurité de l'information à partir d'une perspective globale. Structure de la thèse et thèmes abordés Notre document est structuré en trois parties. La première intitulée : « La problématique de l'évaluation de la sécurité de l'information » est composée de quatre chapitres. Le chapitre 1 introduit l'objet de la recherche ainsi que les concepts de base du modèle d'évaluation proposé. La maniéré traditionnelle de l'évaluation de la sécurité fait l'objet d'une analyse critique pour identifier les éléments principaux et invariants à prendre en compte dans notre approche holistique. Les éléments de base de notre modèle d'évaluation ainsi que son fonctionnement attendu sont ensuite présentés pour pouvoir tracer les résultats attendus de ce modèle. Le chapitre 2 se focalise sur la définition de la notion de Sécurité de l'Information. Il ne s'agit pas d'une redéfinition de la notion de la sécurité, mais d'une mise en perspectives des dimensions, critères, indicateurs à utiliser comme base de référence, afin de déterminer l'objet de l'évaluation qui sera utilisé tout au long de notre travail. Les concepts inhérents de ce qui constitue le caractère holistique de la sécurité ainsi que les éléments constitutifs d'un niveau de référence de sécurité sont définis en conséquence. Ceci permet d'identifier ceux que nous avons dénommés « les racines de confiance ». Le chapitre 3 présente et analyse la différence et les relations qui existent entre les processus de la Gestion des Risques et de la Gestion de la Sécurité, afin d'identifier les éléments constitutifs du cadre de protection à inclure dans notre modèle d'évaluation. Le chapitre 4 est consacré à la présentation de notre modèle d'évaluation Information Security Assurance Assessment Model (ISAAM) et la manière dont il répond aux exigences de l'évaluation telle que nous les avons préalablement présentées. Dans ce chapitre les concepts sous-jacents relatifs aux notions d'assurance et de confiance sont analysés. En se basant sur ces deux concepts, la structure du modèle d'évaluation est développée pour obtenir une plateforme qui offre un certain niveau de garantie en s'appuyant sur trois attributs d'évaluation, à savoir : « la structure de confiance », « la qualité du processus », et « la réalisation des exigences et des objectifs ». Les problématiques liées à chacun de ces attributs d'évaluation sont analysées en se basant sur l'état de l'art de la recherche et de la littérature, sur les différentes méthodes existantes ainsi que sur les normes et les standards les plus courants dans le domaine de la sécurité. Sur cette base, trois différents niveaux d'évaluation sont construits, à savoir : le niveau d'assurance, le niveau de qualité et le niveau de maturité qui constituent la base de l'évaluation de l'état global de la sécurité d'une organisation. La deuxième partie: « L'application du Modèle d'évaluation de l'assurance de la sécurité de l'information par domaine de sécurité » est elle aussi composée de quatre chapitres. Le modèle d'évaluation déjà construit et analysé est, dans cette partie, mis dans un contexte spécifique selon les quatre dimensions prédéfinies de sécurité qui sont: la dimension Organisationnelle, la dimension Fonctionnelle, la dimension Humaine, et la dimension Légale. Chacune de ces dimensions et son évaluation spécifique fait l'objet d'un chapitre distinct. Pour chacune des dimensions, une évaluation en deux phases est construite comme suit. La première phase concerne l'identification des éléments qui constituent la base de l'évaluation: ? Identification des éléments clés de l'évaluation ; ? Identification des « Focus Area » pour chaque dimension qui représentent les problématiques se trouvant dans la dimension ; ? Identification des « Specific Factors » pour chaque Focus Area qui représentent les mesures de sécurité et de contrôle qui contribuent à résoudre ou à diminuer les impacts des risques. La deuxième phase concerne l'évaluation de chaque dimension précédemment présentées. Elle est constituée d'une part, de l'implémentation du modèle général d'évaluation à la dimension concernée en : ? Se basant sur les éléments spécifiés lors de la première phase ; ? Identifiant les taches sécuritaires spécifiques, les processus, les procédures qui auraient dû être effectués pour atteindre le niveau de protection souhaité. D'autre part, l'évaluation de chaque dimension est complétée par la proposition d'un modèle de maturité spécifique à chaque dimension, qui est à considérer comme une base de référence pour le niveau global de sécurité. Pour chaque dimension nous proposons un modèle de maturité générique qui peut être utilisé par chaque organisation, afin de spécifier ses propres exigences en matière de sécurité. Cela constitue une innovation dans le domaine de l'évaluation, que nous justifions pour chaque dimension et dont nous mettons systématiquement en avant la plus value apportée. La troisième partie de notre document est relative à la validation globale de notre proposition et contient en guise de conclusion, une mise en perspective critique de notre travail et des remarques finales. Cette dernière partie est complétée par une bibliographie et des annexes. Notre modèle d'évaluation de la sécurité intègre et se base sur de nombreuses sources d'expertise, telles que les bonnes pratiques, les normes, les standards, les méthodes et l'expertise de la recherche scientifique du domaine. Notre proposition constructive répond à un véritable problème non encore résolu, auquel doivent faire face toutes les organisations, indépendamment de la taille et du profil. Cela permettrait à ces dernières de spécifier leurs exigences particulières en matière du niveau de sécurité à satisfaire, d'instancier un processus d'évaluation spécifique à leurs besoins afin qu'elles puissent s'assurer que leur sécurité de l'information soit gérée d'une manière appropriée, offrant ainsi un certain niveau de confiance dans le degré de protection fourni. Nous avons intégré dans notre modèle le meilleur du savoir faire, de l'expérience et de l'expertise disponible actuellement au niveau international, dans le but de fournir un modèle d'évaluation simple, générique et applicable à un grand nombre d'organisations publiques ou privées. La valeur ajoutée de notre modèle d'évaluation réside précisément dans le fait qu'il est suffisamment générique et facile à implémenter tout en apportant des réponses sur les besoins concrets des organisations. Ainsi notre proposition constitue un outil d'évaluation fiable, efficient et dynamique découlant d'une approche d'évaluation cohérente. De ce fait, notre système d'évaluation peut être implémenté à l'interne par l'entreprise elle-même, sans recourir à des ressources supplémentaires et lui donne également ainsi la possibilité de mieux gouverner sa sécurité de l'information.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Exposure to solar ultraviolet (UV) light is the main causative factor for skin cancer. UV exposure depends on environmental and individual factors. Individual exposure data remain scarce and development of alternative assessment methods is greatly needed. We developed a model simulating human exposure to solar UV. The model predicts the dose and distribution of UV exposure received on the basis of ground irradiation and morphological data. Standard 3D computer graphics techniques were adapted to develop a rendering engine that estimates the solar exposure of a virtual manikin depicted as a triangle mesh surface. The amount of solar energy received by each triangle was calculated, taking into account reflected, direct and diffuse radiation, and shading from other body parts. Dosimetric measurements (n = 54) were conducted in field conditions using a foam manikin as surrogate for an exposed individual. Dosimetric results were compared to the model predictions. The model predicted exposure to solar UV adequately. The symmetric mean absolute percentage error was 13%. Half of the predictions were within 17% range of the measurements. This model provides a tool to assess outdoor occupational and recreational UV exposures, without necessitating time-consuming individual dosimetry, with numerous potential uses in skin cancer prevention and research.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

As part of a European initiative (EuroVacc), we report the design, construction, and immunogenicity of two HIV-1 vaccine candidates based on a clade C virus strain (CN54) representing the current major epidemic in Asia and parts of Africa. Open reading frames encoding an artificial 160-kDa GagPolNef (GPN) polyprotein and the external glycoprotein gp120 were fully RNA and codon optimized. A DNA vaccine (DNA-GPN and DNA-gp120, referred to as DNA-C), and a replication-deficient vaccinia virus encoding both reading frames (NYVAC-C), were assessed regarding immunogenicity in Balb/C mice. The intramuscular administration of both plasmid DNA constructs, followed by two booster DNA immunizations, induced substantial T-cell responses against both antigens as well as Env-specific antibodies. Whereas low doses of NYVAC-C failed to induce specific CTL or antibodies, high doses generated cellular as well as humoral immune responses, but these did not reach the levels seen following DNA vaccination. The most potent immune responses were detectable using prime:boost protocols, regardless of whether DNA-C or NYVAC-C was used as the priming or boosting agent. These preclinical findings revealed the immunogenic response triggered by DNA-C and its enhancement by combining it with NYVAC-C, thus complementing the macaque preclinical and human phase I clinical studies of EuroVacc.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Here we describe a method for measuring tonotopic maps and estimating bandwidth for voxels in human primary auditory cortex (PAC) using a modification of the population Receptive Field (pRF) model, developed for retinotopic mapping in visual cortex by Dumoulin and Wandell (2008). The pRF method reliably estimates tonotopic maps in the presence of acoustic scanner noise, and has two advantages over phase-encoding techniques. First, the stimulus design is flexible and need not be a frequency progression, thereby reducing biases due to habituation, expectation, and estimation artifacts, as well as reducing the effects of spatio-temporal BOLD nonlinearities. Second, the pRF method can provide estimates of bandwidth as a function of frequency. We find that bandwidth estimates are narrower for voxels within the PAC than in surrounding auditory responsive regions (non-PAC).

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Genetic variants influence the risk to develop certain diseases or give rise to differences in drug response. Recent progresses in cost-effective, high-throughput genome-wide techniques, such as microarrays measuring Single Nucleotide Polymorphisms (SNPs), have facilitated genotyping of large clinical and population cohorts. Combining the massive genotypic data with measurements of phenotypic traits allows for the determination of genetic differences that explain, at least in part, the phenotypic variations within a population. So far, models combining the most significant variants can only explain a small fraction of the variance, indicating the limitations of current models. In particular, researchers have only begun to address the possibility of interactions between genotypes and the environment. Elucidating the contributions of such interactions is a difficult task because of the large number of genetic as well as possible environmental factors.In this thesis, I worked on several projects within this context. My first and main project was the identification of possible SNP-environment interactions, where the phenotypes were serum lipid levels of patients from the Swiss HIV Cohort Study (SHCS) treated with antiretroviral therapy. Here the genotypes consisted of a limited set of SNPs in candidate genes relevant for lipid transport and metabolism. The environmental variables were the specific combinations of drugs given to each patient over the treatment period. My work explored bioinformatic and statistical approaches to relate patients' lipid responses to these SNPs, drugs and, importantly, their interactions. The goal of this project was to improve our understanding and to explore the possibility of predicting dyslipidemia, a well-known adverse drug reaction of antiretroviral therapy. Specifically, I quantified how much of the variance in lipid profiles could be explained by the host genetic variants, the administered drugs and SNP-drug interactions and assessed the predictive power of these features on lipid responses. Using cross-validation stratified by patients, we could not validate our hypothesis that models that select a subset of SNP-drug interactions in a principled way have better predictive power than the control models using "random" subsets. Nevertheless, all models tested containing SNP and/or drug terms, exhibited significant predictive power (as compared to a random predictor) and explained a sizable proportion of variance, in the patient stratified cross-validation context. Importantly, the model containing stepwise selected SNP terms showed higher capacity to predict triglyceride levels than a model containing randomly selected SNPs. Dyslipidemia is a complex trait for which many factors remain to be discovered, thus missing from the data, and possibly explaining the limitations of our analysis. In particular, the interactions of drugs with SNPs selected from the set of candidate genes likely have small effect sizes which we were unable to detect in a sample of the present size (<800 patients).In the second part of my thesis, I performed genome-wide association studies within the Cohorte Lausannoise (CoLaus). I have been involved in several international projects to identify SNPs that are associated with various traits, such as serum calcium, body mass index, two-hour glucose levels, as well as metabolic syndrome and its components. These phenotypes are all related to major human health issues, such as cardiovascular disease. I applied statistical methods to detect new variants associated with these phenotypes, contributing to the identification of new genetic loci that may lead to new insights into the genetic basis of these traits. This kind of research will lead to a better understanding of the mechanisms underlying these pathologies, a better evaluation of disease risk, the identification of new therapeutic leads and may ultimately lead to the realization of "personalized" medicine.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Background: Nanoparticle (NPs) functionalization has been shown to affect their cellular toxicity. To study this, differently functionalized silver (Ag) and gold (Au) NPs were synthesised, characterised and tested using lung epithelial cell systems. Mehtods: Monodispersed Ag and Au NPs with a size range of 7 to 10 nm were coated with either sodium citrate or chitosan resulting in surface charges from ¿50 mV to +70 mV. NP-induced cytotoxicity and oxidative stress were determined using A549 cells, BEAS-2B cells and primary lung epithelial cells (NHBE cells). TEER measurements and immunofluorescence staining of tight junctions were performed to test the growth characteristics of the cells. Cytotoxicity was measured by means of the CellTiter-Blue ® and the lactate dehydrogenase assay and cellular and cell-free reactive oxygen species (ROS) production was measured using the DCFH-DA assay. Results: Different growth characteristics were shown in the three cell types used. A549 cells grew into a confluent mono-layer, BEAS-2B cells grew into a multilayer and NHBE cells did not form a confluent layer. A549 cells were least susceptible towards NPs, irrespective of the NP functionalization. Cytotoxicity in BEAS-2B cells increased when exposed to high positive charged (+65-75 mV) Au NPs. The greatest cytotoxicity was observed in NHBE cells, where both Ag and Au NPs with a charge above +40 mV induced cytotoxicity. ROS production was most prominent in A549 cells where Au NPs (+65-75 mV) induced the highest amount of ROS. In addition, cell-free ROS measurements showed a significant increase in ROS production with an increase in chitosan coating. Conclusions: Chitosan functionalization of NPs, with resultant high surface charges plays an important role in NP-toxicity. Au NPs, which have been shown to be inert and often non-cytotoxic, can become toxic upon coating with certain charged molecules. Notably, these effects are dependent on the core material of the particle, the cell type used for testing and the growth characteristics of these cell culture model systems.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The human motion study, which relies on mathematical and computational models ingeneral, and multibody dynamic biomechanical models in particular, has become asubject of many recent researches. The human body model can be applied to different physical exercises and many important results such as muscle forces, which are difficult to be measured through practical experiments, can be obtained easily. In the work, human skeletal lower limb model consisting of three bodies in build using the flexible multibody dynamics simulation approach. The floating frame of reference formulation is used to account for the flexibility in the bones of the human lower limb model. The main reason of considering the flexibility inthe human bones is to measure the strains in the bone result from different physical exercises. It has been perceived the bone under strain will become stronger in order to cope with the exercise. On the other hand, the bone strength is considered and important factors in reducing the bone fractures. The simulation approach and model developed in this work are used to measure the bone strain results from applying raising the sole of the foot exercise. The simulation results are compared to the results available in literature. The comparison shows goof agreement. This study sheds the light on the importance of using the flexible multibody dynamic simulation approach to build human biomechanical models, which can be used in developing some exercises to achieve the optimalbone strength.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Electrical Impedance Tomography (EIT) is an imaging method which enables a volume conductivity map of a subject to be produced from multiple impedance measurements. It has the potential to become a portable non-invasive imaging technique of particular use in imaging brain function. Accurate numerical forward models may be used to improve image reconstruction but, until now, have employed an assumption of isotropic tissue conductivity. This may be expected to introduce inaccuracy, as body tissues, especially those such as white matter and the skull in head imaging, are highly anisotropic. The purpose of this study was, for the first time, to develop a method for incorporating anisotropy in a forward numerical model for EIT of the head and assess the resulting improvement in image quality in the case of linear reconstruction of one example of the human head. A realistic Finite Element Model (FEM) of an adult human head with segments for the scalp, skull, CSF, and brain was produced from a structural MRI. Anisotropy of the brain was estimated from a diffusion tensor-MRI of the same subject and anisotropy of the skull was approximated from the structural information. A method for incorporation of anisotropy in the forward model and its use in image reconstruction was produced. The improvement in reconstructed image quality was assessed in computer simulation by producing forward data, and then linear reconstruction using a sensitivity matrix approach. The mean boundary data difference between anisotropic and isotropic forward models for a reference conductivity was 50%. Use of the correct anisotropic FEM in image reconstruction, as opposed to an isotropic one, corrected an error of 24 mm in imaging a 10% conductivity decrease located in the hippocampus, improved localisation for conductivity changes deep in the brain and due to epilepsy by 4-17 mm, and, overall, led to a substantial improvement on image quality. This suggests that incorporation of anisotropy in numerical models used for image reconstruction is likely to improve EIT image quality.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Neurofilamentous changes in select groups of neurons are associated with the degenerative changes of many human age-related neurodegenerative diseases. To examine the possible effects of aging on the neuronal cytoskeleton containing human proteins, the retinas of transgenic mice expressing the gene for the human middle-sized neurofilament triplet were investigated at 3 or 12 months of age. Transgenic mice developed tangle-like neurofilamentous accumulations in a subset of retinal ganglion cells at 12 months of age. These neurofilamentous accumulations, which also involved endogenous neurofilament proteins, were present in the perikarya and proximal processes of large ganglion cells and were predominantly located in peripheral retina. The presence of the human protein may thus confer vulnerability of the cytoskeleton to age-related alterations in this specific retinal cell type and may serve as a model for similar cellular changes associated with Alzheimer's disease and glaucoma.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The saphenous vein is the conduit of choice in bypass graft procedures. Haemodynamic factors play a major role in the development of intimal hyperplasia (IH), and subsequent bypass failure. To evaluate the potential protective effect of external reinforcement on such a failure, we developed an ex vivo model for the perfusion of segments of human saphenous veins under arterial shear stress. In veins submitted to pulsatile high pressure (mean pressure at 100 mmHg) for 3 or 7 days, the use of an external macroporous polyester mesh 1) prevented the dilatation of the vessel, 2) decreased the development of IH, 3) reduced the apoptosis of smooth muscle cells, and the subsequent fibrosis of the media layer, 4) prevented the remodelling of extracellular matrix through the up-regulation of matrix metalloproteinases (MMP-2, MMP-9) and plasminogen activator type I. The data show that, in an experimental ex vivo setting, an external scaffold decreases IH and maintains the integrity of veins exposed to arterial pressure, via increase in shear stress and decrease wall tension, that likely contribute to trigger selective molecular and cellular changes.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The Potocki-Lupski syndrome (PTLS) is associated with a microduplication of 17p11.2. Clinical features include multiple congenital and neurobehavioral abnormalities and autistic features. We have generated a PTLS mouse model, Dp(11)17/+, that recapitulates some of the physical and neurobehavioral phenotypes present in patients. Here, we investigated the social behavior and gene expression pattern of this mouse model in a pure C57BL/6-Tyr(c-Brd) genetic background. Dp(11)17/+ male mice displayed normal home-cage behavior but increased anxiety and increased dominant behavior in specific tests. A subtle impairment in the preference for a social target versus an inanimate target and abnormal preference for social novelty (the preference to explore an unfamiliar mouse versus a familiar one) was also observed. Our results indicate that these animals could provide a valuable model to identify the specific gene(s) that confer abnormal social behaviors and that map within this delimited genomic deletion interval. In a first attempt to identify candidate genes and for elucidating the mechanisms of regulation of these important phenotypes, we directly assessed the relative transcription of genes within and around this genomic interval. In this mouse model, we found that candidates genes include not only most of the duplicated genes, but also normal-copy genes that flank the engineered interval; both categories of genes showed altered expression levels in the hippocampus of Dp(11)17/+ mice.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

BACKGROUND: The central function of dendritic cells (DC) in inducing and preventing immune responses makes them ideal therapeutic targets for the induction of immunologic tolerance. In a rat in vivo model, we showed that dexamethasone-treated DC (Dex-DC) induced indirect pathway-mediated regulation and that CD4+CD25+ T cells were involved in the observed effects. The aim of the present study was to investigate the mechanisms underlying the acquired immunoregulatory properties of Dex-DC in the rat and human experimental systems. METHODS: After treatment with dexamethasone (Dex), the immunogenicity of Dex-DC was analyzed in T-cell proliferation and two-step hyporesponsiveness induction assays. After carboxyfluorescein diacetate succinimidyl ester labeling, CD4+CD25+ regulatory T-cell expansion was analyzed by flow cytometry, and cytokine secretion was measured by ELISA. RESULTS: In this study, we demonstrate in vitro that rat Dex-DC induced selective expansion of CD4+CD25+ regulatory T cells, which were responsible for alloantigen-specific hyporesponsiveness. The induction of regulatory T-cell division by rat Dex-DC was due to secretion of interleukin (IL)-2 by DC. Similarly, in human studies, monocyte-derived Dex-DC were also poorly immunogenic, were able to induce T-cell anergy in vitro, and expand a population of T cells with regulatory functions. This was accompanied by a change in the cytokine profile in DC and T cells in favor of IL-10. CONCLUSION: These data suggest that Dex-DC induced tolerance by different mechanisms in the two systems studied. Both rat and human Dex-DC were able to induce and expand regulatory T cells, which occurred in an IL-2 dependent manner in the rat system.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In the cerebral cortex, the activity levels of neuronal populations are continuously fluctuating. When neuronal activity, as measured using functional MRI (fMRI), is temporally coherent across 2 populations, those populations are said to be functionally connected. Functional connectivity has previously been shown to correlate with structural (anatomical) connectivity patterns at an aggregate level. In the present study we investigate, with the aid of computational modeling, whether systems-level properties of functional networks-including their spatial statistics and their persistence across time-can be accounted for by properties of the underlying anatomical network. We measured resting state functional connectivity (using fMRI) and structural connectivity (using diffusion spectrum imaging tractography) in the same individuals at high resolution. Structural connectivity then provided the couplings for a model of macroscopic cortical dynamics. In both model and data, we observed (i) that strong functional connections commonly exist between regions with no direct structural connection, rendering the inference of structural connectivity from functional connectivity impractical; (ii) that indirect connections and interregional distance accounted for some of the variance in functional connectivity that was unexplained by direct structural connectivity; and (iii) that resting-state functional connectivity exhibits variability within and across both scanning sessions and model runs. These empirical and modeling results demonstrate that although resting state functional connectivity is variable and is frequently present between regions without direct structural linkage, its strength, persistence, and spatial statistics are nevertheless constrained by the large-scale anatomical structure of the human cerebral cortex.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

RESUME La télomérase confère une durée de vie illimitée et est réactivée dans la plupart des cellules tumorales. Sa sous-unité catalytique hTERT est définie comme le facteur limitant pour son activation. De l'identification de facteurs liant la région régulatrice d'hTERT, au rôle de la méthylation de l'ADN et de la modification des histones, de nombreux modèles de régulation ont été suggérés. Cependant, aucun de ces modèles n'a pu expliquer l'inactivation de la télomérase dans la plupart des cellules somatiques et sa réactivation dans la majorité des cellules tumorales. De plus, les observations contradictoires entre le faible niveau d'expression d'ARN messager d'hTERT dans les cellules télomérase-positives et la très forte activité transcriptionnelle du promoteur d'hTERT en transfection restent incomprises. Dans cette étude, nous avons montré que la région proximale du gène hTERT (exon 1 et 2) était impliquée dans la répression de l'activité de son promoteur. Nous avons identifié le facteur CTCF comme étant un inhibiteur du promoteur d'hTERT, en se liant au niveau de son premier exon. La méthylation de l'exon 1 du gène hTERT, couramment observée dans les tumeurs mais pas dans les cellules normales, empêcherait la liaison de CTCF. L'étude du profil de méthylation du promoteur d'hTERT indique qu'une partie du promoteur reste déméthylée et qu'elle semble suffisante pour permettre une faible activité transcriptionnelle du gène hTERT. Ainsi, la méthylation particulière des régions régulatrices d'hTERT inhibe la liaison de CTCF tout en permettant une faible transcription du gène. Cependant, dans certaines cellules tumorales, le promoteur et la région proximale du gène hTERT ne sont pas méthylés. Dans les lignées cellulaires tumorales de tesitcules et d'ovaires, l'inhibition de CTCF est contrée par son paralogue BORIS, qui se lie aussi au niveau de l'exon 1 d'hTERT, mais permet ainsi l'activation du promoteur. L'étude de l'expression du gène BORIS montre qu'il est exclusivement exprimé dans les tissus normaux de testicules et d'ovaires jeunes, ainsi qu'à différents niveaux dans la plupart des tumeurs. Sa transcription est sous le contrôle de deux promoteurs. Le promoteur proximal est régulé par méthylation et un transcrit alternatif majoritaire, délété de l'exon 6, est trouvé lorsque ce promoteur est actif. Tous ces résultats conduisent à un modèle de régulation du gène hTERT qui tient compte du profil épigénétique du gène et qui permet d'expliquer le faible taux de transcription observé in vivo. De plus, l'expression de BORIS dans les cancers et son implication dans l'activation du gène hTERT pourrait permettre de comprendre les phénomènes de dérégulation épigénétique et d'immortalisation qui ont lieu durant la tumorigenèse. SUMMARY Telomerase confers an unlimited lifespan, and is reactivated in most tumor cells. The catalytic subunit of telomerase, hTERT, is defined as the limiting factor for telomerase activity. Between activators and repressors that bind to the hTERT 5' regulatory region, and the role of CpG methylation and histone acetylation, an abundance of regulatory models have been suggested. None of these models can explain the silence of telomerase in most somatic cells and its reactivation in tumor cells. Moreover, the contradictory observations of the low level of hTERT mRNA in telomerase-positive cells and the high transcriptional activity of the hTERT promoter in transfection experiments remain unresolved. In this study, we demonstrated that the proximal exonic region of the hTERT gene (exon 1 and 2) is involved in the inhibition of its promoter. We identified the protein CTCF as the inhibitor of the hTERT promoter, through its binding to the first exon. The methylation of the first exon region, which is often observed in cancer cells but not in noimal cells, represses CTCF binding. Study of hTERT promoter methylation shows a partial demethylation sufficient to activate the transcription of the hTERT gene. Therefore, we demonstrated that the particular methylation profile of the hTERT regulatory sequences inhibits the binding of CTCF, while it allows a low transcription of the gene. Nevertheless, in some tumor cells, the promoter and the proximal exonic region of hTERT are unmethylated. In testicular and ovarian cancer cell lines, CTCF inhibition is counteracted by its BORIS paralogue that also binds the hTERT first exon but allows the promoter activation. The study of BORIS gene regulation showed that this factor is exclusively expressed in normal tissue of testis and ovary of young woman, as well as in almost all tumors with different levels. Two promoters were found to induce its transcription. The proximal promoter was regulated by methylation. Moreover, a major alternative transcript, deleted of the exon 6, is detected when this promoter is active. All these results lead to a model for hTERT regulation that takes into account the epigenetic profile of the gene and provides an explanation for the low transcriptional level observed in vivo. BORIS expression in cancers and its implication in hTERT activation might also permit the understanding of epigenetic deregulation and immortalization phenomena that occur during tumorigenesis.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

It has been convincingly argued that computer simulation modeling differs from traditional science. If we understand simulation modeling as a new way of doing science, the manner in which scientists learn about the world through models must also be considered differently. This article examines how researchers learn about environmental processes through computer simulation modeling. Suggesting a conceptual framework anchored in a performative philosophical approach, we examine two modeling projects undertaken by research teams in England, both aiming to inform flood risk management. One of the modeling teams operated in the research wing of a consultancy firm, the other were university scientists taking part in an interdisciplinary project experimenting with public engagement. We found that in the first context the use of standardized software was critical to the process of improvisation, the obstacles emerging in the process concerned data and were resolved through exploiting affordances for generating, organizing, and combining scientific information in new ways. In the second context, an environmental competency group, obstacles were related to the computer program and affordances emerged in the combination of experience-based knowledge with the scientists' skill enabling a reconfiguration of the mathematical structure of the model, allowing the group to learn about local flooding.