114 resultados para DETERMINISTIC WALKER
Resumo:
Context: Fibroblast growth factor (FGF) 8 is important for GnRH neuronal development with human mutations resulting in Kallmann syndrome. Murine data suggest a role for Fgf8 in hypothalamo-pituitary development; however, its role in the etiology of wider hypothalamo-pituitary dysfunction in humans is unknown.Objective: The objective of this study was to screen for FGF8 mutations in patients with septo-optic dysplasia (n = 374) or holoprosencephaly (HPE)/midline clefts (n = 47).Methods: FGF8 was analyzed by PCR and direct sequencing. Ethnically matched controls were then screened for mutated alleles (n = 480-686). Localization of Fgf8/FGF8 expression was analyzed by in situ hybridization in developing murine and human embryos. Finally, Fgf8 hypomorphic mice (Fgf8(loxPNeo/-)) were analyzed for the presence of forebrain and hypothalamo-pituitary defects.Results: A homozygous p.R189H mutation was identified in a female patient of consanguineous parentage with semilobar HPE, diabetes insipidus, and TSH and ACTH insufficiency. Second, a heterozygous p.Q216E mutation was identified in a female patient with an absent corpus callosum, hypoplastic optic nerves, and Moebius syndrome. FGF8 was expressed in the ventral diencephalon and anterior commissural plate but not in Rathke's pouch, strongly suggesting early onset hypothalamic and corpus callosal defects in these patients. This was consolidated by significantly reduced vasopressin and oxytocin staining neurons in the hypothalamus of Fgf8 hypomorphic mice compared with controls along with variable hypothalamo-pituitary defects and HPE.Conclusion: We implicate FGF8 in the etiology of recessive HPE and potentially septo-optic dysplasia/Moebius syndrome for the first time to our knowledge. Furthermore, FGF8 is important for the development of the ventral diencephalon, hypothalamus, and pituitary. (J Clin Endocrinol Metab 96: E1709-E1718, 2011)
Resumo:
In order to understand the development of non-genetically encoded actions during an animal's lifespan, it is necessary to analyze the dynamics and evolution of learning rules producing behavior. Owing to the intrinsic stochastic and frequency-dependent nature of learning dynamics, these rules are often studied in evolutionary biology via agent-based computer simulations. In this paper, we show that stochastic approximation theory can help to qualitatively understand learning dynamics and formulate analytical models for the evolution of learning rules. We consider a population of individuals repeatedly interacting during their lifespan, and where the stage game faced by the individuals fluctuates according to an environmental stochastic process. Individuals adjust their behavioral actions according to learning rules belonging to the class of experience-weighted attraction learning mechanisms, which includes standard reinforcement and Bayesian learning as special cases. We use stochastic approximation theory in order to derive differential equations governing action play probabilities, which turn out to have qualitative features of mutator-selection equations. We then perform agent-based simulations to find the conditions where the deterministic approximation is closest to the original stochastic learning process for standard 2-action 2-player fluctuating games, where interaction between learning rules and preference reversal may occur. Finally, we analyze a simplified model for the evolution of learning in a producer-scrounger game, which shows that the exploration rate can interact in a non-intuitive way with other features of co-evolving learning rules. Overall, our analyses illustrate the usefulness of applying stochastic approximation theory in the study of animal learning.
Resumo:
Spatial data analysis mapping and visualization is of great importance in various fields: environment, pollution, natural hazards and risks, epidemiology, spatial econometrics, etc. A basic task of spatial mapping is to make predictions based on some empirical data (measurements). A number of state-of-the-art methods can be used for the task: deterministic interpolations, methods of geostatistics: the family of kriging estimators (Deutsch and Journel, 1997), machine learning algorithms such as artificial neural networks (ANN) of different architectures, hybrid ANN-geostatistics models (Kanevski and Maignan, 2004; Kanevski et al., 1996), etc. All the methods mentioned above can be used for solving the problem of spatial data mapping. Environmental empirical data are always contaminated/corrupted by noise, and often with noise of unknown nature. That's one of the reasons why deterministic models can be inconsistent, since they treat the measurements as values of some unknown function that should be interpolated. Kriging estimators treat the measurements as the realization of some spatial randomn process. To obtain the estimation with kriging one has to model the spatial structure of the data: spatial correlation function or (semi-)variogram. This task can be complicated if there is not sufficient number of measurements and variogram is sensitive to outliers and extremes. ANN is a powerful tool, but it also suffers from the number of reasons. of a special type ? multiplayer perceptrons ? are often used as a detrending tool in hybrid (ANN+geostatistics) models (Kanevski and Maignank, 2004). Therefore, development and adaptation of the method that would be nonlinear and robust to noise in measurements, would deal with the small empirical datasets and which has solid mathematical background is of great importance. The present paper deals with such model, based on Statistical Learning Theory (SLT) - Support Vector Regression. SLT is a general mathematical framework devoted to the problem of estimation of the dependencies from empirical data (Hastie et al, 2004; Vapnik, 1998). SLT models for classification - Support Vector Machines - have shown good results on different machine learning tasks. The results of SVM classification of spatial data are also promising (Kanevski et al, 2002). The properties of SVM for regression - Support Vector Regression (SVR) are less studied. First results of the application of SVR for spatial mapping of physical quantities were obtained by the authorsin for mapping of medium porosity (Kanevski et al, 1999), and for mapping of radioactively contaminated territories (Kanevski and Canu, 2000). The present paper is devoted to further understanding of the properties of SVR model for spatial data analysis and mapping. Detailed description of the SVR theory can be found in (Cristianini and Shawe-Taylor, 2000; Smola, 1996) and basic equations for the nonlinear modeling are given in section 2. Section 3 discusses the application of SVR for spatial data mapping on the real case study - soil pollution by Cs137 radionuclide. Section 4 discusses the properties of the modelapplied to noised data or data with outliers.
Resumo:
As modern molecular biology moves towards the analysis of biological systems as opposed to their individual components, the need for appropriate mathematical and computational techniques for understanding the dynamics and structure of such systems is becoming more pressing. For example, the modeling of biochemical systems using ordinary differential equations (ODEs) based on high-throughput, time-dense profiles is becoming more common-place, which is necessitating the development of improved techniques to estimate model parameters from such data. Due to the high dimensionality of this estimation problem, straight-forward optimization strategies rarely produce correct parameter values, and hence current methods tend to utilize genetic/evolutionary algorithms to perform non-linear parameter fitting. Here, we describe a completely deterministic approach, which is based on interval analysis. This allows us to examine entire sets of parameters, and thus to exhaust the global search within a finite number of steps. In particular, we show how our method may be applied to a generic class of ODEs used for modeling biochemical systems called Generalized Mass Action Models (GMAs). In addition, we show that for GMAs our method is amenable to the technique in interval arithmetic called constraint propagation, which allows great improvement of its efficiency. To illustrate the applicability of our method we apply it to some networks of biochemical reactions appearing in the literature, showing in particular that, in addition to estimating system parameters in the absence of noise, our method may also be used to recover the topology of these networks.
Resumo:
To understand the biology and evolution of ruminants, the cattle genome was sequenced to about sevenfold coverage. The cattle genome contains a minimum of 22,000 genes, with a core set of 14,345 orthologs shared among seven mammalian species of which 1217 are absent or undetected in noneutherian (marsupial or monotreme) genomes. Cattle-specific evolutionary breakpoint regions in chromosomes have a higher density of segmental duplications, enrichment of repetitive elements, and species-specific variations in genes associated with lactation and immune responsiveness. Genes involved in metabolism are generally highly conserved, although five metabolic genes are deleted or extensively diverged from their human orthologs. The cattle genome sequence thus provides a resource for understanding mammalian evolution and accelerating livestock genetic improvement for milk and meat production.
Resumo:
Initiation of antiretroviral therapy during the earliest stages of HIV-1 infection may limit the seeding of a long-lasting viral reservoir, but long-term effects of early antiretroviral treatment initiation remain unknown. Here, we analyzed immunological and virological characteristics of nine patients who started antiretroviral therapy at primary HIV-1 infection and remained on suppressive treatment for >10 years; patients with similar treatment duration but initiation of suppressive therapy during chronic HIV-1 infection served as controls. We observed that independently of the timing of treatment initiation, HIV-1 DNA in CD4 T cells decayed primarily during the initial 3 to 4 years of treatment. However, in patients who started antiretroviral therapy in early infection, this decay occurred faster and was more pronounced, leading to substantially lower levels of cell-associated HIV-1 DNA after long-term treatment. Despite this smaller size, the viral CD4 T cell reservoir in persons with early treatment initiation consisted more dominantly of the long-lasting central-memory and T memory stem cells. HIV-1-specific T cell responses remained continuously detectable during antiretroviral therapy, independently of the timing of treatment initiation. Together, these data suggest that early HIV-1 treatment initiation, even when continued for >10 years, is unlikely to lead to viral eradication, but the presence of low viral reservoirs and durable HIV-1 T cell responses may make such patients good candidates for future interventional studies aiming at HIV-1 eradication and cure. IMPORTANCE: Antiretroviral therapy can effectively suppress HIV-1 replication to undetectable levels; however, HIV-1 can persist despite treatment, and viral replication rapidly rebounds when treatment is discontinued. This is mainly due to the presence of latently infected CD4 T cells, which are not susceptible to antiretroviral drugs. Starting treatment in the earliest stages of HIV-1 infection can limit the number of these latently infected cells, raising the possibility that these viral reservoirs are naturally eliminated if suppressive antiretroviral treatment is continued for extremely long periods of time. Here, we analyzed nine patients who started on antiretroviral therapy within the earliest weeks of the disease and continued treatment for more than 10 years. Our data show that early treatment accelerated the decay of infected CD4 T cells and led to very low residual levels of detectable HIV-1 after long-term therapy, levels that were otherwise detectable in patients who are able to maintain a spontaneous, drug-free control of HIV-1 replication. Thus, long-term antiretroviral treatment started during early infection cannot eliminate HIV-1, but the reduced reservoirs of HIV-1 infected cells in such patients may increase their chances to respond to clinical interventions aiming at inducing a drug-free remission of HIV-1 infection.
Resumo:
Obesity is heritable and predisposes to many diseases. To understand the genetic basis of obesity better, here we conduct a genome-wide association study and Metabochip meta-analysis of body mass index (BMI), a measure commonly used to define obesity and assess adiposity, in up to 339,224 individuals. This analysis identifies 97 BMI-associated loci (P < 5 × 10(-8)), 56 of which are novel. Five loci demonstrate clear evidence of several independent association signals, and many loci have significant effects on other metabolic phenotypes. The 97 loci account for ∼2.7% of BMI variation, and genome-wide estimates suggest that common variation accounts for >20% of BMI variation. Pathway analyses provide strong support for a role of the central nervous system in obesity susceptibility and implicate new genes and pathways, including those related to synaptic function, glutamate signalling, insulin secretion/action, energy metabolism, lipid biology and adipogenesis.
Resumo:
Expression of tissue-specific homing molecules directs antigen-experienced T cells to particular peripheral tissues. In studies using soluble antigens that focused on skin and gut, antigen-presenting cells (APCs) within regional lymphoid tissues were proposed to be responsible for imprinting homing phenotypes. Whether this occurs in other sites and after physiologic antigen processing and presentation is unknown. We define in vivo imprinting of distinct homing phenotypes on monospecific T cells responding to antigens expressed by tumors in intracerebral, subcutaneous, and intraperitoneal sites with efficient brain-tropism of CD8 T cells crossprimed in the cervical lymph nodes (LNs). Multiple imprinting programs could occur simultaneously in the same LN when tumors were present in more than one site. Thus, the identity of the LN is not paramount in determining the homing phenotype; this critical functional parameter is dictated upstream at the site of antigen capture by crosspresenting APCs.
Resumo:
A variant upstream of human leukocyte antigen C (HLA-C) shows the most significant genome-wide effect on HIV control in European Americans and is also associated with the level of HLA-C expression. We characterized the differential cell surface expression levels of all common HLA-C allotypes and tested directly for effects of HLA-C expression on outcomes of HIV infection in 5243 individuals. Increasing HLA-C expression was associated with protection against multiple outcomes independently of individual HLA allelic effects in both African and European Americans, regardless of their distinct HLA-C frequencies and linkage relationships with HLA-B and HLA-A. Higher HLA-C expression was correlated with increased likelihood of cytotoxic T lymphocyte responses and frequency of viral escape mutation. In contrast, high HLA-C expression had a deleterious effect in Crohn's disease, suggesting a broader influence of HLA expression levels in human disease.
Resumo:
Given the multiplicity of nanoparticles (NPs), there is a requirement to develop screening strategies to evaluate their toxicity. Within the EU-funded FP7 NanoTEST project, a panel of medically relevant NPs has been used to develop alternative testing strategies of NPs used in medical diagnostics. As conventional toxicity tests cannot necessarily be directly applied to NPs in the same manner as for soluble chemicals and drugs, we determined the extent of interference of NPs with each assay process and components. In this study, we fully characterized the panel of NP suspensions used in this project (poly(lactic-co-glycolic acid)-polyethylene oxide [PLGA-PEO], TiO2, SiO2, and uncoated and oleic-acid coated Fe3O4) and showed that many NP characteristics (composition, size, coatings, and agglomeration) interfere with a range of in vitro cytotoxicity assays (WST-1, MTT, lactate dehydrogenase, neutral red, propidium iodide, (3)H-thymidine incorporation, and cell counting), pro-inflammatory response evaluation (ELISA for GM-CSF, IL-6, and IL-8), and oxidative stress detection (monoBromoBimane, dichlorofluorescein, and NO assays). Interferences were assay specific as well as NP specific. We propose how to integrate and avoid interference with testing systems as a first step of a screening strategy for biomedical NPs.
Resumo:
In spite of recent advances in describing the health outcomes of exposure to nanoparticles (NPs), it still remains unclear how exactly NPs interact with their cellular targets. Size, surface, mass, geometry, and composition may all play a beneficial role as well as causing toxicity. Concerns of scientists, politicians and the public about potential health hazards associated with NPs need to be answered. With the variety of exposure routes available, there is potential for NPs to reach every organ in the body but we know little about the impact this might have. The main objective of the FP7 NanoTEST project ( www.nanotest-fp7.eu ) was a better understanding of mechanisms of interactions of NPs employed in nanomedicine with cells, tissues and organs and to address critical issues relating to toxicity testing especially with respect to alternatives to tests on animals. Here we describe an approach towards alternative testing strategies for hazard and risk assessment of nanomaterials, highlighting the adaptation of standard methods demanded by the special physicochemical features of nanomaterials and bioavailability studies. The work has assessed a broad range of toxicity tests, cell models and NP types and concentrations taking into account the inherent impact of NP properties and the effects of changes in experimental conditions using well-characterized NPs. The results of the studies have been used to generate recommendations for a suitable and robust testing strategy which can be applied to new medical NPs as they are developed.
Resumo:
"This paper will discuss the major developments in the area of fingerprint" "identification that followed the publication of the National Research Council (NRC, of the US National Academies of Sciences) report in 2009 entitled: Strengthening Forensic Science in the United States: A Path Forward. The report portrayed an image of a field of expertise used for decades without the necessary scientific research-based underpinning. The advances since the report and the needs in selected areas of fingerprinting will be detailed. It includes the measurement of the accuracy, reliability, repeatability and reproducibility of the conclusions offered by fingerprint experts. The paper will also pay attention to the development of statistical models allow- ing assessment of fingerprint comparisons. As a corollary of these developments, the next challenge is to reconcile a traditional practice domi- nated by deterministic conclusions with the probabilistic logic of any statistical model. There is a call for greater candour and fingerprint experts will need to communicate differently on the strengths and limitations of their findings. Their testimony will have to go beyond the blunt assertion" "of the uniqueness of fingerprints or the opinion delivered ispe dixit."