994 resultados para Software Transactional Memory (STM)


Relevância:

20.00% 20.00%

Publicador:

Resumo:

To study telomere length dynamics in hematopoietic cells with age, we analyzed the average length of telomere repeat sequences in diverse populations of nucleated blood cells. More than 500 individuals ranging in age from 0 to 90 yr, including 36 pairs of monozygous and dizygotic twins, were analyzed using quantitative fluorescence in situ hybridization and flow cytometry. Granulocytes and naive T cells showed a parallel biphasic decline in telomere length with age that most likely reflected accumulated cell divisions in the common precursors of both cell types: hematopoietic stem cells. Telomere loss was very rapid in the first year, and continued for more than eight decades at a 30-fold lower rate. Memory T cells also showed an initial rapid decline in telomere length with age. However, in contrast to naive T cells, this decline continued for several years, and in older individuals lymphocytes typically had shorter telomeres than did granulocytes. Our findings point to a dramatic decline in stem cell turnover in early childhood and support the notion that cell divisions in hematopoietic stem cells and T cells result in loss of telomeric DNA.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The most extensively studied Heusler alloys are those based on the Ni-Mn-Ga system. However, to overcome the high cost of Gallium and the usually low martensitic transformation temperature, the search for Ga-free alloys has been recently attempted, particularly, by introducing In, Sn or Sb. In this work, two alloys (Mn50Ni35.5In14.5 and Ni50Mn35In15) have been obtained by melt spinning. We outline their structural and thermal behaviour. Mn50Ni35.5In14.5 alloy has the transformation above room temperature whereas Ni50Mn35In15 does not have this transformation in the temperature range here analyzed

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This manual describes how to use the Iowa Bridge Backwater software. It also documents the methods and equations used for the calculations. The main body describes how to use the software and the appendices cover technical aspects. The Bridge Backwater software performs 5 main tasks: Design Discharge Estimation; Stream Rating Curves; Floodway Encroachment; Bridge Backwater; and Bridge Scour. The intent of this program is to provide a simplified method for analysis of bridge backwater for rural structures located in areas with low flood damage potential. The software is written in Microsoft Visual Basic 6.0. It will run under Windows 95 or newer versions (i.e. Windows 98, NT, 2000, XP and later).

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Working memory, commonly defined as the ability to hold mental representations on line transiently and to manipulate these representations, is known to be a core deficit in schizophrenia. The aim of the present study was to investigate the visuo-spatial component of the working memory in schizophrenia, and more precisely to what extent the dynamic visuo-spatial information processing is impaired in schizophrenia patients. For this purpose we used a computerized paradigm in which 29 patients with schizophrenia (DSMIV, Diagnostic Interview for Genetic Studies) and 29 age and sex matched control subjects (DIGS) had to memorize a plane moving across the computer screen and to identify the observed trajectory among 9 plots proposed together. Each trajectory could be seen max. 3 times if needed. The results showed no difference between schizophrenia patients and controls regarding the number of correct trajectory identified after the first presentation. However, when we determine the mean number of correct trajectories on the basis of 3 trials, we observed that schizophrenia patients are significantly less performant than controls (Mann-Whitney, p _ 0.002). These findings suggest that, although schizophrenia patients are able to memorize some dynamic trajectories as well as controls, they do not profit from the repetition of the trajectory presentation. These findings are congruent with the hypothesis that schizophrenia could induce an unbalance between local and global information processing: the patients may be able to focus on details of the trajectory which could allow them to find the right target (bottom-up processes), but may show difficulty to refer to previous experience in order to filter incoming information (top-down processes) and enhance their visuo-spatial working memory abilities.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Ajankohtaista

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The present work assessed the effects of intracerebroventricular injections (2x5 mg/2.5 ml) of recombined human nerve growth factor (rhNGF) at postnatal days 2 and 3 upon the development of spatial learning capacities in rats. The treated rats were trained at the age of 22 days to escape onto an invisible platform at a fixed position in space in a Morris navigation task. For half of the subjects, the training position was also cued, a procedure aimed at facilitating escape and reducing attention to the distant spatial cues. At the age of 2 months all the rats were retrained in the same task. Treatment effects were found in both immature and adult rats. The injection of NGF induced a slight alteration of the immature rats' performance. In contrast, a marked impairment of spatial abilities was shown in the 2-month-old rats. The most consistent effects were a significant increase in the escape latency and a decrease bias towards the training platform area during probe trials. The reduction of spatial memory was particularly marked if the subjects had been trained in a cued condition. Taken together, these experiments reveal that an acute pharmacological treatment that leads to transient modifications during early development might induce a behavioural change long after treatment. Thus, the development and the maintenance of an accurate spatial representation are tightly related to the development of brain structures that could be altered by precocious NGF administrations.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

En la actualidad las tecnologías de la información son utilizadas en todos los ámbitos empresariales. Desde sistemas de gestión (ERPs) pasando por la gestión documental, el análisis de información con sistema de Bussines Intelligence, pudiendo incluso convertirse en toda una nueva plataforma para proveer a las empresas de nuevos canales de venta, como es el caso deInternet.De la necesidad inicial de nuestro cliente en comenzar a expandirse por un nuevo canal de venta para poder llegar a nuevos mercados y diversificar sus clientes se inicia la motivación de este TFC.Dadas las características actuales de las tecnologías de la información e internet, estas conforman un binomio perfecto para definir este TFC que trata todos los aspectos necesarios para llegar a obtener un producto final como es un portal web inmobiliario adaptado a los requisitos demandados por los usuarios actuales de Internet.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

BACKGROUND: Until recently, neurosurgeons eagerly removed cerebellar lesions without consideration of future cognitive impairment that might be caused by the resection. In children, transient cerebellar mutism after resection has lead to a diminished use of midline approaches and vermis transection, as well as reduced retraction of the cerebellar hemispheres. The role of the cerebellum in higher cognitive functions beyond coordination and motor control has recently attracted significant interest in the scientific community, and might change the neurosurgical approach to these lesions. The aim of this study was to investigate the specific effects of cerebellar lesions on memory, and to assess a possible lateralisation effect. METHODS: We studied 16 patients diagnosed with a cerebellar lesion, from January 1997 to April 2005, in the "Centre Hospitalier Universitaire Vaudois (CHUV)", Lausanne, Switzerland. Different neuropsychological tests assessing short term and anterograde memory, verbal and visuo-spatial modalities were performed pre-operatively. RESULTS: Severe memory deficits in at least one modality were identified in a majority (81%) of patients with cerebellar lesions. Only 1 patient (6%) had no memory deficit. In our series lateralisation of the lesion did not lead to a significant difference in verbal or visuo-spatial memory deficits. FINDINGS: These findings are consistent with findings in the literature concerning memory deficits in isolated cerebellar lesions. These can be explained by anatomical pathways. However, the cross-lateralisation theory cannot be demonstrated in our series. The high percentage of patients with a cerebellar lesion who demonstrate memory deficits should lead us to assess memory in all patients with cerebellar lesions.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

OBJECTIVE: Patients with schizophrenia show deficits in visuospatial working memory and visual pursuit processes. It is currently unclear, however, whether both impairments are related to a common neuropathological origin. The purpose of the present study was therefore to examine the possible relations between the encoding and the discrimination of dynamic visuospatial stimuli in schizophrenia. METHOD: Sixteen outpatients with schizophrenia and 16 control subjects were asked to encode complex disc displacements presented on a screen. After a delay, participants had to identify the previously presented disc trajectory from a choice of six static linear paths, among which were five incorrect paths. The precision of visual pursuit eye movements during the initial presentation of the dynamic stimulus was assessed. The fixations and scanning time in definite regions of the six paths presented during the discrimination phase were investigated. RESULTS: In comparison with controls, patients showed poorer task performance, reduced pursuit accuracy during incorrect trials and less time scanning the correct stimulus or the incorrect paths approximating its global structure. Patients also spent less time scanning the leftmost portion of the correct path even when making a correct choice. The accuracy of visual pursuit and head movements, however, was not correlated with task performance. CONCLUSIONS: The present study provides direct support for the hypothesis that active integration of visuospatial information within working memory is deficient in schizophrenia. In contrast, a general impairment of oculomotor mechanisms involved in smooth pursuit did not appear to be directly related to lower visuospatial working memory performance in schizophrenia.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Résumé Cette thèse est consacrée à l'analyse, la modélisation et la visualisation de données environnementales à référence spatiale à l'aide d'algorithmes d'apprentissage automatique (Machine Learning). L'apprentissage automatique peut être considéré au sens large comme une sous-catégorie de l'intelligence artificielle qui concerne particulièrement le développement de techniques et d'algorithmes permettant à une machine d'apprendre à partir de données. Dans cette thèse, les algorithmes d'apprentissage automatique sont adaptés pour être appliqués à des données environnementales et à la prédiction spatiale. Pourquoi l'apprentissage automatique ? Parce que la majorité des algorithmes d'apprentissage automatiques sont universels, adaptatifs, non-linéaires, robustes et efficaces pour la modélisation. Ils peuvent résoudre des problèmes de classification, de régression et de modélisation de densité de probabilités dans des espaces à haute dimension, composés de variables informatives spatialisées (« géo-features ») en plus des coordonnées géographiques. De plus, ils sont idéaux pour être implémentés en tant qu'outils d'aide à la décision pour des questions environnementales allant de la reconnaissance de pattern à la modélisation et la prédiction en passant par la cartographie automatique. Leur efficacité est comparable au modèles géostatistiques dans l'espace des coordonnées géographiques, mais ils sont indispensables pour des données à hautes dimensions incluant des géo-features. Les algorithmes d'apprentissage automatique les plus importants et les plus populaires sont présentés théoriquement et implémentés sous forme de logiciels pour les sciences environnementales. Les principaux algorithmes décrits sont le Perceptron multicouches (MultiLayer Perceptron, MLP) - l'algorithme le plus connu dans l'intelligence artificielle, le réseau de neurones de régression généralisée (General Regression Neural Networks, GRNN), le réseau de neurones probabiliste (Probabilistic Neural Networks, PNN), les cartes auto-organisées (SelfOrganized Maps, SOM), les modèles à mixture Gaussiennes (Gaussian Mixture Models, GMM), les réseaux à fonctions de base radiales (Radial Basis Functions Networks, RBF) et les réseaux à mixture de densité (Mixture Density Networks, MDN). Cette gamme d'algorithmes permet de couvrir des tâches variées telle que la classification, la régression ou l'estimation de densité de probabilité. L'analyse exploratoire des données (Exploratory Data Analysis, EDA) est le premier pas de toute analyse de données. Dans cette thèse les concepts d'analyse exploratoire de données spatiales (Exploratory Spatial Data Analysis, ESDA) sont traités selon l'approche traditionnelle de la géostatistique avec la variographie expérimentale et selon les principes de l'apprentissage automatique. La variographie expérimentale, qui étudie les relations entre pairs de points, est un outil de base pour l'analyse géostatistique de corrélations spatiales anisotropiques qui permet de détecter la présence de patterns spatiaux descriptible par une statistique. L'approche de l'apprentissage automatique pour l'ESDA est présentée à travers l'application de la méthode des k plus proches voisins qui est très simple et possède d'excellentes qualités d'interprétation et de visualisation. Une part importante de la thèse traite de sujets d'actualité comme la cartographie automatique de données spatiales. Le réseau de neurones de régression généralisée est proposé pour résoudre cette tâche efficacement. Les performances du GRNN sont démontrées par des données de Comparaison d'Interpolation Spatiale (SIC) de 2004 pour lesquelles le GRNN bat significativement toutes les autres méthodes, particulièrement lors de situations d'urgence. La thèse est composée de quatre chapitres : théorie, applications, outils logiciels et des exemples guidés. Une partie importante du travail consiste en une collection de logiciels : Machine Learning Office. Cette collection de logiciels a été développée durant les 15 dernières années et a été utilisée pour l'enseignement de nombreux cours, dont des workshops internationaux en Chine, France, Italie, Irlande et Suisse ainsi que dans des projets de recherche fondamentaux et appliqués. Les cas d'études considérés couvrent un vaste spectre de problèmes géoenvironnementaux réels à basse et haute dimensionnalité, tels que la pollution de l'air, du sol et de l'eau par des produits radioactifs et des métaux lourds, la classification de types de sols et d'unités hydrogéologiques, la cartographie des incertitudes pour l'aide à la décision et l'estimation de risques naturels (glissements de terrain, avalanches). Des outils complémentaires pour l'analyse exploratoire des données et la visualisation ont également été développés en prenant soin de créer une interface conviviale et facile à l'utilisation. Machine Learning for geospatial data: algorithms, software tools and case studies Abstract The thesis is devoted to the analysis, modeling and visualisation of spatial environmental data using machine learning algorithms. In a broad sense machine learning can be considered as a subfield of artificial intelligence. It mainly concerns with the development of techniques and algorithms that allow computers to learn from data. In this thesis machine learning algorithms are adapted to learn from spatial environmental data and to make spatial predictions. Why machine learning? In few words most of machine learning algorithms are universal, adaptive, nonlinear, robust and efficient modeling tools. They can find solutions for the classification, regression, and probability density modeling problems in high-dimensional geo-feature spaces, composed of geographical space and additional relevant spatially referenced features. They are well-suited to be implemented as predictive engines in decision support systems, for the purposes of environmental data mining including pattern recognition, modeling and predictions as well as automatic data mapping. They have competitive efficiency to the geostatistical models in low dimensional geographical spaces but are indispensable in high-dimensional geo-feature spaces. The most important and popular machine learning algorithms and models interesting for geo- and environmental sciences are presented in details: from theoretical description of the concepts to the software implementation. The main algorithms and models considered are the following: multi-layer perceptron (a workhorse of machine learning), general regression neural networks, probabilistic neural networks, self-organising (Kohonen) maps, Gaussian mixture models, radial basis functions networks, mixture density networks. This set of models covers machine learning tasks such as classification, regression, and density estimation. Exploratory data analysis (EDA) is initial and very important part of data analysis. In this thesis the concepts of exploratory spatial data analysis (ESDA) is considered using both traditional geostatistical approach such as_experimental variography and machine learning. Experimental variography is a basic tool for geostatistical analysis of anisotropic spatial correlations which helps to understand the presence of spatial patterns, at least described by two-point statistics. A machine learning approach for ESDA is presented by applying the k-nearest neighbors (k-NN) method which is simple and has very good interpretation and visualization properties. Important part of the thesis deals with a hot topic of nowadays, namely, an automatic mapping of geospatial data. General regression neural networks (GRNN) is proposed as efficient model to solve this task. Performance of the GRNN model is demonstrated on Spatial Interpolation Comparison (SIC) 2004 data where GRNN model significantly outperformed all other approaches, especially in case of emergency conditions. The thesis consists of four chapters and has the following structure: theory, applications, software tools, and how-to-do-it examples. An important part of the work is a collection of software tools - Machine Learning Office. Machine Learning Office tools were developed during last 15 years and was used both for many teaching courses, including international workshops in China, France, Italy, Ireland, Switzerland and for realizing fundamental and applied research projects. Case studies considered cover wide spectrum of the real-life low and high-dimensional geo- and environmental problems, such as air, soil and water pollution by radionuclides and heavy metals, soil types and hydro-geological units classification, decision-oriented mapping with uncertainties, natural hazards (landslides, avalanches) assessments and susceptibility mapping. Complementary tools useful for the exploratory data analysis and visualisation were developed as well. The software is user friendly and easy to use.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Background: The poxvirus vector Modified Vaccinia Virus Ankara (MVA) expressing HIV-1 Env, Gag, Pol and Nef antigens from clade B (MVA-B) is currently used as a HIV/AIDS vaccine candidate. A general strategy to try to improve the immunogenicity of poxvirus HIV-1 vaccine candidates is the deletion of known or suggested immunomodulatory vaccinia virus (VACV) genes.Methods: We have generated and characterized the innate immune sensing and the immunogenicity profile of a new HIV-1 vaccine candidate, which contains a deletion in a VACV gene.Results: We show that this VACV protein is expressed early during virus infection and localizes to the cytoplasm of infected cells. Deletion of this VACV gene from the MVA-B had no effect on virus growth kinetics; therefore this VACV protein is not essential for virus replication. The innate immune signals elicited by the MVA-B deletion mutant in human macrophages and monocyte-derived dendritic cells were characterized. In a DNA prime/MVA boost immunization protocol in mice, flow cytometry analysis revealed that the MVA-B deletion mutant enhanced the magnitude and polyfunctionality of the HIV-1-specific CD4 + and CD8 + T-cell memory immune responses, with most of the HIV-1 responses mediated by the CD8 + T-cell compartment with an effector phenotype. Significantly, while MVA-B induced preferentially Env- and Gag-specific CD8 + T-cell responses, the MVA-B deletion mutant induced more GPN-specific CD8 + T-cell responses. Furthermore, the MVA-B deletion mutant enhanced the levels of antibodies against Env in comparison with MVA-B.Conclusion: These findings revealed that this new VACV protein can be considered as an immunomodulator and that deleting this gene in MVA-B confers an immunological benefit by inducing innate immune responses and increasing the magnitude and quality of the T-cell memory immune responses to HIV-1 antigens. Our observations are relevant for the improvement of MVA vectors as HIV-1 vaccines.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Investigaremos cómo las redes de colaboración y el softwarelibre permiten adaptar el centro educativo al entorno, cómo pueden ayudar al centro a potenciar la formación profesional y garantizar la durabilidad de las acciones, con el objetivo que perdure el conocimiento y la propia red de colaboración para una mejora educativa.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Trabajo que muestra, haciendo uso de tecnologías libres y basándonos en sistemas operativos abiertos, cómo es posible mantener un nivel alto de trabajo para una empresa que se dedica a implementar y realizar desarrollos en tecnologías de software libre. Se muestra el montaje de un laboratorio de desarrollo que nos va a permitir entender el funcionamiento y la implementación tanto de GNU/Linux como del software que se basa en él dentro de la infraestructura de la empresa.