926 resultados para Modular neural systems


Relevância:

30.00% 30.00%

Publicador:

Resumo:

The availability of stem cells is of great promise to study early developmental stages and to generate adequate cells for cell transfer therapies. Although many researchers using stem cells were successful in dissecting intrinsic and extrinsic mechanisms and in generating specific cell phenotypes, few of the stem cells or the differentiated cells show the capacity to repair a tissue. Advances in cell and stem cell cultivation during the last years made tremendous progress in the generation of bona fide differentiated cells able to integrate into a tissue after transplantation, opening new perspectives for developmental biology studies and for regenerative medicine. In this review, we focus on the main works attempting to create in vitro conditions mimicking the natural environment of CNS structures such as the neural tube and its development in different brain region areas including the optic cup. The use of protocols growing cells in 3D organoids is a key strategy to produce cells resembling endogenous ones. An emphasis on the generation of retina tissue and photoreceptor cells is provided to highlight the promising developments in this field. Other examples are presented and discussed, such as the formation of cortical tissue, the epithelial gut or the kidney organoids. The generation of differentiated tissues and well-defined cell phenotypes from embryonic stem (ES) cells or induced pluripotent cells (iPSCs) opens several new strategies in the field of biology and regenerative medicine. A 3D organ/tissue development in vitro derived from human cells brings a unique tool to study human cell biology and pathophysiology of an organ or a specific cell population. The perspective of tissue repair is discussed as well as the necessity of cell banking to accelerate the progress of this promising field.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Neural signal processing is a discipline within neuroengineering. This interdisciplinary approach combines principles from machine learning, signal processing theory, and computational neuroscience applied to problems in basic and clinical neuroscience. The ultimate goal of neuroengineering is a technological revolution, where machines would interact in real time with the brain. Machines and brains could interface, enabling normal function in cases of injury or disease, brain monitoring, and/or medical rehabilitation of brain disorders. Much current research in neuroengineering is focused on understanding the coding and processing of information in the sensory and motor systems, quantifying how this processing is altered in the pathological state, and how it can be manipulated through interactions with artificial devices including brain–computer interfaces and neuroprosthetics.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We present an overview of the long-term adaptation of hippocampal neurotransmission to cholinergic and GABAergic deafferentation caused by excitotoxic lesion of the medial septum. Two months after septal microinjection of 2.7 nmol a -amino-3-hydroxy-5-methylisoxazole-4-propionate (AMPA), a 220% increase of GABA A receptor labelling in the hippo- campal CA3 and the hilus was shown, and also changes in hippocampal neurotransmission characterised by in vivo microdialysis and HPLC. Basal amino acid and purine extra- cellular levels were studied in control and lesioned rats. In vivo effects of 100 m M KCl perfusion and adenosine A 1 receptor blockade with 1,3-dipropyl- 8-cyclopentylxanthine (DPCPX) on their release were also investigated. In lesioned animals GABA, glutamate and glutamine basal levels were decreased and taurine, adenosine and uric acid levels increased. A similar response to KCl infusion occurred in both groups except for GABA and glutamate, which release decreased in lesioned rats. Only in lesioned rats, DPCPX increased GABA basal level and KCl-induced glutamate release, and decreased glutamate turnover. Our results evidence that an excitotoxic septal lesion leads to increased hippocampal GABA A receptors and decreased glutamate neurotransmis- sion. In this situation, a co-ordinated response of hippocampal retaliatory systems takes place to control neuron excitability.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The present study aimed at evaluating the use of Artificial Neural Network to correlate the values resulting from chemical analyses of samples of coffee with the values of their sensory analyses. The coffee samples used were from the Coffea arabica L., cultivars Acaiá do Cerrado, Topázio, Acaiá 474-19 and Bourbon, collected in the southern region of the state of Minas Gerais. The chemical analyses were carried out for reducing and non-reducing sugars. The quality of the beverage was evaluated by sensory analysis. The Artificial Neural Network method used values from chemical analyses as input variables and values from sensory analysis as output values. The multiple linear regression of sensory analysis values, according to the values from chemical analyses, presented a determination coefficient of 0.3106, while the Artificial Neural Network achieved a level of 80.00% of success in the classification of values from the sensory analysis.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Through advances in technology, System-on-Chip design is moving towards integrating tens to hundreds of intellectual property blocks into a single chip. In such a many-core system, on-chip communication becomes a performance bottleneck for high performance designs. Network-on-Chip (NoC) has emerged as a viable solution for the communication challenges in highly complex chips. The NoC architecture paradigm, based on a modular packet-switched mechanism, can address many of the on-chip communication challenges such as wiring complexity, communication latency, and bandwidth. Furthermore, the combined benefits of 3D IC and NoC schemes provide the possibility of designing a high performance system in a limited chip area. The major advantages of 3D NoCs are the considerable reductions in average latency and power consumption. There are several factors degrading the performance of NoCs. In this thesis, we investigate three main performance-limiting factors: network congestion, faults, and the lack of efficient multicast support. We address these issues by the means of routing algorithms. Congestion of data packets may lead to increased network latency and power consumption. Thus, we propose three different approaches for alleviating such congestion in the network. The first approach is based on measuring the congestion information in different regions of the network, distributing the information over the network, and utilizing this information when making a routing decision. The second approach employs a learning method to dynamically find the less congested routes according to the underlying traffic. The third approach is based on a fuzzy-logic technique to perform better routing decisions when traffic information of different routes is available. Faults affect performance significantly, as then packets should take longer paths in order to be routed around the faults, which in turn increases congestion around the faulty regions. We propose four methods to tolerate faults at the link and switch level by using only the shortest paths as long as such path exists. The unique characteristic among these methods is the toleration of faults while also maintaining the performance of NoCs. To the best of our knowledge, these algorithms are the first approaches to bypassing faults prior to reaching them while avoiding unnecessary misrouting of packets. Current implementations of multicast communication result in a significant performance loss for unicast traffic. This is due to the fact that the routing rules of multicast packets limit the adaptivity of unicast packets. We present an approach in which both unicast and multicast packets can be efficiently routed within the network. While suggesting a more efficient multicast support, the proposed approach does not affect the performance of unicast routing at all. In addition, in order to reduce the overall path length of multicast packets, we present several partitioning methods along with their analytical models for latency measurement. This approach is discussed in the context of 3D mesh networks.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Recently, Small Modular Reactors (SMRs) have attracted increased public discussion. While large nuclear power plant new build projects are facing challenges, the focus of attention is turning to small modular reactors. One particular project challenge arises in the area of nuclear licensing, which plays a significant role in new build projects affecting their quality as well as costs and schedules. This dissertation - positioned in the field of nuclear engineering but also with a significant section in the field of systems engineering - examines the nuclear licensing processes and their suitability for the characteristics of SMRs. The study investigates the licensing processes in selected countries, as well as other safety critical industry fields. Viewing the licensing processes and their separate licensing steps in terms of SMRs, the study adopts two different analysis theories for review and comparison. The primary data consists of a literature review, semi-structured interviews, and questionnaire responses concerning licensing processes and practices. The result of the study is a recommendation for a new, optimized licensing process for SMRs. The most important SMR-specific feature, in terms of licensing, is the modularity of the design. Here the modularity indicates multi-module SMR designs, which creates new challenges in the licensing process. As this study focuses on Finland, the main features of the new licensing process are adapted to the current Finnish licensing process, aiming to achieve the main benefits with minimal modifications to the current process. The application of the new licensing process is developed using Systems Engineering, Requirements Management, and Project Management practices and tools. Nuclear licensing includes a large amount of data and documentation which needs to be managed in a suitable manner throughout the new build project and then during the whole life cycle of the nuclear power plant. To enable a smooth licensing process and therefore ensure the success of the new build nuclear power plant project, management processes and practices play a significant role. This study contributes to the theoretical understanding of how licensing processes are structured and how they are put into action in practice. The findings clarify the suitability of different licensing processes and their selected licensing steps for SMR licensing. The results combine the most suitable licensing steps into a new licensing process for SMRs. The results are also extended to the concept of licensing management practices and tools.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

One of the main problems related to the transport and manipulation of multiphase fluids concerns the existence of characteristic flow patterns and its strong influence on important operation parameters. A good example of this occurs in gas-liquid chemical reactors in which maximum efficiencies can be achieved by maintaining a finely dispersed bubbly flow to maximize the total interfacial area. Thus, the ability to automatically detect flow patterns is of crucial importance, especially for the adequate operation of multiphase systems. This work describes the application of a neural model to process the signals delivered by a direct imaging probe to produce a diagnostic of the corresponding flow pattern. The neural model is constituted of six independent neural modules, each of which trained to detect one of the main horizontal flow patterns, and a last winner-take-all layer responsible for resolving when two or more patterns are simultaneously detected. Experimental signals representing different bubbly, intermittent, annular and stratified flow patterns were used to validate the neural model.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The augmented reality (AR) technology has applications in many fields as diverse as aeronautics, tourism, medicine, and education. In this review are summarized the current status of AR and it is proposed a new application of it in weed science. The basic algorithmic elements for AR implementation are already available to develop applications in the area of weed economic thresholds. These include algorithms for image recognition to identify and quantify weeds by species and software for herbicide selection based on weed density. Likewise, all hardware necessary for AR implementation in weed science are available at an affordable price for the user. Thus, the authors propose weed science can take a leading role integrating AR systems into weed economic thresholds software, thus, providing better opportunities for science and computer-based weed control decisions.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In the present study, we modeled a reaching task as a two-link mechanism. The upper arm and forearm motion trajectories during vertical arm movements were estimated from the measured angular accelerations with dual-axis accelerometers. A data set of reaching synergies from able-bodied individuals was used to train a radial basis function artificial neural network with upper arm/forearm tangential angular accelerations. The trained radial basis function artificial neural network for the specific movements predicted forearm motion from new upper arm trajectories with high correlation (mean, 0.9149-0.941). For all other movements, prediction was low (range, 0.0316-0.8302). Results suggest that the proposed algorithm is successful in generalization over similar motions and subjects. Such networks may be used as a high-level controller that could predict forearm kinematics from voluntary movements of the upper arm. This methodology is suitable for restoring the upper limb functions of individuals with motor disabilities of the forearm, but not of the upper arm. The developed control paradigm is applicable to upper-limb orthotic systems employing functional electrical stimulation. The proposed approach is of great significance particularly for humans with spinal cord injuries in a free-living environment. The implication of a measurement system with dual-axis accelerometers, developed for this study, is further seen in the evaluation of movement during the course of rehabilitation. For this purpose, training-related changes in synergies apparent from movement kinematics during rehabilitation would characterize the extent and the course of recovery. As such, a simple system using this methodology is of particular importance for stroke patients. The results underlie the important issue of upper-limb coordination.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The arterial partial pressure (P CO2) of carbon dioxide is virtually constant because of the close match between the metabolic production of this gas and its excretion via breathing. Blood gas homeostasis does not rely solely on changes in lung ventilation, but also to a considerable extent on circulatory adjustments that regulate the transport of CO2 from its sites of production to the lungs. The neural mechanisms that coordinate circulatory and ventilatory changes to achieve blood gas homeostasis are the subject of this review. Emphasis will be placed on the control of sympathetic outflow by central chemoreceptors. High levels of CO2 exert an excitatory effect on sympathetic outflow that is mediated by specialized chemoreceptors such as the neurons located in the retrotrapezoid region. In addition, high CO2 causes an aversive awareness in conscious animals, activating wake-promoting pathways such as the noradrenergic neurons. These neuronal groups, which may also be directly activated by brain acidification, have projections that contribute to the CO2-induced rise in breathing and sympathetic outflow. However, since the level of activity of the retrotrapezoid nucleus is regulated by converging inputs from wake-promoting systems, behavior-specific inputs from higher centers and by chemical drive, the main focus of the present manuscript is to review the contribution of central chemoreceptors to the control of autonomic and respiratory mechanisms.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The mammalian stress response is an integrated physiological and psychological reaction to real or perceived adversity. Glucocorticoids are an important component of this response, acting to redistribute energy resources to both optimize survival in the face of challenge and to restore homeostasis after the immediate challenge has subsided. Release of glucocorticoids is mediated by the hypothalamo-pituitary-adrenal (HPA) axis, driven by a neural signal originating in the paraventricular nucleus (PVN). Stress levels of glucocorticoids bind to glucocorticoid receptors in multiple body compartments, including the brain, and consequently have wide-reaching actions. For this reason, glucocorticoids serve a vital function in negative feedback inhibition of their own secretion. Negative feedback inhibition is mediated by a diverse collection of mechanisms, including fast, non-genomic feedback at the level of the PVN, stress-shut-off at the level of the limbic system, and attenuation of ascending excitatory input through destabilization of mRNAs encoding neuropeptide drivers of the HPA axis. In addition, there is evidence that glucocorticoids participate in stress activation via feed-forward mechanisms at the level of the amygdala. Feedback deficits are associated with numerous disease states, underscoring the necessity for adequate control of glucocorticoid homeostasis. Thus, rather than having a single, defined feedback ‘switch’, control of the stress response requires a wide-reaching feedback ‘network’ that coordinates HPA activity to suit the overall needs of multiple body systems.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The main focus of this thesis is to evaluate and compare Hyperbalilearning algorithm (HBL) to other learning algorithms. In this work HBL is compared to feed forward artificial neural networks using back propagation learning, K-nearest neighbor and 103 algorithms. In order to evaluate the similarity of these algorithms, we carried out three experiments using nine benchmark data sets from UCI machine learning repository. The first experiment compares HBL to other algorithms when sample size of dataset is changing. The second experiment compares HBL to other algorithms when dimensionality of data changes. The last experiment compares HBL to other algorithms according to the level of agreement to data target values. Our observations in general showed, considering classification accuracy as a measure, HBL is performing as good as most ANn variants. Additionally, we also deduced that HBL.:s classification accuracy outperforms 103's and K-nearest neighbour's for the selected data sets.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Understanding how stem and progenitor cells choose between alternative cell fates is a major challenge in developmental biology. Efforts to tackle this problem have been hampered by the scarcity of markers that can be used to predict cell division outcomes. Here we present a computational method, based on algorithmic information theory, to analyze dynamic features of living cells over time. Using this method, we asked whether rat retinal progenitor cells (RPCs) display characteristic phenotypes before undergoing mitosis that could foretell their fate. We predicted whether RPCs will undergo a self-renewing or terminal division with 99% accuracy, or whether they will produce two photoreceptors or another combination of offspring with 87% accuracy. Our implementation can segment, track and generate predictions for 40 cells simultaneously on a standard computer at 5 min per frame. This method could be used to isolate cell populations with specific developmental potential, enabling previously impossible investigations.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Un objectif principal du génie logiciel est de pouvoir produire des logiciels complexes, de grande taille et fiables en un temps raisonnable. La technologie orientée objet (OO) a fourni de bons concepts et des techniques de modélisation et de programmation qui ont permis de développer des applications complexes tant dans le monde académique que dans le monde industriel. Cette expérience a cependant permis de découvrir les faiblesses du paradigme objet (par exemples, la dispersion de code et le problème de traçabilité). La programmation orientée aspect (OA) apporte une solution simple aux limitations de la programmation OO, telle que le problème des préoccupations transversales. Ces préoccupations transversales se traduisent par la dispersion du même code dans plusieurs modules du système ou l’emmêlement de plusieurs morceaux de code dans un même module. Cette nouvelle méthode de programmer permet d’implémenter chaque problématique indépendamment des autres, puis de les assembler selon des règles bien définies. La programmation OA promet donc une meilleure productivité, une meilleure réutilisation du code et une meilleure adaptation du code aux changements. Très vite, cette nouvelle façon de faire s’est vue s’étendre sur tout le processus de développement de logiciel en ayant pour but de préserver la modularité et la traçabilité, qui sont deux propriétés importantes des logiciels de bonne qualité. Cependant, la technologie OA présente de nombreux défis. Le raisonnement, la spécification, et la vérification des programmes OA présentent des difficultés d’autant plus que ces programmes évoluent dans le temps. Par conséquent, le raisonnement modulaire de ces programmes est requis sinon ils nécessiteraient d’être réexaminés au complet chaque fois qu’un composant est changé ou ajouté. Il est cependant bien connu dans la littérature que le raisonnement modulaire sur les programmes OA est difficile vu que les aspects appliqués changent souvent le comportement de leurs composantes de base [47]. Ces mêmes difficultés sont présentes au niveau des phases de spécification et de vérification du processus de développement des logiciels. Au meilleur de nos connaissances, la spécification modulaire et la vérification modulaire sont faiblement couvertes et constituent un champ de recherche très intéressant. De même, les interactions entre aspects est un sérieux problème dans la communauté des aspects. Pour faire face à ces problèmes, nous avons choisi d’utiliser la théorie des catégories et les techniques des spécifications algébriques. Pour apporter une solution aux problèmes ci-dessus cités, nous avons utilisé les travaux de Wiels [110] et d’autres contributions telles que celles décrites dans le livre [25]. Nous supposons que le système en développement est déjà décomposé en aspects et classes. La première contribution de notre thèse est l’extension des techniques des spécifications algébriques à la notion d’aspect. Deuxièmement, nous avons défini une logique, LA , qui est utilisée dans le corps des spécifications pour décrire le comportement de ces composantes. La troisième contribution consiste en la définition de l’opérateur de tissage qui correspond à la relation d’interconnexion entre les modules d’aspect et les modules de classe. La quatrième contribution concerne le développement d’un mécanisme de prévention qui permet de prévenir les interactions indésirables dans les systèmes orientés aspect.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

La mémoire n’est pas un processus unitaire et est souvent divisée en deux catégories majeures: la mémoire déclarative (pour les faits) et procédurale (pour les habitudes et habiletés motrices). Pour perdurer, une trace mnésique doit passer par la consolidation, un processus par lequel elle devient plus robuste et moins susceptible à l’interférence. Le sommeil est connu comme jouant un rôle clé pour permettre le processus de consolidation, particulièrement pour la mémoire déclarative. Depuis plusieurs années cependant, son rôle est aussi reconnu pour la mémoire procédurale. Il est par contre intéressant de noter que ce ne sont pas tous les types de mémoire procédurale qui requiert le sommeil afin d’être consolidée. Entre autres, le sommeil semble nécessaire pour consolider un apprentissage de séquences motrices (s’apparentant à l’apprentissage du piano), mais pas un apprentissage d’adaptation visuomotrice (tel qu’apprendre à rouler à bicyclette). Parallèlement, l’apprentissage à long terme de ces deux types d’habiletés semble également sous-tendu par des circuits neuronaux distincts; c’est-à-dire un réseau cortico-striatal et cortico-cérébelleux respectivement. Toutefois, l’implication de ces réseaux dans le processus de consolidation comme tel demeure incertain. Le but de cette thèse est donc de mieux comprendre le rôle du sommeil, en contrôlant pour le simple passage du temps, dans la consolidation de ces deux types d’apprentissage, à l’aide de l’imagerie par résonnance magnétique fonctionnelle et d’analyses de connectivité cérébrale. Nos résultats comportementaux supportent l’idée que seul l’apprentissage séquentiel requiert le sommeil pour déclencher le processus de consolidation. Nous suggérons de plus que le putamen est fortement associé à ce processus. En revanche, les performances d’un apprentissage visuomoteur s’améliorent indépendamment du sommeil et sont de plus corrélées à une plus grande activation du cervelet. Finalement, en explorant l’effet du sommeil sur la connectivité cérébrale, nos résultats démontrent qu’en fait, un système cortico-striatal semble être plus intégré suite à la consolidation. C’est-à-dire que l’interaction au sein des régions du système est plus forte lorsque la consolidation a eu lieu, après une nuit de sommeil. En opposition, le simple passage du temps semble nuire à l’intégration de ce réseau cortico-striatal. En somme, nous avons pu élargir les connaissances quant au rôle du sommeil pour la mémoire procédurale, notamment en démontrant que ce ne sont pas tous les types d’apprentissages qui requièrent le sommeil pour amorcer le processus de consolidation. D’ailleurs, nous avons également démontré que cette dissociation de l’effet du sommeil est également reflétée par l’implication de deux réseaux cérébraux distincts. À savoir, un réseau cortico-striatal et un réseau cortico-cérébelleux pour la consolidation respective de l’apprentissage de séquence et d’adaptation visuomotrice. Enfin, nous suggérons que la consolidation durant le sommeil permet de protéger et favoriser une meilleure cohésion au sein du réseau cortico-striatal associé à notre tâche; un phénomène qui, s’il est retrouvé avec d’autres types d’apprentissage, pourrait être considéré comme un nouveau marqueur de la consolidation.