132 resultados para pacs: neural computing technologies
Resumo:
Early detection of neural-tude defects is possible by determining Alpha-fetoprotein (AFP) in maternal serum. 16'685 pregnant women were observed. Three methods for the determination of the "normal" range are compared. The first one, already used in similar studies, makes use of a constant multiple of the median. The other two ones make use of robust estimates of location and scale. Their comparison shows the interest of the robust methods to reduce the interlaboratory variability.
Resumo:
Newborn neurons are generated in the adult hippocampus from a pool of self-renewing stem cells located in the subgranular zone (SGZ) of the dentate gyrus. Their activation, proliferation, and maturation depend on a host of environmental and cellular factors but, until recently, the contribution of local neuronal circuitry to this process was relatively unknown. In their recent publication, Song and colleagues have uncovered a novel circuit-based mechanism by which release of the neurotransmitter, γ-aminobutyric acid (GABA), from parvalbumin-expressing (PV) interneurons, can hold radial glia-like (RGL) stem cells of the adult SGZ in a quiescent state. This tonic GABAergic signal, dependent upon the activation of γ(2) subunit-containing GABA(A) receptors of RGL stem cells, can thus prevent their proliferation and subsequent maturation or return them to quiescence if previously activated. PV interneurons are thus capable of suppressing neurogenesis during periods of high network activity and facilitating neurogenesis when network activity is low.
Resumo:
Background: The coagulation factor thrombin mediates ischemic neuronal deathand, at a low concentration, induces tolerance to ischemia.We investigated its modeof activation in ischemic neural tissue using an in vitro approach to distinguish therole of circulating coagulation factors from endogenous cerebral mechanisms. Wealso studied the signalling pathway downstream of thrombin in ischemia and afterthrombin preconditioning.Methods: Rat organotypic hippocampal slice cultures to 30 minute oxygen (5%)and glucose (1 mmol/L) deprivation (OGD).Results: Selective factor Xa (FXa) inhibition by fondaparinux during and afterOGD significantly reduced neuronal death in the CA1 after 48 hours. Thrombinactivity was increased in the medium 24 hours after OGD and this increasewas prevented by fondaparinux suggesting that FXa catalyzes the conversion ofprothrombin to thrombin in neural tissue after ischemia in vitro. Treatment withSCH79797, a selective antagonist of the thrombin receptor protease activatedreceptor-1 (PAR-1), significantly decreased neuronal cell death indicating thatthrombin signals ischemic damage via PAR-1. The JNK pathway plays an importantrole in cerebral ischemia and we observed activation of the JNK substrate,c-Jun in our model. Both the FXa inhibitor, fondaparinux and the PAR-1 antagonistSCH79797, decreased the level of phospho-c-Jun Ser73. After thrombin preconditioningc-Jun was activated by phosphorylation in the nuclei of neurons of the CA1.Treatment with a synthetic thrombin receptor agonist resulted in the same c-Junactivation profile and protection against subsequent OGD indicating that thrombinalso signals via PAR-1 and c-Jun in cell protection.Conclusion: These results indicate that FXa activates thrombin in cerebral ischemia,leading via PAR-1 to the activation of the JNK pathway resulting in neuronal death.Thrombin induced tolerance also involves PAR-1 and JNK, revealing commonfeatures in cell death and survival signalling.
Resumo:
Embryonic stem cells (ESCs) offer attractive prospective as potential source of neurons for cell replacement therapy in human neurodegenerative diseases. Besides, ESCs neural differentiation enables in vitro tissue engineering for fundamental research and drug discovery aimed at the nervous system. We have established stable and long-term three-dimensional (3D) culture conditions which can be used to model long latency and complex neurodegenerative diseases. Mouse ESCs-derived neural progenitor cells generated by MS5 stromal cells induction, result in strictly neural 3D cultures of about 120-mum thick, whose cells expressed mature neuronal, astrocytes and myelin markers. Neurons were from the glutamatergic and gabaergic lineages. This nervous tissue was spatially organized in specific layers resembling brain sub-ependymal (SE) nervous tissue, and was maintained in vitro for at least 3.5 months with great stability. Electron microscopy showed the presence of mature synapses and myelinated axons, suggesting functional maturation. Electrophysiological activity revealed biological signals involving action potential propagation along neuronal fibres and synaptic-like release of neurotransmitters. The rapid development and stabilization of this 3D cultures model result in an abundant and long-lasting production that is compatible with multiple and productive investigations for neurodegenerative diseases modeling, drug and toxicology screening, stress and aging research.
Resumo:
Rhythmic activity plays a central role in neural computations and brain functions ranging from homeostasis to attention, as well as in neurological and neuropsychiatric disorders. Despite this pervasiveness, little is known about the mechanisms whereby the frequency and power of oscillatory activity are modulated, and how they reflect the inputs received by neurons. Numerous studies have reported input-dependent fluctuations in peak frequency and power (as well as couplings across these features). However, it remains unresolved what mediates these spectral shifts among neural populations. Extending previous findings regarding stochastic nonlinear systems and experimental observations, we provide analytical insights regarding oscillatory responses of neural populations to stimulation from either endogenous or exogenous origins. Using a deceptively simple yet sparse and randomly connected network of neurons, we show how spiking inputs can reliably modulate the peak frequency and power expressed by synchronous neural populations without any changes in circuitry. Our results reveal that a generic, non-nonlinear and input-induced mechanism can robustly mediate these spectral fluctuations, and thus provide a framework in which inputs to the neurons bidirectionally regulate both the frequency and power expressed by synchronous populations. Theoretical and computational analysis of the ensuing spectral fluctuations was found to reflect the underlying dynamics of the input stimuli driving the neurons. Our results provide insights regarding a generic mechanism supporting spectral transitions observed across cortical networks and spanning multiple frequency bands.
Resumo:
This study looks at how increased memory utilisation affects throughput and energy consumption in scientific computing, especially in high-energy physics. Our aim is to minimise energy consumed by a set of jobs without increasing the processing time. The earlier tests indicated that, especially in data analysis, throughput can increase over 100% and energy consumption decrease 50% by processing multiple jobs in parallel per CPU core. Since jobs are heterogeneous, it is not possible to find an optimum value for the number of parallel jobs. A better solution is based on memory utilisation, but finding an optimum memory threshold is not straightforward. Therefore, a fuzzy logic-based algorithm was developed that can dynamically adapt the memory threshold based on the overall load. In this way, it is possible to keep memory consumption stable with different workloads while achieving significantly higher throughput and energy-efficiency than using a traditional fixed number of jobs or fixed memory threshold approaches.
Resumo:
Motivation: Genome-wide association studies have become widely used tools to study effects of genetic variants on complex diseases. While it is of great interest to extend existing analysis methods by considering interaction effects between pairs of loci, the large number of possible tests presents a significant computational challenge. The number of computations is further multiplied in the study of gene expression quantitative trait mapping, in which tests are performed for thousands of gene phenotypes simultaneously. Results: We present FastEpistasis, an efficient parallel solution extending the PLINK epistasis module, designed to test for epistasis effects when analyzing continuous phenotypes. Our results show that the algorithm scales with the number of processors and offers a reduction in computation time when several phenotypes are analyzed simultaneously. FastEpistasis is capable of testing the association of a continuous trait with all single nucleotide polymorphism ( SNP) pairs from 500 000 SNPs, totaling 125 billion tests, in a population of 5000 individuals in 29, 4 or 0.5 days using 8, 64 or 512 processors.
Resumo:
Background: Citrobacter rodentium is a natural mouse pathogen that is genetically closelyrelated to the human enteric pathogens enteropathogenic and enterohemorrhagic E. coli.Among the repertoire of conserved virulence factors that these pathogens deliver via typeIII secretion, Tir and EspF are responsible for the formation of characteristic actin-richpedestals and disruption of tight junction integrity, respectively. There is evidence In Vitrothese effectors accomplish this, at least in part, by subverting the normal host cellularfunctions of N-WASP, a critical regulator of branched chain actin assembly. Although NWASPhas been shown to be involved in pedestal formation In Vitro, the requirements ofN-WASP-mediated actin pedestals for intestinal colonization by attaching/effacing (A/E)pathogens In Vivo is not known. Furthermore, it is not known whether N-WASP is requiredfor EspF-mediated tight junction disruption. Methods: To investigate the role of N-WASPin the gut epithelium, we generated mice with intestine-specific deletion of N-WASP(iNWKO), by mating mice homozygous for a floxed N-WASP allele (N-WASPL2L/L2L) tomice expressing Cre recombinase under the villin promoter. Separately housed groups ofWT and iNWKO mice were inoculated with 5x108 GFP-expressing C. rodentium by intragastriclavage. Stool was collected 2, 4, 7, and 12 days after infection, and recoverablecolony forming units (CFUs) of C. rodentium were quantified by plating serial dilutions ofhomogenized stool on MacConkey's agar. GFP+ colonies were counted after 24 hoursincubation at 37°C. The presence of actin pedestals was investigated by electron microscopy(EM), and tight junction morphology was assessed by immunofluorescence staining ofoccludin, ZO-1 and claudin-2. Results: C. rodentium infection did not result in mortalityin WT or iNWKO mice. Compared to controls, iNWKO mice exhibited higher levels ofbacterial shedding during the first 4 days of infection (day 4 average: WT 5.2x104 CFU/gvs. iNWKO 4.7x105 CFU/g, p=0.08), followed by a more rapid clearance of C. rodentium, (day7-12 average: WT 2x106 CFU/g vs. iNWKO 2.7x105, p=0.01). EM and immunofluorescencerevealed the complete lack of actin pedestals in iNWKO mice and no mucosa-associatedGFP+ C. rodentium by day 7. WT controls exhibited tight junction disruption, reflected byaltered distribution of ZO-1, whereas iNWKO mice had no change in the pattern of ZO-1.Conclusion: Intestinal N-WASP is required for actin pedestal formation by C. rodentium InVivo, and ablation of N-WASP is associated with more rapid bacterial clearance and decreasedability of C. rodentium to disrupt intercellular junctions.
Resumo:
Nous assistons actuellement à une diffusion, à l'échelle planétaire, des Technologies de l'Information et de la Communication (TIC) même si elle se fait à des rythmes différents selon les nations (voire entre les régions d'un même pays) créant ainsi un fossé dit « numérique », en sus des multiples inégalités déjà présentes. Cette révolution informatique et technologique engendre de nombreux changements dans les rapports sociaux et permet de nombreuses applications destinées à simplifier la vie quotidienne de tout un chacun. Amine Bekkouche se penche sur la problématique de la cyberadministration comme conséquence importante des TIC, à l'instar du commerce électronique. Il présente, d'abord, une synthèse des principaux concepts de la cyberadministration ainsi qu'un panorama de la situation mondiale en ce domaine. Par la suite, il appréhende la cyberadministration dans la perspective des pays émergents, notamment, à travers l'illustration d'un pays en développement représentatif. Il propose alors des solutions concrètes qui prennent comme point de départ le secteur éducatif pour permettre une « alphabétisation informatique » de la société afin de contribuer justement à réduire le fossé numérique. Il élargit, ensuite, ces propositions à d'autres domaines et formule des recommandations facilitant leur mise en oeuvre. Il conclut, enfin, sur des perspectives qui pourraient constituer autant de pistes de recherches futures et permettre l'élaboration de projets de développement, à travers l'appropriation de ces TIC, pour améliorer la condition de l'administré, et plus globalement, du citoyen. - We are currently witnessing a distribution of Information and Communication Technologies (ICT) on a global scale. Yet, this distribution is carried out in different rhythms within each nation (and even among regions in a given country), which creates a "digital" gap, in addition to multiple inequalities already present. This computing and technological revolution engenders many changes in social relationships and permits numerous applications that are destined to simplify our lives. Amine Bekkouche takes a closer look at the issue of e-government as an important consequence of ICTs, following the example of electronic commerce. First, he presents a synthesis of the main concepts in e- government as well as a panoramic view of the global situation in this domain. Subsequently, he studies e-government in view of emerging countries, in particular through the illustration of a country in representative development. Then, he offers concrete solutions, which take the education sector as their starting point, to allow for a "computed digitalisation" of society that contribute to reduce the digital gap. Thereafter, he broadens these proposals to other domains and formulates recommendations that help their implementation. Finally, he concludes with perspectives that may constitute further research tracks and enable the elaboration of development projects, through the appropriation of ICTs, in order to improve the condition of the administered, and more generally, that of the citizen.
Resumo:
Automatic environmental monitoring networks enforced by wireless communication technologies provide large and ever increasing volumes of data nowadays. The use of this information in natural hazard research is an important issue. Particularly useful for risk assessment and decision making are the spatial maps of hazard-related parameters produced from point observations and available auxiliary information. The purpose of this article is to present and explore the appropriate tools to process large amounts of available data and produce predictions at fine spatial scales. These are the algorithms of machine learning, which are aimed at non-parametric robust modelling of non-linear dependencies from empirical data. The computational efficiency of the data-driven methods allows producing the prediction maps in real time which makes them superior to physical models for the operational use in risk assessment and mitigation. Particularly, this situation encounters in spatial prediction of climatic variables (topo-climatic mapping). In complex topographies of the mountainous regions, the meteorological processes are highly influenced by the relief. The article shows how these relations, possibly regionalized and non-linear, can be modelled from data using the information from digital elevation models. The particular illustration of the developed methodology concerns the mapping of temperatures (including the situations of Föhn and temperature inversion) given the measurements taken from the Swiss meteorological monitoring network. The range of the methods used in the study includes data-driven feature selection, support vector algorithms and artificial neural networks.
Resumo:
The motivation for this research initiated from the abrupt rise and fall of minicomputers which were initially used both for industrial automation and business applications due to their significantly lower cost than their predecessors, the mainframes. Later industrial automation developed its own vertically integrated hardware and software to address the application needs of uninterrupted operations, real-time control and resilience to harsh environmental conditions. This has led to the creation of an independent industry, namely industrial automation used in PLC, DCS, SCADA and robot control systems. This industry employs today over 200'000 people in a profitable slow clockspeed context in contrast to the two mainstream computing industries of information technology (IT) focused on business applications and telecommunications focused on communications networks and hand-held devices. Already in 1990s it was foreseen that IT and communication would merge into one Information and communication industry (ICT). The fundamental question of the thesis is: Could industrial automation leverage a common technology platform with the newly formed ICT industry? Computer systems dominated by complex instruction set computers (CISC) were challenged during 1990s with higher performance reduced instruction set computers (RISC). RISC started to evolve parallel to the constant advancement of Moore's law. These developments created the high performance and low energy consumption System-on-Chip architecture (SoC). Unlike to the CISC processors RISC processor architecture is a separate industry from the RISC chip manufacturing industry. It also has several hardware independent software platforms consisting of integrated operating system, development environment, user interface and application market which enables customers to have more choices due to hardware independent real time capable software applications. An architecture disruption merged and the smartphone and tablet market were formed with new rules and new key players in the ICT industry. Today there are more RISC computer systems running Linux (or other Unix variants) than any other computer system. The astonishing rise of SoC based technologies and related software platforms in smartphones created in unit terms the largest installed base ever seen in the history of computers and is now being further extended by tablets. An underlying additional element of this transition is the increasing role of open source technologies both in software and hardware. This has driven the microprocessor based personal computer industry with few dominating closed operating system platforms into a steep decline. A significant factor in this process has been the separation of processor architecture and processor chip production and operating systems and application development platforms merger into integrated software platforms with proprietary application markets. Furthermore the pay-by-click marketing has changed the way applications development is compensated: Three essays on major trends in a slow clockspeed industry: The case of industrial automation 2014 freeware, ad based or licensed - all at a lower price and used by a wider customer base than ever before. Moreover, the concept of software maintenance contract is very remote in the app world. However, as a slow clockspeed industry, industrial automation has remained intact during the disruptions based on SoC and related software platforms in the ICT industries. Industrial automation incumbents continue to supply systems based on vertically integrated systems consisting of proprietary software and proprietary mainly microprocessor based hardware. They enjoy admirable profitability levels on a very narrow customer base due to strong technology-enabled customer lock-in and customers' high risk leverage as their production is dependent on fault-free operation of the industrial automation systems. When will this balance of power be disrupted? The thesis suggests how industrial automation could join the mainstream ICT industry and create an information, communication and automation (ICAT) industry. Lately the Internet of Things (loT) and weightless networks, a new standard leveraging frequency channels earlier occupied by TV broadcasting, have gradually started to change the rigid world of Machine to Machine (M2M) interaction. It is foreseeable that enough momentum will be created that the industrial automation market will in due course face an architecture disruption empowered by these new trends. This thesis examines the current state of industrial automation subject to the competition between the incumbents firstly through a research on cost competitiveness efforts in captive outsourcing of engineering, research and development and secondly researching process re- engineering in the case of complex system global software support. Thirdly we investigate the industry actors', namely customers, incumbents and newcomers, views on the future direction of industrial automation and conclude with our assessments of the possible routes industrial automation could advance taking into account the looming rise of the Internet of Things (loT) and weightless networks. Industrial automation is an industry dominated by a handful of global players each of them focusing on maintaining their own proprietary solutions. The rise of de facto standards like IBM PC, Unix and Linux and SoC leveraged by IBM, Compaq, Dell, HP, ARM, Apple, Google, Samsung and others have created new markets of personal computers, smartphone and tablets and will eventually also impact industrial automation through game changing commoditization and related control point and business model changes. This trend will inevitably continue, but the transition to a commoditized industrial automation will not happen in the near future.
Resumo:
We report the case study of a French-Spanish bilingual dyslexic girl, MP, who exhibited a severe visual attention (VA) span deficit but preserved phonological skills. Behavioural investigation showed a severe reduction of reading speed for both single items (words and pseudo-words) and texts in the two languages. However, performance was more affected in French than in Spanish. MP was administered an intensive VA span intervention programme. Pre-post intervention comparison revealed a positive effect of intervention on her VA span abilities. The intervention further transferred to reading. It primarily resulted in faster identification of the regular and irregular words in French. The effect of intervention was rather modest in Spanish that only showed a tendency for faster word reading. Text reading improved in the two languages with a stronger effect in French but pseudo-word reading did not improve in either French or Spanish. The overall results suggest that VA span intervention may primarily enhance the fast global reading procedure, with stronger effects in French than in Spanish. MP underwent two fMRI sessions to explore her brain activations before and after VA span training. Prior to the intervention, fMRI assessment showed that the striate and extrastriate visual cortices alone were activated but none of the regions typically involved in VA span. Post-training fMRI revealed increased activation of the superior and inferior parietal cortices. Comparison of pre- and post-training activations revealed significant activation increase of the superior parietal lobes (BA 7) bilaterally. Thus, we show that a specific VA span intervention not only modulates reading performance but further results in increased brain activity within the superior parietal lobes known to housing VA span abilities. Furthermore, positive effects of VA span intervention on reading suggest that the ability to process multiple visual elements simultaneously is one cause of successful reading acquisition.