102 resultados para COMPUTATIONAL NEUROSCIENCE


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Bioresorbable polymers such as PLA have an important role to play in the development of temporary implantable medical devices with significant benefits over traditional therapies. However, development of new devices is hindered by high manufacturing costs associated with difficulties in processing the material. A major problem is the lack of insight on material degradation during processing. In this work, a method of quantifying degradation of PLA using IR spectroscopy coupled with computational chemistry and chemometric modeling is examined. It is shown that the method can predict the quantity of degradation products in solid-state samples with reasonably good accuracy, indicating the potential to adapt the method to developing an on-line sensor for monitoring PLA degradation in real-time during processing.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Background: Oncology is a field that profits tremendously from the genomic data generated by high-throughput technologies, including next-generation sequencing. However, in order to exploit, integrate, visualize and interpret such high-dimensional data efficiently, non-trivial computational and statistical analysis methods are required that need to be developed in a problem-directed manner.

Discussion: For this reason, computational cancer biology aims to fill this gap. Unfortunately, computational cancer biology is not yet fully recognized as a coequal field in oncology, leading to a delay in its maturation and, as an immediate consequence, an under-exploration of high-throughput data for translational research.

Summary: Here we argue that this imbalance, favoring 'wet lab-based activities', will be naturally rectified over time, if the next generation of scientists receives an academic education that provides a fair and competent introduction to computational biology and its manifold capabilities. Furthermore, we discuss a number of local educational provisions that can be implemented on university level to help in facilitating the process of harmonization.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Traditional internal combustion engine vehicles are a major contributor to global greenhouse gas emissions and other air pollutants, such as particulate matter and nitrogen oxides. If the tail pipe point emissions could be managed centrally without reducing the commercial and personal user functionalities, then one of the most attractive solutions for achieving a significant reduction of emissions in the transport sector would be the mass deployment of electric vehicles. Though electric vehicle sales are still hindered by battery performance, cost and a few other technological bottlenecks, focused commercialisation and support from government policies are encouraging large scale electric vehicle adoptions. The mass proliferation of plug-in electric vehicles is likely to bring a significant additional electric load onto the grid creating a highly complex operational problem for power system operators. Electric vehicle batteries also have the ability to act as energy storage points on the distribution system. This double charge and storage impact of many uncontrollable small kW loads, as consumers will want maximum flexibility, on a distribution system which was originally not designed for such operations has the potential to be detrimental to grid balancing. Intelligent scheduling methods if established correctly could smoothly integrate electric vehicles onto the grid. Intelligent scheduling methods will help to avoid cycling of large combustion plants, using expensive fossil fuel peaking plant, match renewable generation to electric vehicle charging and not overload the distribution system causing a reduction in power quality. In this paper, a state-of-the-art review of scheduling methods to integrate plug-in electric vehicles are reviewed, examined and categorised based on their computational techniques. Thus, in addition to various existing approaches covering analytical scheduling, conventional optimisation methods (e.g. linear, non-linear mixed integer programming and dynamic programming), and game theory, meta-heuristic algorithms including genetic algorithm and particle swarm optimisation, are all comprehensively surveyed, offering a systematic reference for grid scheduling considering intelligent electric vehicle integration.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The energetics of the low-temperature adsorption and decomposition of nitrous oxide, N(2)O, on flat and stepped platinum surfaces were calculated using density-functional theory (DFT). The results show that the preferred adsorption site for N(2)O is an atop site, bound upright via the terminal nitrogen. The molecule is only weakly chemisorbed to the platinum surface. The decomposition barriers on flat (I 11) surfaces and stepped (211) surfaces are similar. While the barrier for N(2)O dissociation is relatively small, the surface rapidly becomes poisoned by adsorbed oxygen. These findings are supported by experimental results of pulsed N(2)O decomposition with 5% Pt/SiO(2) and bismuth-modified Pt/C catalysts. At low temperature, decomposition occurs but self-poisoning by O((ads)) prevents further decomposition. At higher temperatures some desorption Of O(2) is observed, allowing continued catalytic activity. The study with bismuth-modified Pt/C catalysts showed that, although the activation barriers calculated for both terraces and steps were similar, the actual rate was different for the two surfaces. Steps were found experimentally to be more active than terraces and this is attributed to differences in the preexponential term. (C) 2004 Elsevier Inc. All rights reserved.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this paper, we present a meeting report for the 2nd Summer School in Computational Biology organized by the Queen's University of Belfast. We describe the organization of the summer school, its underlying concept and student feedback we received after the completion of the summer school.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The piezoresistance effect is defined as change in resistance due to applied stress. Silicon has a relatively large piezoresistance effect which has been known since 1954. A four point bending setup is proposed and designed to analyze the piezoresistance effect in p-type silicon. This setup is used to apply uniform and uniaxial stress along the <110> crystal direction. The main aim of this work is to investigate the piezoresistive characteristic of p-type resistors as a function of doping concentrations using COMSOL Multiphysics. Simulation results are compared with experimental data.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Background: Large-scale biological jobs on high-performance computing systems require manual intervention if one or more computing cores on which they execute fail. This places not only a cost on the maintenance of the job, but also a cost on the time taken for reinstating the job and the risk of losing data and execution accomplished by the job before it failed. Approaches which can proactively detect computing core failures and take action to relocate the computing core's job onto reliable cores can make a significant step towards automating fault tolerance. Method: This paper describes an experimental investigation into the use of multi-agent approaches for fault tolerance. Two approaches are studied, the first at the job level and the second at the core level. The approaches are investigated for single core failure scenarios that can occur in the execution of parallel reduction algorithms on computer clusters. A third approach is proposed that incorporates multi-agent technology both at the job and core level. Experiments are pursued in the context of genome searching, a popular computational biology application. Result: The key conclusion is that the approaches proposed are feasible for automating fault tolerance in high-performance computing systems with minimal human intervention. In a typical experiment in which the fault tolerance is studied, centralised and decentralised checkpointing approaches on an average add 90% to the actual time for executing the job. On the other hand, in the same experiment the multi-agent approaches add only 10% to the overall execution time

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Symposium of papers on Computational Thinking