856 resultados para System analysis - Data processing


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Deep Brain Stimulator devices are becoming widely used for therapeutic benefits in movement disorders such as Parkinson's disease. Prolonging the battery life span of such devices could dramatically reduce the risks and accumulative costs associated with surgical replacement. This paper demonstrates how an artificial neural network can be trained using pre-processing frequency analysis of deep brain electrode recordings to detect the onset of tremor in Parkinsonian patients. Implementing this solution into an 'intelligent' neurostimulator device will remove the need for continuous stimulation currently used, and open up the possibility of demand-driven stimulation. Such a methodology could potentially decrease the power consumption of a deep brain pulse generator.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Medical universities and teaching hospitals in Iraq are facing a lack of professional staff due to the ongoing violence that forces them to flee the country. The professionals are now distributed outside the country which reduces the chances for the staff and students to be physically in one place to continue the teaching and limits the efficiency of the consultations in hospitals. A survey was done among students and professional staff in Iraq to find the problems in the learning and clinical systems and how Information and Communication Technology could improve it. The survey has shown that 86% of the participants use the Internet as a learning resource and 25% for clinical purposes while less than 11% of them uses it for collaboration between different institutions. A web-based collaborative tool is proposed to improve the teaching and clinical system. The tool helps the users to collaborate remotely to increase the quality of the learning system as well as it can be used for remote medical consultation in hospitals.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The intelligent controlling mechanism of a typical mobile robot is usually a computer system. Research is however now ongoing in which biological neural networks are being cultured and trained to act as the brain of an interactive real world robot – thereby either completely replacing or operating in a cooperative fashion with a computer system. Studying such neural systems can give a distinct insight into biological neural structures and therefore such research has immediate medical implications. The principal aims of the present research are to assess the computational and learning capacity of dissociated cultured neuronal networks with a view to advancing network level processing of artificial neural networks. This will be approached by the creation of an artificial hybrid system (animat) involving closed loop control of a mobile robot by a dissociated culture of rat neurons. This paper details the components of the overall animat closed loop system architecture and reports on the evaluation of the results from preliminary real-life and simulated robot experiments.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The classical computer vision methods can only weakly emulate some of the multi-level parallelisms in signal processing and information sharing that takes place in different parts of the primates’ visual system thus enabling it to accomplish many diverse functions of visual perception. One of the main functions of the primates’ vision is to detect and recognise objects in natural scenes despite all the linear and non-linear variations of the objects and their environment. The superior performance of the primates’ visual system compared to what machine vision systems have been able to achieve to date, motivates scientists and researchers to further explore this area in pursuit of more efficient vision systems inspired by natural models. In this paper building blocks for a hierarchical efficient object recognition model are proposed. Incorporating the attention-based processing would lead to a system that will process the visual data in a non-linear way focusing only on the regions of interest and hence reducing the time to achieve real-time performance. Further, it is suggested to modify the visual cortex model for recognizing objects by adding non-linearities in the ventral path consistent with earlier discoveries as reported by researchers in the neuro-physiology of vision.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In this paper, the issues that arise in multi-organisational collaborative groups (MOCGs) in the public sector are discussed and how a technology-based group support system (GSS) could assist individuals within these groups. MOCGs are commonly used in the public sector to find solutions to multifaceted social problems. Finding solutions for such problems is difficult because their scope is outside the boundary of a single government agency. The standard approach to solving such problems is collaborative involving a diverse range of stakeholders. Collaborative working can be advantageous but it also introduces its own pressures. Conflicts can arise due to the multiple contexts and goals of group members and the organisations that they represent. Trust, communication and a shared interface are crucial to making any significant progress. A GSS could support these elements.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Compute grids are used widely in many areas of environmental science, but there has been limited uptake of grid computing by the climate modelling community, partly because the characteristics of many climate models make them difficult to use with popular grid middleware systems. In particular, climate models usually produce large volumes of output data, and running them also involves complicated workflows implemented as shell scripts. A new grid middleware system that is well suited to climate modelling applications is presented in this paper. Grid Remote Execution (G-Rex) allows climate models to be deployed as Web services on remote computer systems and then launched and controlled as if they were running on the user's own computer. Output from the model is transferred back to the user while the run is in progress to prevent it from accumulating on the remote system and to allow the user to monitor the model. G-Rex has a REST architectural style, featuring a Java client program that can easily be incorporated into existing scientific workflow scripts. Some technical details of G-Rex are presented, with examples of its use by climate modellers.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

G-Rex is light-weight Java middleware that allows scientific applications deployed on remote computer systems to be launched and controlled as if they are running on the user's own computer. G-Rex is particularly suited to ocean and climate modelling applications because output from the model is transferred back to the user while the run is in progress, which prevents the accumulation of large amounts of data on the remote cluster. The G-Rex server is a RESTful Web application that runs inside a servlet container on the remote system, and the client component is a Java command line program that can easily be incorporated into existing scientific work-flow scripts. The NEMO and POLCOMS ocean models have been deployed as G-Rex services in the NERC Cluster Grid, and G-Rex is the core grid middleware in the GCEP and GCOMS e-science projects.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper proposes a conceptual model of a context-aware group support system (GSS) to assist local council employees to perform collaborative tasks in conjunction with inter- and intra-organisational stakeholders. Most discussions about e-government focus on the use of ICT to improve the relationship between government and citizen, not on the relationship between government and employees. This paper seeks to expose the unique culture of UK local councils and to show how a GSS could support local government employer and employee needs.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Accurate seasonal forecasts rely on the presence of low frequency, predictable signals in the climate system which have a sufficiently well understood and significant impact on the atmospheric circulation. In the Northern European region, signals associated with seasonal scale variability such as ENSO, North Atlantic SST anomalies and the North Atlantic Oscillation have not yet proven sufficient to enable satisfactorily skilful dynamical seasonal forecasts. The winter-time circulations of the stratosphere and troposphere are highly coupled. It is therefore possible that additional seasonal forecasting skill may be gained by including a realistic stratosphere in models. In this study we assess the ability of five seasonal forecasting models to simulate the Northern Hemisphere extra-tropical winter-time stratospheric circulation. Our results show that all of the models have a polar night jet which is too weak and displaced southward compared to re-analysis data. It is shown that the models underestimate the number, magnitude and duration of periods of anomalous stratospheric circulation. Despite the poor representation of the general circulation of the stratosphere, the results indicate that there may be a detectable tropospheric response following anomalous circulation events in the stratosphere. However, the models fail to exhibit any predictability in their forecasts. These results highlight some of the deficiencies of current seasonal forecasting models with a poorly resolved stratosphere. The combination of these results with other recent studies which show a tropospheric response to stratospheric variability, demonstrates a real prospect for improving the skill of seasonal forecasts.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The accurate prediction of storms is vital to the oil and gas sector for the management of their operations. An overview of research exploring the prediction of storms by ensemble prediction systems is presented and its application to the oil and gas sector is discussed. The analysis method used requires larger amounts of data storage and computer processing time than other more conventional analysis methods. To overcome these difficulties eScience techniques have been utilised. These techniques potentially have applications to the oil and gas sector to help incorporate environmental data into their information systems

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper reports on a new satellite sensor, the Geostationary Earth Radiation Budget (GERB) experiment. GERB is designed to make the first measurements of the Earth's radiation budget from geostationary orbit. Measurements at high absolute accuracy of the reflected sunlight from the Earth, and the thermal radiation emitted by the Earth are made every 15 min, with a spatial resolution at the subsatellite point of 44.6 km (north–south) by 39.3 km (east–west). With knowledge of the incoming solar constant, this gives the primary forcing and response components of the top-of-atmosphere radiation. The first GERB instrument is an instrument of opportunity on Meteosat-8, a new spin-stabilized spacecraft platform also carrying the Spinning Enhanced Visible and Infrared (SEVIRI) sensor, which is currently positioned over the equator at 3.5°W. This overview of the project includes a description of the instrument design and its preflight and in-flight calibration. An evaluation of the instrument performance after its first year in orbit, including comparisons with data from the Clouds and the Earth's Radiant Energy System (CERES) satellite sensors and with output from numerical models, are also presented. After a brief summary of the data processing system and data products, some of the scientific studies that are being undertaken using these early data are described. This marks the beginning of a decade or more of observations from GERB, as subsequent models will fly on each of the four Meteosat Second Generation satellites.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Airborne dust is of concern due to hazards in the localities affected by erosion, transport and deposition, but it is also of global concern due to uncertainties over its role in radiative forcing of climate. In order to model the environmental impact of dust, we need a better knowledge of sources and transport processes. Satellite remote sensing has been instrumental in providing this knowledge, through long time series of observations of atmospheric dust transport. Three remote sensing methodologies have been used, and are reviewed briefly in this paper. Firstly the use of observations from the Total Ozone Mapping Spectrometer (TOMS), secondly the use of the Infrared Difference Dust Index (IDDI) from Meterosat infrared data, thirdly the use of MODIS images from the rapid response system. These data have highlighted the major global sources of dust, mist of which are associated with endoreic drainage basins in deserts, which held lakes during Quaternary humid climate phases, and identified the Bodele Depression in Tchad as the dustiest place on Earth.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper describes the user modeling component of EPIAIM, a consultation system for data analysis in epidemiology. The component is aimed at representing knowledge of concepts in the domain, so that their explanations can be adapted to user needs. The first part of the paper describes two studies aimed at analysing user requirements. The first one is a questionnaire study which examines the respondents' familiarity with concepts. The second one is an analysis of concept descriptions in textbooks and from expert epidemiologists, which examines how discourse strategies are tailored to the level of experience of the expected audience. The second part of the paper describes how the results of these studies have been used to design the user modeling component of EPIAIM. This module works in a two-step approach. In the first step, a few trigger questions allow the activation of a stereotype that includes a "body" and an "inference component". The body is the representation of the body of knowledge that a class of users is expected to know, along with the probability that the knowledge is known. In the inference component, the learning process of concepts is represented as a belief network. Hence, in the second step the belief network is used to refine the initial default information in the stereotype's body. This is done by asking a few questions on those concepts where it is uncertain whether or not they are known to the user, and propagating this new evidence to revise the whole situation. The system has been implemented on a workstation under UNIX. An example of functioning is presented, and advantages and limitations of the approach are discussed.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The Observing System Research and Predictability Experiment (THORPEX) Interactive Grand Global Ensemble (TIGGE) is a World Weather Research Programme project. One of its main objectives is to enhance collaboration on the development of ensemble prediction between operational centers and universities by increasing the availability of ensemble prediction system (EPS) data for research. This study analyzes the prediction of Northern Hemisphere extratropical cyclones by nine different EPSs archived as part of the TIGGE project for the 6-month time period of 1 February 2008–31 July 2008, which included a sample of 774 cyclones. An objective feature tracking method has been used to identify and track the cyclones along the forecast trajectories. Forecast verification statistics have then been produced [using the European Centre for Medium-Range Weather Forecasts (ECMWF) operational analysis as the truth] for cyclone position, intensity, and propagation speed, showing large differences between the different EPSs. The results show that the ECMWF ensemble mean and control have the highest level of skill for all cyclone properties. The Japanese Meteorological Administration (JMA), the National Centers for Environmental Prediction (NCEP), the Met Office (UKMO), and the Canadian Meteorological Centre (CMC) have 1 day less skill for the position of cyclones throughout the forecast range. The relative performance of the different EPSs remains the same for cyclone intensity except for NCEP, which has larger errors than for position. NCEP, the Centro de Previsão de Tempo e Estudos Climáticos (CPTEC), and the Australian Bureau of Meteorology (BoM) all have faster intensity error growth in the earlier part of the forecast. They are also very underdispersive and significantly underpredict intensities, perhaps due to the comparatively low spatial resolutions of these EPSs not being able to accurately model the tilted structure essential to cyclone growth and decay. There is very little difference between the levels of skill of the ensemble mean and control for cyclone position, but the ensemble mean provides an advantage over the control for all EPSs except CPTEC in cyclone intensity and there is an advantage for propagation speed for all EPSs. ECMWF and JMA have an excellent spread–skill relationship for cyclone position. The EPSs are all much more underdispersive for cyclone intensity and propagation speed than for position, with ECMWF and CMC performing best for intensity and CMC performing best for propagation speed. ECMWF is the only EPS to consistently overpredict cyclone intensity, although the bias is small. BoM, NCEP, UKMO, and CPTEC significantly underpredict intensity and, interestingly, all the EPSs underpredict the propagation speed, that is, the cyclones move too slowly on average in all EPSs.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Recent developments in the fields of veterinary epidemiology and economics are critically reviewed and assessed. The impacts of recent technological developments in diagnosis, genetic characterisation, data processing and statistical analysis are evaluated. It is concluded that the acquisition and availability of data remains the principal constraint to the application of available techniques in veterinary epidemiology and economics, especially at population level. As more commercial producers use computerised management systems, the availability of data for analysis within herds is improving. However, consistency of recording and diagnosis remains problematic. Recent trends to the development of national livestock databases intended to provide reassurance to consumers of the safety and traceability of livestock products are potentially valuable sources of data that could lead to much more effective application of veterinary epidemiology and economics. These opportunities will be greatly enhanced if data from different sources, such as movement recording, official animal health programmes, quality assurance schemes, production recording and breed societies can be integrated. However, in order to realise such integrated databases, it will be necessary to provide absolute control of user access to guarantee data security and confidentiality. The potential applications of integrated livestock databases in analysis, modelling, decision-support, and providing management information for veterinary services and livestock producers are discussed. (c) 2004 Elsevier B.V. All rights reserved.