937 resultados para parallel computer systems
Resumo:
Publicado em "Information control in manufacturing 1998 : (INCOM'98) : advances in industrial engineering : a proceedings volume from the 9th IFAC Symposium, Nancy-Metz, France, 24-26 June 1998. Vol. 2"
Resumo:
El crecimiento exponencial del tráfico de datos es uno de los mayores desafíos que enfrentan actualmente los sistemas de comunicaciones, debiendo los mismos ser capaces de soportar velocidades de procesamiento de datos cada vez mas altas. En particular, el consumo de potencia se ha transformado en uno de los parámetros de diseño más críticos, generando la necesidad de investigar el uso de nuevas arquitecturas y algoritmos para el procesamiento digital de la información. Por otro lado, el análisis y evaluación de nuevas técnicas de procesamiento presenta dificultades dadas las altas velocidades a las que deben operar, resultando frecuentemente ineficiente el uso de la simulación basada en software como método. En este contexto, el uso de electrónica programable ofrece una oportunidad a bajo costo donde no solo se evaluan nuevas técnicas de diseño de alta velocidad sino también se valida su implementación en desarrollos tecnológicos. El presente proyecto tiene como objetivo principal el estudio y desarrollo de nuevas arquitecturas y algoritmos en electrónica programable para el procesamiento de datos a alta velocidad. El método a utilizar será la programación en dispositivos FPGA (Field-Programmable Gate Array) que ofrecen una buena relación costo-beneficio y gran flexibilidad para integrarse con otros dispositivos de comunicaciones. Para la etapas de diseño, simulación y programación se utilizaran herramientas CAD (Computer-Aided Design) orientadas a sistemas electrónicos digitales. El proyecto beneficiara a estudiantes de grado y postgrado de carreras afines a la informática y las telecomunicaciones, contribuyendo al desarrollo de proyectos finales y tesis doctorales. Los resultados del proyecto serán publicados en conferencias y/o revistas nacionales e internacionales y divulgados a través de charlas de difusión y/o encuentros. El proyecto se enmarca dentro de un área de gran importancia para la Provincia de Córdoba, como lo es la informática y las telecomunicaciones, y promete generar conocimiento de gran valor agregado que pueda ser transferido a empresas tecnológicas de la Provincia de Córdoba a través de consultorias o desarrollos de productos.
Resumo:
Advances in computer memory technology justify research towards new and different views on computer organization. This paper proposes a novel memory-centric computing architecture with the goal to merge memory and processing elements in order to provide better conditions for parallelization and performance. The paper introduces the architectural concepts and afterwards shows the design and implementation of a corresponding assembler and simulator.
Resumo:
Therapeutic drug monitoring (TDM) aims to optimize treatments by individualizing dosage regimens based on the measurement of blood concentrations. Dosage individualization to maintain concentrations within a target range requires pharmacokinetic and clinical capabilities. Bayesian calculations currently represent the gold standard TDM approach but require computation assistance. In recent decades computer programs have been developed to assist clinicians in this assignment. The aim of this survey was to assess and compare computer tools designed to support TDM clinical activities. The literature and the Internet were searched to identify software. All programs were tested on personal computers. Each program was scored against a standardized grid covering pharmacokinetic relevance, user friendliness, computing aspects, interfacing and storage. A weighting factor was applied to each criterion of the grid to account for its relative importance. To assess the robustness of the software, six representative clinical vignettes were processed through each of them. Altogether, 12 software tools were identified, tested and ranked, representing a comprehensive review of the available software. Numbers of drugs handled by the software vary widely (from two to 180), and eight programs offer users the possibility of adding new drug models based on population pharmacokinetic analyses. Bayesian computation to predict dosage adaptation from blood concentration (a posteriori adjustment) is performed by ten tools, while nine are also able to propose a priori dosage regimens, based only on individual patient covariates such as age, sex and bodyweight. Among those applying Bayesian calculation, MM-USC*PACK© uses the non-parametric approach. The top two programs emerging from this benchmark were MwPharm© and TCIWorks. Most other programs evaluated had good potential while being less sophisticated or less user friendly. Programs vary in complexity and might not fit all healthcare settings. Each software tool must therefore be regarded with respect to the individual needs of hospitals or clinicians. Programs should be easy and fast for routine activities, including for non-experienced users. Computer-assisted TDM is gaining growing interest and should further improve, especially in terms of information system interfacing, user friendliness, data storage capability and report generation.
Resumo:
OBJECTIVE: Quality assurance (QA) in clinical trials is essential to ensure treatment is safely and effectively delivered. As QA requirements have increased in complexity in parallel with evolution of radiation therapy (RT) delivery, a need to facilitate digital data exchange emerged. Our objective is to present the platform developed for the integration and standardization of QART activities across all EORTC trials involving RT. METHODS: The following essential requirements were identified: secure and easy access without on-site software installation; integration within the existing EORTC clinical remote data capture system; and the ability to both customize the platform to specific studies and adapt to future needs. After retrospective testing within several clinical trials, the platform was introduced in phases to participating sites and QART study reviewers. RESULTS: The resulting QA platform, integrating RT analysis software installed at EORTC Headquarters, permits timely, secure, and fully digital central DICOM-RT based data review. Participating sites submit data through a standard secure upload webpage. Supplemental information is submitted in parallel through web-based forms. An internal quality check by the QART office verifies data consistency, formatting, and anonymization. QART reviewers have remote access through a terminal server. Reviewers evaluate submissions for protocol compliance through an online evaluation matrix. Comments are collected by the coordinating centre and institutions are informed of the results. CONCLUSIONS: This web-based central review platform facilitates rapid, extensive, and prospective QART review. This reduces the risk that trial outcomes are compromised through inadequate radiotherapy and facilitates correlation of results with clinical outcomes.
Resumo:
L'evolució que la tecnologia de les comunicacions sense fils ha experimentat en els darrers anys permet que avui en dia els dispositius mòbils proporcionin una resposta més que acceptable. Ja s'han superat molts dels problemes de mobilitat, d'infraestructura i d'ample de banda que fins ara dificultaven l'ús d'aquests dispositius. Aquest fet ha fomentat l'expansió de diferents sistemes dels que es pot destacar Android, iPhone OS i Windows Mobile/Windows Phone 7. Els dos primers són els que actualment es disputen el lideratge del mercat dels dispositius i aplicacions mòbils i és Android el que ha experimentat el major creixement els darrers anys. Paral·lelament, la constant introducció de programari d'e-learning destinat a activitats de formació no presencial basades en Internet ha permès arribar a milions d'usuaris en diferents àmbits, com ara universitats, empreses i institucions d'arreu del món. D'aquest conjunt de programari destaca Moodle, suite de codi obert que permet crear llocs web de formació en línia de forma senzilla i eficaç.Aquest Projecte de Final de Carrera, emmarcat en l'àrea de Xarxes de Computadors, conjugaaquests tres conceptes mitjançant el desenvolupament d'una aplicació per a dispositius mòbils Android que connecta a un servidor Moodle. La comunicació entre ells utilitza intensivament les xarxes sense fils i fa crides als serveis que aquest ofereix gràcies als Web Services, mètode de comunicacions composat d'un conjunt de protocols i de programari que permet la comunicació entre dos dispositius a través de la xarxa.
Resumo:
In the parallel map theory, the hippocampus encodes space with 2 mapping systems. The bearing map is constructed primarily in the dentate gyrus from directional cues such as stimulus gradients. The sketch map is constructed within the hippocampus proper from positional cues. The integrated map emerges when data from the bearing and sketch maps are combined. Because the component maps work in parallel, the impairment of one can reveal residual learning by the other. Such parallel function may explain paradoxes of spatial learning, such as learning after partial hippocampal lesions, taxonomic and sex differences in spatial learning, and the function of hippocampal neurogenesis. By integrating evidence from physiology to phylogeny, the parallel map theory offers a unified explanation for hippocampal function.
Resumo:
This paper proposes a parallel architecture for estimation of the motion of an underwater robot. It is well known that image processing requires a huge amount of computation, mainly at low-level processing where the algorithms are dealing with a great number of data. In a motion estimation algorithm, correspondences between two images have to be solved at the low level. In the underwater imaging, normalised correlation can be a solution in the presence of non-uniform illumination. Due to its regular processing scheme, parallel implementation of the correspondence problem can be an adequate approach to reduce the computation time. Taking into consideration the complexity of the normalised correlation criteria, a new approach using parallel organisation of every processor from the architecture is proposed
Resumo:
Process supervision is the activity focused on monitoring the process operation in order to deduce conditions to maintain the normality including when faults are present Depending on the number/distribution/heterogeneity of variables, behaviour situations, sub-processes, etc. from processes, human operators and engineers do not easily manipulate the information. This leads to the necessity of automation of supervision activities. Nevertheless, the difficulty to deal with the information complicates the design and development of software applications. We present an approach called "integrated supervision systems". It proposes multiple supervisors coordination to supervise multiple sub-processes whose interactions permit one to supervise the global process
Resumo:
Expert supervision systems are software applications specially designed to automate process monitoring. The goal is to reduce the dependency on human operators to assure the correct operation of a process including faulty situations. Construction of this kind of application involves an important task of design and development in order to represent and to manipulate process data and behaviour at different degrees of abstraction for interfacing with data acquisition systems connected to the process. This is an open problem that becomes more complex with the number of variables, parameters and relations to account for the complexity of the process. Multiple specialised modules tuned to solve simpler tasks that operate under a co-ordination provide a solution. A modular architecture based on concepts of software agents, taking advantage of the integration of diverse knowledge-based techniques, is proposed for this purpose. The components (software agents, communication mechanisms and perception/action mechanisms) are based on ICa (Intelligent Control architecture), software middleware supporting the build-up of applications with software agent features
Resumo:
Fault location has been studied deeply for transmission lines due to its importance in power systems. Nowadays the problem of fault location on distribution systems is receiving special attention mainly because of the power quality regulations. In this context, this paper presents an application software developed in Matlabtrade that automatically calculates the location of a fault in a distribution power system, starting from voltages and currents measured at the line terminal and the model of the distribution power system data. The application is based on a N-ary tree structure, which is suitable to be used in this application due to the highly branched and the non- homogeneity nature of the distribution systems, and has been developed for single-phase, two-phase, two-phase-to-ground, and three-phase faults. The implemented application is tested by using fault data in a real electrical distribution power system
Resumo:
This paper proposes to promote autonomy in digital ecosystems so that it provides agents with information to improve the behavior of the digital ecosystem in terms of stability. This work proposes that, in digital ecosystems, autonomous agents can provide fundamental services and information. The final goal is to run the ecosystem, generate novel conditions and let agents exploit them. A set of evaluation measures must be defined as well. We want to provide an outline of some global indicators, such as heterogeneity and diversity, and establish relationships between agent behavior and these global indicators to fully understand interactions between agents, and to understand the dependence and autonomy relations that emerge between the interacting agents. Individual variations, interaction dependencies, and environmental factors are determinants of autonomy that would be considered. The paper concludes with a discussion of situations when autonomy is a milestone
Resumo:
This paper shows the impact of the atomic capabilities concept to include control-oriented knowledge of linear control systems in the decisions making structure of physical agents. These agents operate in a real environment managing physical objects (e.g. their physical bodies) in coordinated tasks. This approach is presented using an introspective reasoning approach and control theory based on the specific tasks of passing a ball and executing the offside manoeuvre between physical agents in the robotic soccer testbed. Experimental results and conclusions are presented, emphasising the advantages of our approach that improve the multi-agent performance in cooperative systems
Resumo:
An objective analysis of image quality parameters was performed for a computed radiography (CR) system using both standard single-side and prototype dual-side read plates. The pre-sampled modulation transfer function (MTF), noise power spectrum (NPS), and detective quantum efficiency (DQE) for the systems were determined at three different beam qualities representative of pediatric chest radiography, at an entrance detector air kerma of 5 microGy. The NPS and DQE measurements were realized under clinically relevant x-ray spectra for pediatric radiology, including x-ray scatter radiations. Compared to the standard single-side read system, the MTF for the dual-side read system is reduced, but this is offset by a significant decrease in image noise, resulting in a marked increase in DQE (+40%) in the low spatial frequency range. Thus, for the same image quality, the new technology permits the CR system to be used at a reduced dose level.
Resumo:
The EVS4CSCL project starts in the context of a Computer Supported Collaborative Learning environment (CSCL). Previous UOC projects created a CSCL generic platform (CLPL) to facilitate the development of CSCL applications. A discussion forum (DF) was the first application developed over the framework. This discussion forum was different from other products on the marketplace because of its focus on the learning process. The DF carried out the specification and elaboration phases from the discussion learning process but there was a lack in the consensus phase. The consensus phase in a learning environment is not something to be achieved but tested. Common tests are done by Electronic Voting System (EVS) tools, but consensus test is not an assessment test. We are not evaluating our students by their answers but by their discussion activity. Our educational EVS would be used as a discussion catalyst proposing a discussion about the results after an initial query or it would be used after a discussion period in order to manifest how the discussion changed the students mind (consensus). It should be also used by the teacher as a quick way to know where the student needs some reinforcement. That is important in a distance-learning environment where there is no direct contact between the teacher and the student and it is difficult to detect the learning lacks. In an educational environment, assessment it is a must and the EVS will provide direct assessment by peer usefulness evaluation, teacher marks on every query created and indirect assessment from statistics regarding the user activity.