876 resultados para Dynamic Data eXchange
Resumo:
Background Patients' health related quality of life (HRQoL) has rarely been systematically monitored in general practice. Electronic tools and practice training might facilitate the routine application of HRQoL questionnaires. Thorough piloting of innovative procedures is strongly recommended before the conduction of large-scale studies. Therefore, we aimed to assess i) the feasibility and acceptance of HRQoL assessment using tablet computers in general practice, ii) the perceived practical utility of HRQoL results and iii) to identify possible barriers hindering wider application of this approach. Methods Two HRQoL questionnaires (St. George's Respiratory Questionnaire SGRQ and EORTC QLQ-C30) were electronically presented on portable tablet computers. Wireless network (WLAN) integration into practice computer systems of 14 German general practices with varying infrastructure allowed automatic data exchange and the generation of a printout or a PDF file. General practitioners (GPs) and practice assistants were trained in a 1-hour course, after which they could invite patients with chronic diseases to fill in the electronic questionnaire during their waiting time. We surveyed patients, practice assistants and GPs regarding their acceptance of this tool in semi-structured telephone interviews. The number of assessments, HRQoL results and interview responses were analysed using quantitative and qualitative methods. Results Over the course of 1 year, 523 patients filled in the electronic questionnaires (1–5 times; 664 total assessments). On average, results showed specific HRQoL impairments, e.g. with respect to fatigue, pain and sleep disturbances. The number of electronic assessments varied substantially between practices. A total of 280 patients, 27 practice assistants and 17 GPs participated in the telephone interviews. Almost all GPs (16/17 = 94%; 95% CI = 73–99%), most practice assistants (19/27 = 70%; 95% CI = 50–86%) and the majority of patients (240/280 = 86%; 95% CI = 82–91%) indicated that they would welcome the use of electronic HRQoL questionnaires in the future. GPs mentioned availability of local health services (e.g. supportive, physiotherapy) (mean: 9.4 ± 1.0 SD; scale: 1 – 10), sufficient extra time (8.9 ± 1.5) and easy interpretation of HRQoL results (8.6 ± 1.6) as the most important prerequisites for their use. They believed HRQoL assessment facilitated both communication and follow up of patients' conditions. Practice assistants emphasised that this process demonstrated an extra commitment to patient centred care; patients viewed it as a tool, which contributed to the physicians' understanding of their personal condition and circumstances. Conclusion This pilot study indicates that electronic HRQoL assessment is technically feasible in general practices. It can provide clinically significant information, which can either be used in the consultation for routine care, or for research purposes. While GPs, practice assistants and patients were generally positive about the electronic procedure, several barriers (e.g. practices' lack of time and routine in HRQoL assessment) need to be overcome to enable broader application of electronic questionnaires in every day medical practice.
Resumo:
In recent years interactive media and tools, like scientific simulations and simulation environments or dynamic data visualizations, became established methods in the neural and cognitive sciences. Hence, university teachers of neural and cognitive sciences are faced with the challenge to integrate these media into the neuroscientific curriculum. Especially simulations and dynamic visualizations offer great opportunities for teachers and learners, since they are both illustrative and explorable. However, simulations bear instructional problems: they are abstract, demand some computer skills and conceptual knowledge about what simulations intend to explain. By following two central questions this article provides an overview on possible approaches to be applied in neuroscience education and opens perspectives for their curricular integration: (i) How can complex scientific media be transformed for educational use in an efficient and (for students on all levels) comprehensible manner and (ii) by what technical infrastructure can this transformation be supported? Exemplified by educational simulations for the neurosciences and their application in courses, answers to these questions are proposed a) by introducing a specific educational simulation approach for the neurosciences b) by introducing an e-learning environment for simulations, and c) by providing examples of curricular integration on different levels which might help academic teachers to integrate newly created or existing interactive educational resources in their courses.
Resumo:
The sustainable management of natural resources is a key issue for sustainable development of a poor, mountainous country such as Tajikistan. In order to strengthen its agricultural and infrastructural development efforts and alleviate poverty in rural areas, spatial information and analysis are of crucial importance to improve priority setting and decision making efficiency. However, poor access to geospatial data and tools, and limited capacity in their use has greatly constrained the ability of governmental institutions to effectively assess, plan, and monitor natural resources management. The Centre for Development and Environment (CDE) has thus been mandated by the World Bank Group to provide adequate technical support to the Community Agriculture and Watershed Management Project (CAWMP). This support consists of a spatial database on soil degradation trends in 4 watersheds, capacity development in and awareness creation about geographic information technology and a spatial data exchange hub for natural resources management in Tajikistan. CDE’s support has started in July 2007 and will last until December 2007 with a possible extension in 2008.
Resumo:
Since the early days of logic programming, researchers in the field realized the potential for exploitation of parallelism present in the execution of logic programs. Their high-level nature, the presence of nondeterminism, and their referential transparency, among other characteristics, make logic programs interesting candidates for obtaining speedups through parallel execution. At the same time, the fact that the typical applications of logic programming frequently involve irregular computations, make heavy use of dynamic data structures with logical variables, and involve search and speculation, makes the techniques used in the corresponding parallelizing compilers and run-time systems potentially interesting even outside the field. The objective of this article is to provide a comprehensive survey of the issues arising in parallel execution of logic programming languages along with the most relevant approaches explored to date in the field. Focus is mostly given to the challenges emerging from the parallel execution of Prolog programs. The article describes the major techniques used for shared memory implementation of Or-parallelism, And-parallelism, and combinations of the two. We also explore some related issues, such as memory management, compile-time analysis, and execution visualization.
Resumo:
This article proposes a MAS architecture for network diagnosis under uncertainty. Network diagnosis is divided into two inference processes: hypothesis generation and hypothesis confirmation. The first process is distributed among several agents based on a MSBN, while the second one is carried out by agents using semantic reasoning. A diagnosis ontology has been defined in order to combine both inference processes. To drive the deliberation process, dynamic data about the influence of observations are taken during diagnosis process. In order to achieve quick and reliable diagnoses, this influence is used to choose the best action to perform. This approach has been evaluated in a P2P video streaming scenario. Computational and time improvements are highlight as conclusions.
Resumo:
Irregular computations pose sorne of the most interesting and challenging problems in automatic parallelization. Irregularity appears in certain kinds of numerical problems and is pervasive in symbolic applications. Such computations often use dynamic data structures, which make heavy use of pointers. This complicates all the steps of a parallelizing compiler, from independence detection to task partitioning and placement. Starting in the mid 80s there has been significant progress in the development of parallelizing compilers for logic programming (and more recently, constraint programming) resulting in quite capable parallelizers. The typical applications of these paradigms frequently involve irregular computations, and make heavy use of dynamic data structures with pointers, since logical variables represent in practice a well-behaved form of pointers. This arguably makes the techniques used in these compilers potentially interesting. In this paper, we introduce in a tutoríal way, sorne of the problems faced by parallelizing compilers for logic and constraint programs and provide pointers to sorne of the significant progress made in the area. In particular, this work has resulted in a series of achievements in the areas of inter-procedural pointer aliasing analysis for independence detection, cost models and cost analysis, cactus-stack memory management, techniques for managing speculative and irregular computations through task granularity control and dynamic task allocation such as work-stealing schedulers), etc.
Resumo:
La implantación de las tecnologías Internet ha permitido la extensión del uso de estrategias e-manufacturing y el desarrollo de herramientas para la recopilación, transformación y sincronización de datos de fabricación vía web. En este ámbito, un área de potencial desarrollo es la extensión del virtual manufacturing a los procesos de Performance Management (PM), área crítica para la toma de decisiones y ejecución de acciones de mejora en fabricación. Este trabajo doctoral propone un Arquitectura de Información para el desarrollo de herramientas virtuales en el ámbito PM. Su aplicación permite asegurar la interoperabilidad necesaria en los procesos de tratamiento de información de toma de decisión. Está formado por tres sub-sistemas: un modelo conceptual, un modelo de objetos y un marco Web compuesto de una plataforma de información y una arquitectura de servicios Web (WS). El modelo conceptual y el modelo de objetos se basa en el desarrollo de toda la información que se necesita para definir y obtener los diferentes indicadores de medida que requieren los procesos PM. La plataforma de información hace uso de las tecnologías XML y B2MML para estructurar un nuevo conjunto de esquemas de mensajes de intercambio de medición de rendimiento (PMXML). Esta plataforma de información se complementa con una arquitectura de servicios web que hace uso de estos esquemas para integrar los procesos de codificación, decodificación, traducción y evaluación de los performance key indicators (KPI). Estos servicios realizan todas las transacciones que permiten transformar los datos origen en información inteligente usable en los procesos de toma de decisión. Un caso práctico de intercambio de datos en procesos de medición del área de mantenimiento de equipos es mostrado para verificar la utilidad de la arquitectura. ABSTRAC The implementation of Internet technologies has led to e-Manufacturing technologies becoming more widely used and to the development of tools for compiling, transforming and synchronizing manufacturing data through the Web. In this context, a potential area for development is the extension of virtual manufacturing to Performance Measurement (PM) processes, a critical area for decision-making and implementing improvement actions in manufacturing. This thesis proposes a Information Architecture to integrate decision support systems in e-manufacturing. Specifically, the proposed architecture offers a homogeneous PM information exchange model that can be applied trough decision support in emanufacturing environment. Its application improves the necessary interoperability in decision-making data processing tasks. It comprises three sub-systems: a data model, a object model and Web Framework which is composed by a PM information platform and PM-Web services architecture. . The data model and the object model are based on developing all the information required to define and acquire the different indicators required by PM processes. The PM information platform uses XML and B2MML technologies to structure a new set of performance measurement exchange message schemas (PM-XML). This PM information platform is complemented by a PM-Web Services architecture that uses these schemas to integrate the coding, decoding, translation and assessment processes of the key performance indicators (KPIs). These services perform all the transactions that enable the source data to be transformed into smart data that can be used in the decision-making processes. A practical example of data exchange for measurement processes in the area of equipment maintenance is shown to demonstrate the utility of the architecture.
Resumo:
Irregular computations pose some of the most interesting and challenging problems in automatic parallelization. Irregularity appears in certain kinds of numerical problems and is pervasive in symbolic applications. Such computations often use dynamic data structures which make heavy use of pointers. This complicates all the steps of a parallelizing compiler, from independence detection to task partitioning and placement. In the past decade there has been significant progress in the development of parallelizing compilers for logic programming and, more recently, constraint programming. The typical applications of these paradigms frequently involve irregular computations, which arguably makes the techniques used in these compilers potentially interesting. In this paper we introduce in a tutorial way some of the problems faced by parallelizing compilers for logic and constraint programs. These include the need for inter-procedural pointer aliasing analysis for independence detection and having to manage speculative and irregular computations through task granularity control and dynamic task allocation. We also provide pointers to some of the progress made in these áreas. In the associated talk we demónstrate representatives of several generations of these parallelizing compilers.
Resumo:
CoLogNetWS is a Web-site on Computational Logic systems, environments, and implementation technology. CoLogNetWS provides at the same time:A simple WWW interface which allows the users to access/modify the data stored in its database. An automatic data exchange between CoLogNetWS and the rest of Web-sites, in order to keep their databases up-to-date. This document constitutes an internals manual, providing information on how the different internal parts of CoLogNetWS are connected.
Resumo:
Los sensores de presión capacitivos tipo Rosemount son ampliamente utilizados en las centrales nucleares para la medida de la presión, el caudal y el nivel. Están unidos a las líneas de proceso de la central mediante líneas sensoras siendo el mantenimiento del sistema sensor-línea fundamental para mantener la seguridad en la planta. La vigilancia in situ del sistema en su conjunto, sensor-línea, se suele realizar mediante la medida del tiempo de respuesta del transmisor de presión con las técnicas de análisis de ruido. Sin embargo, determinadas averías como la pérdida de aceite de su cámara interna no son detectables por medio de la medida del tiempo de respuesta, ya que éste apenas varía su valor en una fase incipiente del síndrome. No obstante, la mejora de la descripción dinámica del sensor podría servir para detectar dichas averías. En laboratorio se ha demostrado que la función de transferencia del sistema sensor-línea puede tener más de un polo real, y en consecuencia, el modelo de tres polos (dos polos complejos conjugados y uno real) utilizado hasta ahora en otros trabajos podría sustituirse por uno de cuatro o incluso cinco. En este trabajo se propone utilizar técnicas de vigilancia in situ basadas en el análisis de ruido y en el Dynamic Data System para ampliar la descripción dinámica del sensor y con ello mejorar las posibilidades de diagnóstico de averías como el síndrome de la pérdida de aceite del sensor en fases incipientes. Se han analizado varias medidas de planta con la metodología propuesta y se muestran los resultados obtenidos.
Resumo:
Learning analytics is the analysis of static and dynamic data extracted from virtual learning environments, in order to understand and optimize the learning process. Generally, this dynamic data is generated by the interactions which take place in the virtual learning environment. At the present time, many implementations for grouping of data have been proposed, but there is no consensus yet on which interactions and groups must be measured and analyzed. There is also no agreement on what is the influence of these interactions, if any, on learning outcomes, academic performance or student success. This study presents three different extant interaction typologies in e-learning and analyzes the relation of their components with students? academic performance. The three different classifications are based on the agents involved in the learning process, the frequency of use and the participation mode, respectively. The main findings from the research are: a) that agent-based classifications offer a better explanation of student academic performance; b) that at least one component in each typology predicts academic performance; and c) that student-teacher and student-student, evaluating students, and active interactions, respectively, have a significant impact on academic performance, while the other interaction types are not significantly related to academic performance.
Resumo:
En esta tesis se investiga de forma experimental el transporte pasivo de magnitudes físicas en micro-sistemas con carácter de inmediata aplicación industrial, usando métodos innovadores para mejorar la eficiencia de los mismos optimizando parámetros críticos del diseño o encontrar nuevos destinos de posible aplicación. Parte de los resultados obtenidos en estos experimentos han sido publicados en revistas con un índice de impacto tal que pertenecen al primer cuarto del JCR. Primero de todo se ha analizado el efecto que produce en un intercambiador de calor basado en micro-canales el hecho de dejar un espacio entre canales y tapa superior para la interconexión de los mismos. Esto genera efectos tridimensionales que mejoran la exracción de calor del intercambiador y reducen la caída de presión que aparece por el transcurso del fluido a través de los micro-canales, lo que tiene un gran impacto en la potencia que ha de suministrar la bomba de refrigerante. Se ha analizado también la mejora producida en términos de calor disipado de un micro-procesador refrigerado con un ampliamente usado plato de aletas al implementar en éste una cámara de vapor que almacena un fluido bifásico. Se ha desarrollado de forma paralela un modelo numérico para optimizar las nuevas dimensiones del plato de aletas modificado compatibles con una serie de requerimientos de diseño en el que tanto las dimensiones como el peso juegan un papel esencial. Por otro lado, se han estudiado los fenomenos fluido-dinámicos que aparecen aguas abajo de un cuerpo romo en el seno de un fluido fluyendo por un canal con una alta relación de bloqueo. Los resultados de este estudio confirman, de forma experimental, la existencia de un régimen intermedio, caracterizado por el desarrollo de una burbuja de recirculación oscilante entre los regímenes, bien diferenciados, de burbuja de recirculación estacionaria y calle de torbellinos de Karman, como función del número de Reynolds del flujo incidente. Para la obtención, análisis y post-proceso de los datos, se ha contado con la ayuda de un sistema de Velocimetría por Imágenes de Partículas (PIV). Finalmente y como adición a este último punto, se ha estudiado las vibraciones de un cuerpo romo producidas por el desprendimiento de torbellinos en un canal de alta relación de bloqueo con la base obtenida del estudio anterior. El prisma se mueve con un movimiento armónico simple para un intervalo de números de Reynolds y este movimiento se transforma en vibración alrededor de su eje a partir de un ciero número de Reynolds. En relación al fluido, el régimen de desprendimiento de torbellinos se alcanza a menores números de Reynolds que en el caso de tener el cuerpo romo fijo. Uniendo estos dos registros de movimientos y variando la relación de masas entre prisma y fluido se obtiene un mapa con diferentes estados globales del sistema. Esto no solo tiene aplicación como método para promover el mezclado sino también como método para obtener energía a partir del movimiento del cuerpo en el seno del fluido. Abstract In this thesis, experimental research focused on passive scalar transport is performed in micro-systems with marked sense of industrial application, using innovative methods in order to obtain better performances optimizing critical design parameters or finding new utilities. Part of the results obtained in these experiments have been published into high impact factor journals belonged to the first quarter of the Journal Citation Reports (JCR). First of all the effect of tip clearance in a micro-channel based heat sink is analyzed. Leaving a gap between channels and top cover, letting the channels communicate each other causes three-dimensional effects which improve the heat transfer between fluid and heat sink and also reducing the pressure drop caused by the fluid passing through the micro-channels which has a great impact on the total cooling pumping power needed. It is also analyzed the enhancement produced in terms of dissipated heat in a micro-processor cooling system by improving the predominantly used fin plate with a vapour chamber based heat spreader which contains a two-phase fluid inside. It has also been developed at the same time a numerical model to optimize the new fin plate dimensions compatible with a series of design requirements in which both size and wight plays a very restrictive role. On the other hand, fluid-dynamics phenomena that appears downstream of a bluff body in the bosom of a fluid flow with high blockage ratio has been studied. This research experimentally confirms the existence of an intermediate regime characterized by an oscillating closed recirculation bubble intermediate regime between the steady closed recirculation bubble regime and the vortex shedding regime (Karman street like regime) as a function of the incoming flow Reynolds number. A particle image velocimetry technique (PIV) has been used in order to obtain, analyze and post-process the fluid-dynamic data. Finally and as an addition to the last point, a study on the vortexinduced vibrations (VIV) of a bluff body inside a high blockage ratio channel has been carried out taking advantage of the results obtained with the fixed square prism. The prism moves with simple harmonic motion for a Reynolds number interval and this movement becomes vibrational around its axial axis after overcoming at definite Reynolds number. Regarding the fluid, vortex shedding regime is reached at Reynolds numbers lower than the previous critical ones. Merging both movement spectra and varying the square prism to fluid mass ratio, a map with different global states is reached. This is not only applicable as a mixing enhancement technique but as an energy harvesting method.
Resumo:
In this work, a methodology is proposed to find the dynamics poles of a capacitive pressure transmitter in order to enhance and extend the on line surveillance of this type of sensors based on the response time measurement by applying noise analysis techniques and the Dynamic Data System. Several measurements have been analyzed taken from a Pressurized Water Reactor. The methodology proposes an autoregressive fit whose order is determined by the sensor dynamics poles. Nevertheless, the signals that have been analyzed, could not be filtered properly in order to remove the plant noise, thus, this was considered as an additional pair of complex conjugate poles. With this methodology we have come up with the numerical value of the sensor second real pole in spite of its low influence on the sensor dynamic response. This opens up a more accurate on line sensor surveillance since the previous methods were achieved by considering one real pole only.
Resumo:
The implementation of Internet technologies has led to e-Manufacturing technologies becoming more widely used and to the development of tools for compiling, transforming and synchronising manufacturing data through the Web. In this context, a potential area for development is the extension of virtual manufacturing to performance measurement (PM) processes, a critical area for decision making and implementing improvement actions in manufacturing. This paper proposes a PM information framework to integrate decision support systems in e-Manufacturing. Specifically, the proposed framework offers a homogeneous PM information exchange model that can be applied through decision support in e-Manufacturing environment. Its application improves the necessary interoperability in decision-making data processing tasks. It comprises three sub-systems: a data model, a PM information platform and PM-Web services architecture. A practical example of data exchange for measurement processes in the area of equipment maintenance is shown to demonstrate the utility of the model.
Resumo:
This thesis is the result of a project whose objective has been to develop and deploy a dashboard for sentiment analysis of football in Twitter based on web components and D3.js. To do so, a visualisation server has been developed in order to present the data obtained from Twitter and analysed with Senpy. This visualisation server has been developed with Polymer web components and D3.js. Data mining has been done with a pipeline between Twitter, Senpy and ElasticSearch. Luigi have been used in this process because helps building complex pipelines of batch jobs, so it has analysed all tweets and stored them in ElasticSearch. To continue, D3.js has been used to create interactive widgets that make data easily accessible, this widgets will allow the user to interact with them and �filter the most interesting data for him. Polymer web components have been used to make this dashboard according to Google's material design and be able to show dynamic data in widgets. As a result, this project will allow an extensive analysis of the social network, pointing out the influence of players and teams and the emotions and sentiments that emerge in a lapse of time.