820 resultados para sistema distribuito data-grid cloud computing CERN LHC Hazelcast Elasticsearch
Resumo:
Aquest treball consisteix en el disseny i implementació d'un sistema informàtic per a la gestió d'un consultori mèdic, amb les següents funcionalitats: gestió de pacients, control d'hores de visita, gestió de medicaments i manteniment d'historials mèdics.
Resumo:
El projecte consisteix a crear una base de dades per a una biblioteca, que gestioni de forma òptima i intel·ligent els préstecs dels seus recursos. Per a això s'utilitza la base de dades Oracle, utilitzant SQL i PL/SQL i dissenyarem l'esmentada base de dades de manera que no estigui tancada per a una única interfície, i que s'incloguin totes les informacions requerides i preparades per a possibles canvis en un futur
Resumo:
L'objectiu d'aquest TFC és el disseny i implementació d'un Sistema de gestió i intel·ligència per una biblioteca. A part del que suposa el disseny de la base de dades que contindrà tots els recursos amb els que compte la biblioteca, Aixa com de les dades d'usuaris i administradors i, finalment, de tots els moviments de préstecs dels recursos disponibles.
Resumo:
Aquest treball presenta el disseny d'un sistema de còpies de seguretat per a una empresa de serveis que presenta dades per a altres empreses. Explica les necessitats de l'empresa i les especificacions que ha de complir el sistema de còpies, i estudia les possibles opcions de programari i maquinari i els aspectes legals.
Resumo:
El treball final de carrera 'HISMED', es basa en l'anàlisi, disseny, investigació e implementació d'una aplicació dins l'àmbit assistencial sanitari amb la premissa de voler ser un sistema per oferir un servei sanitari de millor qualitat, i crear un entorn en el qual investigadors, laboratoris, metges i personal docent comparteixen dades amb una rellevància cabdal per les seves professions. Oferint-los una eina de consulta molt potent en quant a l'obtenció de dades clíniques, que els permet explotar la informació segons convingui el cas.
Resumo:
Objectives: We are interested in the numerical simulation of the anastomotic region comprised between outflow canula of LVAD and the aorta. Segmenta¬tion, geometry reconstruction and grid generation from patient-specific data remain an issue because of the variable quality of DICOM images, in particular CT-scan (e.g. metallic noise of the device, non-aortic contrast phase). We pro¬pose a general framework to overcome this problem and create suitable grids for numerical simulations.Methods: Preliminary treatment of images is performed by reducing the level window and enhancing the contrast of the greyscale image using contrast-limited adaptive histogram equalization. A gradient anisotropic diffusion filter is applied to reduce the noise. Then, watershed segmentation algorithms and mathematical morphology filters allow reconstructing the patient geometry. This is done using the InsightToolKit library (www.itk.org). Finally the Vascular Model¬ing ToolKit (www.vmtk.org) and gmsh (www.geuz.org/gmsh) are used to create the meshes for the fluid (blood) and structure (arterial wall, outflow canula) and to a priori identify the boundary layers. The method is tested on five different patients with left ventricular assistance and who underwent a CT-scan exam.Results: This method produced good results in four patients. The anastomosis area is recovered and the generated grids are suitable for numerical simulations. In one patient the method failed to produce a good segmentation because of the small dimension of the aortic arch with respect to the image resolution.Conclusions: The described framework allows the use of data that could not be otherwise segmented by standard automatic segmentation tools. In particular the computational grids that have been generated are suitable for simulations that take into account fluid-structure interactions. Finally the presented method features a good reproducibility and fast application.
Resumo:
The graphical representation of spatial soil properties in a digital environment is complex because it requires a conversion of data collected in a discrete form onto a continuous surface. The objective of this study was to apply three-dimension techniques of interpolation and visualization on soil texture and fertility properties and establish relationships with pedogenetic factors and processes in a slope area. The GRASS Geographic Information System was used to generate three-dimensional models and ParaView software to visualize soil volumes. Samples of the A, AB, BA, and B horizons were collected in a regular 122-point grid in an area of 13 ha, in Pinhais, PR, in southern Brazil. Geoprocessing and graphic computing techniques were effective in identifying and delimiting soil volumes of distinct ranges of fertility properties confined within the soil matrix. Both three-dimensional interpolation and the visualization tool facilitated interpretation in a continuous space (volumes) of the cause-effect relationships between soil texture and fertility properties and pedological factors and processes, such as higher clay contents following the drainage lines of the area. The flattest part with more weathered soils (Oxisols) had the highest pH values and lower Al3+ concentrations. These techniques of data interpolation and visualization have great potential for use in diverse areas of soil science, such as identification of soil volumes occurring side-by-side but that exhibit different physical, chemical, and mineralogical conditions for plant root growth, and monitoring of plumes of organic and inorganic pollutants in soils and sediments, among other applications. The methodological details for interpolation and a three-dimensional view of soil data are presented here.
Resumo:
La preservació digital (PD) s'ha convertit en un problema persistent per a tots els que vulguin conservar la seva informació digital, garantir el seu estat i consultar aquest informació en el transcurs del temps. Fins ara només grans institucions amb coneixement expert i eines especialitzades han pogut fer front a aquest problema, però la preservació digital no pot ser abordada per una sola institució o nació. Les biblioteques, arxius i altres institucions de conservació de la memòria comparteixen aquest repte de la mateixa manera que els col•leccionistes i creadors, que ho fan a títol individual.L’objectiu del projecte és crear l'aplicació Pyramid que està concebuda com una eina de suport orientada a l'usuari domèstic (sense coneixements tècnics ni de preservació) per a la preservació a mig i llarg termini de col•leccions digitals, texts i vídeos, tal que funcioni com un antivirus (en BackGround) i preservi la informació sense requerir un cost addicional a l'ordinador i que l'usuari no noti cap molèstia a l'hora de fer les seves tasques diàries
Resumo:
The aim of this research paper is to present a macroscopic study about the feasibility and the efficiency of mobile devices in computing Least-Cost Path (LCP). This kind of artifact must work in off-line mode and must allow to load data from a mountain zone like digital terrain models and meteorological data.The research strategy has two steps:- First of all, we need to identify the set of software components in order to implement them inside the IT artifact. This set of components should have to be able to do LCP calculations, visualize results and present a well adapted human interface. The main goal of this first steep is to demonstrate the feasibility of a mobile geographic information system by following the ¿Design & Creation¿ research strategy.- In a second time, the goal is to evaluate the reliability and usability of this IT artifact by an ¿Experiments¿ research approach. In this step we want to characterize the behavior of the artifact in terms of fidelity and LCP process speed. This evaluation will be carried out by some external users.During the reading of this paper, we will see that this kind of geographic information system (the IT artifact) has the minimal requirements needed to carry out LCP calculations in mobile devices although it has several limitations and constraints in terms of useability and reliability. We will point out qualitative and quantitative elements related to the IT artifact performances while doing this kind of computations.
Resumo:
The aim of this research paper is to present a macroscopic study about the feasibility and the efficiency of mobile devices in computing least-cost path (LCP). This kind of artifact must work in off-line mode and must allow to load data from a mountain zone like digital terrain models and meteorological data.
Resumo:
Tutkimuksen selvitettiin miten skenaarioanalyysia voidaan käyttää uuden teknologian tutkimisessa. Työssä havaittiin, että skenaarioanalyysin soveltuvuuteen vaikuttaa eniten teknologisen muutoksen taso ja saatavilla olevan tiedon luonne. Skenaariomenetelmä soveltuu hyvin uusien teknologioiden tutkimukseen erityisesti radikaalien innovaatioiden kohdalla. Syynä tähän on niihin liittyvä suuri epävarmuus, kompleksisuus ja vallitsevan paradigman muuttuminen, joiden takia useat muut tulevaisuuden tutkimuksen menetelmät eivät ole tilanteessa käyttökelpoisia. Työn empiirisessä osiossa tutkittiin hilaverkkoteknologian tulevaisuutta skenaarioanalyysin avulla. Hilaverkot nähtiin mahdollisena disruptiivisena teknologiana, joka radikaalina innovaationa saattaa muuttaa tietokonelaskennan nykyisestä tuotepohjaisesta laskentakapasiteetin ostamisesta palvelupohjaiseksi. Tällä olisi suuri vaikutus koko nykyiseen ICT-toimialaan erityisesti tarvelaskennan hyödyntämisen ansiosta. Tutkimus tarkasteli kehitystä vuoteen 2010 asti. Teorian ja olemassa olevan tiedon perusteella muodostettiin vahvaan asiantuntijatietouteen nojautuen neljä mahdollista ympäristöskenaariota hilaverkoille. Skenaarioista huomattiin, että teknologian kaupallinen menestys on vielä monen haasteen takana. Erityisesti luottamus ja lisäarvon synnyttäminen nousivat tärkeimmiksi hilaverkkojen tulevaisuutta ohjaaviksi tekijöiksi.
Resumo:
Laser scanning is becoming an increasingly popular method for measuring 3D objects in industrial design. Laser scanners produce a cloud of 3D points. For CAD software to be able to use such data, however, this point cloud needs to be turned into a vector format. A popular way to do this is to triangulate the assumed surface of the point cloud using alpha shapes. Alpha shapes start from the convex hull of the point cloud and gradually refine it towards the true surface of the object. Often it is nontrivial to decide when to stop this refinement. One criterion for this is to do so when the homology of the object stops changing. This is known as the persistent homology of the object. The goal of this thesis is to develop a way to compute the homology of a given point cloud when processed with alpha shapes, and to infer from it when the persistent homology has been achieved. Practically, the computation of such a characteristic of the target might be applied to power line tower span analysis.
Resumo:
The purpose of the work was to realize a high-speed digital data transfer system for RPC muon chambers in the CMS experiment on CERN’s new LHC accelerator. This large scale system took many years and many stages of prototyping to develop, and required the participation of tens of people. The system interfaces to Frontend Boards (FEB) at the 200,000-channel detector and to the trigger and readout electronics in the control room of the experiment. The distance between these two is about 80 metres and the speed required for the optic links was pushing the limits of available technology when the project was started. Here, as in many other aspects of the design, it was assumed that the features of readily available commercial components would develop in the course of the design work, just as they did. By choosing a high speed it was possible to multiplex the data from some the chambers into the same fibres to reduce the number of links needed. Further reduction was achieved by employing zero suppression and data compression, and a total of only 660 optical links were needed. Another requirement, which conflicted somewhat with choosing the components a late as possible was that the design needed to be radiation tolerant to an ionizing dose of 100 Gy and to a have a moderate tolerance to Single Event Effects (SEEs). This required some radiation test campaigns, and eventually led to ASICs being chosen for some of the critical parts. The system was made to be as reconfigurable as possible. The reconfiguration needs to be done from a distance as the electronics is not accessible except for some short and rare service breaks once the accelerator starts running. Therefore reconfigurable logic is extensively used, and the firmware development for the FPGAs constituted a sizable part of the work. Some special techniques needed to be used there too, to achieve the required radiation tolerance. The system has been demonstrated to work in several laboratory and beam tests, and now we are waiting to see it in action when the LHC will start running in the autumn 2008.