945 resultados para File processing (Computer science)
Resumo:
El PFC té com objectiu dissenyar i implementar un esquema criptogràfic per a garantir un accéssegur a les dades proporcionant mecanismes per salvaguardar la confidencialitat, autenticitat i integritat de les dades i no repudi de les accions portades a terme pels usuaris.
Resumo:
Al llarg del present treball es construeix una aplicació de gestió d'historials metges per la qual cosa prèviament s'han estudiat els requisits de seguretat exigibles a la mateixa, com condicionants fonamentals per a la seva posada en funcionament.
Resumo:
En aquest projecte de final de carrera s'ha dissenyat, implementat i provat un sistema que utilitzant criptografia de clau pública, garanteix la correcta utilització de les dades en un procés de consulta i complimentació d'una història clínica informatitzada consultada a distància.
Resumo:
El projecte ha de generar un seguit d'aplicacions per a que els usuaris i elsmetges puguin accedir a través d'una xarxa de comunicacions als historials clínicsmantenint sempre els requisits de seguretat.
Resumo:
Análisis de experiencias de enseñanza-aprendizaje de la competencia comunicativaescrita a nivel de Ingeniería Informática.
Resumo:
Human electrophysiological studies support a model whereby sensitivity to so-called illusory contour stimuli is first seen within the lateral occipital complex. A challenge to this model posits that the lateral occipital complex is a general site for crude region-based segmentation, based on findings of equivalent hemodynamic activations in the lateral occipital complex to illusory contour and so-called salient region stimuli, a stimulus class that lacks the classic bounding contours of illusory contours. Using high-density electrical mapping of visual evoked potentials, we show that early lateral occipital cortex activity is substantially stronger to illusory contour than to salient region stimuli, whereas later lateral occipital complex activity is stronger to salient region than to illusory contour stimuli. Our results suggest that equivalent hemodynamic activity to illusory contour and salient region stimuli probably reflects temporally integrated responses, a result of the poor temporal resolution of hemodynamic imaging. The temporal precision of visual evoked potentials is critical for establishing viable models of completion processes and visual scene analysis. We propose that crude spatial segmentation analyses, which are insensitive to illusory contours, occur first within dorsal visual regions, not the lateral occipital complex, and that initial illusory contour sensitivity is a function of the lateral occipital complex.
Resumo:
Supervisory systems evolution makes the obtaining of significant information from processes more important in the way that the supervision systems' particular tasks are simplified. So, having signal treatment tools capable of obtaining elaborate information from the process data is important. In this paper, a tool that obtains qualitative data about the trends and oscillation of signals is presented. An application of this tool is presented as well. In this case, the tool, implemented in a computer-aided control systems design (CACSD) environment, is used in order to give to an expert system for fault detection in a laboratory plant
Resumo:
Image registration is an important component of image analysis used to align two or more images. In this paper, we present a new framework for image registration based on compression. The basic idea underlying our approach is the conjecture that two images are correctly registered when we can maximally compress one image given the information in the other. The contribution of this paper is twofold. First, we show that the image registration process can be dealt with from the perspective of a compression problem. Second, we demonstrate that the similarity metric, introduced by Li et al., performs well in image registration. Two different versions of the similarity metric have been used: the Kolmogorov version, computed using standard real-world compressors, and the Shannon version, calculated from an estimation of the entropy rate of the images
Resumo:
Cobalt-labelled motoneuron dendrites of the frog spinal cord at the level of the second spinal nerve were photographed in the electron microscope from long series of ultrathin sections. Three-dimensional computer reconstructions of 120 dendrite segments were analysed. The samples were taken from two locations: proximal to cell body and distal, as defined in a transverse plane of the spinal cord. The dendrites showed highly irregular outlines with many 1-2 microns-long 'thorns' (on average 8.5 thorns per 100 microns 2 of dendritic area). Taken together, the reconstructed dendrite segments from the proximal sites had a total length of about 250 microns; those from the distal locations, 180 microns. On all segments together there were 699 synapses. Nine percent of the synapses were on thorns, and many more close to their base on the dendritic shaft. The synapses were classified in four groups. One third of the synapses were asymmetric with spherical vesicles; one half were symmetric with spherical vesicles; and one tenth were symmetric with flattened vesicles. A fourth, small class of asymmetric synapses had dense-core vesicles. The area of the active zones was large for the asymmetric synapses (median value 0.20 microns 2), and small for the symmetric ones (median value 0.10 microns 2), and the difference was significant. On average, the areas of the active zones of the synapses on thin dendrites were larger than those of synapses on large calibre dendrites. About every 4 microns 2 of dendritic area received one contact. There was a significant difference between the areas of the active zones of the synapses at the two locations. Moreover, the number per unit dendritic length was correlated with dendrite calibre. On average, the active zones covered more than 4% of the dendritic area; this value for thin dendrites was about twice as large as that of large calibre dendrites. We suggest that the larger active zones and the larger synaptic coverage of the thin dendrites compensate for the longer electrotonic distance of these synapses from the soma.
Resumo:
En els últims anys el sector de la construcció ha experimentat un creixement exponencial. Aquest creixement ha repercutit sobre molts aspectes: des de la necessitat de tenir més personal a les obres, la implantació d’unes oficines per a poder gestionar la compatibilitat i portar un control sobre les obres fins a la necessitat d’haver de disposar de programes informàtics específics que ajudin a realitzar la feina de la manera més còmode i àgil possible. El projecte que s’ha dut a terme consisteix a cobrir una d’aquestes necessitats, que és la de la gestió dels pressupostos en les diferents obres que els constructors realitzen. Utilitza la base de dades de l’ITEC (institut de Tecnologia de la Construcció de Catalunya) sobre la qual treballen la immensa majoria dels arquitectes quan dissenyen les obres, però també permet entrar les pròpies dades que el constructor vulgui. L’usuari de l’aplicació podrà fer pressupostos per obres de nova construcció, reformes ... agrupant cada una d’elles per capítols. Aquests capítols els podem entendre com les diferents fases a dur a terme, per exemple: la construcció dels fonaments, l’aixecament de les parets o fer la teulada. Dins dels capítols hi trobem les partides, que és un conjunt de materials i hores de feina i maquinària per a dur a terme una part de l’obra, com per exemple seria fer un envà de separació entre habitacions. En aquest cas hi tindríem els diferents materials que necessitaríem, totxanes, morter; les hores de manobre necessàries per aixecar-la, el transport de tot el material fins a l’obra... Tots aquests paràmetres (materials, hores, transport...) s’anomenen articles i van inclosos a dins de les partides. Aquesta aplicació està dissenyada per funcionar en un entorn client/servidor, utilitzant com a servidor un Linux OpenSuse 10.2 i com a clients estacions de treball amb Windows XP, tot i que també podríem utilitzar d’altres versions dels sistemes operatius de Microsoft. L’entorn de desenvolupament utilitzat és el del llenguatge FDS , el qual ja porta integrat un gestor de fitxers que és el que es farà servir.
Resumo:
El software lliure està tenint últimament un pes cada cop més important en les empreses, però encara és el gran desconegut per a molta gent. Des de la seva creació als anys 80 fins ara, hi ha hagut un creixement exponencial de software lliure de gran qualitat, oferint eines per a tot tipus de necessitats, eines ofimàtiques, gestors de correu, sistemes de fitxer, sistemes operatius…. Tot aquest moviment no ha passat desapercebut per a molts usuaris i empreses, que s’han aprofitat d’ell per cobrir les seves necessitats. Pel que fa a les empreses, cada cop n’hi ha més que en petita o gran mesura, utilitzen el software lliure, ja sigui per el seu menor cost d’adquisició, o bé per la seva gran fiabilitat o per que és fàcilment adaptable o per no establir cap lligam tecnològic, en definitiva per tenir més llibertat. En el moment de la creació d’una nova empresa, on es parteix de zero en tota la tecnologia informàtica, és el moment menys costòs d’implementar l’arquitectura informàtica amb software lliure, és quan l’impacte que té sobre l’empresa, usuaris i clients és menor. En les empreses que ja tenen un sistema informàtic, caldrà establir un pla de migració, ja sigui total o parcial. La finalitat d’aquest projecte no és la de dir quin software és millor que l’altre o de dir quin s’ha d’instal•lar, sinó el de donar a conèixer el món del software lliure, mostrar part d’aquest software, fer alguna comparativa de software lliure amb software propietari, donant idees i un conjunt de solucions per a empreses, per què una empresa pugui agafar idees d’implementació d’algunes de les solucions informàtiques exposades o seguir algun dels consells proposats. Actualment ja hi ha moltes empreses que utilitzen software lliure. Algunes només n’utilitzen una petita part en les seves instal•lacions, ja que el fet de que una empresa funcioni al 100% amb software lliure, tot i que n’hi comença ha haver, de moment ho considero una mica arriscat, però que en poc temps, aquest fet serà cada cop més habitual.
Resumo:
PURPOSE: To explore whether triaxial accelerometric measurements can be utilized to accurately assess speed and incline of running in free-living conditions. METHODS: Body accelerations during running were recorded at the lower back and at the heel by a portable data logger in 20 human subjects, 10 men, and 10 women. After parameterizing body accelerations, two neural networks were designed to recognize each running pattern and calculate speed and incline. Each subject ran 18 times on outdoor roads at various speeds and inclines; 12 runs were used to calibrate the neural networks whereas the 6 other runs were used to validate the model. RESULTS: A small difference between the estimated and the actual values was observed: the square root of the mean square error (RMSE) was 0.12 m x s(-1) for speed and 0.014 radiant (rad) (or 1.4% in absolute value) for incline. Multiple regression analysis allowed accurate prediction of speed (RMSE = 0.14 m x s(-1)) but not of incline (RMSE = 0.026 rad or 2.6% slope). CONCLUSION: Triaxial accelerometric measurements allows an accurate estimation of speed of running and incline of terrain (the latter with more uncertainty). This will permit the validation of the energetic results generated on the treadmill as applied to more physiological unconstrained running conditions.
Resumo:
The purpose of this study is to clinically validate a new two-dimensional preoperative planning software for cementless total hip arthroplasty (THA). Manual and two-dimensional computer-assisted planning were compared by an independent observer for each of the 30 patients with osteoarthritis who underwent THA. This study showed that there were no statistical differences between the results of both preoperative plans in terms of stem size and neck length (<1 size) and hip rotation center position (<5 mm). Two-dimensional computer-assisted preoperative planning provided successful results comparable to those using the manual procedure, thereby allowing the surgeon to simulate various stem designs easily.
Resumo:
Statistics has become an indispensable tool in biomedical research. Thanks, in particular, to computer science, the researcher has easy access to elementary "classical" procedures. These are often of a "confirmatory" nature: their aim is to test hypotheses (for example the efficacy of a treatment) prior to experimentation. However, doctors often use them in situations more complex than foreseen, to discover interesting data structures and formulate hypotheses. This inverse process may lead to misuse which increases the number of "statistically proven" results in medical publications. The help of a professional statistician thus becomes necessary. Moreover, good, simple "exploratory" techniques are now available. In addition, medical data contain quite a high percentage of outliers (data that deviate from the majority). With classical methods it is often very difficult (even for a statistician!) to detect them and the reliability of results becomes questionable. New, reliable ("robust") procedures have been the subject of research for the past two decades. Their practical introduction is one of the activities of the Statistics and Data Processing Department of the University of Social and Preventive Medicine, Lausanne.