909 resultados para Database management -- Computer programs
Resumo:
This paper introduces how artificial intelligence technologies can be integrated into a known computer aided control system design (CACSD) framework, Matlab/Simulink, using an object oriented approach. The aim is to build a framework to aid supervisory systems analysis, design and implementation. The idea is to take advantage of an existing CACSD framework, Matlab/Simulink, so that engineers can proceed: first to design a control system, and then to design a straightforward supervisory system of the control system in the same framework. Thus, expert systems and qualitative reasoning tools are incorporated into this popular CACSD framework to develop a computer aided supervisory system design (CASSD) framework. Object-variables an introduced into Matlab/Simulink for sharing information between tools
Resumo:
Background: The variety of DNA microarray formats and datasets presently available offers an unprecedented opportunity to perform insightful comparisons of heterogeneous data. Cross-species studies, in particular, have the power of identifying conserved, functionally important molecular processes. Validation of discoveries can now often be performed in readily available public data which frequently requires cross-platform studies.Cross-platform and cross-species analyses require matching probes on different microarray formats. This can be achieved using the information in microarray annotations and additional molecular biology databases, such as orthology databases. Although annotations and other biological information are stored using modern database models ( e. g. relational), they are very often distributed and shared as tables in text files, i.e. flat file databases. This common flat database format thus provides a simple and robust solution to flexibly integrate various sources of information and a basis for the combined analysis of heterogeneous gene expression profiles.Results: We provide annotationTools, a Bioconductor-compliant R package to annotate microarray experiments and integrate heterogeneous gene expression profiles using annotation and other molecular biology information available as flat file databases. First, annotationTools contains a specialized set of functions for mining this widely used database format in a systematic manner. It thus offers a straightforward solution for annotating microarray experiments. Second, building on these basic functions and relying on the combination of information from several databases, it provides tools to easily perform cross-species analyses of gene expression data.Here, we present two example applications of annotationTools that are of direct relevance for the analysis of heterogeneous gene expression profiles, namely a cross-platform mapping of probes and a cross-species mapping of orthologous probes using different orthology databases. We also show how to perform an explorative comparison of disease-related transcriptional changes in human patients and in a genetic mouse model.Conclusion: The R package annotationTools provides a simple solution to handle microarray annotation and orthology tables, as well as other flat molecular biology databases. Thereby, it allows easy integration and analysis of heterogeneous microarray experiments across different technological platforms or species.
Resumo:
Little is known about the relation between the genome organization and gene expression in Leishmania. Bioinformatic analysis can be used to predict genes and find homologies with known proteins. A model was proposed, in which genes are organized into large clusters and transcribed from only one strand, in the form of large polycistronic primary transcripts. To verify the validity of this model, we studied gene expression at the transcriptional, post-transcriptional and translational levels in a unique locus of 34kb located on chr27 and represented by cosmid L979. Sequence analysis revealed 115 ORFs on either DNA strand. Using computer programs developed for Leishmania genes, only nine of these ORFs, localized on the same strand, were predicted to code for proteins, some of which show homologies with known proteins. Additionally, one pseudogene, was identified. We verified the biological relevance of these predictions. mRNAs from nine predicted genes and proteins from seven were detected. Nuclear run-on analyses confirmed that the top strand is transcribed by RNA polymerase II and suggested that there is no polymerase entry site. Low levels of transcription were detected in regions of the bottom strand and stable transcripts were identified for four ORFs on this strand not predicted to be protein-coding. In conclusion, the transcriptional organization of the Leishmania genome is complex, raising the possibility that computer predictions may not be comprehensive.
Resumo:
In this final project the high availability options for PostgreSQL database management system were explored and evaluated. The primary objective of the project was to find a reliable replication system and implement it to a production environment. The secondary objective was to explore different load balancing methods and compare their performance. The potential replication methods were thoroughly examined, and the most promising was implemented to a database system gathering weather information in Lithuania. The different load balancing methods were tested performance wise with different load scenarios and the results were analysed. As a result for this project a functioning PostgreSQL database replication system was built to the Lithuanian Hydrometeorological Service's headquarters, and definite guidelines for future load balancing needs were produced. This study includes the actual implementation of a replication system to a demanding production environment, but only guidelines for building a load balancing system to the same production environment.
Resumo:
The manipulation of DNA is routine practice in botanical research and has made a huge impact on plant breeding, biotechnology and biodiversity evaluation. DNA is easy to extract from most plant tissues and can be stored for long periods in DNA banks. Curation methods are well developed for other botanical resources such as herbaria, seed banks and botanic gardens, but procedures for the establishment and maintenance of DNA banks have not been well documented. This paper reviews the curation of DNA banks for the characterisation and utilisation of biodiversity and provides guidelines for DNA bank management. It surveys existing DNA banks and outlines their operation. It includes a review of plant DNA collection, preservation, isolation, storage, database management and exchange procedures. We stress that DNA banks require full integration with existing collections such as botanic gardens, herbaria and seed banks, and information retrieval systems that link such facilities, bioinformatic resources and other DNA banks. They also require efficient and well-regulated sample exchange procedures. Only with appropriate curation will maximum utilisation of DNA collections be achieved.
Resumo:
The purpose of Project ASSIST is to provide computer training to individuals who are blind, visually-impaired or deaf-blind. Our training materials address all levels of users, from beginners to advanced users. We have tutorials, keyboard guides and diagrams, and course packets. These materials can be used by individuals who want to learn popular computer programs on their own and by professional trainers for their organization's computer training program. We also offer instructor-led training through our ASSIST Online distance learning program.
Resumo:
Objective Investigating the educational technologies developed for promoting cardiovascular health in adults. Method Integrative review carried out in the databases of PubMed, SciELO and LILACS, with 15 articles selected. Results Over half (60%) of the studies were randomized clinical trials. The developed educational technologies were programs involving three strategies, with duration of one year, use of playful technologies with storytelling, computer programs or software for smartphones, and electronic brochure. These technologies resulted in reduction of blood pressure, weight, waist circumference, decreased hospitalizations and increased years of life. Conclusion The studies with better impact on the cardiovascular health of adults were those who brought the technology in the form of program and duration of one year.
Resumo:
Data mining can be defined as the extraction of previously unknown and potentially useful information from large datasets. The main principle is to devise computer programs that run through databases and automatically seek deterministic patterns. It is applied in different fields of application, e.g., remote sensing, biometry, speech recognition, but has seldom been applied to forensic case data. The intrinsic difficulty related to the use of such data lies in its heterogeneity, which comes from the many different sources of information. The aim of this study is to highlight potential uses of pattern recognition that would provide relevant results from a criminal intelligence point of view. The role of data mining within a global crime analysis methodology is to detect all types of structures in a dataset. Once filtered and interpreted, those structures can point to previously unseen criminal activities. The interpretation of patterns for intelligence purposes is the final stage of the process. It allows the researcher to validate the whole methodology and to refine each step if necessary. An application to cutting agents found in illicit drug seizures was performed. A combinatorial approach was done, using the presence and the absence of products. Methods coming from the graph theory field were used to extract patterns in data constituted by links between products and place and date of seizure. A data mining process completed using graphing techniques is called ``graph mining''. Patterns were detected that had to be interpreted and compared with preliminary knowledge to establish their relevancy. The illicit drug profiling process is actually an intelligence process that uses preliminary illicit drug classes to classify new samples. Methods proposed in this study could be used \textit{a priori} to compare structures from preliminary and post-detection patterns. This new knowledge of a repeated structure may provide valuable complementary information to profiling and become a source of intelligence.
Resumo:
Various test methods exist for measuring heat of cement hydration; however, most current methods require expensive equipment, complex testing procedures, and/or extensive time, thus not being suitable for field application. The objectives of this research are to identify, develop, and evaluate a standard test procedure for characterization and quality control of pavement concrete mixtures using a calorimetry technique. This research project has three phases. Phase I was designed to identify the user needs, including performance requirements and precision and bias limits, and to synthesize existing test methods for monitoring the heat of hydration, including device types, configurations, test procedures, measurements, advantages, disadvantages, applications, and accuracy. Phase II was designed to conduct experimental work to evaluate the calorimetry equipment recommended from the Phase I study and to develop a standard test procedure for using the equipment and interpreting the test results. Phase II also includes the development of models and computer programs for prediction of concrete pavement performance based on the characteristics of heat evolution curves. Phase III was designed to study for further development of a much simpler, inexpensive calorimeter for field concrete. In this report, the results from the Phase I study are presented, the plan for the Phase II study is described, and the recommendations for Phase III study are outlined. Phase I has been completed through three major activities: (1) collecting input and advice from the members of the project Technical Working Group (TWG), (2) conducting a literature survey, and (3) performing trials at the CP Tech Center’s research lab. The research results indicate that in addition to predicting maturity/strength, concrete heat evolution test results can also be used for (1) forecasting concrete setting time, (2) specifying curing period, (3) estimating risk of thermal cracking, (4) assessing pavement sawing/finishing time, (5) characterizing cement features, (6) identifying incompatibility of cementitious materials, (7) verifying concrete mix proportions, and (8) selecting materials and/or mix designs for given environmental conditions. Besides concrete materials and mix proportions, the configuration of the calorimeter device, sample size, mixing procedure, and testing environment (temperature) also have significant influences on features of concrete heat evolution process. The research team has found that although various calorimeter tests have been conducted for assorted purposes and the potential uses of calorimeter tests are clear, there is no consensus on how to utilize the heat evolution curves to characterize concrete materials and how to effectively relate the characteristics of heat evolution curves to concrete pavement performance. The goal of the Phase II study is to close these gaps.
Resumo:
Mitjançant les tècniques de visió per computador aquest projecte pretén desenvolupar una aplicació capaç de segmentar la pell, detectar nevus (pigues i altres taques) i poder comparar imatges de pacients amb risc de contreure melanoma preses en moments diferents. Aquest projecte pretén oferir diferents eines informàtiques als dermatòlegs per a propòsits relacionats amb la investigació. L’ objectiu principal d’ aquest projecte és desenvolupar un sistema informàtic que proporcioni als dermatòlegs agilitat a l’hora de gestionar les dades dels pacients amb les sevesimatges corresponents, ajudar-los en la realització de deteccions dels nevus d’aquestes imatges, i ajudar-los en la comparació d’exploracions (amb les deteccions realitzades)de diferents èpoques d’un mateix pacient
Resumo:
L’objectiu d’aquest PFC és estudiar la branca de la detecció d’objectes en vídeos segons el seu moviment. Per fer-ho es crearà un algorisme que sigui capaç de tractar un vídeo, calculant el nombre d’objectes de l’escena i quina és la posició de cada un d’aquests. L’algorisme ha de ser capaç de trobar un conjunt de regions útils i a partir d’aquest, separar-lo en diferents grups, cada un representant un objecte en moviment. La finalitat d’aquest projecte és l’estudi de la detecció d’objectes en vídeo. Intentarem crear un algorisme que ens permeti dur a terme aquest estudi i treure’n conclusions. Pretenem fer un algorisme, o un conjunt d’algorismes, en Matlab que sigui capaç de donat qualsevol vídeo, pugui retornar un conjunt de imatges, o un vídeo, amb els diferents objectes de l’escena destacats. Es faran proves en diferents situacions, des de objectes sintètics amb un moviment clarament definit, fins a proves en seqüències reals extretes de diferents pel•lícules. Per últim es pretén comprovar l’eficiència d’aquest. Ja que el projecte s’emmarca en la línia de recerca de robòtica i visió per computador, la tasca principal serà la manipulació d’imatges. Per tant farem servir el Matlab, ja que les imatges no son res més que matrius i aquest programa permet el càlcul vectorial i matricial d’una manera senzilla i realment eficient
Estudi i implementació d’un mètode de reconstrucció 3D basat en SfM i registre de vistes 3D parcials
Resumo:
Aquest projecte es basarà en reconstruir una imatge 3D gran a partir d’una seqüència d’imatges 2D capturades per una càmera. Ens centrem en l’estudi de les bases matemàtiques de la visió per computador així com en diferents mètodes emprats en la reconstrucció 3D d’imatges. Per portar a terme aquest estudi s’utilitza la plataforma de desenvolupament MatLab ja que permet tractar operacions matemàtiques, imatges i matrius de gran tamany amb molta senzillesa, rapidesa i eficiència, per aquesta raó s’usa en moltes recerques sobre aquest tema. El projecte aprofundeix en el tema descrit anteriorment estudiant i implementant un mètode que consisteix en aplicar Structure From Motion (SFM) a pocs frames seguits obtinguts d’una seqüència d’imatges 2D per crear una reconstrucció 3D. Quan s’han creat dues reconstruccions 3D consecutives i fent servir un frame com a mínim en comú entre elles, s’aplica un mètode de registre d’estructures 3D, l’Iterative Closest Point (ICP), per crear una reconstrucció 3D més gran a través d’unir les diferents reconstruccions obtingudes a partir de SfM. El mètode consisteix en anar repetint aquestes operacions fins al final dels frames per poder aconseguir una reconstrucció 3D més gran que les petites imatges que s’aconsegueixen a través de SfM. A la Figura 1 es pot veure un esquema del procés que es segueix. Per avaluar el comportament del mètode, utilitzem un conjunt de seqüències sintètiques i un conjunt de seqüències reals obtingudes a partir d’una càmera. L’objectiu final d’aquest projecte és construir una nova toolbox de MatLab amb tots els mètodes per crear reconstruccions 3D grans per tal que sigui possible tractar amb facilitat aquest problema i seguir-lo desenvolupant en un futur
Resumo:
L’objectiu principal d’aquest projecte consisteix en realitzar l’estudi, l' anàlisi, el disseny, i el desenvolupament d’una aplicació que millori i faciliti la tasca d’un empleat que treballi fora de l’empresa. Es fa una anàlisi dels requeriments de l’aplicació, es fa el model d’anàlisi i tot seguit el model de disseny, i s’implementarà mitjançant les eines que utilitza la nostra empresa o les que millor s’adaptin a ella. S’analitza també el tipus i el volum d’informació que haurà de manejar. Durant el procés de cada una de les etapes de desenvolupament s’aniran realitzant les corresponents proves per, finalment, realitzar una posada en marxa sobre un empleat a mode de proves en un entorn real. Degut a la necessitat de mobilitat dels comercials s’utilitzen dispositius PDA (Personal Digital Assistant) per al desenvolupament de l’aplicació que els permetrà realitzar les seves tasques. En canvi, l’aplicació corporativa de gestió de l’empresa està situada en un servidor Windows, i per centralitzar i gestionar la informació de les PDA s’utilitzarà una aplicació sota el sistema Windows. L’aplicació s’haurà de confeccionar en funció de les dades que es vulgui que el comercial pugui consultar sobre un client, obra, ... Per millorar-ne la seva funcionalitat, es permetrà l’opció de multi-llenguatge. L’aplicació de la PDA tindrà a grans trets les següents funcionalitats: gestió de clients, gestió de contactes, consultar gestions comercials, consultar obres dels clients, consultar el planning i l’estat dels muntadors. Altres processos que haurà de permetre l’aplicació son: la configuració i instal·lació de l’aplicació en els dispositius PDA i la càrrega i el traspàs de les dades entre l’aplicació del PDA i l’aplicació corporativa de l’empresa
Resumo:
Els objectius del projecte són: realitzar un intèrpret de comandes en VAL3 que rebi les ordres a través d’una connexió TCP/IP; realitzar una toolbox de Matlab per enviar diferents ordres mitjançant una connexió TCP/IP; adquirir i processar mitjançant Matlab imatges de la càmera en temps real i detectar la posició d’objectes artificials mitjançant la segmentació per color i dissenyar i realitzar una aplicació amb Matlab que reculli peces detectades amb la càmera. L’abast del projecte inclou: l’estudi del llenguatge de programació VAL3 i disseny de l’ intèrpret de comandes, l’estudi de les llibreries de Matlab per comunicació mitjançant TCP/IP, per l’adquisició d’imatges, pel processament d’imatges i per la programació en C; el disseny de la aplicació recol·lectora de peces i la implementació de: un intèrpret de comandes en VAL3, la toolbox pel control del robot STAUBLI en Matlab i la aplicació recol·lectora de peces mitjançant el processament d’imatges en temps real també en Matlab
Resumo:
El objetivo de este proyecto es implementar un sitio Web i un programa gestor de datos para el criadero/residencia canino De La Serranía (Madrid). Gracias al desarrollo de esta aplicación y del sitio Web se conseguirá que el cliente suprima el uso de todo el papel utilizado anteriormente para guardar la información relativa a clientes, animales y entradas a la residencia, podrá controlar la ocupación de la residencia y generar automáticamente las facturas, crear automáticamente árboles genealógicos de todos los ejemplares registrados, mejorar el rendimiento del criadero gracias a la opción que controla los datos de cría de cada animal: fecha de monta, fecha de parto, resultados de cada parto, etc. y controlar los gastos y ganancias de la empresa