964 resultados para Libraries -- Computer programs
Resumo:
Little is known about the relation between the genome organization and gene expression in Leishmania. Bioinformatic analysis can be used to predict genes and find homologies with known proteins. A model was proposed, in which genes are organized into large clusters and transcribed from only one strand, in the form of large polycistronic primary transcripts. To verify the validity of this model, we studied gene expression at the transcriptional, post-transcriptional and translational levels in a unique locus of 34kb located on chr27 and represented by cosmid L979. Sequence analysis revealed 115 ORFs on either DNA strand. Using computer programs developed for Leishmania genes, only nine of these ORFs, localized on the same strand, were predicted to code for proteins, some of which show homologies with known proteins. Additionally, one pseudogene, was identified. We verified the biological relevance of these predictions. mRNAs from nine predicted genes and proteins from seven were detected. Nuclear run-on analyses confirmed that the top strand is transcribed by RNA polymerase II and suggested that there is no polymerase entry site. Low levels of transcription were detected in regions of the bottom strand and stable transcripts were identified for four ORFs on this strand not predicted to be protein-coding. In conclusion, the transcriptional organization of the Leishmania genome is complex, raising the possibility that computer predictions may not be comprehensive.
Resumo:
Objective Investigating the educational technologies developed for promoting cardiovascular health in adults. Method Integrative review carried out in the databases of PubMed, SciELO and LILACS, with 15 articles selected. Results Over half (60%) of the studies were randomized clinical trials. The developed educational technologies were programs involving three strategies, with duration of one year, use of playful technologies with storytelling, computer programs or software for smartphones, and electronic brochure. These technologies resulted in reduction of blood pressure, weight, waist circumference, decreased hospitalizations and increased years of life. Conclusion The studies with better impact on the cardiovascular health of adults were those who brought the technology in the form of program and duration of one year.
Resumo:
Data mining can be defined as the extraction of previously unknown and potentially useful information from large datasets. The main principle is to devise computer programs that run through databases and automatically seek deterministic patterns. It is applied in different fields of application, e.g., remote sensing, biometry, speech recognition, but has seldom been applied to forensic case data. The intrinsic difficulty related to the use of such data lies in its heterogeneity, which comes from the many different sources of information. The aim of this study is to highlight potential uses of pattern recognition that would provide relevant results from a criminal intelligence point of view. The role of data mining within a global crime analysis methodology is to detect all types of structures in a dataset. Once filtered and interpreted, those structures can point to previously unseen criminal activities. The interpretation of patterns for intelligence purposes is the final stage of the process. It allows the researcher to validate the whole methodology and to refine each step if necessary. An application to cutting agents found in illicit drug seizures was performed. A combinatorial approach was done, using the presence and the absence of products. Methods coming from the graph theory field were used to extract patterns in data constituted by links between products and place and date of seizure. A data mining process completed using graphing techniques is called ``graph mining''. Patterns were detected that had to be interpreted and compared with preliminary knowledge to establish their relevancy. The illicit drug profiling process is actually an intelligence process that uses preliminary illicit drug classes to classify new samples. Methods proposed in this study could be used \textit{a priori} to compare structures from preliminary and post-detection patterns. This new knowledge of a repeated structure may provide valuable complementary information to profiling and become a source of intelligence.
Resumo:
Various test methods exist for measuring heat of cement hydration; however, most current methods require expensive equipment, complex testing procedures, and/or extensive time, thus not being suitable for field application. The objectives of this research are to identify, develop, and evaluate a standard test procedure for characterization and quality control of pavement concrete mixtures using a calorimetry technique. This research project has three phases. Phase I was designed to identify the user needs, including performance requirements and precision and bias limits, and to synthesize existing test methods for monitoring the heat of hydration, including device types, configurations, test procedures, measurements, advantages, disadvantages, applications, and accuracy. Phase II was designed to conduct experimental work to evaluate the calorimetry equipment recommended from the Phase I study and to develop a standard test procedure for using the equipment and interpreting the test results. Phase II also includes the development of models and computer programs for prediction of concrete pavement performance based on the characteristics of heat evolution curves. Phase III was designed to study for further development of a much simpler, inexpensive calorimeter for field concrete. In this report, the results from the Phase I study are presented, the plan for the Phase II study is described, and the recommendations for Phase III study are outlined. Phase I has been completed through three major activities: (1) collecting input and advice from the members of the project Technical Working Group (TWG), (2) conducting a literature survey, and (3) performing trials at the CP Tech Center’s research lab. The research results indicate that in addition to predicting maturity/strength, concrete heat evolution test results can also be used for (1) forecasting concrete setting time, (2) specifying curing period, (3) estimating risk of thermal cracking, (4) assessing pavement sawing/finishing time, (5) characterizing cement features, (6) identifying incompatibility of cementitious materials, (7) verifying concrete mix proportions, and (8) selecting materials and/or mix designs for given environmental conditions. Besides concrete materials and mix proportions, the configuration of the calorimeter device, sample size, mixing procedure, and testing environment (temperature) also have significant influences on features of concrete heat evolution process. The research team has found that although various calorimeter tests have been conducted for assorted purposes and the potential uses of calorimeter tests are clear, there is no consensus on how to utilize the heat evolution curves to characterize concrete materials and how to effectively relate the characteristics of heat evolution curves to concrete pavement performance. The goal of the Phase II study is to close these gaps.
Resumo:
Mitjançant les tècniques de visió per computador aquest projecte pretén desenvolupar una aplicació capaç de segmentar la pell, detectar nevus (pigues i altres taques) i poder comparar imatges de pacients amb risc de contreure melanoma preses en moments diferents. Aquest projecte pretén oferir diferents eines informàtiques als dermatòlegs per a propòsits relacionats amb la investigació. L’ objectiu principal d’ aquest projecte és desenvolupar un sistema informàtic que proporcioni als dermatòlegs agilitat a l’hora de gestionar les dades dels pacients amb les sevesimatges corresponents, ajudar-los en la realització de deteccions dels nevus d’aquestes imatges, i ajudar-los en la comparació d’exploracions (amb les deteccions realitzades)de diferents èpoques d’un mateix pacient
Resumo:
L’objectiu d’aquest PFC és estudiar la branca de la detecció d’objectes en vídeos segons el seu moviment. Per fer-ho es crearà un algorisme que sigui capaç de tractar un vídeo, calculant el nombre d’objectes de l’escena i quina és la posició de cada un d’aquests. L’algorisme ha de ser capaç de trobar un conjunt de regions útils i a partir d’aquest, separar-lo en diferents grups, cada un representant un objecte en moviment. La finalitat d’aquest projecte és l’estudi de la detecció d’objectes en vídeo. Intentarem crear un algorisme que ens permeti dur a terme aquest estudi i treure’n conclusions. Pretenem fer un algorisme, o un conjunt d’algorismes, en Matlab que sigui capaç de donat qualsevol vídeo, pugui retornar un conjunt de imatges, o un vídeo, amb els diferents objectes de l’escena destacats. Es faran proves en diferents situacions, des de objectes sintètics amb un moviment clarament definit, fins a proves en seqüències reals extretes de diferents pel•lícules. Per últim es pretén comprovar l’eficiència d’aquest. Ja que el projecte s’emmarca en la línia de recerca de robòtica i visió per computador, la tasca principal serà la manipulació d’imatges. Per tant farem servir el Matlab, ja que les imatges no son res més que matrius i aquest programa permet el càlcul vectorial i matricial d’una manera senzilla i realment eficient
Estudi i implementació d’un mètode de reconstrucció 3D basat en SfM i registre de vistes 3D parcials
Resumo:
Aquest projecte es basarà en reconstruir una imatge 3D gran a partir d’una seqüència d’imatges 2D capturades per una càmera. Ens centrem en l’estudi de les bases matemàtiques de la visió per computador així com en diferents mètodes emprats en la reconstrucció 3D d’imatges. Per portar a terme aquest estudi s’utilitza la plataforma de desenvolupament MatLab ja que permet tractar operacions matemàtiques, imatges i matrius de gran tamany amb molta senzillesa, rapidesa i eficiència, per aquesta raó s’usa en moltes recerques sobre aquest tema. El projecte aprofundeix en el tema descrit anteriorment estudiant i implementant un mètode que consisteix en aplicar Structure From Motion (SFM) a pocs frames seguits obtinguts d’una seqüència d’imatges 2D per crear una reconstrucció 3D. Quan s’han creat dues reconstruccions 3D consecutives i fent servir un frame com a mínim en comú entre elles, s’aplica un mètode de registre d’estructures 3D, l’Iterative Closest Point (ICP), per crear una reconstrucció 3D més gran a través d’unir les diferents reconstruccions obtingudes a partir de SfM. El mètode consisteix en anar repetint aquestes operacions fins al final dels frames per poder aconseguir una reconstrucció 3D més gran que les petites imatges que s’aconsegueixen a través de SfM. A la Figura 1 es pot veure un esquema del procés que es segueix. Per avaluar el comportament del mètode, utilitzem un conjunt de seqüències sintètiques i un conjunt de seqüències reals obtingudes a partir d’una càmera. L’objectiu final d’aquest projecte és construir una nova toolbox de MatLab amb tots els mètodes per crear reconstruccions 3D grans per tal que sigui possible tractar amb facilitat aquest problema i seguir-lo desenvolupant en un futur
Resumo:
Els objectius del projecte són: realitzar un intèrpret de comandes en VAL3 que rebi les ordres a través d’una connexió TCP/IP; realitzar una toolbox de Matlab per enviar diferents ordres mitjançant una connexió TCP/IP; adquirir i processar mitjançant Matlab imatges de la càmera en temps real i detectar la posició d’objectes artificials mitjançant la segmentació per color i dissenyar i realitzar una aplicació amb Matlab que reculli peces detectades amb la càmera. L’abast del projecte inclou: l’estudi del llenguatge de programació VAL3 i disseny de l’ intèrpret de comandes, l’estudi de les llibreries de Matlab per comunicació mitjançant TCP/IP, per l’adquisició d’imatges, pel processament d’imatges i per la programació en C; el disseny de la aplicació recol·lectora de peces i la implementació de: un intèrpret de comandes en VAL3, la toolbox pel control del robot STAUBLI en Matlab i la aplicació recol·lectora de peces mitjançant el processament d’imatges en temps real també en Matlab
Resumo:
A highlight of the past ten years' accomplishments of the State Library of Iowa.
Resumo:
Signal search analysis is a general method to discover and characterize sequence motifs that are positionally correlated with a functional site (e.g. a transcription or translation start site). The method has played an instrumental role in the analysis of eukaryotic promoter elements. The signal search analysis server provides access to four different computer programs as well as to a large number of precompiled functional site collections. The programs offered allow: (i) the identification of non-random sequence regions under evolutionary constraint; (ii) the detection of consensus sequence-based motifs that are over- or under-represented at a particular distance from a functional site; (iii) the analysis of the positional distribution of a consensus sequence- or weight matrix-based sequence motif around a functional site; and (iv) the optimization of a weight matrix description of a locally over-represented sequence motif. These programs can be accessed at: http://www.isrec.isb-sib.ch/ssa/.
Resumo:
Se presenta una modificación del programa Bond-Scand, con vistas a su empleo en ordenadores de reducida capacidad de memoria. Can dicha modificación se pasa de una capacidad de 100 K a 81 K, pudiendo ser ejecutada en el ordenador IBM 360/30 de que dispone la Facultad de Ciencias de la Universidad de Barcelona.
Resumo:
Local conditions in the past often limited opportunities for scholarly exchange. But now these limits are gone and the global workplace has replaced them. It is important to react to these changes. Every academic department must now adopt new methods and rethink processes. Another is the intense national and international debate about open access to scholarly knowledge. The Open Access Initiative shows that a change is taking place in the communication process. This change is also important for service departments within research institutions. Libraries, computer centers and related units have to ask themselves how to react appropriately to the new conditions. What services must be changed or redeveloped, and in what quality and quantity should they be offered? This article focuses on changes in the scholarly publication process. It describes both technological changes and the changes needed in people's attitudes.
Resumo:
Electronic distance measuring instruments (EDMI) are used by surveyors in routine length measurements. The constant and scale factors of the instrument tend to change due to usage, transportation, and aging of crystals. Calibration baselines are established to enable surveyors to check the instruments and determine any changes in the values of constant and scale factors. The National Geodetic Survey (NGS) has developed guidelines for establishing these baselines. In 1981 an EDMI baseline at ISU was established according to NGS guidelines. In October 1982, the NGS measured the distances between monuments. Computer programs for reducing observed distances were developed. Mathematical model and computer programs for determining constant and scale factors were developed. A method was developed to detect any movements of the monuments. Periodic measurements of the baseline were made. No significant movement of the monuments was detected.
Resumo:
Several accidents, some involving fatalities, have occurred on U.S. Highway 30 near the Archer Daniels Midland Company (ADM) Corn Sweeteners plant in Cedar Rapids, Iowa. A contributing factor to many of these accidents has been the large amounts of water (vapor and liquid) emitted from multiple sources at ADM's facility located along the south side of the highway. Weather and road closure data acquired from IDOT have been used to develop a database of meteorological conditions preceding and accompanying closure of Highway 30 in Cedar Rapids. An expert system and a FORTRAN program were developed as aids in decision making with regard to closure of Highway 30 near the plant. The computer programs were used for testing, evaluation, and final deployment. Reports indicate the decision tools have been successfully implemented and were judged to be helpful in forecasting road closures and in reducing costs and personnel time in monitoring the roadway.
Resumo:
Provides instructions for using the computer program which was developed under the research project, "The Economics of Reducing the County Road System: Three Case Studies In Iowa". This program operates on an IBP personal computer with 300K storage. A fixed disk is required with at least 3 megabytes of storage. The computer must be equipped with DOS version 3.0; the programs are written in Fortran. The user's manual describes all data requirements including network preparation, trip information, cost for maintenance, reconstruction, etc. Program operation instructions are presented, as well as sample solution output and a listing of the computer programs.