1000 resultados para Software para automação de bibliotecas
Resumo:
Background: TIDratio indirectly reflects myocardial ischemia and is correlated with cardiacprognosis. We aimed at comparing the influence of three different softwarepackages for the assessment of TID using Rb-82 cardiac PET/CT. Methods: Intotal, data of 30 patients were used based on normal myocardial perfusion(SSS<3 and SRS<3) and stress myocardial blood flow 2mL/min/g)assessed by Rb-82 cardiac PET/CT. After reconstruction using 2D OSEM (2Iterations, 28 subsets), 3-D filtering (Butterworth, order=10, ωc=0.5), data were automatically processed, and then manually processed fordefining identical basal and apical limits on both stress and rest images.TIDratio were determined with Myometrix®, ECToolbox® and QGS®software packages. Comparisons used ANOVA, Student t-tests and Lin concordancetest (ρc). Results: All of the 90 processings were successfullyperformed. TID ratio were not statistically different between software packageswhen data were processed automatically (P=0.2) or manually (P=0.17). There was a slight, butsignificant relative overestimation of TID with automatic processing incomparison to manual processing using ECToolbox® (1.07 ± 0.13 vs 1.0± 0.13, P=0.001)and Myometrix® (1.07 ± 0.15 vs 1.01 ± 0.11, P=0.003) but not using QGS®(1.02 ±0.12 vs 1.05 ± 0.11, P=0.16). The best concordance was achieved between ECToolbox®and Myometrix® manual (ρc=0.67) processing.Conclusion: Using automatic or manual mode TID estimation was not significantlyinfluenced by software type. Using Myometrix® or ECToolbox®TID was significantly different between automatic and manual processing, butnot using QGS®. Software package should be account for when definingTID normal reference limits, as well as when used in multicenter studies. QGS®software seemed to be the most operator-independent software package, whileECToolbox® and Myometrix® produced the closest results.
Resumo:
In the search for high efficiency in root studies, computational systems have been developed to analyze digital images. ImageJ and Safira are public-domain systems that may be used for image analysis of washed roots. However, differences in root properties measured using ImageJ and Safira are supposed. This study compared values of root length and surface area obtained with public-domain systems with values obtained by a reference method. Root samples were collected in a banana plantation in an area of a shallower Typic Carbonatic Haplic Cambisol (CXk), and an area of a deeper Typic Haplic Ta Eutrophic Cambisol (CXve), at six depths in five replications. Root images were digitized and the systems ImageJ and Safira used to determine root length and surface area. The line-intersect method modified by Tennant was used as reference; values of root length and surface area measured with the different systems were analyzed by Pearson's correlation coefficient and compared by the confidence interval and t-test. Both systems ImageJ and Safira had positive correlation coefficients with the reference method for root length and surface area data in CXk and CXve. The correlation coefficient ranged from 0.54 to 0.80, with lowest value observed for ImageJ in the measurement of surface area of roots sampled in CXve. The IC (95 %) revealed that root length measurements with Safira did not differ from that with the reference method in CXk (-77.3 to 244.0 mm). Regarding surface area measurements, Safira did not differ from the reference method for samples collected in CXk (-530.6 to 565.8 mm²) as well as in CXve (-4231 to 612.1 mm²). However, measurements with ImageJ were different from those obtained by the reference method, underestimating length and surface area in samples collected in CXk and CXve. Both ImageJ and Safira allow an identification of increases or decreases in root length and surface area. However, Safira results for root length and surface area are closer to the results obtained with the reference method.
Resumo:
The aim of this study was to determine the effect of using video analysis software on the interrater reliability of visual assessments of gait videos in children with cerebral palsy. Two clinicians viewed the same random selection of 20 sagittal and frontal video recordings of 12 children with cerebral palsy routinely acquired during outpatient rehabilitation clinics. Both observers rated these videos in a random sequence for each lower limb using the Observational Gait Scale, once with standard video software and another with video analysis software (Dartfish(®)) which can perform angle and timing measurements. The video analysis software improved interrater agreement, measured by weighted Cohen's kappas, for the total score (κ 0.778→0.809) and all of the items that required angle and/or timing measurements (knee position mid-stance κ 0.344→0.591; hindfoot position mid-stance κ 0.160→0.346; foot contact mid-stance κ 0.700→0.854; timing of heel rise κ 0.769→0.835). The use of video analysis software is an efficient approach to improve the reliability of visual video assessments.
Resumo:
Introduction: Therapeutic drug monitoring (TDM) aims at optimizing treatment by individualizing dosage regimen based on measurement of blood concentrations. Maintaining concentrations within a target range requires pharmacokinetic and clinical capabilities. Bayesian calculation represents a gold standard in TDM approach but requires computing assistance. In the last decades computer programs have been developed to assist clinicians in this assignment. The aim of this benchmarking was to assess and compare computer tools designed to support TDM clinical activities.¦Method: Literature and Internet search was performed to identify software. All programs were tested on common personal computer. Each program was scored against a standardized grid covering pharmacokinetic relevance, user-friendliness, computing aspects, interfacing, and storage. A weighting factor was applied to each criterion of the grid to consider its relative importance. To assess the robustness of the software, six representative clinical vignettes were also processed through all of them.¦Results: 12 software tools were identified, tested and ranked. It represents a comprehensive review of the available software's characteristics. Numbers of drugs handled vary widely and 8 programs offer the ability to the user to add its own drug model. 10 computer programs are able to compute Bayesian dosage adaptation based on a blood concentration (a posteriori adjustment) while 9 are also able to suggest a priori dosage regimen (prior to any blood concentration measurement), based on individual patient covariates, such as age, gender, weight. Among those applying Bayesian analysis, one uses the non-parametric approach. The top 2 software emerging from this benchmark are MwPharm and TCIWorks. Other programs evaluated have also a good potential but are less sophisticated (e.g. in terms of storage or report generation) or less user-friendly.¦Conclusion: Whereas 2 integrated programs are at the top of the ranked listed, such complex tools would possibly not fit all institutions, and each software tool must be regarded with respect to individual needs of hospitals or clinicians. Interest in computing tool to support therapeutic monitoring is still growing. Although developers put efforts into it the last years, there is still room for improvement, especially in terms of institutional information system interfacing, user-friendliness, capacity of data storage and report generation.
Resumo:
Objectives: Therapeutic drug monitoring (TDM) aims at optimizing treatment by individualizing dosage regimen based on blood concentrations measurement. Maintaining concentrations within a target range requires pharmacokinetic (PK) and clinical capabilities. Bayesian calculation represents a gold standard in TDM approach but requires computing assistance. The aim of this benchmarking was to assess and compare computer tools designed to support TDM clinical activities.¦Methods: Literature and Internet were searched to identify software. Each program was scored against a standardized grid covering pharmacokinetic relevance, user-friendliness, computing aspects, interfacing, and storage. A weighting factor was applied to each criterion of the grid to consider its relative importance. To assess the robustness of the software, six representative clinical vignettes were also processed through all of them.¦Results: 12 software tools were identified, tested and ranked. It represents a comprehensive review of the available software characteristics. Numbers of drugs handled vary from 2 to more than 180, and integration of different population types is available for some programs. Nevertheless, 8 programs offer the ability to add new drug models based on population PK data. 10 computer tools incorporate Bayesian computation to predict dosage regimen (individual parameters are calculated based on population PK models). All of them are able to compute Bayesian a posteriori dosage adaptation based on a blood concentration while 9 are also able to suggest a priori dosage regimen, only based on individual patient covariates. Among those applying Bayesian analysis, MM-USC*PACK uses a non-parametric approach. The top 2 programs emerging from this benchmark are MwPharm and TCIWorks. Others programs evaluated have also a good potential but are less sophisticated or less user-friendly.¦Conclusions: Whereas 2 software packages are ranked at the top of the list, such complex tools would possibly not fit all institutions, and each program must be regarded with respect to individual needs of hospitals or clinicians. Programs should be easy and fast for routine activities, including for non-experienced users. Although interest in TDM tools is growing and efforts were put into it in the last years, there is still room for improvement, especially in terms of institutional information system interfacing, user-friendliness, capability of data storage and automated report generation.
Resumo:
Selección personal de 100 proyectos de digitalización internacionales y locales actualmente en funcionamiento. La elección se ha hecho siguiendo el interés que pueden tener las obras incluidas y en otras ocasiones por los servicios, las herramientas y las aplicaciones diferentes que se proponen. La mayor parte de los recursos tienen como objetivo ofrecer una copia digital de libros pero también se incluyen otras tipologías documentales: publicaciones periódicas, tesis, literatura gris, fotografías, grabados, grabaciones sonoras... Abundan los proyectos realizados por las bibliotecas nacionales y universitarias, los proyectos comerciales más representativos y las colecciones colaborativas. Una tipología a destacar es la de las plataformas que pretenden recuperar y difundir una cultura o una lengua. La lista incluye preferentemente proyectos de bibliotecas digitales con obras de dominio público, las obras de las que se pueden descargar e imprimir, también están incluidos pero materiales de pago. La ordenación de la selección está presentada alfabéticamente.
Resumo:
Este trabalho procura traçar um painel da situação do acervo bibliográfico, em termos de periódicos especializados em ecologia, de algumas das principais universidades brasileiras. A pesquisa baseou-se na escolha de um tópico básico o qual, vem recebendo grande ênfase na literatura internacional ao longo dos últimos anos: a produção secundária. Todos os meios de procura disponíveis, sejam eles convencionais ¾ arquivos, fichários, microfilmes ¾ , ou aqueles recentemente implantados, tais como os bancos de dados em CD ROM, foram empregados. Das dez bibliotecas universitárias selecionadas, apenas a Universidade de São Paulo (USP) pode oferecer um acervo que cobre satisfatoriamente a maioria das referências selecionadas. Um segundo grupo de universidades, composto pela Universidade de Campinas (UNICAMP), Universidade Estadual Paulista (UNESP) e Universidade Federal do Rio de Janeiro (UFRJ) apresentou percentuais de cobertura variando entre 40 e 60%. A maioria das outras bibliotecas universitárias, no entanto, manteve-se em um patamar comparável ao da Universidade Federal de Minas Gerais (UFMG) ou Universidade de Brasília (UNB), ou seja, com percentuais de cobertura inferiores a 40%. Destaca-se ainda, que muitas das universidades selecionadas, mesmo possuindo cursos de pós-graduação em ecologia, exibiram, em seu conjunto, dados que demonstram inequivocamente a pobreza de seu, acervos na área de ecologia
Resumo:
As novas tecnologias da Informação estão criando "bibliotecas sem paredes para livros sem páginas". Mais conhecidas como bibliotecas virtuais, estas novas formas e suportes estão redefinindo os paradigmas atuais sobre informação, comunicação e o próprio âmbito de trabalho dos profissionais da área. Interdisciplinaridade e interatividade tornam-se as novas palavras de ordem. À medida que avançamos na chamada Era da Informação, esta transição faz surgir a necessidade de repensar os modelos éticos, legais, estéticos, culturais, profissionais e outros, estabelecidos pelo suporte impresso. Ocorrendo paralelamente off e on-line, a também chamada Revolução da Informação utiliza ampla gama de aplicativos e equipamentos para tornar-se operativa. Coleção versus acesso, usuário local versus remoto, indexação hierárquica ou hipertextual, imprimir e distribuir ou distribuir e imprimir, navegar no oceano da informação ou afogar-se? Este artigo pretende discutir essas questões dentro de um enfoque diacrônico e interdisciplinar, contribuindo para o debate e a reflexão que a convergência de mídias para o suporte digital faz surgir.
Resumo:
Sintetizam-se, neste artigo, as diretrizes básicas de uma política para a implementação de bibliotecas virtuais no Brasil, as quais devem ser conectadas à Internet, a fim de disponibilizar as fontes de informação de acordo com as normas nacionais e internacionais e as tecnologias modernas. Uma prioridade seria conectar as bibliotecas brasileiras à Internet e aperfeiçoar a capacitação de profissionais da informação, com o intuito de atualizar os conhecimentos nesta área e modernizar os mecanismos de editoração e disseminação da informação, bem como preservar a memória nacional.
Resumo:
A biblioteca virtual no Brasil é uma alternativa para se acessar informação através da Internet na Biblioteca Nacional, em bibliotecas especializadas, universitárias, públicas e escolares. Um levantamento preliminar, realizado pelo IBICT, mostra a distribuição de bibliotecas virtuais por estado e sua presença na Internet. Os principais produtos e serviços disponíveis são o acesso a apontadores, obras digitalizadas, catálogos e informações institucionais. A interligação das diversas bibliotecas minimiza esforços e evita duplicação de trabalho.
Resumo:
The Office of Special Investigations at Iowa Department of Transportation (DOT) collects FWD data on regular basis to evaluate pavement structural conditions. The primary objective of this study was to develop a fully-automated software system for rapid processing of the FWD data along with a user manual. The software system automatically reads the FWD raw data collected by the JILS-20 type FWD machine that Iowa DOT owns, processes and analyzes the collected data with the rapid prediction algorithms developed during the phase I study. This system smoothly integrates the FWD data analysis algorithms and the computer program being used to collect the pavement deflection data. This system can be used to assess pavement condition, estimate remaining pavement life, and eventually help assess pavement rehabilitation strategies by the Iowa DOT pavement management team. This report describes the developed software in detail and can also be used as a user-manual for conducting simulation studies and detailed analyses. *********************** Large File ***********************
Resumo:
Los agentes de software en la era de las redes globales son una herramienta vital para superar el fenómeno llamado "sobrecarga de información". El grado de madurez alcanzado en esta tecnología permite que hoy se puedan ver aplicaciones concretas funcionado en organizaciones, como así también en el escritorio del usuario hogareño. El objetivo de este trabajo es presentar una revisión bibliográfica sobre la tecnología de agentes de software, con orientación a los modelos que permiten gerenciar la sobrecarga de información.