1000 resultados para MEDICIÓN DE SOFTWARE
Resumo:
The aim of this study was to determine the effect of using video analysis software on the interrater reliability of visual assessments of gait videos in children with cerebral palsy. Two clinicians viewed the same random selection of 20 sagittal and frontal video recordings of 12 children with cerebral palsy routinely acquired during outpatient rehabilitation clinics. Both observers rated these videos in a random sequence for each lower limb using the Observational Gait Scale, once with standard video software and another with video analysis software (Dartfish(®)) which can perform angle and timing measurements. The video analysis software improved interrater agreement, measured by weighted Cohen's kappas, for the total score (κ 0.778→0.809) and all of the items that required angle and/or timing measurements (knee position mid-stance κ 0.344→0.591; hindfoot position mid-stance κ 0.160→0.346; foot contact mid-stance κ 0.700→0.854; timing of heel rise κ 0.769→0.835). The use of video analysis software is an efficient approach to improve the reliability of visual video assessments.
Resumo:
Introduction: Therapeutic drug monitoring (TDM) aims at optimizing treatment by individualizing dosage regimen based on measurement of blood concentrations. Maintaining concentrations within a target range requires pharmacokinetic and clinical capabilities. Bayesian calculation represents a gold standard in TDM approach but requires computing assistance. In the last decades computer programs have been developed to assist clinicians in this assignment. The aim of this benchmarking was to assess and compare computer tools designed to support TDM clinical activities.¦Method: Literature and Internet search was performed to identify software. All programs were tested on common personal computer. Each program was scored against a standardized grid covering pharmacokinetic relevance, user-friendliness, computing aspects, interfacing, and storage. A weighting factor was applied to each criterion of the grid to consider its relative importance. To assess the robustness of the software, six representative clinical vignettes were also processed through all of them.¦Results: 12 software tools were identified, tested and ranked. It represents a comprehensive review of the available software's characteristics. Numbers of drugs handled vary widely and 8 programs offer the ability to the user to add its own drug model. 10 computer programs are able to compute Bayesian dosage adaptation based on a blood concentration (a posteriori adjustment) while 9 are also able to suggest a priori dosage regimen (prior to any blood concentration measurement), based on individual patient covariates, such as age, gender, weight. Among those applying Bayesian analysis, one uses the non-parametric approach. The top 2 software emerging from this benchmark are MwPharm and TCIWorks. Other programs evaluated have also a good potential but are less sophisticated (e.g. in terms of storage or report generation) or less user-friendly.¦Conclusion: Whereas 2 integrated programs are at the top of the ranked listed, such complex tools would possibly not fit all institutions, and each software tool must be regarded with respect to individual needs of hospitals or clinicians. Interest in computing tool to support therapeutic monitoring is still growing. Although developers put efforts into it the last years, there is still room for improvement, especially in terms of institutional information system interfacing, user-friendliness, capacity of data storage and report generation.
Resumo:
Objectives: Therapeutic drug monitoring (TDM) aims at optimizing treatment by individualizing dosage regimen based on blood concentrations measurement. Maintaining concentrations within a target range requires pharmacokinetic (PK) and clinical capabilities. Bayesian calculation represents a gold standard in TDM approach but requires computing assistance. The aim of this benchmarking was to assess and compare computer tools designed to support TDM clinical activities.¦Methods: Literature and Internet were searched to identify software. Each program was scored against a standardized grid covering pharmacokinetic relevance, user-friendliness, computing aspects, interfacing, and storage. A weighting factor was applied to each criterion of the grid to consider its relative importance. To assess the robustness of the software, six representative clinical vignettes were also processed through all of them.¦Results: 12 software tools were identified, tested and ranked. It represents a comprehensive review of the available software characteristics. Numbers of drugs handled vary from 2 to more than 180, and integration of different population types is available for some programs. Nevertheless, 8 programs offer the ability to add new drug models based on population PK data. 10 computer tools incorporate Bayesian computation to predict dosage regimen (individual parameters are calculated based on population PK models). All of them are able to compute Bayesian a posteriori dosage adaptation based on a blood concentration while 9 are also able to suggest a priori dosage regimen, only based on individual patient covariates. Among those applying Bayesian analysis, MM-USC*PACK uses a non-parametric approach. The top 2 programs emerging from this benchmark are MwPharm and TCIWorks. Others programs evaluated have also a good potential but are less sophisticated or less user-friendly.¦Conclusions: Whereas 2 software packages are ranked at the top of the list, such complex tools would possibly not fit all institutions, and each program must be regarded with respect to individual needs of hospitals or clinicians. Programs should be easy and fast for routine activities, including for non-experienced users. Although interest in TDM tools is growing and efforts were put into it in the last years, there is still room for improvement, especially in terms of institutional information system interfacing, user-friendliness, capability of data storage and automated report generation.
Resumo:
The Office of Special Investigations at Iowa Department of Transportation (DOT) collects FWD data on regular basis to evaluate pavement structural conditions. The primary objective of this study was to develop a fully-automated software system for rapid processing of the FWD data along with a user manual. The software system automatically reads the FWD raw data collected by the JILS-20 type FWD machine that Iowa DOT owns, processes and analyzes the collected data with the rapid prediction algorithms developed during the phase I study. This system smoothly integrates the FWD data analysis algorithms and the computer program being used to collect the pavement deflection data. This system can be used to assess pavement condition, estimate remaining pavement life, and eventually help assess pavement rehabilitation strategies by the Iowa DOT pavement management team. This report describes the developed software in detail and can also be used as a user-manual for conducting simulation studies and detailed analyses. *********************** Large File ***********************
Resumo:
Los agentes de software en la era de las redes globales son una herramienta vital para superar el fenómeno llamado "sobrecarga de información". El grado de madurez alcanzado en esta tecnología permite que hoy se puedan ver aplicaciones concretas funcionado en organizaciones, como así también en el escritorio del usuario hogareño. El objetivo de este trabajo es presentar una revisión bibliográfica sobre la tecnología de agentes de software, con orientación a los modelos que permiten gerenciar la sobrecarga de información.
Resumo:
Apresenta um método para avaliação e seleção de softwares de automação de bibliotecas. Consiste na atribuição de critérios e cálculos estatísticos em uma lista elaborada para a seleção e avaliação deste tipo de software. Este método pretender servir como instrumento de apoio à tomada de decisão no processo de escolha do software mais adequado às necessidades de cada instituição. Este trabalho foi motivado por uma demanda do Instituto Brasileiro de Informação em Ciência e Tecnologia (IBICT) para automatizar a sua biblioteca.
Resumo:
This paper examines statistical analysis of social reciprocity at group, dyadic, and individual levels. Given that testing statistical hypotheses regarding social reciprocity can be also of interest, a statistical procedure based on Monte Carlo sampling has been developed and implemented in R in order to allow social researchers to describe groups and make statistical decisions.
Resumo:
Abstract
Resumo:
The book presents the state of the art in machine learning algorithms (artificial neural networks of different architectures, support vector machines, etc.) as applied to the classification and mapping of spatially distributed environmental data. Basic geostatistical algorithms are presented as well. New trends in machine learning and their application to spatial data are given, and real case studies based on environmental and pollution data are carried out. The book provides a CD-ROM with the Machine Learning Office software, including sample sets of data, that will allow both students and researchers to put the concepts rapidly to practice.
Resumo:
Measuring the height of the vertical jump is an indicator of the strength and power of the lower body. The technological tools available to measure the vertical jump are black boxes and are not open to third-party verification or adaptation. We propose the creation of a measurement system called Chronojump-Boscosystem, consisting of open hardware and free software. Methods: A microcontroller was created and validated using a square wave generator and an oscilloscope. Two types of contact platforms were developed using different materials. These platforms were validated by the minimum pressure required for activation at different points by a strain gauge, together with the on/off time of our platforms in respect of the Ergojump-Boscosystem platform by a sample of 8 subjects performing submaximal jumps with one foot on each platform. Agile methodologies were used to develop and validate the software. Results: All the tools fall under the free software / open hardware guidelines and are, in that sense, free. The microcontroller margin of error is 0.1%. The validity of the fiberglass platform is 0.95 (ICC). The management software contains nearly 113.000 lines of code and is available in 7 languages.
Resumo:
La formació de traductors implica l´ús de procediments i eines que permetin els estudiants familiaritzar-se amb contextos professionals. El software lliure especialitzat inclou eines de qualitat professional i procediments accessibles per a les institucions acadèmiques i els estudiants a distància que treballen a casa seva. Els projectes reals que utilitzen software lliure i traducció col·laborativa (crowdsourcing) constitueixen recursos indispensables en la formació de traductors.
Resumo:
Debido a la necesidad de diferenciarse y hacer frente a la competencia, las empresas han apostado por desarrollar operaciones que den valor al cliente, por eso muchas de ellas han visto en las herramientas lean la oportunidad para mejorar sus operaciones. Esta mejora implica la reducción de dinero, personas, equipos grandes, inventario y espacio, con dos objetivos: eliminar despilfarro y reducir la variabilidad. Para conseguir los objetivos estratégicos de la empresa es imprescindible qué éstos estén alineados con los planes de la gerencia a nivel medio y a su vez con el trabajo realizado por los empleados para asegurar que cada persona está alineada en la misma dirección y al mismo tiempo. Ésta es la filosofía de la planificación estratégica. Por ello uno de los objetivos de este proyecto será el desarrollar una herramienta que facilite la exposición de los objetivos de la empresa y la comunicación de los mismos a todos los niveles de la organización para a partir de ellos y tomando como referencia la necesidad de reducir inventarios en la cadena de suministro se realizará un estudio de la producción de un componente de control del aerogenerador para conseguir nivelarla y reducir su inventario de producto terminado. Los objetivos particulares en este apartado serán reducir el inventario en un 28%, nivelar la producción reduciendo la variabilidad del 31% al 24%, mantener un stock máximo de 24 unidades garantizando el suministro ante una demanda variable, incrementar la rotación del inventario en un 10% y establecer un plan de acción para reducir el lead time entre un 40-50%. Todo ello será posible gracias a la realización del mapa de valor presente y futuro para eliminar desperdicios y crear un flujo continuo y el cálculo de un supermercado que mantenga el stock en un nivel óptimo.