942 resultados para software, translation, validation tool, VMNET, Wikipedia, XML
Resumo:
El departament d’electrònica i telecomunicacions de la Universitat de Vic ha dissenyat un conjunt de plaques entrenadores amb finalitat educativa. Perquè els alumnes puguin utilitzar aquestes plaques com a eina d’estudi, és necessari disposar d’un sistema de gravació econòmic i còmode. La major part dels programadors, en aquest cas, no compleixen amb aquests requeriments. L’objectiu d’aquest projecte és dissenyar un sistema de programació que utilitzi la comunicació sèrie i que no requereixi d'un hardware ni software específics. D’aquesta manera, obtenim una placa autònoma i un programador gratuït, de muntatge ràpid i simple d’utilitzar. El sistema de gravació dissenyat s’ha dividit en tres blocs. Per una banda, un programa que anomenem “programador” encarregat de transferir codi de programa des de l’ordinador al microcontrolador de la placa entrenadora. Per altra banda, un programa anomenat “bootloader”, situat al microcontrolador, permet rebre aquest codi de programa i emmagatzemar-lo a les direccions de memòria de programa corresponents. Com a tercer bloc, s’implementa un protocol de comunicació i un sistema de control d’errors per tal d’assegurar una correcta comunicació entre el “programador” i el “bootloader”. Els objectius d’aquest projecte s’han complert i per les proves realitzades, el sistema de programació ha funcionat correctament.
Resumo:
The proposal to work on this final project came after several discussions held with Dr. Elzbieta Malinowski Gadja, who in 2008 published the book entitled Advanced Data Warehouse Design: From Conventional to Spatial and Temporal Applications (Data-Centric Systems and Applications). The project was carried out under the technical supervision of Dr. Malinowski and the direct beneficiary was the University of Costa Rica (UCR) where Dr. Malinowski is a professor at the Department of Computer Science and Informatics. The purpose of this project was twofold: First, to translate chapter III of said book with the intention of generating educational material for the use of the UCR and, second, to venture in the field of technical translation related to data warehouse. For the first component, the goal was to generate a final product that would eventually serve as an educational tool for the post-graduate courses of the UCR. For the second component, this project allowed me to acquire new skills and put into practice techniques that have helped me not only to perfom better in my current job as an Assistant Translator of the Inter-American BAnk (IDB), but also to use them in similar projects. The process was lenggthy and required torough research and constant communication with the author. The investigation focused on the search of terms and definitions to prepare the glossary, which was the basis to start the translation project. The translation process itself was carried out by phases, so that comments and corrections by the author could be taken into account in subsequent stages. Later, based on the glossary and the translated text, illustrations had been created in the Visio software were translated. In addition to the technical revision by the author, professor Carme Mangiron was in charge of revising the non-technical text. The result was a high-quality document that is currently used as reference and study material by the Department of Computer Science and Informatics of Costa Rica.
Resumo:
Extensible Markup Language (XML) is a generic computing language that provides an outstanding case study of commodification of service standards. The development of this language in the late 1990s marked a shift in computer science as its extensibility let store and share any kind of data. Many office suites software rely on it. The chapter highlights how the largest multinational firms pay special attention to gain a recognised international standard for such a major technological innovation. It argues that standardisation processes affects market structures and can lead to market capture. By examining how a strategic use of standardisation arenas can generate profits, it shows that Microsoft succeeded in making its own technical solution a recognised ISO standard in 2008, while the same arena already adopted two years earlier the open source standard set by IBM and Sun Microsystems. Yet XML standardisation also helped to establish a distinct model of information technology services at the expense of Microsoft monopoly on proprietary software
Resumo:
A recurring task in the analysis of mass genome annotation data from high-throughput technologies is the identification of peaks or clusters in a noisy signal profile. Examples of such applications are the definition of promoters on the basis of transcription start site profiles, the mapping of transcription factor binding sites based on ChIP-chip data and the identification of quantitative trait loci (QTL) from whole genome SNP profiles. Input to such an analysis is a set of genome coordinates associated with counts or intensities. The output consists of a discrete number of peaks with respective volumes, extensions and center positions. We have developed for this purpose a flexible one-dimensional clustering tool, called MADAP, which we make available as a web server and as standalone program. A set of parameters enables the user to customize the procedure to a specific problem. The web server, which returns results in textual and graphical form, is useful for small to medium-scale applications, as well as for evaluation and parameter tuning in view of large-scale applications, requiring a local installation. The program written in C++ can be freely downloaded from ftp://ftp.epd.unil.ch/pub/software/unix/madap. The MADAP web server can be accessed at http://www.isrec.isb-sib.ch/madap/.
Resumo:
Guilbert ER, Morin D, Guilbert AC, Gagnon H, Robitaille J, Richardson M. International Journal of Nursing Practice 2011; 17: 315-321 Task-shifting in the delivery of hormonal contraceptive methods: Validation of a questionnaire and preliminary results In order to palliate the access problem to effective contraceptive methods in Quebec, Canada, as well as to legitimate nurses' practices in family planning, a collaborative agreement was developed that allow nurses, in conjunction with pharmacists, to give hormonal contraceptives to healthy women of reproductive age for a 6 month period. Training in hormonal contraception was offered to targeted nurses before they could begin this practice. A questionnaire, based on Rogers's theory of diffusion of innovations, was elaborated and validated to specifically evaluate this phenomenon. Preliminary results show that the translation of training into practice might be suboptimal. The validated questionnaire can now be used to fully understand the set of factors influencing this new practice.
Resumo:
BACKGROUND: A 70-gene signature was previously shown to have prognostic value in patients with node-negative breast cancer. Our goal was to validate the signature in an independent group of patients. METHODS: Patients (n = 307, with 137 events after a median follow-up of 13.6 years) from five European centers were divided into high- and low-risk groups based on the gene signature classification and on clinical risk classifications. Patients were assigned to the gene signature low-risk group if their 5-year distant metastasis-free survival probability as estimated by the gene signature was greater than 90%. Patients were assigned to the clinicopathologic low-risk group if their 10-year survival probability, as estimated by Adjuvant! software, was greater than 88% (for estrogen receptor [ER]-positive patients) or 92% (for ER-negative patients). Hazard ratios (HRs) were estimated to compare time to distant metastases, disease-free survival, and overall survival in high- versus low-risk groups. RESULTS: The 70-gene signature outperformed the clinicopathologic risk assessment in predicting all endpoints. For time to distant metastases, the gene signature yielded HR = 2.32 (95% confidence interval [CI] = 1.35 to 4.00) without adjustment for clinical risk and hazard ratios ranging from 2.13 to 2.15 after adjustment for various estimates of clinical risk; clinicopathologic risk using Adjuvant! software yielded an unadjusted HR = 1.68 (95% CI = 0.92 to 3.07). For overall survival, the gene signature yielded an unadjusted HR = 2.79 (95% CI = 1.60 to 4.87) and adjusted hazard ratios ranging from 2.63 to 2.89; clinicopathologic risk yielded an unadjusted HR = 1.67 (95% CI = 0.93 to 2.98). For patients in the gene signature high-risk group, 10-year overall survival was 0.69 for patients in both the low- and high-clinical risk groups; for patients in the gene signature low-risk group, the 10-year survival rates were 0.88 and 0.89, respectively. CONCLUSIONS: The 70-gene signature adds independent prognostic information to clinicopathologic risk assessment for patients with early breast cancer.
Resumo:
OBJECTIVE: To evaluate an automated seizure detection (ASD) algorithm in EEGs with periodic and other challenging patterns. METHODS: Selected EEGs recorded in patients over 1year old were classified into four groups: A. Periodic lateralized epileptiform discharges (PLEDs) with intermixed electrical seizures. B. PLEDs without seizures. C. Electrical seizures and no PLEDs. D. No PLEDs or seizures. Recordings were analyzed by the Persyst P12 software, and compared to the raw EEG, interpreted by two experienced neurophysiologists; Positive percent agreement (PPA) and false-positive rates/hour (FPR) were calculated. RESULTS: We assessed 98 recordings (Group A=21 patients; B=29, C=17, D=31). Total duration was 82.7h (median: 1h); containing 268 seizures. The software detected 204 (=76.1%) seizures; all ictal events were captured in 29/38 (76.3%) patients; in only in 3 (7.7%) no seizures were detected. Median PPA was 100% (range 0-100; interquartile range 50-100), and the median FPR 0/h (range 0-75.8; interquartile range 0-4.5); however, lower performances were seen in the groups containing periodic discharges. CONCLUSION: This analysis provides data regarding the yield of the ASD in a particularly difficult subset of EEG recordings, showing that periodic discharges may bias the results. SIGNIFICANCE: Ongoing refinements in this technique might enhance its utility and lead to a more extensive application.
Resumo:
Introduction: Therapeutic drug monitoring (TDM) aims at optimizing treatment by individualizing dosage regimen based on measurement of blood concentrations. Maintaining concentrations within a target range requires pharmacokinetic and clinical capabilities. Bayesian calculation represents a gold standard in TDM approach but requires computing assistance. In the last decades computer programs have been developed to assist clinicians in this assignment. The aim of this benchmarking was to assess and compare computer tools designed to support TDM clinical activities.¦Method: Literature and Internet search was performed to identify software. All programs were tested on common personal computer. Each program was scored against a standardized grid covering pharmacokinetic relevance, user-friendliness, computing aspects, interfacing, and storage. A weighting factor was applied to each criterion of the grid to consider its relative importance. To assess the robustness of the software, six representative clinical vignettes were also processed through all of them.¦Results: 12 software tools were identified, tested and ranked. It represents a comprehensive review of the available software's characteristics. Numbers of drugs handled vary widely and 8 programs offer the ability to the user to add its own drug model. 10 computer programs are able to compute Bayesian dosage adaptation based on a blood concentration (a posteriori adjustment) while 9 are also able to suggest a priori dosage regimen (prior to any blood concentration measurement), based on individual patient covariates, such as age, gender, weight. Among those applying Bayesian analysis, one uses the non-parametric approach. The top 2 software emerging from this benchmark are MwPharm and TCIWorks. Other programs evaluated have also a good potential but are less sophisticated (e.g. in terms of storage or report generation) or less user-friendly.¦Conclusion: Whereas 2 integrated programs are at the top of the ranked listed, such complex tools would possibly not fit all institutions, and each software tool must be regarded with respect to individual needs of hospitals or clinicians. Interest in computing tool to support therapeutic monitoring is still growing. Although developers put efforts into it the last years, there is still room for improvement, especially in terms of institutional information system interfacing, user-friendliness, capacity of data storage and report generation.
Resumo:
The objective of this research is to determine whether the nationally calibrated performance models used in the Mechanistic-Empirical Pavement Design Guide (MEPDG) provide a reasonable prediction of actual field performance, and if the desired accuracy or correspondence exists between predicted and monitored performance for Iowa conditions. A comprehensive literature review was conducted to identify the MEPDG input parameters and the MEPDG verification/calibration process. Sensitivities of MEPDG input parameters to predictions were studied using different versions of the MEPDG software. Based on literature review and sensitivity analysis, a detailed verification procedure was developed. A total of sixteen different types of pavement sections across Iowa, not used for national calibration in NCHRP 1-47A, were selected. A database of MEPDG inputs and the actual pavement performance measures for the selected pavement sites were prepared for verification. The accuracy of the MEPDG performance models for Iowa conditions was statistically evaluated. The verification testing showed promising results in terms of MEPDG’s performance prediction accuracy for Iowa conditions. Recalibrating the MEPDG performance models for Iowa conditions is recommended to improve the accuracy of predictions. ****************** Large File**************************
Resumo:
This work is divided into three volumes: Volume I: Strain-Based Damage Detection; Volume II: Acceleration-Based Damage Detection; Volume III: Wireless Bridge Monitoring Hardware. Volume I: In this work, a previously-developed structural health monitoring (SHM) system was advanced toward a ready-for-implementation system. Improvements were made with respect to automated data reduction/analysis, data acquisition hardware, sensor types, and communication network architecture. The statistical damage-detection tool, control-chart-based damage-detection methodologies, were further investigated and advanced. For the validation of the damage-detection approaches, strain data were obtained from a sacrificial specimen attached to the previously-utilized US 30 Bridge over the South Skunk River (in Ames, Iowa), which had simulated damage,. To provide for an enhanced ability to detect changes in the behavior of the structural system, various control chart rules were evaluated. False indications and true indications were studied to compare the damage detection ability in regard to each methodology and each control chart rule. An autonomous software program called Bridge Engineering Center Assessment Software (BECAS) was developed to control all aspects of the damage detection processes. BECAS requires no user intervention after initial configuration and training. Volume II: In this work, a previously developed structural health monitoring (SHM) system was advanced toward a ready-for-implementation system. Improvements were made with respect to automated data reduction/analysis, data acquisition hardware, sensor types, and communication network architecture. The objective of this part of the project was to validate/integrate a vibration-based damage-detection algorithm with the strain-based methodology formulated by the Iowa State University Bridge Engineering Center. This report volume (Volume II) presents the use of vibration-based damage-detection approaches as local methods to quantify damage at critical areas in structures. Acceleration data were collected and analyzed to evaluate the relationships between sensors and with changes in environmental conditions. A sacrificial specimen was investigated to verify the damage-detection capabilities and this volume presents a transmissibility concept and damage-detection algorithm that show potential to sense local changes in the dynamic stiffness between points across a joint of a real structure. The validation and integration of the vibration-based and strain-based damage-detection methodologies will add significant value to Iowa’s current and future bridge maintenance, planning, and management Volume III: In this work, a previously developed structural health monitoring (SHM) system was advanced toward a ready-for-implementation system. Improvements were made with respect to automated data reduction/analysis, data acquisition hardware, sensor types, and communication network architecture. This report volume (Volume III) summarizes the energy harvesting techniques and prototype development for a bridge monitoring system that uses wireless sensors. The wireless sensor nodes are used to collect strain measurements at critical locations on a bridge. The bridge monitoring hardware system consists of a base station and multiple self-powered wireless sensor nodes. The base station is responsible for the synchronization of data sampling on all nodes and data aggregation. Each wireless sensor node include a sensing element, a processing and wireless communication module, and an energy harvesting module. The hardware prototype for a wireless bridge monitoring system was developed and tested on the US 30 Bridge over the South Skunk River in Ames, Iowa. The functions and performance of the developed system, including strain data, energy harvesting capacity, and wireless transmission quality, were studied and are covered in this volume.
Resumo:
In recent years, protein-ligand docking has become a powerful tool for drug development. Although several approaches suitable for high throughput screening are available, there is a need for methods able to identify binding modes with high accuracy. This accuracy is essential to reliably compute the binding free energy of the ligand. Such methods are needed when the binding mode of lead compounds is not determined experimentally but is needed for structure-based lead optimization. We present here a new docking software, called EADock, that aims at this goal. It uses an hybrid evolutionary algorithm with two fitness functions, in combination with a sophisticated management of the diversity. EADock is interfaced with the CHARMM package for energy calculations and coordinate handling. A validation was carried out on 37 crystallized protein-ligand complexes featuring 11 different proteins. The search space was defined as a sphere of 15 A around the center of mass of the ligand position in the crystal structure, and on the contrary to other benchmarks, our algorithm was fed with optimized ligand positions up to 10 A root mean square deviation (RMSD) from the crystal structure, excluding the latter. This validation illustrates the efficiency of our sampling strategy, as correct binding modes, defined by a RMSD to the crystal structure lower than 2 A, were identified and ranked first for 68% of the complexes. The success rate increases to 78% when considering the five best ranked clusters, and 92% when all clusters present in the last generation are taken into account. Most failures could be explained by the presence of crystal contacts in the experimental structure. Finally, the ability of EADock to accurately predict binding modes on a real application was illustrated by the successful docking of the RGD cyclic pentapeptide on the alphaVbeta3 integrin, starting far away from the binding pocket.
Resumo:
Alcoholism is a chronic disease and the evaluation of its burden usually focuses on long-term co-morbidity and mortality. Clinical Trials evaluating new interventions for alcohol-dependent patients rarely last more than 12 to 24 months. OBJECTIVES: Develop a questionnaire capable of capturing principal resource use yet sensitive enough to show short-term economic benefit of drugs developed to reduce consump¬tion in alcohol-dependent patients. METHODS: Comprehensive Medline literature search using keywords: Alcohol-related-disorders, economics, cost of illness. Further, experts panel discussions provided additional data. RESULTS: Two key cost drivers, hospitalisation and sick leaves were identified by the literature review. Expert findings related to costs of social consequences were incorporated. These three important resources were included in the questionnaire in addition to standard medical resource use consumption input. Finally, the following items were included: consultation visits, hospitalisations, sick leaves and working situation, living situation, social environ¬ment, accidents, arrests and domestic violence. The recall period is 3 months. DISCUSSION: A great deal of information is collected in this questionnaire in order to capture all relevant resources. Tests to validate the questionnaire in a real-life setting will be conducted (face validity, concurrent validity, and test-retest) in a cohort of dependent patients initiated at Lausanne University hospital ( Switzerland). Items not sensitive enough to capture short-term costs and consequences will be removed. Translation into other major languages and adaptation to different settings after cultural validation is planned. CONCLUSIONS: Publication of this tool should facilitate additional knowledge about resource utilisation at the patient level and enable evaluation of short-term economic impact of pharmacological and non-pharmacological interventions.
Resumo:
We present the first steps in the validation of an observational tool for father-mother-infant interactions: the FAAS (Family Alliance Assessment Scales). Family-level variables are acknowledged as unique contributors to the understanding of the socio-affective development of the child, yet producing reliable assessments of family-level interactions poses a methodological challenge. There is, therefore, a clear need for a validated and clinically relevant tool. This validation study has been carried out on three samples: one non-referred sample, of families taking part in a study on the transition to parenthood (normative sample; n = 30), one referred for medically assisted procreation (infertility sample; n = 30) and one referred for a psychiatric condition in one parent (clinical sample; n = 15). Results show that the FAAS scales have (1) good inter-rater reliability and (2) good validity, as assessed through known-group validity by comparing the three samples and through concurrent validity by checking family interactions against parents' self-reported marital satisfaction.
Resumo:
Epigenetic silencing of the DNA repair protein O(6)-methylguanine-DNA methyltransferase (MGMT) by promoter methylation predicts successful alkylating agent therapy, such as with temozolomide, in glioblastoma patients. Stratified therapy assignment of patients in prospective clinical trials according to tumor MGMT status requires a standardized diagnostic test, suitable for high-throughput analysis of small amounts of formalin-fixed, paraffin-embedded tumor tissue. A direct, real-time methylation-specific PCR (MSP) assay was developed to determine methylation status of the MGMT gene promoter. Assay specificity was obtained by selective amplification of methylated DNA sequences of sodium bisulfite-modified DNA. The copy number of the methylated MGMT promoter, normalized to the beta-actin gene, provides a quantitative test result. We analyzed 134 clinical glioma samples, comparing the new test with the previously validated nested gel-based MSP assay, which yields a binary readout. A cut-off value for the MGMT methylation status was suggested by fitting a bimodal normal mixture model to the real-time results, supporting the hypothesis that there are two distinct populations within the test samples. Comparison of the tests showed high concordance of the results (82/91 [90%]; Cohen's kappa = 0.80; 95% confidence interval, 0.82-0.95). The direct, real-time MSP assay was highly reproducible (Pearson correlation 0.996) and showed valid test results for 93% (125/134) of samples compared with 75% (94/125) for the nested, gel-based MSP assay. This high-throughput test provides an important pharmacogenomic tool for individualized management of alkylating agent chemotherapy.
Resumo:
Debido a la necesidad de diferenciarse y hacer frente a la competencia, las empresas han apostado por desarrollar operaciones que den valor al cliente, por eso muchas de ellas han visto en las herramientas lean la oportunidad para mejorar sus operaciones. Esta mejora implica la reducción de dinero, personas, equipos grandes, inventario y espacio, con dos objetivos: eliminar despilfarro y reducir la variabilidad. Para conseguir los objetivos estratégicos de la empresa es imprescindible qué éstos estén alineados con los planes de la gerencia a nivel medio y a su vez con el trabajo realizado por los empleados para asegurar que cada persona está alineada en la misma dirección y al mismo tiempo. Ésta es la filosofía de la planificación estratégica. Por ello uno de los objetivos de este proyecto será el desarrollar una herramienta que facilite la exposición de los objetivos de la empresa y la comunicación de los mismos a todos los niveles de la organización para a partir de ellos y tomando como referencia la necesidad de reducir inventarios en la cadena de suministro se realizará un estudio de la producción de un componente de control del aerogenerador para conseguir nivelarla y reducir su inventario de producto terminado. Los objetivos particulares en este apartado serán reducir el inventario en un 28%, nivelar la producción reduciendo la variabilidad del 31% al 24%, mantener un stock máximo de 24 unidades garantizando el suministro ante una demanda variable, incrementar la rotación del inventario en un 10% y establecer un plan de acción para reducir el lead time entre un 40-50%. Todo ello será posible gracias a la realización del mapa de valor presente y futuro para eliminar desperdicios y crear un flujo continuo y el cálculo de un supermercado que mantenga el stock en un nivel óptimo.