995 resultados para software validation
Resumo:
Jatkuva laadunmittaus osana ohjelmistoprosessia on yleistynyt ohjelmistoyritysten keskuudessa viime vuosien aikana. ISO 9001:2000 -laatustandardi vaatii yrityksiltä tuotteiden ja prosessien laadun mittaamista ja seuraamista. Laadun mittareiden valinta on haastava tehtävä. Yritykset luulevat usein mittaavansa laatua, vaikka ne todellisuudessa mittaavatkin ohjelmistojen eri ominaisuuksia kuten kokoa tai monimutkaisuutta. Tässä diplomityössä kehitetään ohjelmistojen validointiprosessiin vertailuun perustuva laadunmittausprosessi ohjelmistotuotteiden laadun arviointiin, mittaamiseen ja seurantaan. Laatumittarit valitaan ennalta määriteltyjen kriteereiden mukaisesti, ja niille asetetaan tavoitearvot vertailuanalyysistä saatujen tulosten perusteella. Laadunmittausprosessin lisäksi työssä annetaan suositus prosessin käyttöönotosta ja käytöstä osana yrityksen toimintaa, mikä mahdollistaa jatkuvan seurannan sekä kehityksen tulevaisuudessa.
Resumo:
Purpose: The aim of this study was to compare 2 different methods of assessment of implants at different inclinations (90 degrees and 65 degrees)-with a profilometer and AutoCAD software. Materials and Methods: Impressions (n = 5) of a metal matrix containing 2 implants, 1 at 90 degrees to the surface and 1 at 65 degrees to the surface, were obtained with square impression copings joined together with dental floss splinting covered with autopolymerizing acrylic resin, an open custom tray, and vinyl polysiloxane impression material. Measurement of the angles (in degrees) of the implant analogs were assessed by the same blinded operator with a profilometer and through analysis of digitized images by AutoCAD software. For each implant analog, 3 readings were performed with each method. The results were subjected to a nonparametric Kruskal-Wallis test, with P <= .05 considered significant. Results: For implants perpendicular to the horizontal surface of the specimen (90 degrees), there were no significant differences between the mean measurements obtained with the profilometer (90.04 degrees) and AutoCAD (89.95 degrees; P=.9142). In the analyses of the angled implants at 65 degrees in relation to the horizontal surface of the specimen, significant differences were observed (P=.0472) between the mean readings with the profilometer (65.73 degrees) and AutoCAD (66.25 degrees). Conclusions: The degrees of accuracy of implant angulation recording vary among the techniques available and may vary depending on the angle of the implant. Further investigation is needed to determine the best test conditions and the best measuring technique for determination of the angle of the implant in vitro.
Resumo:
Trabajo realizado en la empresa ULMA Embedded Solutions
Resumo:
PoliEstudio 1.0 is a computational tool, with free license, created to work with polynomial expressions in one variable and it was created by a team in which the authors of this article are part of. This article documents the qualitative validation performed to this software which main objective was to bring to the Costa Rican Educational System a validated educational software that can solve, partially, the problems that nowadays exists in the mathematic education of secondary students, particularly in the topics related to polynomial expressions in one variable and specifically to those students who are in eighth grade.
Resumo:
Introduction To guarantee the success of a virtual library it is essential that all users can access all the library resources independently of the user’s location. Achieving this goal in the Andalusian Public Health System has been a particularly difficult task, due to it is made up of 10 research centers and 95.000 health-care professionals. Aims Since the BV-SSPA started three years ago, one of its mayor aims has been to provide remote access to all its resources in this complex scenario, as well as facilitate the access to the virtual library to both professionals and citizens. IP access was guaranteed because health-care professionals could access everything from their workplaces thanks to the intranet, but it was restricted when they were not there. The BV-SSPA solved this problem by installing a federated authentication and authorization system called PAPI and a PAPI rewriting proxy. After three years the BV-SSPA has met a new challenge: adapting its federated access system to Metalib and SFX, specifically the access management module PDS had to be connected with the existing PAPI system. This new challenge came along with the introduction of a new metasearcher and link resolver. Material and Methods Initially there were three independent systems: • A Metalib and SFX PDS module, • A federated authentication and authorization system: PAPI. • A PAPI Rewriting Proxy. The chosen solution went through the reutilization of the existing software. To achieve this goal, a PHP connector between these applications was developed and several modules in the PDS configuration were modified. On the other hand, providing a simplified access to Metalib has been solved using Xerxes and integrating it in a Drupal website. Results Thanks to this connector the BV-SSPA was able to get all its users remotely accessing its new metasearcher without changing the way they used to validate, or without having to remember a new username and password. Futhermore, thanks to Xerxes, it is possible to use Metalib from a simple interface and without having to leave the BV-SSPA website to go its native interface.
Resumo:
The variability observed in drug exposure has a direct impact on the overall response to drug. The largest part of variability between dose and drug response resides in the pharmacokinetic phase, i.e. in the dose-concentration relationship. Among possibilities offered to clinicians, Therapeutic Drug Monitoring (TDM; Monitoring of drug concentration measurements) is one of the useful tool to guide pharmacotherapy. TDM aims at optimizing treatments by individualizing dosage regimens based on blood drug concentration measurement. Bayesian calculations, relying on population pharmacokinetic approach, currently represent the gold standard TDM strategy. However, it requires expertise and computational assistance, thus limiting its large implementation in routine patient care. The overall objective of this thesis was to implement robust tools to provide Bayesian TDM to clinician in modern routine patient care. To that endeavour, aims were (i) to elaborate an efficient and ergonomic computer tool for Bayesian TDM: EzeCHieL (ii) to provide algorithms for drug concentration Bayesian forecasting and software validation, relying on population pharmacokinetics (iii) to address some relevant issues encountered in clinical practice with a focus on neonates and drug adherence. First, the current stage of the existing software was reviewed and allows establishing specifications for the development of EzeCHieL. Then, in close collaboration with software engineers a fully integrated software, EzeCHieL, has been elaborated. EzeCHieL provides population-based predictions and Bayesian forecasting and an easy-to-use interface. It enables to assess the expectedness of an observed concentration in a patient compared to the whole population (via percentiles), to assess the suitability of the predicted concentration relative to the targeted concentration and to provide dosing adjustment. It allows thus a priori and a posteriori Bayesian drug dosing individualization. Implementation of Bayesian methods requires drug disposition characterisation and variability quantification trough population approach. Population pharmacokinetic analyses have been performed and Bayesian estimators have been provided for candidate drugs in population of interest: anti-infectious drugs administered to neonates (gentamicin and imipenem). Developed models were implemented in EzeCHieL and also served as validation tool in comparing EzeCHieL concentration predictions against predictions from the reference software (NONMEM®). Models used need to be adequate and reliable. For instance, extrapolation is not possible from adults or children to neonates. Therefore, this work proposes models for neonates based on the developmental pharmacokinetics concept. Patients' adherence is also an important concern for drug models development and for a successful outcome of the pharmacotherapy. A last study attempts to assess impact of routine patient adherence measurement on models definition and TDM interpretation. In conclusion, our results offer solutions to assist clinicians in interpreting blood drug concentrations and to improve the appropriateness of drug dosing in routine clinical practice.
Resumo:
Dynamic analysis is an increasingly important means of supporting software validation and maintenance. To date, developers of dynamic analyses have used low-level instrumentation and debug interfaces to realize their analyses. Many dynamic analyses, however, share multiple common high-level requirements, e.g., capture of program data state as well as events, and efficient and accurate event capture in the presence of threading. We present SOFYA – an infra-structure designed to provide high-level, efficient, concurrency-aware support for building analyses that reason about rich observations of program data and events. It provides a layered, modular architecture, which has been successfully used to rapidly develop and evaluate a variety of demanding dynamic program analyses. In this paper, we describe the SOFYA framework, the challenges it addresses, and survey several such analyses.
Resumo:
Nearly all biologic tissues exhibit viscoelastic behavior. This behavior is characterized by hysteresis in the response of the material to load or strain. This information can be utilized in extrapolation of life expectancy of vascular implant materials including native tissues and synthetic materials. This behavior is exhibited in many engineering materials as well such as the polymers PTFE, polyamide, polyethylene, etc. While procedures have been developed for evaluating the engineering polymers the techniques for biologic tissues are not as mature. There are multiple reasons for this. A major one is a cultural divide between the medical and engineering communities. Biomedical engineers are beginning to fill that void. A digitally controlled drivetrain designed to evaluate both elastic and viscoelastic characteristics of biologic tissues has been developed. The initial impetus for the development of this device was to evaluate the potential for human umbilical tissue to serve as a vascular graft material. The consequence is that the load frame is configured for membrane type specimens with rectangular dimensions of no more than 25mm per side. The designed load capacity of the drivetrain is to impose an axial load of 40N on the specimen. This drivetrain is capable of assessing the viscoelastic response of the specimens by four different test modes: stress relaxation, creep, harmonic induced oscillations, and controlled strain rate tests. The fluorocarbon PTFE has mechanical properties commensurate with vascular tissue. In fact, it has been used for vascular grafts in patients who have been victims of various traumas. Hardware and software validation of the device was accomplished by testing PTFE and comparing the results to properties that have been published by both researchers and manufacturers.
Resumo:
PURPOSE: To compare the direct and indirect radiographic methods for assessing the gray levels of biomaterials employing the Digora for Windows and the Adobe Photoshop CS2 systems. METHODS: Specimens of biomaterials were made following manusfacturer's instructions and placed on phosphor storage plates (PSP) and on radiographic film for subsequent gray level assessment using the direct and indirect radiographic method, respectively. The radiographic density of each biomaterial was analyzed using Adobe Photoshop CS2 and Digora for Windows software. RESULTS: The distribution of gray levels found using the direct and indirect methods suggests that higher exposure times are correlated to lower reproducibility rates between groups. CONCLUSION: The indirect method is a feasible alternative to the direct method in assessing the radiographic gray levels of biomaterials, insofar as significant reproducibility was observed between groups for the exposure times of 0.2 to 0.5 seconds.
Resumo:
Today, there is little knowledge on the attitude state of decommissioned intact objects in Earth orbit. Observational means have advanced in the past years, but are still limited with respect to an accurate estimate of motion vector orientations and magnitude. Especially for the preparation of Active Debris Removal (ADR) missions as planned by ESA’s Clean Space initiative or contingency scenarios for ESA spacecraft like ENVISAT, such knowledge is needed. ESA's “Debris Attitude Motion Measurements and Modelling” project (ESA Contract No. 40000112447), led by the Astronomical Institute of the University of Bern (AIUB), addresses this problem. The goal of the project is to achieve a good understanding of the attitude evolution and the considerable internal and external effects which occur. To characterize the attitude state of selected targets in LEO and GTO, multiple observation methods are combined. Optical observations are carried out by AIUB, Satellite Laser Ranging (SLR) is performed by the Space Research Institute of the Austrian Academy of Sciences (IWF) and radar measurements and signal level determination are provided by the Fraunhofer Institute for High Frequency Physics and Radar Techniques (FHR). The In-Orbit Tumbling Analysis tool (ιOTA) is a prototype software, currently in development by Hyperschall Technologie Göttingen GmbH (HTG) within the framework of the project. ιOTA will be a highly modular software tool to perform short-(days), medium-(months) and long-term (years) propagation of the orbit and attitude motion (six degrees-of-freedom) of spacecraft in Earth orbit. The simulation takes into account all relevant acting forces and torques, including aerodynamic drag, solar radiation pressure, gravitational influences of Earth, Sun and Moon, eddy current damping, impulse and momentum transfer from space debris or micro meteoroid impact, as well as the optional definition of particular spacecraft specific influences like tank sloshing, reaction wheel behaviour, magnetic torquer activity and thruster firing. The purpose of ιOTA is to provide high accuracy short-term simulations to support observers and potential ADR missions, as well as medium-and long-term simulations to study the significance of the particular internal and external influences on the attitude, especially damping factors and momentum transfer. The simulation will also enable the investigation of the altitude dependency of the particular external influences. ιOTA's post-processing modules will generate synthetic measurements for observers and for software validation. The validation of the software will be done by cross-calibration with observations and measurements acquired by the project partners.
Resumo:
En el presente Trabajo de Fin de Grado se abordan diferentes aspectos del diseño, implementación y vericación de un sistema de tiempo real de características especiales, el satélite UPMSat-2. Este proyecto, llevado a cabo por un grupo de trabajo formado por profesores, alumnos y personal de la Universidad Politecnica de Madrid (UPM), tiene como objetivo el desarrollo de un microsatélite como plataforma de demostración tecnológica en órbita. Parte de este grupo de trabajo es el Grupo de Sistemas de Tiempo Real y Arquitectura de Servicios Telemáticos (STRAST), del cual el alumno forma parte y que tiene a cargo el diseño e implementación tanto del software de abordo como del software de tierra. Dentro de estas asignaciones, el alumno ha trabajado en tres aspectos principales: el diseño e implementación de diferentes manejadores de dispositivos, el diseño de un algoritmo para la gestión de la memoria no volátil y la configuración y prueba de un sistema de validación de software para un subsistema del satélite. Tanto la memoria de estas tareas como las bases y fundamentos tecnológicos aplicados se desarrollan en el documento. ------------------------------------------------ ----------------------------------------------------------------------------------- Diferent aspects of the design, implementation and validation of an specific Real Time System, the UPMSat-2 satellite, are described in this final report. UPMSat-2 project is aimed at developing an experimental microsatellite that can be used as a technology demonstrator for several research groups at UPM. The Real-Time Systems Group at UPM (STRAST) is responsible for designing and building all the on-board and ground-segment software for the satellite. In relation to this, three main task have been carried out and are described in this document: the design and implementation of three diferent device drivers, the design of an algorithm to manage the nonvolatile memory and the configuration and test of a software validation facility to test the UPMSat-2 Attitude Determination and Control System (ADCS) subsystem. Detailed information of these tasks and their technological basis are presented in the rest of the document.
Resumo:
We discuss experiences gained by porting a Software Validation Facility (SVF) and a satellite Central Software (CSW) to a platform with support for Time and Space Partitioning (TSP). The SVF and CSW are part of the EagleEye Reference mission of the European Space Agency (ESA). As a reference mission, EagleEye is a perfect candidate to evaluate practical aspects of developing satellite CSW for and on TSP platforms. The specific TSP platform we used consists of a simulate D LEON3 CPU controlled by the XtratuM separation micro-kernel. On top of this, we run five separate partitions. Each partition ru n s its own real-time operating system or Ada run-time kernel, which in turn are running the application software of the CSW. We describe issues related to partitioning; inter-partition communication; scheduling; I/O; and fault-detection, isolation, and recovery (FDIR)
Resumo:
The traditional business models and the traditionally successful development methods that have been distinctive to the industrial era, do not satisfy the needs of modern IT companies. Due to the rapid nature of IT markets, the uncertainty of new innovations‟ success and the overwhelming competition with established companies, startups need to make quick decisions and eliminate wasted resources more effectively than ever before. There is a need for an empirical basis on which to build business models, as well as evaluate the presumptions regarding value and profit. Less than ten years ago, the Lean software development principles and practices became widely well-known in the academic circles. Those practices help startup entrepreneurs to validate their learning, test their assumptions and be more and more dynamical and flexible. What is special about today‟s software startups is that they are increasingly individual. There are quantitative research studies available regarding the details of Lean startups. Broad research with hundreds of companies presented in a few charts is informative, but a detailed study of fewer examples gives an insight to the way software entrepreneurs see Lean startup philosophy and how they describe it in their own words. This thesis focuses on Lean software startups‟ early phases, namely Customer Discovery (discovering a valuable solution to a real problem) and Customer Validation (being in a good market with a product which satisfies that market). The thesis first offers a sufficiently compact insight into the Lean software startup concept to a reader who is not previously familiar with the term. The Lean startup philosophy is then put into a real-life test, based on interviews with four Finnish Lean software startup entrepreneurs. The interviews reveal 1) whether the Lean startup philosophy is actually valuable for them, 2) how can the theory be practically implemented in real life and 3) does theoretical Lean startup knowledge compensate a lack of entrepreneurship experience. A reader gets familiar with the key elements and tools of Lean startups, as well as their mutual connections. The thesis explains why Lean startups waste less time and money than many other startups. The thesis, especially its research sections, aims at providing data and analysis simultaneously.
Resumo:
The traditional business models and the traditionally successful development methods that have been distinctive to the industrial era, do not satisfy the needs of modern IT companies. Due to the rapid nature of IT markets, the uncertainty of new innovations‟ success and the overwhelming competition with established companies, startups need to make quick decisions and eliminate wasted resources more effectively than ever before. There is a need for an empirical basis on which to build business models, as well as evaluate the presumptions regarding value and profit. Less than ten years ago, the Lean software development principles and practices became widely well-known in the academic circles. Those practices help startup entrepreneurs to validate their learning, test their assumptions and be more and more dynamical and flexible. What is special about today‟s software startups is that they are increasingly individual. There are quantitative research studies available regarding the details of Lean startups. Broad research with hundreds of companies presented in a few charts is informative, but a detailed study of fewer examples gives an insight to the way software entrepreneurs see Lean startup philosophy and how they describe it in their own words. This thesis focuses on Lean software startups‟ early phases, namely Customer Discovery (discovering a valuable solution to a real problem) and Customer Validation (being in a good market with a product which satisfies that market). The thesis first offers a sufficiently compact insight into the Lean software startup concept to a reader who is not previously familiar with the term. The Lean startup philosophy is then put into a real-life test, based on interviews with four Finnish Lean software startup entrepreneurs. The interviews reveal 1) whether the Lean startup philosophy is actually valuable for them, 2) how can the theory be practically implemented in real life and 3) does theoretical Lean startup knowledge compensate a lack of entrepreneurship experience. A reader gets familiar with the key elements and tools of Lean startups, as well as their mutual connections. The thesis explains why Lean startups waste less time and money than many other startups. The thesis, especially its research sections, aims at providing data and analysis simultaneously.
Resumo:
OBJECTIVES: To determine the accuracy of automated vessel-segmentation software for vessel-diameter measurements based on three-dimensional contrast-enhanced magnetic resonance angiography (3D-MRA). METHOD: In 10 patients with high-grade carotid stenosis, automated measurements of both carotid arteries were obtained with 3D-MRA by two independent investigators and compared with manual measurements obtained by digital subtraction angiography (DSA) and 2D maximum-intensity projection (2D-MIP) based on MRA and duplex ultrasonography (US). In 42 patients undergoing carotid endarterectomy (CEA), intraoperative measurements (IOP) were compared with postoperative 3D-MRA and US. RESULTS: Mean interoperator variability was 8% for measurements by DSA and 11% by 2D-MIP, but there was no interoperator variability with the automated 3D-MRA analysis. Good correlations were found between DSA (standard of reference), manual 2D-MIP (rP=0.6) and automated 3D-MRA (rP=0.8). Excellent correlations were found between IOP, 3D-MRA (rP=0.93) and US (rP=0.83). CONCLUSION: Automated 3D-MRA-based vessel segmentation and quantification result in accurate measurements of extracerebral-vessel dimensions.