958 resultados para Software certification
Resumo:
Com o crescente aumento da Teleradiologia, sentiu-se necessidade de criar mais e melhores softwares para sustentar esse crescimento. O presente trabalho pretende abordar a temática da certificação de software e a sua marcação CE, pois para dar entrada no mercado Europeu todos os Dispositivos Médicos (DM) têm de estar devidamente certificados. Para efetuar a marcação CE e a certificação serão estudadas normas e normativos adequados para marcação de DM ao nível Europeu e também dos Estados Unidos da América. A temática da segurança de dados pessoais será também estudada de forma a assegurar que o dispositivo respeite a legislação em vigor. Este estudo tem como finalidade a certificação de um software proprietário da efficientia sysPACS, um serviço online abrangente, que permite a gestão integrada do armazenamento e distribuição de imagens médicas para apoio ao diagnóstico.
Resumo:
Includes bibliographical references
Resumo:
This paper presents the proposal for a reference model for developing software aimed at small companies. Despite the importance of that represent the small software companies in Latin America, the fact of not having its own standards, and able to meet their specific, has created serious difficulties in improving their process and also in quality certification. In this sense and as a contribution to better understanding of the subject they propose a reference model and as a means to validate the proposal, presents a report of its application in a small Brazilian company, committed to certification of the quality model MPS.BR.
Resumo:
Semi-automatic capillary gas chromatographic method with classical flame ionization detection, which satisfies the conditions for required performance and gave acceptable results within the framework of an interlaboratory certification programme for PAHs in sewage sludge, is described. The interesting feature of the procedure is that it incorporates automatic operations such as sample fractionation by semi-preparative HPLC, fraction collection at signal level recognition and evaporation under nitrogen flow. Multiple injections in the GC capillary column are performed in the on-column mode via an autosampler with temperature-programmable injector. Automatic data acquisition and chromatogram treatment are made via computer software. This partially automatic procedure releases personnel from tedious and time-consuming tasks and its robust character was validated through the certification of reference material for PAHs in sewage sludge, demonstrating its reliable performance.
Resumo:
Quality management has become a strategic issue for organisations and is very valuable to produce quality software. However, quality management systems (QMS) are not easy to implement and maintain. The authors' experience shows the benefits of developing a QMS by first formalising it using semantic web ontologies and then putting them into practice through a semantic wiki. The QMS ontology that has been developed captures the core concepts of a traditional QMS and combines them with concepts coming from the MPIu'a development process model, which is geared towards obtaining usable and accessible software products. Then, the ontology semantics is directly put into play by a semantics-aware tool, the Semantic MediaWiki. The developed QMS tool is being used for 2 years by the GRIHO research group, where it has manages almost 50 software development projects taking into account the quality management issues. It has also been externally audited by a quality certification organisation. Its users are very satisfied with their daily work with the tool, which manages all the documents created during project development and also allows them to collaborate, thanks to the wiki features.
Resumo:
Este plan exportador proyectado a un plazo de 3 años, servirá a ITAC IT APPLICATIONS CONSULTING S.A. para direccionar sus actividades en el mercado internacional para los años 2009, 2010, 2011. La prioridad de los 2 primeros años será el mejoramiento interno de la empresa, que será la aplicación de estrategias en diferentes campos como: capital humano, capital intelectual, capital cultural, crecimiento económico, estrategia comercial en el área internacional, construcción de capital financiero para la generación de ingresos. Para tener participación en mercados internacionales, mostrar su potencial exportador y lograr las expectativas de crecimiento de las ventas independientes a las obtenidas en el marcado local; pretende empezar en el año 2009, en el mercado Peruano con exportaciones por $36.000 USD correspondiente a 30 unidades, aumentando a $ 72000 USD con 60 unidades en el 2010 y $ 108000 USD y 90 unidades en el 2011. El Servicio a exportar fue “SecureFile” a partir del cual se definieron factores de éxito como lo son las ventajas competitivas del producto en sí mismo enumeradas a continuación: 1) Precio muy competitivo en el mercado, 2) Automatización del proceso de intercambio de información, 3) Software basado en estándares, 4) Se ejecuta en cualquier sistema operativo. A su vez se realizaron consultorías donde se diagnosticó todas las áreas de la empresa arrojando algunos resultados: La estructura organizacional esta bien definida, pero por su crecimiento y necesidad de incluir nuevo personal, no hay claridad en las funciones dentro del organigrama y depende totalmente de la dirección general. Por esto la gerencia debe estructurar mejor los departamentos comerciales creando nuevos cargos de acuerdo al proceso de internacionalización. Las políticas de personal se trabajan de manera informal con criterios validos para promover trabajadores (mérito, antigüedad, etc.), se realizan actualizaciones Tecnológicas mensuales, reconocimiento y participación en la empresa a sus funcionarios, excelentes relaciones personales que permiten hacer evaluaciones de desempeño acorde a las metas, gran variedad de motivación y responsabilidad social encaminada a los niños de bajos recursos. Aunque se debe crear un área de gestión humana y definir la frecuencia de las capacitaciones. Los ingresos son provenientes de la prestación de servicios de IT con incrementando de 256% durante los tres años anteriores para obtener $ 2`032.784.683 millones de pesos en el 2007. El nivel de endeudamiento también ha ido en aumento, por la necesidad de capacidad instalada, contrataciones de personal, el cumplimiento de requisitos del mercado y la necesidad generar buena imagen crediticia con entidades financieras. Cuenta con un musculo financiero para respaldar sus obligaciones inmediatas con $4,42 por $1 comprometido en el 2007 a pesar de ser el año con mayor nivel de endeudamiento arrojando pasivos corrientes por $127.715.281,37. Los cuatro socios cuentan con un comportamiento de 164,67% (2006) y 132,97% (2007) de rendimiento de sobre la inversión antes de impuestos. Para este año más del 95% de su información financiera y contable se maneja de manera sistematizada. El área Financiera de la empresa no es la más débil, pero no existe un departamento financiero con un solo responsable a la cabeza, por esto deben destinar un área separada de la administrativa con un asesor financiero que tenga disponibilidad de 100%. En el caso particular del proyecto de exportación los costos de producción se centran en SecureFile versión 3.0 que no representa costos marginales, ya que la replica de este software puede hacerse cuantas veces sea requerido sin afectar en ninguna proporción los costos. La empresa no utiliza un método formal para calcular sus costos de operación y desarrollo de programas. Pero ha desarrollado un sistema de evaluación de costos en tablas de Excel que de manera organizada logran un costeo acorde a sus necesidades específicas. Para la selección de los países: objetivo, alterno y contingente; se realizó una matriz de Selección de 6 países basados en la exigencia gubernamental en términos de seguridad de la información vía internet, y la percepción de los empresarios, competencia y otros factores económicos; arrojando como resultado a Perú, Costa Rica y México.
Resumo:
A great challenge of the Component Based Development is the creation of mechanisms to facilitate the finding of reusable assets that fulfill the requirements of a particular system under development. In this sense, some component repositories have been proposed in order to answer such a need. However, repositories need to represent the asset characteristics that can be taken into account by the consumers when choosing the more adequate assets for their needs. In such a context, the literature presents some models proposed to describe the asset characteristics, such as identification, classification, non-functional requirements, usage and deployment information and component interfaces. Nevertheless, the set of characteristics represented by those models is insufficient to describe information used before, during and after the asset acquisition. This information refers to negotiation, certification, change history, adopted development process, events, exceptions and so on. In order to overcome this gap, this work proposes an XML-based model to represent several characteristics, of different asset types, that may be employed in the component-based development. Besides representing metadata used by consumers, useful for asset discovering, acquisition and usage, this model, called X-ARM, also focus on helping asset developers activities. Since the proposed model represents an expressive amount of information, this work also presents a tool called X-Packager, developed with the goal of helping asset description with X-ARM
Resumo:
The research intended to analyze the adoption process of the green certification "Leadership in Energy and Environmental Design" (LEED) from the hotel sector establishments that has already adopted it. For its concretization it was proceeded a bibliographical research, secondary fact-gathering in journals, institutional sites and documentaries, and primary fact-gathering by means of semi structured interviews carried out with responsible people of the certified hotels and of the responsible entity of the certification in Brazil (Green Building Council Brazil). There were 21 interviewee, being 02 of the GBC Brazil and 19 of means of lodging (31% of the certified). For data analysis, it was utilized content analysis technique with the aid of ATLAS.ti software. The results permitted to identify the chronology of the processes of certification and the profile of the hotel categories that adopt the LEED program. Beyond that, the interviews enabled the discussion of the initial motivations for seeking the certification, as well the advantages and the obstacles perceived regarding its adoption.
Resumo:
To what extent is “software engineering” really “engineering” as this term is commonly understood? A hallmark of the products of the traditional engineering disciplines is trustworthiness based on dependability. But in his keynote presentation at ICSE 2006 Barry Boehm pointed out that individuals’, systems’, and peoples’ dependency on software is becoming increasingly critical, yet that dependability is generally not the top priority for software intensive system producers. Continuing in an uncharacteristic pessimistic vein, Professor Boehm said that this situation will likely continue until a major software-induced system catastrophe similar in impact to the 9/11 World Trade Center catastrophe stimulates action toward establishing accountability for software dependability. He predicts that it is highly likely that such a software-induced catastrophe will occur between now and 2025. It is widely understood that software, i.e., computer programs, are intrinsically different from traditionally engineered products, but in one aspect they are identical: the extent to which the well-being of individuals, organizations, and society in general increasingly depend on software. As wardens of the future through our mentoring of the next generation of software developers, we believe that it is our responsibility to at least address Professor Boehm’s predicted catastrophe. Traditional engineering has, and continually addresses its social responsibility through the evolution of the education, practice, and professional certification/licensing of professional engineers. To be included in the fraternity of professional engineers, software engineering must do the same. To get a rough idea of where software engineering currently stands on some of these issues we conducted two surveys. Our main survey was sent to software engineering academics in the U.S., Canada, and Australia. Among other items it sought detail information on their software engineering programs. Our auxiliary survey was sent to U.S. engineering institutions to get some idea about how software engineering programs compared with those in established engineering disciplines of Civil, Electrical, and Mechanical Engineering. Summaries of our findings can be found in the last two sections of our paper.
Resumo:
Proof carrying code is a general methodology for certifying that the execution of an untrusted mobile code is safe, according to a predefined safety policy. The basic idea is that the code supplier attaches a certifícate (or proof) to the mobile code which, then, the consumer checks in order to ensure that the code is indeed safe. The potential benefit is that the consumer's task is reduced from the level of proving to the level of checking, a much simpler task. Recently, the abstract interpretation techniques developed in logic programming have been proposed as a basis for proof carrying code [1]. To this end, the certifícate is generated from an abstract interpretation-based proof of safety. Intuitively, the verification condition is extracted from a set of assertions guaranteeing safety and the answer table generated during the analysis. Given this information, it is relatively simple and fast to verify that the code does meet this proof and so its execution is safe. This extended abstract reports on experiments which illustrate several issues involved in abstract interpretation-based code certification. First, we describe the implementation of our system in the context of CiaoPP: the preprocessor of the Ciao multi-paradigm (constraint) logic programming system. Then, by means of some experiments, we show how code certification is aided in the implementation of the framework. Finally, we discuss the application of our method within the área of pervasive systems which may lack the necessary computing resources to verify safety on their own. We herein illustrate the relevance of the information inferred by existing cost analysis to control resource usage in this context. Moreover, since the (rather complex) analysis phase is replaced by a simpler, efficient checking process at the code consumer side, we believe that our abstract interpretation-based approach to proof-carrying code becomes practically applicable to this kind of systems.
Resumo:
El Análisis de Consumo de Recursos o Análisis de Coste trata de aproximar el coste de ejecutar un programa como una función dependiente de sus datos de entrada. A pesar de que existen trabajos previos a esta tesis doctoral que desarrollan potentes marcos para el análisis de coste de programas orientados a objetos, algunos aspectos avanzados, como la eficiencia, la precisión y la fiabilidad de los resultados, todavía deben ser estudiados en profundidad. Esta tesis aborda estos aspectos desde cuatro perspectivas diferentes: (1) Las estructuras de datos compartidas en la memoria del programa son una pesadilla para el análisis estático de programas. Trabajos recientes proponen una serie de condiciones de localidad para poder mantener de forma consistente información sobre los atributos de los objetos almacenados en memoria compartida, reemplazando éstos por variables locales no almacenadas en la memoria compartida. En esta tesis presentamos dos extensiones a estos trabajos: la primera es considerar, no sólo los accesos a los atributos, sino también los accesos a los elementos almacenados en arrays; la segunda se centra en los casos en los que las condiciones de localidad no se cumplen de forma incondicional, para lo cual, proponemos una técnica para encontrar las precondiciones necesarias para garantizar la consistencia de la información acerca de los datos almacenados en memoria. (2) El objetivo del análisis incremental es, dado un programa, los resultados de su análisis y una serie de cambios sobre el programa, obtener los nuevos resultados del análisis de la forma más eficiente posible, evitando reanalizar aquellos fragmentos de código que no se hayan visto afectados por los cambios. Los analizadores actuales todavía leen y analizan el programa completo de forma no incremental. Esta tesis presenta un análisis de coste incremental, que, dado un cambio en el programa, reconstruye la información sobre el coste del programa de todos los métodos afectados por el cambio de forma incremental. Para esto, proponemos (i) un algoritmo multi-dominio y de punto fijo que puede ser utilizado en todos los análisis globales necesarios para inferir el coste, y (ii) una novedosa forma de almacenar las expresiones de coste que nos permite reconstruir de forma incremental únicamente las funciones de coste de aquellos componentes afectados por el cambio. (3) Las garantías de coste obtenidas de forma automática por herramientas de análisis estático no son consideradas totalmente fiables salvo que la implementación de la herramienta o los resultados obtenidos sean verificados formalmente. Llevar a cabo el análisis de estas herramientas es una tarea titánica, ya que se trata de herramientas de gran tamaño y complejidad. En esta tesis nos centramos en el desarrollo de un marco formal para la verificación de las garantías de coste obtenidas por los analizadores en lugar de analizar las herramientas. Hemos implementado esta idea mediante la herramienta COSTA, un analizador de coste para programas Java y KeY, una herramienta de verificación de programas Java. De esta forma, COSTA genera las garantías de coste, mientras que KeY prueba la validez formal de los resultados obtenidos, generando de esta forma garantías de coste verificadas. (4) Hoy en día la concurrencia y los programas distribuidos son clave en el desarrollo de software. Los objetos concurrentes son un modelo de concurrencia asentado para el desarrollo de sistemas concurrentes. En este modelo, los objetos son las unidades de concurrencia y se comunican entre ellos mediante llamadas asíncronas a sus métodos. La distribución de las tareas sugiere que el análisis de coste debe inferir el coste de los diferentes componentes distribuidos por separado. En esta tesis proponemos un análisis de coste sensible a objetos que, utilizando los resultados obtenidos mediante un análisis de apunta-a, mantiene el coste de los diferentes componentes de forma independiente. Abstract Resource Analysis (a.k.a. Cost Analysis) tries to approximate the cost of executing programs as functions on their input data sizes and without actually having to execute the programs. While a powerful resource analysis framework on object-oriented programs existed before this thesis, advanced aspects to improve the efficiency, the accuracy and the reliability of the results of the analysis still need to be further investigated. This thesis tackles this need from the following four different perspectives. (1) Shared mutable data structures are the bane of formal reasoning and static analysis. Analyses which keep track of heap-allocated data are referred to as heap-sensitive. Recent work proposes locality conditions for soundly tracking field accesses by means of ghost non-heap allocated variables. In this thesis we present two extensions to this approach: the first extension is to consider arrays accesses (in addition to object fields), while the second extension focuses on handling cases for which the locality conditions cannot be proven unconditionally by finding aliasing preconditions under which tracking such heap locations is feasible. (2) The aim of incremental analysis is, given a program, its analysis results and a series of changes to the program, to obtain the new analysis results as efficiently as possible and, ideally, without having to (re-)analyze fragments of code that are not affected by the changes. During software development, programs are permanently modified but most analyzers still read and analyze the entire program at once in a non-incremental way. This thesis presents an incremental resource usage analysis which, after a change in the program is made, is able to reconstruct the upper-bounds of all affected methods in an incremental way. To this purpose, we propose (i) a multi-domain incremental fixed-point algorithm which can be used by all global analyses required to infer the cost, and (ii) a novel form of cost summaries that allows us to incrementally reconstruct only those components of cost functions affected by the change. (3) Resource guarantees that are automatically inferred by static analysis tools are generally not considered completely trustworthy, unless the tool implementation or the results are formally verified. Performing full-blown verification of such tools is a daunting task, since they are large and complex. In this thesis we focus on the development of a formal framework for the verification of the resource guarantees obtained by the analyzers, instead of verifying the tools. We have implemented this idea using COSTA, a state-of-the-art cost analyzer for Java programs and KeY, a state-of-the-art verification tool for Java source code. COSTA is able to derive upper-bounds of Java programs while KeY proves the validity of these bounds and provides a certificate. The main contribution of our work is to show that the proposed tools cooperation can be used for automatically producing verified resource guarantees. (4) Distribution and concurrency are today mainstream. Concurrent objects form a well established model for distributed concurrent systems. In this model, objects are the concurrency units that communicate via asynchronous method calls. Distribution suggests that analysis must infer the cost of the diverse distributed components separately. In this thesis we propose a novel object-sensitive cost analysis which, by using the results gathered by a points-to analysis, can keep the cost of the diverse distributed components separate.
Resumo:
The Safety Certification of Software-Intensive Systems with Reusable Components project, in short SafeCer (www.safecer.eu),is targeting increased efficiency and reduced time-to-market by composable safety certification of safety- relevant embedded systems. The industrial domains targeted are within automotive and construction equipment, avionics, and rail. Some of the companies involved are: Volvo Tech- nology, Thales, TTTech, and Intecs among others. SafeCer includes more than 30 partners in six different countries and has a budget of e25.7 millions. A primary objective is to provide support for system safety arguments based on arguments and properties of system components as well as to provide support for generation of corresponding evidence in a similar compositional way. By providing support for efficient reuse of certification and stronger links between certification and development, compo- nent reuse will be facilitated, and by providing support for reuse across domains the amount of components available for reuse will increase dramatically. The resulting efficiency and reduced time to market will, together with increased quality and reduced risk, increase competitiveness and pave the way for a cross-domain market for software components qualified for certification.
Resumo:
This article aimed at comparing the accuracy of linear measurement tools of different commercial software packages. Eight fully edentulous dry mandibles were selected for this study. Incisor, canine, premolar, first molar and second molar regions were selected. Cone beam computed tomography (CBCT) images were obtained with i-CAT Next Generation. Linear bone measurements were performed by one observer on the cross-sectional images using three different software packages: XoranCat®, OnDemand3D® and KDIS3D®, all able to assess DICOM images. In addition, 25% of the sample was reevaluated for the purpose of reproducibility. The mandibles were sectioned to obtain the gold standard for each region. Intraclass coefficients (ICC) were calculated to examine the agreement between the two periods of evaluation; the one-way analysis of variance performed with the post-hoc Dunnett test was used to compare each of the software-derived measurements with the gold standard. The ICC values were excellent for all software packages. The least difference between the software-derived measurements and the gold standard was obtained with the OnDemand3D and KDIS3D (-0.11 and -0.14 mm, respectively), and the greatest, with the XoranCAT (+0.25 mm). However, there was no statistical significant difference between the measurements obtained with the different software packages and the gold standard (p> 0.05). In conclusion, linear bone measurements were not influenced by the software package used to reconstruct the image from CBCT DICOM data.
Resumo:
This paper presents SMarty, a variability management approach for UML-based software product lines (PL). SMarty is supported by a UML profile, the SMartyProfile, and a process for managing variabilities, the SMartyProcess. SMartyProfile aims at representing variabilities, variation points, and variants in UML models by applying a set of stereotypes. SMartyProcess consists of a set of activities that is systematically executed to trace, identify, and control variabilities in a PL based on SMarty. It also identifies variability implementation mechanisms and analyzes specific product configurations. In addition, a more comprehensive application of SMarty is presented using SEI's Arcade Game Maker PL. An evaluation of SMarty and related work are discussed.