871 resultados para Requirements specifications


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Developing a desirable framework for handling inconsistencies in software requirements specifications is a challenging problem. It has been widely recognized that the relative priority of requirements can help developers to make some necessary trade-off decisions for resolving con- flicts. However, for most distributed development such as viewpoints-based approaches, different stakeholders may assign different levels of priority to the same shared requirements statement from their own perspectives. The disagreement in the local levels of priority assigned to the same shared requirements statement often puts developers into a dilemma during the inconsistency handling process. The main contribution of this paper is to present a prioritized merging-based framework for handling inconsistency in distributed software requirements specifications. Given a set of distributed inconsistent requirements collections with the local prioritization, we first construct a requirements specification with a prioritization from an overall perspective. We provide two approaches to constructing a requirements specification with the global prioritization, including a merging-based construction and a priority vector-based construction. Following this, we derive proposals for handling inconsistencies from the globally prioritized requirements specification in terms of prioritized merging. Moreover, from the overall perspective, these proposals may be viewed as the most appropriate to modifying the given inconsistent requirements specification in the sense of the ordering relation over all the consistent subsets of the requirements specification. Finally, we consider applying negotiation-based techniques to viewpoints so as to identify an acceptable common proposal from these proposals.

Relevância:

70.00% 70.00%

Publicador:

Resumo:

Requirements-aware systems address the need to reason about uncertainty at runtime to support adaptation decisions, by representing quality of services (QoS) requirements for service-based systems (SBS) with precise values in run-time queryable model specification. However, current approaches do not support updating of the specification to reflect changes in the service market, like newly available services or improved QoS of existing ones. Thus, even if the specification models reflect design-time acceptable requirements they may become obsolete and miss opportunities for system improvement by self-adaptation. This articles proposes to distinguish "abstract" and "concrete" specification models: the former consists of linguistic variables (e.g. "fast") agreed upon at design time, and the latter consists of precise numeric values (e.g. "2ms") that are dynamically calculated at run-time, thus incorporating up-to-date QoS information. If and when freshly calculated concrete specifications are not satisfied anymore by the current service configuration, an adaptation is triggered. The approach was validated using four simulated SBS that use services from a previously published, real-world dataset; in all cases, the system was able to detect unsatisfied requirements at run-time and trigger suitable adaptations. Ongoing work focuses on policies to determine recalculation of specifications. This approach will allow engineers to build SBS that can be protected against market-caused obsolescence of their requirements specifications. © 2012 IEEE.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Distributed systems are one of the most vital components of the economy. The most prominent example is probably the internet, a constituent element of our knowledge society. During the recent years, the number of novel network types has steadily increased. Amongst others, sensor networks, distributed systems composed of tiny computational devices with scarce resources, have emerged. The further development and heterogeneous connection of such systems imposes new requirements on the software development process. Mobile and wireless networks, for instance, have to organize themselves autonomously and must be able to react to changes in the environment and to failing nodes alike. Researching new approaches for the design of distributed algorithms may lead to methods with which these requirements can be met efficiently. In this thesis, one such method is developed, tested, and discussed in respect of its practical utility. Our new design approach for distributed algorithms is based on Genetic Programming, a member of the family of evolutionary algorithms. Evolutionary algorithms are metaheuristic optimization methods which copy principles from natural evolution. They use a population of solution candidates which they try to refine step by step in order to attain optimal values for predefined objective functions. The synthesis of an algorithm with our approach starts with an analysis step in which the wanted global behavior of the distributed system is specified. From this specification, objective functions are derived which steer a Genetic Programming process where the solution candidates are distributed programs. The objective functions rate how close these programs approximate the goal behavior in multiple randomized network simulations. The evolutionary process step by step selects the most promising solution candidates and modifies and combines them with mutation and crossover operators. This way, a description of the global behavior of a distributed system is translated automatically to programs which, if executed locally on the nodes of the system, exhibit this behavior. In our work, we test six different ways for representing distributed programs, comprising adaptations and extensions of well-known Genetic Programming methods (SGP, eSGP, and LGP), one bio-inspired approach (Fraglets), and two new program representations called Rule-based Genetic Programming (RBGP, eRBGP) designed by us. We breed programs in these representations for three well-known example problems in distributed systems: election algorithms, the distributed mutual exclusion at a critical section, and the distributed computation of the greatest common divisor of a set of numbers. Synthesizing distributed programs the evolutionary way does not necessarily lead to the envisaged results. In a detailed analysis, we discuss the problematic features which make this form of Genetic Programming particularly hard. The two Rule-based Genetic Programming approaches have been developed especially in order to mitigate these difficulties. In our experiments, at least one of them (eRBGP) turned out to be a very efficient approach and in most cases, was superior to the other representations.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

The occurrence of problems related to the scattering and tangling phenomenon, such as the difficulty to do system maintenance, increasingly frequent. One way to solve this problem is related to the crosscutting concerns identification. To maximize its benefits, the identification must be performed from early stages of development process, but some works have reported that this has not been done in most of cases, making the system development susceptible to the errors incidence and prone to the refactoring later. This situation affects directly to the quality and cost of the system. PL-AOVgraph is a goal-oriented requirements modeling language which offers support to the relationships representation among requirements and provides separation of crosscutting concerns by crosscutting relationships representation. Therefore, this work presents a semi-automatic method to crosscutting concern identification in requirements specifications written in PL-AOVgraph. An adjacency matrix is used to identify the contributions relationships among the elements. The crosscutting concern identification is based in fan-out analysis of contribution relationships from the informations of adjacency matrix. When identified, the crosscutting relationships are created. And also, this method is implemented as a new module of ReqSys-MDD tool

Relevância:

60.00% 60.00%

Publicador:

Resumo:

The software industry has become more and more concerned with the appropriate application of activities that composes requirement engineering as a way to improve the quality of its products. In order to support these activities, several computational tools have been available in the market, although it is still possible to find a lack of resources related to some activities. In this context, this paper proposes the inclusion of a module to aid in the requirements specification to a tool called Requirements Elicitation Support Tool. This module allows to specify requirements in accordance with IEEE 830 standard, thus contributing to the documentation of the requirements established for a software system, besides supporting the learning of concepts related to the requirements specification, which improves the skills of users of the tool. © 2012 IEEE.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

El presente proyecto fin de carrera, realizado por el ingeniero técnico en telecomunicaciones Pedro M. Matamala Lucas, es la fase final de desarrollo de un proyecto de mayor magnitud correspondiente al software de vídeo forense SAVID. El propósito del proyecto en su totalidad es la creación de una herramienta informática capacitada para realizar el análisis de ficheros de vídeo, codificados y comprimidos por el sistema DV –Digital Video-. El objetivo del análisis, es aportar información acerca de si la cinta magnética presenta indicios de haber sido manipulada con una edición posterior a su grabación original, además, de mostrar al usuario otros datos de interés como las especificaciones técnicas de la señal de vídeo y audio. Por lo tanto, se facilitará al usuario, analista de vídeo forense, información que le ayude a valorar la originalidad del contenido del soporte que es sujeto del análisis. El objetivo específico de esta fase final, es la creación de la interfaz de usuario del software, que informa tanto del código binario de los sectores significativos, como de su interpretación tras el análisis. También permitirá al usuario el reporte de los resultados, además de otras funcionalidades que le permitan la navegación por los sectores del código que han sido modificados como efecto colateral de la edición de la cinta magnética original. Otro objetivo importante del proyecto ha sido la investigación de metodologías y técnicas de desarrollo de software para su posterior implementación, buscando con esto, una mayor eficiencia en la gestión del tiempo y una mayor calidad de software con el fin de garantizar su evolución y sostenibilidad en el futuro. Se ha hecho hincapié en las metodologías ágiles que han ido ganando relevancia en el sector de las tecnologías de la información en las últimas décadas, sustituyendo a metodologías clásicas como el desarrollo en cascada. Su flexibilidad durante el ciclo de vida del software, permite obtener mejores resultados cuando las especificaciones no están del todo definidas, ajustándose de este modo a las condiciones del proyecto. Resumiendo las especificaciones técnicas del software, C++ es el lenguaje de programación orientado a objetos con el que se ha desarrollado, utilizándose la tecnología MFC -Microsoft Foundation Classes- para la implementación. Es un proyecto MFC de tipo cuadro de dialogo,creado, compilado y publicado, con la herramienta de desarrollo integrado Microsoft Visual Studio 2010. La arquitectura con la que se ha estructurado es la arquetípica de tres capas, compuesta por la interfaz de usuario, capa de negocio y capa de acceso a datos. Se ha visto necesario configurar el proyecto con compatibilidad con CLR –Common Languages Runtime- para poder implementar la funcionalidad de creación de reportes. Acompañando a la aplicación informática, se presenta la memoria del proyecto y sus anexos correspondientes a los documentos EDRF –Especificaciones Detalladas de Requisitos funcionales-, EIU –Especificaciones de Interfaz de Usuario , DT -Diseño Técnico- y Guía de Usuario. SUMMARY. This dissertation, carried out by the telecommunications engineer Pedro M. Matamala Lucas, is in its final stage and is part of a larger project for the software of forensic video called SAVID. The purpose of the entire project is the creation of a software tool capable of analyzing video files that are coded and compressed by the DV -Digital Video- System. The objective of the analysis is to provide information on whether the magnetic tape shows signs of having been tampered with after the editing of the original recording, and also to show the user other relevant data and technical specifications of the video signal and audio. Therefore the user, forensic video analyst, will have information to help assess the originality of the content of the media that is subject to analysis. The specific objective of this final phase is the creation of the user interface of the software that provides information about the binary code of the significant sectors and also its interpretation after analysis. It will also allow the user to report the results, and other features that will allow browsing through the sections of the code that have been modified as a secondary effect of the original magnetic tape being tampered. Another important objective of the project is the investigation of methodologies and software development techniques to be used in deployment, with the aim of greater efficiency in time management and enhanced software quality in order to ensure its development and maintenance in the future. Agile methodologies, which have become important in the field of information technology in recent decades, have been used during the execution of the project, replacing classical methodologies such as Waterfall Development. The flexibility, as the result of using by agile methodologies, during the software life cycle, produces better results when the specifications are not fully defined, thus conforming to the initial conditions of the project. Summarizing the software technical specifications, C + + the programming language – which is object oriented and has been developed using technology MFC- Microsoft Foundation Classes for implementation. It is a project type dialog box, created, compiled and released with the integrated development tool Microsoft Visual Studio 2010. The architecture is structured in three layers: the user interface, business layer and data access layer. It has been necessary to configure the project with the support CLR -Common Languages Runtime – in order to implement the reporting functionality. The software application is submitted with the project report and its annexes to the following documents: Functional Requirements Specifications - Detailed User Interface Specifications, Technical Design and User Guide.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

En cumplimiento de la Directiva 2002/49/CE en lo referente a carreteras, se han llevado a cabo en España, en el año 2007, los mapas estratégicos de ruido (MER) de los grandes ejes viarios de más de 6 millones de vehículos / año (datos 2006). En esta categoría se encuentra la carretera A-3 a su paso por el Campus Sur de la U.P.M. De acuerdo con el pliego de prescripciones técnicas, estos MER se dividen en fase A o de estudio básico y fase B o de estudio de detalle. En el proyecto denominado “Realización de Mapa Estratégico de Ruido de las carreteras de la Red del Estado (A-3 - Zona Campus sur de la U.P.M.)” se muestran los resultados de información pública de fase A del Mapa Estratégico de Ruido (MER) para la carretera A-3 en la Comunidad de Madrid. Teniendo en cuenta dichos resultados y la no elección del Campus Sur en la fase B (detalle) de dicho MER; en el presente proyecto se realiza el Mapa Estratégico de fase B del Campus Sur de la U.P.M. para el gran ejes viarios (A-3), de cumplimiento de la Directiva 2002/49/CE y siguiendo las indicaciones del pliego de prescripciones técnicas para esta primera fase de entregas de 2007 y utilizando para ello Sistemas de Información Geográfica (SIG). Se justifica, según los resultados obtenidos, si hubiera sido necesaria su inclusión en esta fase de estudio. Una vez establecidas las diferencias entre Mapa Estratégico de Ruido (MER) y Mapa de Ruido (MR), se analizan los diversos criterios técnicos que debe tomar un consultor acústico durante la realización de un MER y cómo estos pueden afectar al resultado final siendo todos ellos válidos pero creando una falta de homogeneidad entre los diferentes MER según el autor del estudio. Se describen las diferentes acciones que se están tomando tras la entrega de 2007 para intentar solucionar este problema de cara a las nuevas entregas. ABSTRACT: The environmental Noise Directive 2002/49/EC has clear requirements to Member States in terms of Strategic Noise Maps according to roads. So, have been carried out in Spain, in 2007, Strategic Noise Maps of the major roads which have more than six million vehicle passages a year (2006 data) where the A-3 road near “Campus Sur U.P.M.” is included. In accord with the statement of technical requirements, these Strategic Noise Maps are divided into A phase (Basic study) and B phase (detailed study). The present project by the name of "Implementation of Strategic Noise Map of Spanish Roads (A-3 – Campus Sur U.P.M. area)" shows the results of Strategic Noise Map for the A-3 in the Community of Madrid (A phase, public information – EGRA project Spanish Ministry of Transport). According to de study results and the not election of “Campus Sur” to B phase (detail), in this project has been made the B phase of Strategic Map in compliance with Directive 2002/49/EC, following the technical requirements specifications for the first phase of deliveries (2007) and using Geographic Information Systems (GIS) to carry on them. Is justified, in accordance with the results, if it was necessary to include “Campus Sur” in this B phase of study. Once established the differences between Strategic Noise Map and Noise Map, the present project examines the technical criteria that must take an acoustic consultant for the realization of a Strategic Map and how these criteria could affect the quality of the final results being all of them valid but it generate a lack of homogeneity between the different Strategic Noise Map by the author of the study. There wasn’t so much experience in Spain in the methodology proposed by the European Noise Directive. The actions that are being taken after delivery of 2007 to try to solve this problem and get harmonized the results among the whole network for the new deliveries each five years are shown at the present project.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Due to dynamic variability, identifying the specific conditions under which non-functional requirements (NFRs) are satisfied may be only possible at runtime. Therefore, it is necessary to consider the dynamic treatment of relevant information during the requirements specifications. The associated data can be gathered by monitoring the execution of the application and its underlying environment to support reasoning about how the current application configuration is fulfilling the established requirements. This paper presents a dynamic decision-making infrastructure to support both NFRs representation and monitoring, and to reason about the degree of satisfaction of NFRs during runtime. The infrastructure is composed of: (i) an extended feature model aligned with a domain-specific language for representing NFRs to be monitored at runtime; (ii) a monitoring infrastructure to continuously assess NFRs at runtime; and (iii) a exible decision-making process to select the best available configuration based on the satisfaction degree of the NRFs. The evaluation of the approach has shown that it is able to choose application configurations that well fit user NFRs based on runtime information. The evaluation also revealed that the proposed infrastructure provided consistent indicators regarding the best application configurations that fit user NFRs. Finally, a benefit of our approach is that it allows us to quantify the level of satisfaction with respect to NFRs specification.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

The first part of this three-part review on the relevance of laboratory testing of composites and adhesives deals with approval requirements for composite materials. We compare the in vivo and in vitro literature data and discuss the relevance of in vitro analyses. The standardized ISO protocols are presented, with a focus on the evaluation of physical parameters. These tests all have a standardized protocol that describes the entire test set-up. The tests analyse flexural strength, depth of cure, susceptibility to ambient light, color stability, water sorption and solubility, and radiopacity. Some tests have a clinical correlation. A high flexural strength, for instance, decreases the risk of fractures of the marginal ridge in posterior restorations and incisal edge build-ups of restored anterior teeth. Other tests do not have a clinical correlation or the threshold values are too low, which results in an approval of materials that show inferior clinical properties (e.g., radiopacity). It is advantageous to know the test set-ups and the ideal threshold values to correctly interpret the material data. Overall, however, laboratory assessment alone cannot ensure the clinical success of a product.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Mode of access: Internet.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Increasingly, national and international governments have a strong mandate to develop national e-health systems to enable delivery of much-needed healthcare services. Research is, therefore, needed into appropriate security and reliance structures for the development of health information systems which must be compliant with governmental and alike obligations. The protection of e-health information security is critical to the successful implementation of any e-health initiative. To address this, this paper proposes a security architecture for index-based e-health environments, according to the broad outline of Australia’s National E-health Strategy and National E-health Transition Authority (NEHTA)’s Connectivity Architecture. This proposal, however, could be equally applied to any distributed, index-based health information system involving referencing to disparate health information systems. The practicality of the proposed security architecture is supported through an experimental demonstration. This successful prototype completion demonstrates the comprehensibility of the proposed architecture, and the clarity and feasibility of system specifications, in enabling ready development of such a system. This test vehicle has also indicated a number of parameters that need to be considered in any national indexed-based e-health system design with reasonable levels of system security. This paper has identified the need for evaluation of the levels of education, training, and expertise required to create such a system.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

With the increasing popularity and adoption of building information modeling (BIM), the amount of digital information available about a building is overwhelming. Enormous challenges remain however in identifying meaningful and required information from a complex BIM model to support a particular construction management (CM) task. Detailed specifications of information required by different construction domains and expressive and easy-to-use BIM reasoning mechanisms are seen as an important means in addressing these challenges. This paper analyzes some of the characteristics and requirements of component-specific construction knowledge in relation to the current work practice and BIM-based applications. It is argued that domain ontologies and information extraction approaches, such as queries could significantly bring much needed support for knowledge sharing and integration of information between design, construction and facility management.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The world of Construction is changing, so too are the expectations of stakeholders regarding strategies for adapting existing resources (people, equipment and finances), processes and tools to the evolving needs of the industry. Building Information Modelling (BIM) is a data-rich, digital approach for representing building information required for design and construction. BIM tools play a crucial role and are instrumental to current approaches, by industry stakeholders, aimed at harnessing the power of a single information repository for improved project delivery and maintenance. Yet, building specifications - which document information on material quality, and workmanship requirements - remain distinctly separate from model information typically represented in BIM models. BIM adoption for building design, construction and maintenance is an industry-wide strategy aimed at addressing such concerns about information fragmentation. However, to effectively reduce inefficiencies due to fragmentation, BIM models require crucial building information contained in specifications. This paper profiles some specification tools which have been used in industry as a means of bridging the BIM-Specifications divide. We analyse the distinction between current attempts at integrating BIM and specifications and our approach which utilizes rich specification information embedded within objects in a product library as a method for improving the quality of information contained in BIM objects at various levels of model development.