14 resultados para Metriche del software Stima del software Software embedded Function point Lines of code
em Aston University Research Archive
Resumo:
Most parametric software cost estimation models used today evolved in the late 70's and early 80's. At that time, the dominant software development techniques being used were the early 'structured methods'. Since then, several new systems development paradigms and methods have emerged, one being Jackson Systems Development (JSD). As current cost estimating methods do not take account of these developments, their non-universality means they cannot provide adequate estimates of effort and hence cost. In order to address these shortcomings two new estimation methods have been developed for JSD projects. One of these methods JSD-FPA, is a top-down estimating method, based on the existing MKII function point method. The other method, JSD-COCOMO, is a sizing technique which sizes a project, in terms of lines of code, from the process structure diagrams and thus provides an input to the traditional COCOMO method.The JSD-FPA method allows JSD projects in both the real-time and scientific application areas to be costed, as well as the commercial information systems applications to which FPA is usually applied. The method is based upon a three-dimensional view of a system specification as opposed to the largely data-oriented view traditionally used by FPA. The method uses counts of various attributes of a JSD specification to develop a metric which provides an indication of the size of the system to be developed. This size metric is then transformed into an estimate of effort by calculating past project productivity and utilising this figure to predict the effort and hence cost of a future project. The effort estimates produced were validated by comparing them against the effort figures for six actual projects.The JSD-COCOMO method uses counts of the levels in a process structure chart as the input to an empirically derived model which transforms them into an estimate of delivered source code instructions.
Resumo:
The research described here concerns the development of metrics and models to support the development of hybrid (conventional/knowledge based) integrated systems. The thesis argues from the point that, although it is well known that estimating the cost, duration and quality of information systems is a difficult task, it is far from clear what sorts of tools and techniques would adequately support a project manager in the estimation of these properties. A literature review shows that metrics (measurements) and estimating tools have been developed for conventional systems since the 1960s while there has been very little research on metrics for knowledge based systems (KBSs). Furthermore, although there are a number of theoretical problems with many of the `classic' metrics developed for conventional systems, it also appears that the tools which such metrics can be used to develop are not widely used by project managers. A survey was carried out of large UK companies which confirmed this continuing state of affairs. Before any useful tools could be developed, therefore, it was important to find out why project managers were not using these tools already. By characterising those companies that use software cost estimating (SCE) tools against those which could but do not, it was possible to recognise the involvement of the client/customer in the process of estimation. Pursuing this point, a model of the early estimating and planning stages (the EEPS model) was developed to test exactly where estimating takes place. The EEPS model suggests that estimating could take place either before a fully-developed plan has been produced, or while this plan is being produced. If it were the former, then SCE tools would be particularly useful since there is very little other data available from which to produce an estimate. A second survey, however, indicated that project managers see estimating as being essentially the latter at which point project management tools are available to support the process. It would seem, therefore, that SCE tools are not being used because project management tools are being used instead. The issue here is not with the method of developing an estimating model or tool, but; in the way in which "an estimate" is intimately tied to an understanding of what tasks are being planned. Current SCE tools are perceived by project managers as targetting the wrong point of estimation, A model (called TABATHA) is then presented which describes how an estimating tool based on an analysis of tasks would thus fit into the planning stage. The issue of whether metrics can be usefully developed for hybrid systems (which also contain KBS components) is tested by extending a number of "classic" program size and structure metrics to a KBS language, Prolog. Measurements of lines of code, Halstead's operators/operands, McCabe's cyclomatic complexity, Henry & Kafura's data flow fan-in/out and post-release reported errors were taken for a set of 80 commercially-developed LPA Prolog programs: By re~defining the metric counts for Prolog it was found that estimates of program size and error-proneness comparable to the best conventional studies are possible. This suggests that metrics can be usefully applied to KBS languages, such as Prolog and thus, the development of metncs and models to support the development of hybrid information systems is both feasible and useful.
Resumo:
Purpose - The main objective of the paper is to develop a risk management framework for software development projects from developers' perspective. Design/methodology/approach - This study uses a combined qualitative and quantitative technique with the active involvement of stakeholders in order to identify, analyze and respond to risks. The entire methodology has been explained using a case study on software development project in a public sector organization in Barbados. Findings - Analytical approach to managing risk in software development ensures effective delivery of projects to clients. Research limitations/implications - The proposed risk management framework has been applied to a single case. Practical implications - Software development projects are characterized by technical complexity, market and financial uncertainties and competent manpower availability. Therefore, successful project accomplishment depends on addressing those issues throughout the project phases. Effective risk management ensures the success of projects. Originality/value - There are several studies on managing risks in software development and information technology (IT) projects. Most of the studies identify and prioritize risks through empirical research in order to suggest mitigating measures. Although they are important to clients for future projects, these studies fail to provide any framework for risk management from software developers' perspective. Although a few studies introduced framework of risk management in software development, most of them are presented from clients' perspectives and very little effort has been made to integrate this with the software development cycle. As software developers absorb considerable amount of risks, an integrated framework for managing risks in software development from developers' perspective is needed. © Emerald Group Publishing Limited.
Resumo:
The objective of this research is to design and build a groupware system which will allow members of a distributed group more flexibility in performing software inspection. Software inspection, which is part of non-execution based testing in software development, is a group activity. The groupware system aims to provide a system that will improve acceptability of groupware and improve software quality by providing a software inspection tool that is flexible and adaptable. The groupware system provide a flexible structure for software inspection meetings. The groupware system will extend the structure of the software inspection meeting itself, allowing software inspection meetings to use all four quadrant of the space-time matrix: face-to-face, distributed synchronous, distributed asynchronous, and same place-different time. This will open up new working possibilities. The flexibility and adaptability of the system allows work to switch rapidly between synchronous and asynchronous interaction. A model for a flexible groupware system was developed. The model was developed based on review of the literature and questionnaires. A prototype based on the model was built using java and WWW technology. To test the effectiveness of the system, an evaluation was conducted. Questionnaires was used to gather response from the users. The evaluations ascertained that the model developed is flexible and adaptable to the different working modes, and the system is capable of supporting several different models of the software inspection process.
Resumo:
There has been little research in health and safety management concernmg the application of information technology to the field. This thesis attempts to stimulate interest in this area by analysing the value of proprietary health and safety software to proactive health and safety management. The thesis is based upon the detailed software evaluation of seven pieces of proprietary health and safety software. It features a discussion concerning the development of information technology and health and safety management, a review of the key issues identified during the software evaluations, an analysis of the commercial market for this type of software, and a consideration of the broader issues which surround the use of this software. It also includes practical guidance for the evaluation, selection, implementation and maintenance of all health and safety management software. This includes a comprehensive software evaluation chart. The implications of the research are considered for proprietary health and safety software, the application of information technology to health and safety management, and for future research.
Resumo:
We have attempted to bring together two areas which are challenging for both IS research and practice: forms of coordination and management of knowledge in the context of global, virtual software development projects. We developed a more comprehensive, knowledge-based model of how coordination can be achieved, and\illustrated the heuristic and explanatory power of the model when applied to global software projects experiencing different degrees of success. We first reviewed the literature on coordination and determined what is known about coordination of knowledge in global software projects. From this we developed a new, distinctive knowledge-based model of coordination, which was then employed to analyze two case studies of global software projects, at SAP and Baan, to illustrate the utility of the model.
Resumo:
Higher education institutions are increasingly using social software tools to support teaching and learning. Despite the fact that social software is often used in a social context, these applications can significantly contribute to the educational experience of a student. However, as the social software domain comprises a considerable diversity of tools, the respective tools can be expected to differ in the way they can contribute to teaching and learning. In this review on the educational use of social software, we systematically analyze and compare the diverse social software tools and identify their contributions to teaching and learning. By integrating established learning theory and the extant literature on the individual social software applications we seek to contribute to a theoretical foundation for social software use and the choice of tools. Case vignettes from several UK higher education institutions are used to illustrate the different applications of social software tools in teaching and learning.
Resumo:
The focus of our work is the verification of tight functional properties of numerical programs, such as showing that a floating-point implementation of Riemann integration computes a close approximation of the exact integral. Programmers and engineers writing such programs will benefit from verification tools that support an expressive specification language and that are highly automated. Our work provides a new method for verification of numerical software, supporting a substantially more expressive language for specifications than other publicly available automated tools. The additional expressivity in the specification language is provided by two constructs. First, the specification can feature inclusions between interval arithmetic expressions. Second, the integral operator from classical analysis can be used in the specifications, where the integration bounds can be arbitrary expressions over real variables. To support our claim of expressivity, we outline the verification of four example programs, including the integration example mentioned earlier. A key component of our method is an algorithm for proving numerical theorems. This algorithm is based on automatic polynomial approximation of non-linear real and real-interval functions defined by expressions. The PolyPaver tool is our implementation of the algorithm and its source code is publicly available. In this paper we report on experiments using PolyPaver that indicate that the additional expressivity does not come at a performance cost when comparing with other publicly available state-of-the-art provers. We also include a scalability study that explores the limits of PolyPaver in proving tight functional specifications of progressively larger randomly generated programs. © 2014 Springer International Publishing Switzerland.
Resumo:
As a discipline, supply chain management (SCM) has traditionally been primarily concerned with the procurement, processing, movement and sale of physical goods. However an important class of products has emerged - digital products - which cannot be described as physical as they do not obey commonly understood physical laws. They do not possess mass or volume, and they require no energy in their manufacture or distribution. With the Internet, they can be distributed at speeds unimaginable in the physical world, and every copy produced is a 100% perfect duplicate of the original version. Furthermore, the ease with which digital products can be replicated has few analogues in the physical world. This paper assesses the effect of non-physicality on one such product – software – in relation to the practice of SCM. It explores the challenges that arise when managing the software supply chain and how practitioners are addressing these challenges. Using a two-pronged exploratory approach that examines the literature around software management as well as direct interviews with software distribution practitioners, a number of key challenges associated with software supply chains are uncovered, along with responses to these challenges. This paper proposes a new model for software supply chains that takes into account the non-physicality of the product being delivered. Central to this model is the replacement of physical flows with flows of intellectual property, the growing importance of innovation over duplication and the increased centrality of the customer in the entire process. Hybrid physical / digital supply chains are discussed and a framework for practitioners concerned with software supply chains is presented.
Resumo:
This chapter provides information on the use of Performance Improvement Management Software (PIMDEA). This advanced DEA software enables users to make the best possible analysis of the data, using the latest theoretical developments in Data Envelopment Analysis (DEA). PIM-DEA software gives full capacity to assess efficiency and productivity, set targets, identify benchmarks, and much more, allowing users to truly manage the performance of organizational units. PIM-DEA is easy to use and powerful, and it has an extensive range of the most up-to-date DEA models and which can handle large sets of data.
Resumo:
Software architecture plays an essential role in the high level description of a system design, where the structure and communication are emphasized. Despite its importance in the software engineering process, the lack of formal description and automated verification hinders the development of good software architecture models. In this paper, we present an approach to support the rigorous design and verification of software architecture models using the semantic web technology. We view software architecture models as ontology representations, where their structures and communication constraints are captured by the Web Ontology Language (OWL) and the Semantic Web Rule Language (SWRL). Specific configurations on the design are represented as concrete instances of the ontology, to which their structures and dynamic behaviors must conform. Furthermore, ontology reasoning tools can be applied to perform various automated verification on the design to ensure correctness, such as consistency checking, style recognition, and behavioral inference.
Resumo:
Se presenta y describen las líneas de trabajo experimentales que se vienen cultivando en el Grupo de investigación en Dinámica no Lineal y Fibras ópticas, recientemente creado en el Instituto de Óptica del CSIC. We present the experimental lines developed in last years in the Nonlinear Dynamics and Fiber Optics Group (NDFO) of the Optics Institute "Daza de Valdés" (IO-CSIC). © Sociedad Española de Óptica.