30 resultados para software asset creation
Resumo:
Participation in European Union research projects now requires the setting-up of a project website. This paper discusses the creation of the "Matrix" to facilitate the information visualisation of a project; experiments, data, and results, etc, i.e. information far beyond the promotional details of the website. The paper describes the theory of such an endeavour before proceeding to discuss the practical realities for this case study project. Finally, we consider any lessons that can be learnt from this real-world application.
Resumo:
This paper examines different ways for measuring similarity between software design models for the purpose of software reuse. Current approaches to this problem are discussed and a set of suitable similarity metrics are proposed and evaluated. Work on the optimisation of weights to increase the competence of a CBR system is presented. A graph matching algorithm and associated metrics capturing the structural similarity between UML class diagrams is presented and demonstrated through an example case.
Resumo:
This paper describes the use of a blackboard architecture for building a hybrid case based reasoning (CBR) system. The Smartfire fire field modelling package has been built using this architecture and includes a CBR component. It allows the integration into the system of qualitative spatial reasoning knowledge from domain experts. The system can be used for the automatic set-up of fire field models. This enables fire safety practitioners who are not expert in modelling techniques to use a fire modelling tool. The paper discusses the integrating powers of the architecture, which is based on a common knowledge representation comprising a metric diagram and place vocabulary and mechanisms for adaptation and conflict resolution built on the Blackboard.
Resumo:
Software metrics are the key tool in software quality management. In this paper, we propose to use support vector machines for regression applied to software metrics to predict software quality. In experiments we compare this method with other regression techniques such as Multivariate Linear Regression, Conjunctive Rule and Locally Weighted Regression. Results on benchmark dataset MIS, using mean absolute error, and correlation coefficient as regression performance measures, indicate that support vector machines regression is a promising technique for software quality prediction. In addition, our investigation of PCA based metrics extraction shows that using the first few Principal Components (PC) we can still get relatively good performance.
Resumo:
[This abstract is based on the authors' abstract.]Three new standards to be applied when adopting commercial computer off-the-shelf (COTS) software solutions are discussed. The first standard is for a COTS software life cycle, the second for a software solution user requirements life cycle, and the third is a checklist to help in completing the requirements. The standards are based on recent major COTS software solution implementations.
Resumo:
The parallelization of real-world compute intensive Fortran application codes is generally not a trivial task. If the time to complete the parallelization is to be significantly reduced then an environment is needed that will assist the programmer in the various tasks of code parallelization. In this paper the authors present a code parallelization environment where a number of tools that address the main tasks such as code parallelization, debugging and optimization are available. The ParaWise and CAPO parallelization tools are discussed which enable the near automatic parallelization of real-world scientific application codes for shared and distributed memory-based parallel systems. As user involvement in the parallelization process can introduce errors, a relative debugging tool (P2d2) is also available and can be used to perform nearly automatic relative debugging of a program that has been parallelized using the tools. A high quality interprocedural dependence analysis as well as user-tool interaction are also highlighted and are vital to the generation of efficient parallel code and in the optimization of the backtracking and speculation process used in relative debugging. Results of benchmark and real-world application codes parallelized are presented and show the benefits of using the environment
Resumo:
In this chapter we look at JOSTLE, the multilevel graph-partitioning software package, and highlight some of the key research issues that it addresses. We first outline the core algorithms and place it in the context of the multilevel refinement paradigm. We then look at issues relating to its use as a tool for parallel processing and, in particular, partitioning in parallel. Since its first release in 1995, JOSTLE has been used for many mesh-based parallel scientific computing applications and so we also outline some enhancements such as multiphase mesh-partitioning, heterogeneous mapping and partitioning to optimise subdomain shape
Resumo:
This paper examines different ways of measuring similarity between software design models for Case Based Reasoning (CBR) to facilitate reuse of software design and code. The paper considers structural and behavioural aspects of similarity between software design models. Similarity metrics for comparing static class structures are defined and discussed. A Graph representation of UML class diagrams and corresponding similarity measures for UML class diagrams are defined. A full search graph matching algorithm for measuring structural similarity diagrams based on the identification of the Maximum Common Sub-graph (MCS) is presented. Finally, a simple evaluation of the approach is presented and discussed.
Resumo:
This paper describes a methodology for embedding dynamic behaviour into software components. The implications and system architecture requirements to support this adaptivity are discussed. This work is part of a European Commission funded and industry supported project to produce a reconfigurable middleware for use in automotive systems. Such systems must be trustable against illegal internal behaviour and activity with external origins, additional devices for example. Policy-based computing is used here as an example of embedded logic. A key contribution of this work is the way in which static and dynamic aspects of the system are interfaced, such that the behaviour can be changed very flexibly (even during run-time), without modification, recompilation or redeployment of the embedded application code. An implementation of these concepts is presented, focussing on achieving trust in the use of dynamic behaviour.
Resumo:
This paper describes a highly flexible component architecture, primarily designed for automotive control systems, that supports distributed dynamically- configurable context-aware behaviour. The architecture enforces a separation of design-time and run-time concerns, enabling almost all decisions concerning runtime composition and adaptation to be deferred beyond deployment. Dynamic context management contributes to flexibility. The architecture is extensible, and can embed potentially many different self-management decision technologies simultaneously. The mechanism that implements the run-time configuration has been designed to be very robust, automatically and silently handling problems arising from the evaluation of self- management logic and ensuring that in the worst case the dynamic aspects of the system collapse down to static behavior in totally predictable ways.
Resumo:
[Author's description] Bringing together new research on punishment and control in the 19th and 20th centuries, this collection begins by examining the development of the modern prison, gender, social control and punishment, and psychiatry and the criminal justice system. Further, it explores penal olicy, prison practice, and discourses on offenders, providing case studies of: the 'respectable' criminal, the female inebriate and the juvenile offender. The final part examines the experiences of confinement, discipline and resistance, through prisoner memoirs, prison riots and resistance and identity in residential institutions.
Resumo:
The role intra-organizational knowledge exchanges play in innovation processes has been widely acknowledged in the organizational literature. This paper contributes to the understanding of which specific configurations knowledge networks assume during different phases of radical and incremental innovation processes. The case study we selected is a FLOSS (Free/Libre Open Source Software) community consisting of 233 developers committed to the development of a web browser application since November 2002. By harvesting the mailing list, official blog and code repository of a FLOSS community, we investigate the patterns of knowledge exchange and individual contributions of its developers. We measure structural cohesion and compare global and local network properties at different points in time. Preliminary results show that phases of radical and incremental innovation are associated with specific configurations of the knowledge network as a whole as well as with different network positions of the core developers of the software.
Resumo:
In global marketing and international management, the fields of Branding and Culture are well discussed as separate disciplines; within both academia and industry. However, there appears to be limited supporting literature, examining brands and culture as a collective discipline. In addition, environmental factors such as ethnicity, nationality and religion are also seen to play a significant role. This in itself adds to the challenges encountered, by those looking to critically apply learning and frameworks, to any information gathered. In the first instance, this paper tries to bring aspects together from Branding and Culture and in doing so, aims to find linkages between the two. The main purpose of this paper is to distil current brand thinking and explore what impact cross-cultural, cross-national, and ethnic interactions have on a brand’s creation. The position of the authors is that without further understanding in this field, a brand will experience what has been termed by them as the ‘Pinocchio Effect’. Pinocchio was a puppet who longed to become a real human being; but sadly encountered difficulties. The conclusion presented is that the critical long-term success of a brand lies in three areas: how it is created; the subsequent associated perceptions; and more specifically in the reality of the relationships that it enjoys. Collectively these processes necessitate an appraisal of connecting strategic management procedures and thinking. Finally, this paper looks into proposing future methods for brand evaluation and strategic management. The aim is to stimulate further thinking in a field; which transcends national, ethnic and cultural boundaries - in the interests of developing new insight, and to provide a platform for marketers to develop more effective communications.
Resumo:
The main sources of financing for small and medium sized enterprises (SMEs) are equity (internally generated cash), trade credit paid on time, long and short term bank credits, delayed payment on trade credit and other debt. The marginal costs of each financing instrument are driven by asymmetric information (cost of gathering and analysing information) and transactions costs associated with non-payment (costs of collecting and selling collateral). According to the Pecking Order Theory, firms will choose the cheapest source in terms of cost. In the case of the static trade-off theory, firms choose finance so that the marginal costs across financing sources are all equal, thus an additional Euro of financing is obtained from all the sources whereas under the Pecking Order Theory the source is determined by how far down the Pecking Order the firm is presently located. In this paper, we argue that both of these theories miss the point that the marginal costs are dependent of the use of the funds, and the asset side of the balance sheet primarily determines the financing source for an additional Euro. An empirical analysis on a unique dataset of Portuguese SME's confirms that the composition of the asset side of the balance sheet has an impact of the type of financing used and the Pecking Order Theory and the traditional Static Trade-off theory are rejected.
Resumo:
Deliberating on Enterprise Resource Planning (ERP) software sourcing and provision, this paper contrasts the corporate environment with the small business environment. The paper is about Enterprise Resource Planning client (ERPc) expectations and Enterprise Resource Planning vendor (ERPv) value propositions as a mutually compatible process for achieving acceptable standards of ERP software performance. It is suggested that a less-than-equitable vendor–client relationship would not contribute to the implementation of the optimum solution. Adapting selected theoretical concepts and models, the researchers analyse ERPv to ERPc relationship. This analysis is designed to discover if the provision of the very large ERP vendors who market systems such as SAP, and the provision of the smaller ERP vendors (in this instance Eshbel Technologies Ltd who market an ERP software solution called Priority) when framed as a value proposition (Walters, D. (2002) Operations Strategy. Hampshire, UK: Palgrave), is at all comparable or distinctive.