37 resultados para Software Process Improvement
Resumo:
This study describes a study of 14 software companies, on how they initiate and pre-plan software projects. The aim was to obtain an indication of the range of planning activities carried out. The study, using a convenience sample, was carried out using structured interviews, with questions about early software project planning activities. The study offers evidence that an iterative and incremental development process presents extra difficulties in the case of fixed-contract projects. The authors also found evidence that feasibility studies were common, but generally informal in nature. Documentation of the planning process, especially for project scoping, was variable. For incremental and iterative development projects, an upfront decision on software architecture was shown to be preferred over allowing the architecture to just ‘emerge’. There is also evidence that risk management is recognised but often performed incompletely. Finally appropriate future research arising from the study is described.
Resumo:
The decision of the U.S. Supreme Court in 1991 in Feist Publications, Inc. v. Rural Tel. Service Co. affirmed originality as a constitutional requirement for copyright. Originality has a specific sense and is constituted by a minimal degree of creativity and independent creation. The not original is the more developed concept within the decision. It includes the absence of a minimal degree of creativity as a major constituent. Different levels of absence of creativity also are distinguished, from the extreme absence of creativity to insufficient creativity. There is a gestalt effect of analogy between the delineation of the not original and the concept of computability. More specific correlations can be found within the extreme absence of creativity. "[S]o mechanical" in the decision can be correlated with an automatic mechanical procedure and clauses with a historical resonance with understandings of computability as what would naturally be regarded as computable. The routine within the extreme absence of creativity can be regarded as the product of a computational process. The concern of this article is with rigorously establishing an understanding of the extreme absence of creativity, primarily through the correlations with aspects of computability. The understanding established is consistent with the other elements of the not original. It also revealed as testable under real-world conditions. The possibilities for understanding insufficient creativity, a minimal degree of creativity, and originality, from the understanding developed of the extreme absence of creativity, are indicated.
Resumo:
Developing a desirable framework for handling inconsistencies in software requirements specifications is a challenging problem. It has been widely recognized that the relative priority of requirements can help developers to make some necessary trade-off decisions for resolving con- flicts. However, for most distributed development such as viewpoints-based approaches, different stakeholders may assign different levels of priority to the same shared requirements statement from their own perspectives. The disagreement in the local levels of priority assigned to the same shared requirements statement often puts developers into a dilemma during the inconsistency handling process. The main contribution of this paper is to present a prioritized merging-based framework for handling inconsistency in distributed software requirements specifications. Given a set of distributed inconsistent requirements collections with the local prioritization, we first construct a requirements specification with a prioritization from an overall perspective. We provide two approaches to constructing a requirements specification with the global prioritization, including a merging-based construction and a priority vector-based construction. Following this, we derive proposals for handling inconsistencies from the globally prioritized requirements specification in terms of prioritized merging. Moreover, from the overall perspective, these proposals may be viewed as the most appropriate to modifying the given inconsistent requirements specification in the sense of the ordering relation over all the consistent subsets of the requirements specification. Finally, we consider applying negotiation-based techniques to viewpoints so as to identify an acceptable common proposal from these proposals.
Resumo:
This case study examines how the lean ideas behind the Toyota production system can be applied to software project management. It is a detailed investigation of the performance of a nine-person software development team employed by BBC Worldwide based in London. The data collected in 2009 involved direct observations of the development team, the kanban boards, the daily stand-up meetings, semistructured interviews with a wide variety of staff, and statistical analysis. The evidence shows that over the 12-month period, lead time to deliver software improved by 37%, consistency of delivery rose by 47%, and defects reported by customers fell 24%. The significance of this work is showing that the use of lean methods including visual management, team-based problem solving, smaller batch sizes, and statistical process control can improve software development. It also summarizes key differences between agile and lean approaches to software development. The conclusion is that the performance of the software development team was improved by adopting a lean approach. The faster delivery with a focus on creating the highest value to the customer also reduced both technical and market risks. The drawbacks are that it may not fit well with existing corporate standards.
Resumo:
For any proposed software project, when the software requirements specification has been established, requirements changes may result in not only a modification of the requirements specification but also a series of modifications of all existing artifacts during the development. Then it is necessary to provide effective and flexible requirements changes management. In this paper, we present an approach to managing requirements changes based on Booth’s negotiation-style framework for belief revision. Informally, we consider the current requirements specification as a belief set about the system-to-be. The request of requirements change is viewed as new information about the same system-to-be. Then the process of executing the requirements change is a process of revising beliefs about the system-to-be. We design a family of belief negotiation models appropriate for different processes of requirements revision, including the setting of the request of requirements change being fully accepted, the setting of the current requirements specification being fully preserved, and that of the current specification and the request of requirements change reaching a compromise. In particular, the prioritization of requirements plays an important role in reaching an agreement in each belief negotiation model designed in this paper.
Resumo:
This paper describes the application of an improved nonlinear principal component analysis (PCA) to the detection of faults in polymer extrusion processes. Since the processes are complex in nature and nonlinear relationships exist between the recorded variables, an improved nonlinear PCA, which incorporates the radial basis function (RBF) networks and principal curves, is proposed. This algorithm comprises two stages. The first stage involves the use of the serial principal curve to obtain the nonlinear scores and approximated data. The second stage is to construct two RBF networks using a fast recursive algorithm to solve the topology problem in traditional nonlinear PCA. The benefits of this improvement are demonstrated in the practical application to a polymer extrusion process.
Resumo:
Sphere Decoding (SD) is a highly effective detection technique for Multiple-Input Multiple-Output (MIMO) wireless communications receivers, offering quasi-optimal accuracy with relatively low computational complexity as compared to the ideal ML detector. Despite this, the computational demands of even low-complexity SD variants, such as Fixed Complexity SD (FSD), remains such that implementation on modern software-defined network equipment is a highly challenging process, and indeed real-time solutions for MIMO systems such as 4 4 16-QAM 802.11n are unreported. This paper overcomes this barrier. By exploiting large-scale networks of fine-grained softwareprogrammable processors on Field Programmable Gate Array (FPGA), a series of unique SD implementations are presented, culminating in the only single-chip, real-time quasi-optimal SD for 44 16-QAM 802.11n MIMO. Furthermore, it demonstrates that the high performance software-defined architectures which enable these implementations exhibit cost comparable to dedicated circuit architectures.
Resumo:
Therapeutic strategies aimed to reverse the pathogenic process, replace diseased tissue, and restore visual function represent the final frontier in treatment of chronic ocular disease. The goal in this approach is improvement, not stabilization or slowing the decline of the disease. Lines of research that can lead to identification of new treatments that could reverse the disease course are reviewed.
Resumo:
In essence, optimal software engineering means creating the right product, through the right process, to the overall satisfaction of everyone involved. Adopting the agile approach to software development appears to have helped many companies make substantial progress towards that goal. The purpose of this paper is to clarify that contribution from comparative survey information gathered in 2010 and 2012. The surveys were undertaken in software development companies across Northern Ireland. The paper describes the design of the surveys and discusses optimality in relation to the results obtained. Both surveys aimed to achieve comprehensive coverage of a single region rather than rely on a voluntary sample. The main outcome from the work is a collection of insights into the nature and advantages of agile development, suggesting how further progress towards optimality might be achieved.
Resumo:
Porous poly(L-lactic acid) (PLA) scaffolds of 85 per cent and 90 per cent porosity are prepared using polymer sintering and porogen leaching method. Different weight fractions of 10 per cent, 30 per cent, and 50 per cent of hydroxyapatite (HA) are added to the PLA to control the acidity and degradation rate. The three-dimensional (3D) morphology and surface porosity are tested using micro-computer tomography (micro-CT), optical microscopy, and scanning electron microscopy (SEM). Results indicate that the surface porosity does not change on the addition of HA. The micro-CT examinations show a slight decrease in the pore size and increase in the wall thickness accompanied by reduced anisotropy for the scaffolds containing HA. Scanning electron micrographs show detectable interconnected pores for the scaffold with pure PLA. Addition of the HA results in agglomeration of the HA particles and reduced leaching of the porogen. Compression tests of the scaffold identify three stages in the stress-strain curve. The addition of HA results in a reduction in the modulus of the scaffold at the first stage of elastic bending of the wall, but this is reversed for the second and third stages of collapse of the wall and densification in the compression tests. In the scaffolds with 85 per cent porosity, the addition of a high percentage of HA could result in 70 per cent decrease in stiffness in the first stage, 200 per cent increase in stiffness in the second stage, and 20 per cent increase in stiffness in the third stage. The results of these tests are compared with the Gibson cellular material model that is proposed for prediction of the behaviour of cellular material under compression. The pH and molecular weight changes are tracked for the scaffolds within a period of 35 days. The addition of HA keeps the pH in the alkaline region, which results in higher rate of degradation at an early period of observation, followed by a reduced rate of degradation later in the process. The final molecular weight is higher for the scaffolds with HA than for scaffolds of pure PLA. The manufactured scaffolds offer acceptable properties in terms of the pore size range and interconnectivity of the pores and porosity for non-load-bearing bone graft substitute; however, improvement to the mixing of the phases of PLA and HA is required to achieve better integrity of the composite scaffolds. © 2008 IMechE.
Resumo:
The effects of addition of reinforcing carbon nanotubes (CNTs) into hydrogenated nitrile-butadiene rubber (HNBR) matrix on the mechanical, dynamic viscoelastic, and permeability properties were studied in this investigation. Different techniques of incorporating nanotubes in HNBR were investigated in this research. The techniques considered were more suitable for industrial preparation of rubber composites. The nanotubes were modified with different surfactants and dispersion agents to improve the compatibility and adhesion of nanotubes on the HNBR matrix. The effects of the surface modification of the nanotubes on various properties were examined in detail. The amount of CNTs was varied from 2.5 to 10 phr in different formulations prepared to identify the optimum CNT levels. A detailed analysis was made to investigate the morphological structure and mechanical behavior at room temperature. The viscoelastic behavior of the nanotube filler elastomer was studied by dynamic mechanical thermal analysis (DMTA). Morphological analysis indicated a very good dispersion of the CNTs for a low nanotube loading of 3.5 phr. A significant improvement in the mechanical properties was observed with the addition of nanotubes. DMTA studies revealed an increase in the storage modulus and a reduction in the glass-transition temperature after the incorporation of the nanotubes. Further, the HNBR/CNT nanocomposites were subjected to permeability studies. The studies showed a significant reduction in the permeability of nitrogen gas. Copyright © 2011 Wiley Periodicals, Inc.
Resumo:
Drilling of Ti6Al4V is investigated experimentally and numerically. A 3D finite element model developed based on Lagrangian approach using commercial finite element software ABAQUS/explicit. 3D complex drill geometry is included in the model. The drilling process simulations are performed at the combinations of three cutting speed and four feed rates. The effects of cutting parameters on the induced thrust force and torque are predicted by the developed model. For validation purpose, experimental trials have been performed in similar condition to the simulations. The forces and torques measured during experiment are compared to the results of the finite element analysis. The agreement of the experimental results for force and torque values with the FE results is very good. Moreover, surface roughness of the holes was measured for mapping of machining. Copyright © 2013 Inderscience Enterprises Ltd.
Resumo:
Since the first launch of the new engineering contract (NEC) in 1993, early warning of problems has been widely recognized as an important approach of proactive management during a construction or engineering project. Is early warning really effective for the improvement of problem solving and project performance? This is a research question that still lacks a good answer. For this reason, an empirical investigation was made in the United Kingdom (U.K.) to answer the question. This study adopts a combination of literature review, expert interview, and questionnaire survey. Nearly 100 questionnaire responses were collected from the U.K. construction industry, based on which the use of early warning under different forms of contract is compared in this paper. Problem solving and project performance are further compared between the projects using early warning and the projects not using early warning. The comparison provides clear evidence for the significant effect of early warning on problem solving and project performance in terms of time, cost, and quality. Subsequently, an input-process-output model is developed in this paper to explore the relationship among early warning, problem solving, and project
performance. All these help construction researchers and practitioners to better understand the role of early warning in ensuring project success.
Resumo:
Software Product-Line Engineering has emerged in recent years, as an important strategy for maximising reuse within the context of a family of related products. In current approaches to software product-lines, there is general agreement that the definition of a reference-architecture for the product-line is an important step in the software engineering process. In this paper we introduce ADLARS, a new form of architecture Description language that places emphasis on the capture of architectural relationships. ADLARS is designed for use within a product-line engineering process. The language supports both the definition of architectural structure, and of important architectural relationships. In particular it supports capture of the relationships between product features, component and task architectures, interfaces and parameter requirements.
Resumo:
2-D Discrete Cosine Transform (DCT) is widely used as the core of digital image and video compression. In this paper, we present a novel DCT architecture that allows aggressive voltage scaling by exploiting the fact that not all intermediate computations are equally important in a DCT system to obtain "good" image quality with Peak Signal to Noise Ratio(PSNR) > 30 dB. This observation has led us to propose a DCT architecture where the signal paths that are less contributive to PSNR improvement are designed to be longer than the paths that are more contributive to PSNR improvement. It should also be noted that robustness with respect to parameter variations and low power operation typically impose contradictory requirements in terms of architecture design. However, the proposed architecture lends itself to aggressive voltage scaling for low-power dissipation even under process parameter variations. Under a scaled supply voltage and/or variations in process parameters, any possible delay errors would only appear from the long paths that are less contributive towards PSNR improvement, providing large improvement in power dissipation with small PSNR degradation. Results show that even under large process variation and supply voltage scaling (0.8V), there is a gradual degradation of image quality with considerable power savings (62.8%) for the proposed architecture when compared to existing implementations in 70 nm process technology.