78 resultados para Requirements specifications
Resumo:
As a class of defects in software requirements specification, inconsistency has been widely studied in both requirements engineering and software engineering. It has been increasingly recognized that maintaining consistency alone often results in some other types of non-canonical requirements, including incompleteness of a requirements specification, vague requirements statements, and redundant requirements statements. It is therefore desirable for inconsistency handling to take into account the related non-canonical requirements in requirements engineering. To address this issue, we propose an intuitive generalization of logical techniques for handling inconsistency to those that are suitable for managing non-canonical requirements, which deals with incompleteness and redundancy, in addition to inconsistency. We first argue that measuring non-canonical requirements plays a crucial role in handling them effectively. We then present a measure-driven logic framework for managing non-canonical requirements. The framework consists of five main parts, identifying non-canonical requirements, measuring them, generating candidate proposals for handling them, choosing commonly acceptable proposals, and revising them according to the chosen proposals. This generalization can be considered as an attempt to handle non-canonical requirements along with logic-based inconsistency handling in requirements engineering.
Towards an understanding of the causes and effects of software requirements change: two case studies
Resumo:
Changes to software requirements not only pose a risk to the successful delivery of software applications but also provide opportunity for improved usability and value. Increased understanding of the causes and consequences of change can support requirements management and also make progress towards the goal of change anticipation. This paper presents the results of two case studies that address objectives arising from that ultimate goal. The first case study evaluated the potential of a change source taxonomy containing the elements ‘market’, ‘organisation’, ‘vision’, ‘specification’, and ‘solution’ to provide a meaningful basis for change classification and measurement. The second case study investigated whether the requirements attributes of novelty, complexity, and dependency correlated with requirements volatility. While insufficiency of data in the first case study precluded an investigation of changes arising due to the change source of ‘market’, for the remainder of the change sources, results indicate a significant difference in cost, value to the customer and management considerations. Findings show that higher cost and value changes arose more often from ‘organisation’ and ‘vision’ sources; these changes also generally involved the co-operation of more stakeholder groups and were considered to be less controllable than changes arising from the ‘specification’ or ‘solution’ sources. Results from the second case study indicate that only ‘requirements dependency’ is consistently correlated with volatility and that changes coming from each change source affect different groups of requirements. We conclude that the taxonomy can provide a meaningful means of change classification, but that a single requirement attribute is insufficient for change prediction. A theoretical causal account of requirements change is drawn from the implications of the combined results of the two case studies.
Resumo:
Increasingly infrastructure providers are supplying the cloud marketplace with storage and on-demand compute resources to host cloud applications. From an application user's point of view, it is desirable to identify the most appropriate set of available resources on which to execute an application. Resource choice can be complex and may involve comparing available hardware specifications, operating systems, value-added services, such as network configuration or data replication, and operating costs, such as hosting cost and data throughput. Providers' cost models often change and new commodity cost models, such as spot pricing, have been introduced to offer significant savings. In this paper, a software abstraction layer is used to discover infrastructure resources for a particular application, across multiple providers, by using a two-phase constraints-based approach. In the first phase, a set of possible infrastructure resources are identified for a given application. In the second phase, a heuristic is used to select the most appropriate resources from the initial set. For some applications a cost-based heuristic is most appropriate; for others a performance-based heuristic may be used. A financial services application and a high performance computing application are used to illustrate the execution of the proposed resource discovery mechanism. The experimental result shows the proposed model could dynamically select an appropriate set of resouces that match the application's requirements.
Resumo:
Free fatty acid receptor 2 (FFA2; GPR43) is a G protein-coupled seven-transmembrane receptor for short-chain fatty acids (SCFAs) that is implicated in inflammatory and metabolic disorders. The SCFA propionate has close to optimal ligand efficiency for FFA2 and can hence be considered as highly potent given its size. Propionate, however, does not discriminate between FFA2 and the closely related receptor FFA3 (GPR41). To identify FFA2-selective ligands and understand the molecular basis for FFA2 selectivity, a targeted library of small carboxylic acids was examined using holistic, label-free dynamic mass redistribution technology for primary screening and the receptor-proximal G protein [S-35] guanosine 5'-(3-O-thio) triphosphate activation, inositol phosphate, and cAMP accumulation assays for hit confirmation. Structure-activity relationship analysis allowed formulation of a general rule to predict selectivity for small carboxylic acids at the orthosteric binding site where ligands with substituted sp(3)-hybridized alpha-carbons preferentially activate FFA3, whereas ligands with sp(2)- or sp-hybridized alpha-carbons prefer FFA2. The orthosteric binding mode was verified by site-directed mutagenesis: replacement of orthosteric site arginine residues by alanine in FFA2 prevented ligand binding, and molecular modeling predicted the detailed mode of binding. Based on this, selective mutation of three residues to their non-conserved counterparts in FFA3 was sufficient to transfer FFA3 selectivity to FFA2. Thus, selective activation of FFA2 via the orthosteric site is achievable with rather small ligands, a finding with significant implications for the rational design of therapeutic compounds selectively targeting the SCFA receptors.
Resumo:
The use of dataflow digital signal processing system modelling
and synthesis techniques has been a fruitful research theme for many years and has yielded many powerful rapid system synthesis and optimisation capabilities. However, recent years have seen the spectrum of languages and techniques splinter in an application specific manner, resulting in an ad-hoc design process which is increasingly dependent on the particular application under development. This poses a major problem for automated toolflows attempting to provide rapid system synthesis for a wide ranges of applications. By analysing a number of dataflow FPGA implementation case studies, this paper shows that despit ethis common traits may be found in current techniques, which fall largely into three classes. Further, it exposes limitations pertaining to their ability to adapt algorith models to implementations for different operating environments and target platforms.
Resumo:
Key stakeholders in the UK charity sector have, in recent years, advocated greater accountability for charity performance. Part of that debate has focussed on the use of conversion ratios as indicators of efficiency, with importance to stakeholders being contrasted with charities’ apparent reluctance to report such measures. Whilst, before 2005, conversion ratios could have been computed from financial statements, changes in the UK charity SORP have radically altered the ability of users to do this. This article explores the impact on the visibility of such information through an analysis of the financial statements of large UK charities before and after the 2005 changes. Overall, the findings suggest that, despite the stated intention of increasing transparency in respect of charity costs, the application of the changes has resulted in charities ‘managing’ the numbers and limiting their disclosures, possibly to the detriment of external stakeholders.
Resumo:
Abstract
BACKGROUND:
Glaucoma is a leading cause of blindness. Early detection is advocated but there is insufficient evidence from randomized controlled trials (RCTs) to inform health policy on population screening. Primarily, there is no agreed screening intervention. For a screening programme, agreement is required on the screening tests to be used, either individually or in combination, the person to deliver the test and the location where testing should take place. This study aimed to use ophthalmologists (who were experienced glaucoma subspecialists), optometrists, ophthalmic nurses and patients to develop a reduced set of potential screening tests and testing arrangements that could then be explored in depth in a further study of their feasibility for evaluation in a glaucoma screening RCT.
METHODS:
A two-round Delphi survey involving 38 participants was conducted. Materials were developed from a prior evidence synthesis. For round one, after some initial priming questions in four domains, specialists were asked to nominate three screening interventions, the intervention being a combination of the four domains; target population, (age and higher risk groups), site, screening test and test operator (provider). More than 250 screening interventions were identified. For round two, responses were condensed into 72 interventions and each was rated by participants on a 0-10 scale in terms of feasibility.
RESULTS:
Using a cut-off of a median rating of feasibility of =5.5 as evidence of agreement of intervention feasibility, six interventions were identified from round 2. These were initiating screening at age 50, with a combination of two or three screening tests (varying combinations of tonometry/measures of visual function/optic nerve damage) organized in a community setting with an ophthalmic trained technical assistant delivering the tests. An alternative intervention was a 'glaucoma risk score' ascertained by questionnaire. The advisory panel recommended that further exploration of the feasibility of screening higher risk populations and detailed specification of the screening tests was required.
CONCLUSIONS:
With systematic use of expert opinions, a shortlist of potential screening interventions was identified. Views of users, service providers and cost-effectiveness modeling are now required to identify a feasible intervention to evaluate in a future glaucoma screening trial.