25 resultados para requirements development


Relevância:

40.00% 40.00%

Publicador:

Resumo:

DUE TO COPYRIGHT RESTRICTIONS ONLY AVAILABLE FOR CONSULTATION AT ASTON UNIVERSITY LIBRARY AND INFORMATION SERVICES WITH PRIOR ARRANGEMENT

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Purpose - Despite the increasing sophistication of new product development (NPD) research, the reliance on traditional approaches to studying NPD has left several areas in need of further research. The authors propose addressing some of these gaps, especially the limited focus on consumer brands, evaluation criteria used across different project-review points in the NPD process, and the distinction between "kills", "successes", and "failures". Moreover, they propose investigating how screening criteria change across project-review points, using real-time NPD projects. Design/methodology/approach - A postal survey generated 172 usable questionnaires from a sample of European, North American, Far Eastern and Australian consumer packaged-goods firms, providing data on 314 new product projects covering different development and post-commercialization review points. Findings - The results confirm that acceptance-rejection criteria vary through the NPD process. However, financial criteria dominate across all the project-review points. Initial screening is coarse, focusing predominantly on financial criteria. Fit with organizational, product, brand, promotional, and market requirements dominate in the detailed screen and pre-development evaluation points. At pre-launch, decision-makers focus on product, brand, and promotional criteria. Commercial fit, production synergies, and reliability of the firm's market intelligence are significant discriminators in the post-launch review. Moreover, the importance of marketing and channel issues makes the criteria for screening brands different from those of industrial markets. Originality/value - The study, although largely descriptive and involves a relatively small sample of consumer goods firms, offers new insights into NPD project evaluation behavior. Future, larger-scale investigations covering a broader spectrum of consumer product sectors are needed to validate our results and to explain the reasons behind managers' decisions. © Emerald Group Publishing Limited.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Hard real-time systems are a class of computer control systems that must react to demands of their environment by providing `correct' and timely responses. Since these systems are increasingly being used in systems with safety implications, it is crucial that they are designed and developed to operate in a correct manner. This thesis is concerned with developing formal techniques that allow the specification, verification and design of hard real-time systems. Formal techniques for hard real-time systems must be capable of capturing the system's functional and performance requirements, and previous work has proposed a number of techniques which range from the mathematically intensive to those with some mathematical content. This thesis develops formal techniques that contain both an informal and a formal component because it is considered that the informality provides ease of understanding and the formality allows precise specification and verification. Specifically, the combination of Petri nets and temporal logic is considered for the specification and verification of hard real-time systems. Approaches that combine Petri nets and temporal logic by allowing a consistent translation between each formalism are examined. Previously, such techniques have been applied to the formal analysis of concurrent systems. This thesis adapts these techniques for use in the modelling, design and formal analysis of hard real-time systems. The techniques are applied to the problem of specifying a controller for a high-speed manufacturing system. It is shown that they can be used to prove liveness and safety properties, including qualitative aspects of system performance. The problem of verifying quantitative real-time properties is addressed by developing a further technique which combines the formalisms of timed Petri nets and real-time temporal logic. A unifying feature of these techniques is the common temporal description of the Petri net. A common problem with Petri net based techniques is the complexity problems associated with generating the reachability graph. This thesis addresses this problem by using concurrency sets to generate a partial reachability graph pertaining to a particular state. These sets also allows each state to be checked for the presence of inconsistencies and hazards. The problem of designing a controller for the high-speed manufacturing system is also considered. The approach adopted mvolves the use of a model-based controller: This type of controller uses the Petri net models developed, thus preservIng the properties already proven of the controller. It. also contains a model of the physical system which is synchronised to the real application to provide timely responses. The various way of forming the synchronization between these processes is considered and the resulting nets are analysed using concurrency sets.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A major application of computers has been to control physical processes in which the computer is embedded within some large physical process and is required to control concurrent physical processes. The main difficulty with these systems is their event-driven characteristics, which complicate their modelling and analysis. Although a number of researchers in the process system community have approached the problems of modelling and analysis of such systems, there is still a lack of standardised software development formalisms for the system (controller) development, particular at early stage of the system design cycle. This research forms part of a larger research programme which is concerned with the development of real-time process-control systems in which software is used to control concurrent physical processes. The general objective of the research in this thesis is to investigate the use of formal techniques in the analysis of such systems at their early stages of development, with a particular bias towards an application to high speed machinery. Specifically, the research aims to generate a standardised software development formalism for real-time process-control systems, particularly for software controller synthesis. In this research, a graphical modelling formalism called Sequential Function Chart (SFC), a variant of Grafcet, is examined. SFC, which is defined in the international standard IEC1131 as a graphical description language, has been used widely in industry and has achieved an acceptable level of maturity and acceptance. A comparative study between SFC and Petri nets is presented in this thesis. To overcome identified inaccuracies in the SFC, a formal definition of the firing rules for SFC is given. To provide a framework in which SFC models can be analysed formally, an extended time-related Petri net model for SFC is proposed and the transformation method is defined. The SFC notation lacks a systematic way of synthesising system models from the real world systems. Thus a standardised approach to the development of real-time process control systems is required such that the system (software) functional requirements can be identified, captured, analysed. A rule-based approach and a method called system behaviour driven method (SBDM) are proposed as a development formalism for real-time process-control systems.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The combination of dimethyl dioctadecyl ammonium bromide (DDA) and the synthetic cord factor trehalose dibehenate (TDB) with Ag85B-ESAT-6 (H1 fusion protein) has been found to promote strong protective immune responses against Mycobacterium tuberculosis. The development of a vaccine formulation that is able to facilitate the requirements of sterility, stability and generation of a vaccine product with acceptable composition, shelf-life and safety profile may necessitate selected alterations in vaccine formulation. This study describes the implementation of a sterilisation protocol and the use of selected lyoprotective agents in order to fulfil these requirements. Concomitantly, close analysis of any alteration in physico-chemical characteristics and parameters of immunogenicity have been examined for this promising DDA liposome-based tuberculosis vaccine. The study addresses the extensive guidelines on parameters for non-clinical assessment, suitable for liposomal vaccines and other vaccine delivery systems issued by the World Health Organisation (WHO) and the European Medicines Agency (EMEA). Physical and chemical stability was observed following alteration in formulations to include novel cryoprotectants and radiation sterilisation. Immunogenicity was maintained following these alterations and even improved by modification with lysine as the cryoprotective agent for sterilised formulations. Taken together, these results outline the successful alteration to a liposomal vaccine, representing improved formulations by rational modification, whilst maintaining biological activity.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The traditional waterfall software life cycle model has several weaknesses. One problem is that a working version of a system is unavailable until a late stage in the development; any omissions and mistakes in the specification undetected until that stage can be costly to maintain. The operational approach which emphasises the construction of executable specifications can help to remedy this problem. An operational specification may be exercised to generate the behaviours of the specified system, thereby serving as a prototype to facilitate early validation of the system's functional requirements. Recent ideas have centred on using an existing operational method such as JSD in the specification phase of object-oriented development. An explicit transformation phase following specification is necessary in this approach because differences in abstractions between the two domains need to be bridged. This research explores an alternative approach of developing an operational specification method specifically for object-oriented development. By incorporating object-oriented concepts in operational specifications, the specifications have the advantage of directly facilitating implementation in an object-oriented language without requiring further significant transformations. In addition, object-oriented concepts can help the developer manage the complexity of the problem domain specification, whilst providing the user with a specification that closely reflects the real world and so the specification and its execution can be readily understood and validated. A graphical notation has been developed for the specification method which can capture the dynamic properties of an object-oriented system. A tool has also been implemented comprising an editor to facilitate the input of specifications, and an interpreter which can execute the specifications and graphically animate the behaviours of the specified systems.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This thesis first considers the calibration and signal processing requirements of a neuromagnetometer for the measurement of human visual function. Gradiometer calibration using straight wire grids is examined and optimal grid configurations determined, given realistic constructional tolerances. Simulations show that for gradiometer balance of 1:104 and wire spacing error of 0.25mm the achievable calibration accuracy of gain is 0.3%, of position is 0.3mm and of orientation is 0.6°. Practical results with a 19-channel 2nd-order gradiometer based system exceed this performance. The real-time application of adaptive reference noise cancellation filtering to running-average evoked response data is examined. In the steady state, the filter can be assumed to be driven by a non-stationary step input arising at epoch boundaries. Based on empirical measures of this driving step an optimal progression for the filter time constant is proposed which improves upon fixed time constant filter performance. The incorporation of the time-derivatives of the reference channels was found to improve the performance of the adaptive filtering algorithm by 15-20% for unaveraged data, falling to 5% with averaging. The thesis concludes with a neuromagnetic investigation of evoked cortical responses to chromatic and luminance grating stimuli. The global magnetic field power of evoked responses to the onset of sinusoidal gratings was shown to have distinct chromatic and luminance sensitive components. Analysis of the results, using a single equivalent current dipole model, shows that these components arise from activity within two distinct cortical locations. Co-registration of the resulting current source localisations with MRI shows a chromatically responsive area lying along the midline within the calcarine fissure, possibly extending onto the lingual and cuneal gyri. It is postulated that this area is the human homologue of the primate cortical area V4.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This study is concerned with labour productivity in traditional house building in Scotland. Productivity is a measure of the effective use of resources and provides vital benefits that can be combined in a number of ways. The introduction gives the background to two Scottish house building sites (Blantyre and Greenfield) that were surveyed by the Building Research Establishment (BEE) activity sampling method to provide the data for the study. The study had two main objectives; (1) summary data analysis in average manhours per house between all the houses on the site, and (2) detailed data analysis in average manhours for each house block on the site. The introduction also provides a literature review related to the objectives. The method is outlined in Chapter 2, the sites are discussed in Chapter 3, and Chapter 4 covers the method application on each site and a method development made in the study. The summary data analysis (Chapter 5) compares Blantyre and Greenfield, and two previous BEE surveys in England. The main detailed data analysis consisted of three forms, (Chapters 6, 7 and 8) each applied to a set of operations. The three forms of analysis were variations in average manhours per house for each house block on the site compared with; (1) block construction order, (2) average number of separate visits per house made by operatives to each block to complete an operation, and (3) average number of different operatives per house employed on an operation in each block. Three miscellaneous items of detail data analysis are discussed in Chapter 9. The conclusions to the whole study state that considerable variations in manhours for repeated operations were discovered, that the numbers of visits by operatives to complete operations were large and that the numbers of different operatives employed in some operations were a factor related to productivity. A critique of the activity sampling method suggests that the data produced is reliable in summary form and can give a good context for more detailed data collection. For future work, this could take the form of selected operations, with the context of an activity sampling survey, that wuld be intensively surveyed by other methods.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A comprehensive survey of industrial sites and heat recovery products revealed gaps between equipment that was required and that which was available. Two heat recovery products were developed to fill those gaps: a gas-to-gas modular heat recovery unit; a gas-to-liquid exhaust gas heat exchanger. The former provided an entire heat recovery system in one unit. It was specifically designed to overcome the problems associated with existing component system of large design commitment, extensive installation and incompatibility between parts. The unit was intended to recover heat from multiple waste gas sources and, in particular, from baking ovens. A survey of the baking industry defined typical waste gas temperatures and flow rates, around which the unit was designed. The second unit was designed to recover heat from the exhaust gases of small diesel engines. The developed unit differed from existing designs by having a negligible effect on engine performance. In marketing terms these products are conceptual opposites. The first, a 'product-push' product generated from site and product surveys, required marketing following design. The second, a 'market-pull' product, resulted from a specific user need; this had a captive market and did not require marketing. Here marketing was replaced by commercial aspects including the protection of ideas, contracting, tendering and insurance requirements. These two product development routes are compared and contrasted. As a general conclusion this work suggests that it can be beneficial for small companies (as was the sponsor of this project) to undertake projects of the market-pull type. Generally they have a higher probability of success and are less capital intensive than their product-push counterparts. Development revealed shortcomings in three other fields: British Standards governing heat exchangers; financial assessment of energy saving schemes; degree day procedure of calculating energy savings. Methods are proposed to overcome these shortcomings.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The present scarcity of operational knowledge-based systems (KBS) has been attributed, in part, to an inadequate consideration shown to user interface design during development. From a human factors perspective the problem has stemmed from an overall lack of user-centred design principles. Consequently the integration of human factors principles and techniques is seen as a necessary and important precursor to ensuring the implementation of KBS which are useful to, and usable by, the end-users for whom they are intended. Focussing upon KBS work taking place within commercial and industrial environments, this research set out to assess both the extent to which human factors support was presently being utilised within development, and the future path for human factors integration. The assessment consisted of interviews conducted with a number of commercial and industrial organisations involved in KBS development; and a set of three detailed case studies of individual KBS projects. Two of the studies were carried out within a collaborative Alvey project, involving the Interdisciplinary Higher Degrees Scheme (IHD) at the University of Aston in Birmingham, BIS Applied Systems Ltd (BIS), and the British Steel Corporation. This project, which had provided the initial basis and funding for the research, was concerned with the application of KBS to the design of commercial data processing (DP) systems. The third study stemmed from involvement on a KBS project being carried out by the Technology Division of the Trustees Saving Bank Group plc. The preliminary research highlighted poor human factors integration. In particular, there was a lack of early consideration of end-user requirements definition and user-centred evaluation. Instead concentration was given to the construction of the knowledge base and prototype evaluation with the expert(s). In response to this identified problem, a set of methods was developed that was aimed at encouraging developers to consider user interface requirements early on in a project. These methods were then applied in the two further projects, and their uptake within the overall development process was monitored. Experience from the two studies demonstrated that early consideration of user interface requirements was both feasible, and instructive for guiding future development work. In particular, it was shown a user interface prototype could be used as a basis for capturing requirements at the functional (task) level, and at the interface dialogue level. Extrapolating from this experience, a KBS life-cycle model is proposed which incorporates user interface design (and within that, user evaluation) as a largely parallel, rather than subsequent, activity to knowledge base construction. Further to this, there is a discussion of several key elements which can be seen as inhibiting the integration of human factors within KBS development. These elements stem from characteristics of present KBS development practice; from constraints within the commercial and industrial development environments; and from the state of existing human factors support.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The starting point of this research was the belief that manufacturing and similar industries need help with the concept of e-business, especially in assessing the relevance of possible e-business initiatives. The research hypotheses was that it should be possible to produce a systematic model that defines, at a useful level of detail, the probable e-business requirements of an organisation based on objective criteria with an accuracy of 85%-90%. This thesis describes the development and validation of such a model. A preliminary model was developed from a variety of sources, including a survey of current and planned e-business activity and representative examples of e-business material produced by e-business solution providers. The model was subject to a process of testing and refinement based on recursive case studies, with controls over the improving accuracy and stability of the model. Useful conclusions were also possible as to the relevance of e-business functions to the case study participants themselves. Techniques were evolved to synthesise the e-business requirements of an organisation and present them at a management summary level of detail. The results of applying these techniques to all the case studies used in this research were discussed. The conclusion of the research was that the case study methodology employed was successful. A model was achieved suitable for practical application in a manufacturing organisation requiring help with a requirements definition process.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The need for improvement in the development of research careers and researchers’ training in transferable skills was highlighted in two particular recommendations (numbers 4.2 and 5.3) in the 2002 report ‘SET for success: the report of Sir Gareth Roberts’ Review - the supply of people with science, technology, engineering and mathematics skills’ (Roberts, 2002). As a consequence of that review, Research Councils UK (RCUK)1 have invested about £120 million, usually referred to as ’Roberts’ Money’, in research organisations to address this concern in all research disciplines. The last ‘Roberts’ Money’ payment will be for the period up to March 2011; it was therefore proposed to assess the progress made with taking forward these specific recommendations. An independent panel was formed by RCUK to undertake this review in 2010. The terms of reference for the panel are in Annex A. In summary, the panel was asked to review progress made and to advise RCUK and the higher education (HE) sector about future requirements for the development and training of researchers. In the course of their review, the panel considered a wide range of existing reports, interviewed key stakeholders in the HE sector and elsewhere, as well as drawing on their own knowledge and expertise. This report presents the findings of the panel’s review.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Requirements are sensitive to the context in which the system-to-be must operate. Where such context is well-understood and is static or evolves slowly, existing RE techniques can be made to work well. Increasingly, however, development projects are being challenged to build systems to operate in contexts that are volatile over short periods in ways that are imperfectly understood. Such systems need to be able to adapt to new environmental contexts dynamically, but the contextual uncertainty that demands this self-adaptive ability makes it hard to formulate, validate and manage their requirements. Different contexts may demand different requirements trade-offs. Unanticipated contexts may even lead to entirely new requirements. To help counter this uncertainty, we argue that requirements for self-adaptive systems should be run-time entities that can be reasoned over in order to understand the extent to which they are being satisfied and to support adaptation decisions that can take advantage of the systems' self-adaptive machinery. We take our inspiration from the fact that explicit, abstract representations of software architectures used to be considered design-time-only entities but computational reflection showed that architectural concerns could be represented at run-time too, helping systems to dynamically reconfigure themselves according to changing context. We propose to use analogous mechanisms to achieve requirements reflection. In this paper we discuss the ideas that support requirements reflection as a means to articulate some of the outstanding research challenges.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

As a subset of the Internet of Things (IoT), the Web of Things (WoT) shares many characteristics with wireless sensor and actuator networks (WSANs) and ubiquitous computing systems (Ubicomp). Yet to a far greater degree than the IoT, WSANs or Ubicomp, the WoT will integrate physical and information objects, necessitating a means to model and reason about a range of context types that have hitherto received little or no attention from the RE community. RE practice is only now developing the means to support WSANs and Ubicomp system development, including faltering first steps in the representation of context. We argue that these techniques will need to be developed further, with a particular focus on rich context types, if RE is to support WoT application development. © 2012 Springer-Verlag.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Dynamically adaptive systems (DASs) are intended to monitor the execution environment and then dynamically adapt their behavior in response to changing environmental conditions. The uncertainty of the execution environment is a major motivation for dynamic adaptation; it is impossible to know at development time all of the possible combinations of environmental conditions that will be encountered. To date, the work performed in requirements engineering for a DAS includes requirements monitoring and reasoning about the correctness of adaptations, where the DAS requirements are assumed to exist. This paper introduces a goal-based modeling approach to develop the requirements for a DAS, while explicitly factoring uncertainty into the process and resulting requirements. We introduce a variation of threat modeling to identify sources of uncertainty and demonstrate how the RELAX specification language can be used to specify more flexible requirements within a goal model to handle the uncertainty. © 2009 Springer Berlin Heidelberg.