43 resultados para Software design process
Resumo:
When object databases arrived on the scene some ten years ago, they provided database capabilities for previously neglected, complex applications, such as CAD, but were burdened with one inherent teething problem, poor performance. Physical database design is one tool that can provide performance improvements and it is the general area of concern for this thesis. Clustering is one fruitful design technique which can provide improvements in performance. However, clustering in object databases has not been explored in depth and so has not been truly exploited. Further, clustering, although a physical concern, can be determined from the logical model. The object model is richer than previous models, notably the relational model, and so it is anticipated that the opportunities with respect to clustering are greater. This thesis provides a thorough analysis of object clustering strategies with a view to highlighting any links between the object logical and physical model and improving performance. This is achieved by considering all possible types of object logical model construct and the implementation of those constructs in terms of theoretical clusterings strategies to produce actual clustering arrangements. This analysis results in a greater understanding of object clustering strategies, aiding designers in the development process and providing some valuable rules of thumb to support the design process.
Resumo:
This thesis presents a new approach to designing large organizational databases. The approach emphasizes the need for a holistic approach to the design process. The development of the proposed approach was based on a comprehensive examination of the issues of relevance to the design and utilization of databases. Such issues include conceptual modelling, organization theory, and semantic theory. The conceptual modelling approach presented in this thesis is developed over three design stages, or model perspectives. In the semantic perspective, concept definitions were developed based on established semantic principles. Such definitions rely on meaning - provided by intension and extension - to determine intrinsic conceptual definitions. A tool, called meaning-based classification (MBC), is devised to classify concepts based on meaning. Concept classes are then integrated using concept definitions and a set of semantic relations which rely on concept content and form. In the application perspective, relationships are semantically defined according to the application environment. Relationship definitions include explicit relationship properties and constraints. The organization perspective introduces a new set of relations specifically developed to maintain conformity of conceptual abstractions with the nature of information abstractions implied by user requirements throughout the organization. Such relations are based on the stratification of work hierarchies, defined elsewhere in the thesis. Finally, an example of an application of the proposed approach is presented to illustrate the applicability and practicality of the modelling approach.
Resumo:
Background: We introduced a series of computer-supported workshops in our undergraduate statistics courses, in the hope that it would help students to gain a deeper understanding of statistical concepts. This raised questions about the appropriate design of the Virtual Learning Environment (VLE) in which such an approach had to be implemented. Therefore, we investigated two competing software design models for VLEs. In the first system, all learning features were a function of the classical VLE. The second system was designed from the perspective that learning features should be a function of the course's core content (statistical analyses), which required us to develop a specific-purpose Statistical Learning Environment (SLE) based on Reproducible Computing and newly developed Peer Review (PR) technology. Objectives: The main research question is whether the second VLE design improved learning efficiency as compared to the standard type of VLE design that is commonly used in education. As a secondary objective we provide empirical evidence about the usefulness of PR as a constructivist learning activity which supports non-rote learning. Finally, this paper illustrates that it is possible to introduce a constructivist learning approach in large student populations, based on adequately designed educational technology, without subsuming educational content to technological convenience. Methods: Both VLE systems were tested within a two-year quasi-experiment based on a Reliable Nonequivalent Group Design. This approach allowed us to draw valid conclusions about the treatment effect of the changed VLE design, even though the systems were implemented in successive years. The methodological aspects about the experiment's internal validity are explained extensively. Results: The effect of the design change is shown to have substantially increased the efficiency of constructivist, computer-assisted learning activities for all cohorts of the student population under investigation. The findings demonstrate that a content-based design outperforms the traditional VLE-based design. © 2011 Wessa et al.
Resumo:
While mobile devices offer many innovative possibilities to help increase the standard of living for individuals with disabilities and other special needs, the process of developing assistive technology, such that it will be effective across a group of individuals with a particular disability, can be extremely challenging. This chapter discusses key issues and trends related to designing and evaluating mobile assistive technology for individuals with disabilities. Following an overview of general design process issues, we argue (based on current research trends) that individuals with disabilities and domain experts be involved throughout the development process. While this, in itself, presents its own set of challenges, many strategies have successfully been used to overcome the difficulties and maximize the contributions of users and experts alike. Guidelines based on these strategies are discussed and are illustrated with real examples from one of our active research projects.
Resumo:
Our paper presents the work of the Cuneiform Digital Forensic Project (CDFP), an interdisciplinary project at The University of Birmingham, concerned with the development of a multimedia database to support scholarly research into cuneiform, wedge-shaped writing imprinted onto clay tablets and indeed the earliest real form of writing. We describe the evolutionary design process and dynamic research and developmental cycles associated with the database. Unlike traditional publications, the electronic publication of resources offers the possibility of almost continuous revisions with the integration and support of new media and interfaces. However, if on-line resources are to win the favor and confidence of their respective communities there must be a clear distinction between published and maintainable resources, and, developmental content. Published material should, ideally, be supported via standard web-browser interfaces with fully integrated tools so that users receive a reliable, homogenous and intuitive flow of information and media relevant to their needs. We discuss the inherent dynamics of the design and publication of our on-line resource, starting with the basic design and maintenance aspects of the electronic database, which includes photographic instances of cuneiform signs, and shows how the continuous review process identifies areas for further research and development, for example, the “sign processor” graphical search tool and three-dimensional content, the results of which then feedback into the maintained resource.
Resumo:
While mobile devices offer many innovative possibilities to help increase the standard of living for individuals with disabilities and other special needs, the process of developing assistive technology, such that it will be effective across a group of individuals with a particular disability, can be extremely challenging. This chapter discusses key issues and trends related to designing and evaluating mobile assistive technology for individuals with disabilities. Following an overview of general design process issues, we argue (based on current research trends) that individuals with disabilities and domain experts be involved throughout the development process. While this, in itself, presents its own set of challenges, many strategies have successfully been used to overcome the difficulties and maximize the contributions of users and experts alike. Guidelines based on these strategies are discussed and are illustrated with real examples from one of our active research projects.
Resumo:
Bed expansion occurs during the operation of gas-fluidized beds and is influenced by particle properties, gas properties and distributor characteristics. It has a significant bearing on heat and mass transfer phenomena within the bed. A method of predicting bed expansion behavior from other fluidizing parameters would be a useful tool in the design process, dispensing with the need for small-scale trials. This study builds on previous work on fluidized beds with vertical inserts to produce a correlation that links a modified particle terminal velocity, minimum fluidizing velocity and distributor characteristics with bed voidage in the relationship with P as the pitch between holes in the perforated distributor plate. © 2007 Wiley-VCH Verlag GmbH & Co. KGaA, Weinheim.
Resumo:
A negative input-resistance compensator is designed to stabilize a power electronic brushless dc motor drive with constant power-load characteristics. The strategy is to feed a portion of the changes in the dc-link voltage into the current control loop to modify the system input impedance in the midfrequency range and thereby to damp the input filter. The design process of the compensator and the selection of parameters are described. The impact of the compensator is examined on the motor-controller performance, and finally, the effectiveness of the controller is verified by simulation and experimental testing.
Resumo:
Optical differentiators constitute a basic device for analog all-optical signal processing [1]. Fiber grating approaches, both fiber Bragg grating (FBG) and long period grating (LPG), constitute an attractive solution because of their low cost, low insertion losses, and full compatibility with fiber optic systems. A first order differentiator LPG approach was proposed and demonstrated in [2], but FBGs may be preferred in applications with a bandwidth up to few nm because of the extreme sensitivity of LPGs to environmental fluctuations [3]. Several FBG approaches have been proposed in [3-6], requiring one or more additional optical elements to create a first-order differentiator. A very simple, single optical element FBG approach was proposed in [7] for first order differentiation, applying the well-known logarithmic Hilbert transform relation of the amplitude and phase of an FBG in transmission [8]. Using this relationship in the design process, it was theoretically and numerically demonstrated that a single FBG in transmission can be designed to simultaneously approach the amplitude and phase of a first-order differentiator spectral response, without need of any additional elements. © 2013 IEEE.
Resumo:
Decision-making in product quality is an indispensable stage in product development, in order to reduce product development risk. Based on the identification of the deficiencies of quality function deployment (QFD) and failure modes and effects analysis (FMEA), a novel decision-making method is presented that draws upon a knowledge network of failure scenarios. An ontological expression of failure scenarios is presented together with a framework of failure knowledge network (FKN). According to the roles of quality characteristics (QCs) in failure processing, QCs are set into three categories namely perceptible QCs, restrictive QCs, and controllable QCs, which present the monitor targets, control targets and improvement targets respectively for quality management. A mathematical model and algorithms based on the analytic network process (ANP) is introduced for calculating the priority of QCs with respect to different development scenarios. A case study is provided according to the proposed decision-making procedure based on FKN. This methodology is applied in the propeller design process to solve the problem of prioritising QCs. This paper provides a practical approach for decision-making in product quality. Copyright © 2011 Inderscience Enterprises Ltd.
Resumo:
This opinion piece argues for the necessity of student-staff partnerships that go beyond the common rhetoric of ‘student engagement’, achieving a richer student and staff dialogue which results in more meaningful change in policy and practice. In particular, attention is drawn to the need for such partnerships when determining technology applications that are often missed out from, or treated in isolation from, the curriculum design process. This piece cites, as an example, a student-led taught day on the Post Graduate Diploma in Learning and Teaching at Aston University in July 2015. There was clear evidence that the staff participants designed their assessments with student partners in mind. It is therefore proposed that a partnership relationship offers an effective means of moving forward from common practices where technology simply replicates, or supplements, traditional activities.
Resumo:
Based on an unprecedented need of stimulating creative capacities towards entrepreneurship to university students and young researchers, this paper introduces and analyses a smart learning ecosystem for encouraging teaching and learning on creative thinking as a distinct feature to be taught and learnt in universities. The paper introduces a mashed-up authoring architecture for designing lesson-plans and games with visual learning mechanics for creativity learning. The design process is facilitated by creativity pathways discerned across components. Participatory learning, networking and capacity building is a key aspect of the architecture, extending the learning experience and context from the classroom to outdoor (co-authoring of creative pathways by students, teachers and real-world entrepreneurs) and personal spaces. We anticipate that the smart learning ecosystem will be empirically evaluated and validated in future iterations for exploring the benefits of using games for enhancing creative mindsets, unlocking the imagination that lies within, practiced and transferred to multiple academic tribes and territories.
Resumo:
Requirements for systems to continue to operate satisfactorily in the presence of faults has led to the development of techniques for the construction of fault tolerant software. This thesis addresses the problem of error detection and recovery in distributed systems which consist of a set of communicating sequential processes. A method is presented for the `a priori' design of conversations for this class of distributed system. Petri nets are used to represent the state and to solve state reachability problems for concurrent systems. The dynamic behaviour of the system can be characterised by a state-change table derived from the state reachability tree. Systematic conversation generation is possible by defining a closed boundary on any branch of the state-change table. By relating the state-change table to process attributes it ensures all necessary processes are included in the conversation. The method also ensures properly nested conversations. An implementation of the conversation scheme using the concurrent language occam is proposed. The structure of the conversation is defined using the special features of occam. The proposed implementation gives a structure which is independent of the application and is independent of the number of processes involved. Finally, the integrity of inter-process communications is investigated. The basic communication primitives used in message passing systems are seen to have deficiencies when applied to systems with safety implications. Using a Petri net model a boundary for a time-out mechanism is proposed which will increase the integrity of a system which involves inter-process communications.