961 resultados para design rules
Resumo:
This paper describes a methodology: 'decision rules for analyzing manufacturing activities', which is designed to be a practical system of enquiry linking a strategic analysis to the design of production systems. The paper describes the development of the system, an industry specific design methodology, into DRAMA II which is a model that serves as an analytical tool for studying decision processes and implementation of production systems.
Resumo:
The design and implementation of data bases involve, firstly, the formulation of a conceptual data model by systematic analysis of the structure and information requirements of the organisation for which the system is being designed; secondly, the logical mapping of this conceptual model onto the data structure of the target data base management system (DBMS); and thirdly, the physical mapping of this structured model into storage structures of the target DBMS. The accuracy of both the logical and physical mapping determine the performance of the resulting systems. This thesis describes research which develops software tools to facilitate the implementation of data bases. A conceptual model describing the information structure of a hospital is derived using the Entity-Relationship (E-R) approach and this model forms the basis for mapping onto the logical model. Rules are derived for automatically mapping the conceptual model onto relational and CODASYL types of data structures. Further algorithms are developed for partly automating the implementation of these models onto INGRES, MIMER and VAX-11 DBMS.
Resumo:
This dissertation studies the process of operations systems design within the context of the manufacturing organization. Using the DRAMA (Design Routine for Adopting Modular Assembly) model as developed by a team from the IDOM Research Unit at Aston University as a starting point, the research employed empirically based fieldwork and a survey to investigate the process of production systems design and implementation within four UK manufacturing industries: electronics assembly, electrical engineering, mechanical engineering and carpet manufacturing. The intention was to validate the basic DRAMA model as a framework for research enquiry within the electronics industry, where the initial IDOM work was conducted, and then to test its generic applicability, further developing the model where appropriate, within the other industries selected. The thesis contains a review of production systems design theory and practice prior to presenting thirteen industrial case studies of production systems design from the four industry sectors. The results and analysis of the postal survey into production systems design are then presented. The strategic decisions of manufacturing and their relationship to production systems design, and the detailed process of production systems design and operation are then discussed. These analyses are used to develop the generic model of production systems design entitled DRAMA II (Decision Rules for Analysing Manufacturing Activities). The model contains three main constituent parts: the basic DRAMA model, the extended DRAMA II model showing the imperatives and relationships within the design process, and a benchmark generic approach for the design and analysis of each component in the design process. DRAMA II is primarily intended for use by researchers as an analytical framework of enquiry, but is also seen as having application for manufacturing practitioners.
Resumo:
Investigation of the different approaches used by Expert Systems researchers to solve problems in the domain of Mechanical Design and Expert Systems was carried out. The techniques used for conventional formal logic programming were compared with those used when applying Expert Systems concepts. A literature survey of design processes was also conducted with a view to adopting a suitable model of the design process. A model, comprising a variation on two established ones, was developed and applied to a problem within what are described as class 3 design tasks. The research explored the application of these concepts to Mechanical Engineering Design problems and their implementation on a microcomputer using an Expert System building tool. It was necessary to explore the use of Expert Systems in this manner so as to bridge the gap between their use as a control structure and for detailed analytical design. The former application is well researched into and this thesis discusses the latter. Some Expert System building tools available to the author at the beginning of his work were evaluated specifically for their suitability for Mechanical Engineering design problems. Microsynics was found to be the most suitable on which to implement a design problem because of its simple but powerful Semantic Net Knowledge Representation structure and the ability to use other types of representation schemes. Two major implementations were carried out. The first involved a design program for a Helical compression spring and the second a gearpair system design. Two concepts were proposed in the thesis for the modelling and implementation of design systems involving many equations. The method proposed enables equation manipulation and analysis using a combination of frames, semantic nets and production rules. The use of semantic nets for purposes other than for psychology and natural language interpretation, is quite new and represents one of the major contributions to knowledge by the author. The development of a purpose built shell program for this type of design problems was recommended as an extension of the research. Microsynics may usefully be used as a platform for this development.
Resumo:
The work reported in this thesis is concerned with the improvement and expansion of the assistance given to the designer by the computer in the design of cold formed sections. The main contributions have been in four areas, which have consequently led to the fifth, the development of a methodology to optimise designs. This methodology can be considered an `Expert Design System' for cold formed sections. A different method of determining section properties of profiles was introduced, using the properties of line and circular elements. Graphics were introduced to show the outline of the profile on screen. The analysis of beam loading has been expanded to beam loading conditions where the number of supports, point loads, and uniform distributive loads can be specified by the designer. The profile can then be checked for suitability for the specified type of loading. Artificial Intelligence concepts have been introduced to give the designer decision support from the computer, in combination with the computer aided design facilities. The more complex decision support was adopted through the use of production rules. All the support was based on the British standards. A method has been introduced, by which the appropriate use of stiffeners can be determined and consequently designed by the designer. Finally, the methodology by which the designer is given assistance from the computer, without constraining the designer, was developed. This methodology gives advice to the designer on possible methods of improving the design, but allows the designer to reject that option, and analyse the profile accordingly. The methodology enables optimisation to be achieved by the designer, designing variety of profiles for a particular loading, and determining which one is best suited.
Resumo:
Agent-based technology is playing an increasingly important role in today’s economy. Usually a multi-agent system is needed to model an economic system such as a market system, in which heterogeneous trading agents interact with each other autonomously. Two questions often need to be answered regarding such systems: 1) How to design an interacting mechanism that facilitates efficient resource allocation among usually self-interested trading agents? 2) How to design an effective strategy in some specific market mechanisms for an agent to maximise its economic returns? For automated market systems, auction is the most popular mechanism to solve resource allocation problems among their participants. However, auction comes in hundreds of different formats, in which some are better than others in terms of not only the allocative efficiency but also other properties e.g., whether it generates high revenue for the auctioneer, whether it induces stable behaviour of the bidders. In addition, different strategies result in very different performance under the same auction rules. With this background, we are inevitably intrigued to investigate auction mechanism and strategy designs for agent-based economics. The international Trading Agent Competition (TAC) Ad Auction (AA) competition provides a very useful platform to develop and test agent strategies in Generalised Second Price auction (GSP). AstonTAC, the runner-up of TAC AA 2009, is a successful advertiser agent designed for GSP-based keyword auction. In particular, AstonTAC generates adaptive bid prices according to the Market-based Value Per Click and selects a set of keyword queries with highest expected profit to bid on to maximise its expected profit under the limit of conversion capacity. Through evaluation experiments, we show that AstonTAC performs well and stably not only in the competition but also across a broad range of environments. The TAC CAT tournament provides an environment for investigating the optimal design of mechanisms for double auction markets. AstonCAT-Plus is the post-tournament version of the specialist developed for CAT 2010. In our experiments, AstonCAT-Plus not only outperforms most specialist agents designed by other institutions but also achieves high allocative efficiencies, transaction success rates and average trader profits. Moreover, we reveal some insights of the CAT: 1) successful markets should maintain a stable and high market share of intra-marginal traders; 2) a specialist’s performance is dependent on the distribution of trading strategies. However, typical double auction models assume trading agents have a fixed trading direction of either buy or sell. With this limitation they cannot directly reflect the fact that traders in financial markets (the most popular application of double auction) decide their trading directions dynamically. To address this issue, we introduce the Bi-directional Double Auction (BDA) market which is populated by two-way traders. Experiments are conducted under both dynamic and static settings of the continuous BDA market. We find that the allocative efficiency of a continuous BDA market mainly comes from rational selection of trading directions. Furthermore, we introduce a high-performance Kernel trading strategy in the BDA market which uses kernel probability density estimator built on historical transaction data to decide optimal order prices. Kernel trading strategy outperforms some popular intelligent double auction trading strategies including ZIP, GD and RE in the continuous BDA market by making the highest profit in static games and obtaining the best wealth in dynamic games.
Resumo:
In the developed world we are surrounded by man-made objects, but most people give little thought to the complex processes needed for their design. The design of hand knitting is complex because much of the domain knowledge is tacit. The objective of this thesis is to devise a methodology to help designers to work within design constraints, whilst facilitating creativity. A hybrid solution including computer aided design (CAD) and case based reasoning (CBR) is proposed. The CAD system creates designs using domain-specific rules and these designs are employed for initial seeding of the case base and the management of constraints. CBR reuses the designer's previous experience. The key aspects in the CBR system are measuring the similarity of cases and adapting past solutions to the current problem. Similarity is measured by asking the user to rank the importance of features; the ranks are then used to calculate weights for an algorithm which compares the specifications of designs. A novel adaptation operator called rule difference replay (RDR) is created. When the specifications to a new design is presented, the CAD program uses it to construct a design constituting an approximate solution. The most similar design from the case-base is then retrieved and RDR replays the changes previously made to the retrieved design on the new solution. A measure of solution similarity that can validate subjective success scores is created. Specification similarity can be used as a guide whether to invoke CBR, in a hybrid CAD-CBR system. If the newly resulted design is suffciently similar to a previous design, then CBR is invoked; otherwise CAD is used. The application of RDR to knitwear design has demonstrated the flexibility to overcome deficiencies in rules that try to automate creativity, and has the potential to be applied to other domains such as interior design.
Resumo:
In this work we explore numerically an experimentally the dependence of the broadened spectra on the choice of fibers and we analyze a series of basic rules to be taken into account when using nonlinear broadening to reduce the gain ripple of broadband Raman amplifiers
Resumo:
In the field of Transition P systems implementation, it has been determined that it is very important to determine in advance how long takes evolution rules application in membranes. Moreover, to have time estimations of rules application in membranes makes possible to take important decisions related to hardware / software architectures design. The work presented here introduces an algorithm for applying active evolution rules in Transition P systems, which is based on active rules elimination. The algorithm complies the requisites of being nondeterministic, massively parallel, and what is more important, it is time delimited because it is only dependant on the number of membrane evolution rules.
Resumo:
P systems or Membrane Computing are a type of a distributed, massively parallel and non deterministic system based on biological membranes. They are inspired in the way cells process chemical compounds, energy and information. These systems perform a computation through transition between two consecutive configurations. As it is well known in membrane computing, a configuration consists in a m-tuple of multisets present at any moment in the existing m regions of the system at that moment time. Transitions between two configurations are performed by using evolution rules which are in each region of the system in a non-deterministic maximally parallel manner. This work is part of an exhaustive investigation line. The final objective is to implement a HW system that evolves as it makes a transition P-system. To achieve this objective, it has been carried out a division of this generic system in several stages, each of them with concrete matters. In this paper the stage is developed by obtaining the part of the system that is in charge of the application of the active rules. To count the number of times that the active rules is applied exist different algorithms. Here, it is presents an algorithm with improved aspects: the number of necessary iterations to reach the final values is smaller than the case of applying step to step each rule. Hence, the whole process requires a minor number of steps and, therefore, the end of the process will be reached in a shorter length of time.
Resumo:
Transition P systems are computational models based on basic features of biological membranes and the observation of biochemical processes. In these models, membrane contains objects multisets, which evolve according to given evolution rules. In the field of Transition P systems implementation, it has been detected the necessity to determine whichever time are going to take active evolution rules application in membranes. In addition, to have time estimations of rules application makes possible to take important decisions related to the hardware / software architectures design. In this paper we propose a new evolution rules application algorithm oriented towards the implementation of Transition P systems. The developed algorithm is sequential and, it has a linear order complexity in the number of evolution rules. Moreover, it obtains the smaller execution times, compared with the preceding algorithms. Therefore the algorithm is very appropriate for the implementation of Transition P systems in sequential devices.
Resumo:
Workflows are set of activities that implement and realise business goals. Modern business goals add extra requirements on workflow systems and their management. Workflows may cross many organisations and utilise services on a variety of devices and/or supported by different platforms. Current workflows are therefore inherently context-aware. Each context is governed and constrained by its own policies and rules to prevent unauthorised participants from executing sensitive tasks and also to prevent tasks from accessing unauthorised services and/or data. We present a sound and multi-layered design language for the design and analysis of secure and context aware workflows systems.
Resumo:
Бойко Бл. Банчев - Представена е обосновка и описание на език за програмиране в композиционен стил за опитни и учебни цели. Под “композиционен” имаме предвид функционален стил на програмиране, при който пресмятането е йерархия от композиции и прилагания на функции. Един от данновите типове на езика е този на геометричните фигури, които могат да бъдат получавани чрез прости правила за съотнасяне и така също образуват йерархични композиции. Езикът е силно повлиян от GeomLab, но по редица свойства се различава от него значително. Статията разглежда основните черти на езика; подробното му описание и фигурноконструктивните му възможности ще бъдат представени в съпътстваща публикация.
Resumo:
The span of control is the most discussed single concept in classical and modern management theory. In specifying conditions for organizational effectiveness, the span of control has generally been regarded as a critical factor. Existing research work has focused mainly on qualitative methods to analyze this concept, for example heuristic rules based on experiences and/or intuition. This research takes a quantitative approach to this problem and formulates it as a binary integer model, which is used as a tool to study the organizational design issue. This model considers a range of requirements affecting management and supervision of a given set of jobs in a company. These decision variables include allocation of jobs to workers, considering complexity and compatibility of each job with respect to workers, and the requirement of management for planning, execution, training, and control activities in a hierarchical organization. The objective of the model is minimal operations cost, which is the sum of supervision costs at each level of the hierarchy, and the costs of workers assigned to jobs. The model is intended for application in the make-to-order industries as a design tool. It could also be applied to make-to-stock companies as an evaluation tool, to assess the optimality of their current organizational structure. Extensive experiments were conducted to validate the model, to study its behavior, and to evaluate the impact of changing parameters with practical problems. This research proposes a meta-heuristic approach to solving large-size problems, based on the concept of greedy algorithms and the Meta-RaPS algorithm. The proposed heuristic was evaluated with two measures of performance: solution quality and computational speed. The quality is assessed by comparing the obtained objective function value to the one achieved by the optimal solution. The computational efficiency is assessed by comparing the computer time used by the proposed heuristic to the time taken by a commercial software system. Test results show the proposed heuristic procedure generates good solutions in a time-efficient manner.
Resumo:
In this research, I analyze the effects of candidate nomination rules and campaign financing rules on elite recruitment into the national legislatures of Germany and the United States. This dissertation is both theory-driven and constitutes exploratory research, too. While the effects of electoral rules are frequently studied in political science, the emphasis is thereby on electoral rules that are set post-election. My focus, in contrast, is on electoral rules that have an effect prior to the election. Furthermore, my dissertation is comparative by design.^ The research question is twofold. Do electoral rules have an effect on elite recruitment, and does it matter? To answer these question, I create a large-N original data set, in which I code the behavior and recruitment paths and patterns of members of the American House of Representatives and the German Bundestag. Furthermore, I include interviews with members of the said two national legislatures. Both the statistical analyses and the interviews provide affirmative evidence for my working hypothesis that differences in electoral rules lead to a different type of elite recruitment. To that end, I use the active-politician concept, through which I dichotomously distinguish the economic behavior of politicians.^ Thanks to the exploratory nature of my research, I also discover the phenomenon of differential valence of local and state political office for entrance into national office in comparative perspective. By statistically identifying this hitherto unknown paradox, as well as evidencing the effects of electoral rules, I show that besides ideology and culture, institutional rules are key in shaping the ruling elite. The way institutional rules are set up, in particular electoral rules, does not only affect how the electorate will vote and how seats will be distributed, but it will also affect what type of people will end up in elected office.^