961 resultados para Design rules
Resumo:
A flexible and simple Bayesian decision-theoretic design for dose-finding trials is proposed in this paper. In order to reduce the computational burden, we adopt a working model with conjugate priors, which is flexible to fit all monotonic dose-toxicity curves and produces analytic posterior distributions. We also discuss how to use a proper utility function to reflect the interest of the trial. Patients are allocated based on not only the utility function but also the chosen dose selection rule. The most popular dose selection rule is the one-step-look-ahead (OSLA), which selects the best-so-far dose. A more complicated rule, such as the two-step-look-ahead, is theoretically more efficient than the OSLA only when the required distributional assumptions are met, which is, however, often not the case in practice. We carried out extensive simulation studies to evaluate these two dose selection rules and found that OSLA was often more efficient than two-step-look-ahead under the proposed Bayesian structure. Moreover, our simulation results show that the proposed Bayesian method's performance is superior to several popular Bayesian methods and that the negative impact of prior misspecification can be managed in the design stage.
Resumo:
This paper deals with the system oriented analysis, design, modeling, and implementation of active clamp HF link three phase converter. The main advantage of the topology is reduced size, weight, and cost of the isolation transformer. However, violation of basic power conversion rules due to presence of the leakage inductance in the HF transformer causes over voltage stresses across the cycloconverter devices. It makes use of the snubber circuit necessary in such topologies. The conventional RCD snubbers are dissipative in nature and hence inefficient. The efficiency of the system is greatly improved by using regenerative snubber or active clamp circuit. It consists of an active switching device with an anti-parallel diode and one capacitor to absorb the energy stored in the leakage inductance of the isolation transformer and to regenerate the same without affecting circuit performance. The turn on instant and duration of the active device are selected such that it requires simple commutation requirements. The time domain expressions for circuit dynamics, design criteria of the snubber capacitor with two conflicting constrains (over voltage stress across the devices and the resonating current duration), the simulation results based on generalized circuit model and the experimental results based on laboratory prototype are presented.
Resumo:
Multi-packet reception (MPR) promises significant throughput gains in wireless local area networks (WLANs) by allowing nodes to transmit even in the presence of ongoing transmissions in the medium. However, the medium access control (MAC) layer must now be redesigned to facilitate rather than discourage - these overlapping transmissions. We investigate asynchronous MPR MAC protocols, which successfully accomplish this by controlling the node behavior based on the number of ongoing transmissions in the channel. The protocols use the backoff timer mechanism of the distributed coordination function, which makes them practically appealing. We first highlight a unique problem of acknowledgment delays, which arises in asynchronous MPR, and investigate a solution that modifies the medium access rules to reduce these delays and increase system throughput in the single receiver scenario. We develop a general renewal-theoretic fixed-point analysis that leads to expressions for the saturation throughput, packet dropping probability, and average head-of-line packet delay. We also model and analyze the practical scenario in which nodes may incorrectly estimate the number of ongoing transmissions.
Resumo:
The goal of the work reported in this paper is to use automated, combinatorial synthesis to generate alternative solutions to be used as stimuli by designers for ideation. FuncSION, a computational synthesis tool that can automatically synthesize solution concepts for mechanical devices by combining building blocks from a library, is used for this purpose. The objectives of FuncSION are to help generate a variety of functional requirements for a given problem and a variety of concepts to fulfill these functions. A distinctive feature of FuncSION is its focus on automated generation of spatial configurations, an aspect rarely addressed by other computational synthesis programs. This paper provides an overview of FuncSION in terms of representation of design problems, representation of building blocks, and rules with which building blocks are combined to generate concepts at three levels of abstraction: topological, spatial, and physical. The paper then provides a detailed account of evaluating FuncSION for its effectiveness in providing stimuli for enhanced ideation.
Resumo:
In noncooperative cost sharing games, individually strategic agents choose resources based on how the welfare (cost or revenue) generated at each resource (which depends on the set of agents that choose the resource) is distributed. The focus is on finding distribution rules that lead to stable allocations, which is formalized by the concept of Nash equilibrium, e.g., Shapley value (budget-balanced) and marginal contribution (not budget-balanced) rules.
Recent work that seeks to characterize the space of all such rules shows that the only budget-balanced distribution rules that guarantee equilibrium existence in all welfare sharing games are generalized weighted Shapley values (GWSVs), by exhibiting a specific 'worst-case' welfare function which requires that GWSV rules be used. Our work provides an exact characterization of the space of distribution rules (not necessarily budget-balanced) for any specific local welfare functions remains, for a general class of scalable and separable games with well-known applications, e.g., facility location, routing, network formation, and coverage games.
We show that all games conditioned on any fixed local welfare functions possess an equilibrium if and only if the distribution rules are equivalent to GWSV rules on some 'ground' welfare functions. Therefore, it is neither the existence of some worst-case welfare function, nor the restriction of budget-balance, which limits the design to GWSVs. Also, in order to guarantee equilibrium existence, it is necessary to work within the class of potential games, since GWSVs result in (weighted) potential games.
We also provide an alternative characterization—all games conditioned on any fixed local welfare functions possess an equilibrium if and only if the distribution rules are equivalent to generalized weighted marginal contribution (GWMC) rules on some 'ground' welfare functions. This result is due to a deeper fundamental connection between Shapley values and marginal contributions that our proofs expose—they are equivalent given a transformation connecting their ground welfare functions. (This connection leads to novel closed-form expressions for the GWSV potential function.) Since GWMCs are more tractable than GWSVs, a designer can tradeoff budget-balance with computational tractability in deciding which rule to implement.
Resumo:
This thesis presents a novel class of algorithms for the solution of scattering and eigenvalue problems on general two-dimensional domains under a variety of boundary conditions, including non-smooth domains and certain "Zaremba" boundary conditions - for which Dirichlet and Neumann conditions are specified on various portions of the domain boundary. The theoretical basis of the methods for the Zaremba problems on smooth domains concern detailed information, which is put forth for the first time in this thesis, about the singularity structure of solutions of the Laplace operator under boundary conditions of Zaremba type. The new methods, which are based on use of Green functions and integral equations, incorporate a number of algorithmic innovations, including a fast and robust eigenvalue-search algorithm, use of the Fourier Continuation method for regularization of all smooth-domain Zaremba singularities, and newly derived quadrature rules which give rise to high-order convergence even around singular points for the Zaremba problem. The resulting algorithms enjoy high-order convergence, and they can tackle a variety of elliptic problems under general boundary conditions, including, for example, eigenvalue problems, scattering problems, and, in particular, eigenfunction expansion for time-domain problems in non-separable physical domains with mixed boundary conditions.
Resumo:
These three papers describe an approach to the synthesis of solutions to a class of mechanical design problems; these involve transmission and transformation of mechanical forces and motion, and can be described by a set of inputs and outputs. The approach involves (1) identifying a set of primary functional elements and rules of combining them, and (2) developing appropriate representations and reasoning procedures for synthesising solution concepts using these elements and their combination rules; these synthesis procedures can produce an exhaustive set of solution concepts, in terms of their topological as well as spatial configurations, to a given design problem. This paper (Part III) describes a constraint propagation procedure which, using a knowledge base of spatial information about a set of primary functional elements, can produce possible spatial configurations of solution concepts generated in Part II.
Resumo:
Electrical circuit designers seldom create really new topologies or use old ones in a novel way. Most designs are known combinations of common configurations tailored for the particular problem at hand. In this thesis I show that much of the behavior of a designer engaged in such ordinary design can be modelled by a clearly defined computational mechanism executing a set of stylized rules. Each of my rules embodies a particular piece of the designer's knowledge. A circuit is represented as a hierarchy of abstract objects, each of which is composed of other objects. The leaves of this tree represent the physical devices from which physical circuits are fabricated. By analogy with context-free languages, a class of circuits is generated by a phrase-structure grammar of which each rule describes how one type of abstract object can be expanded into a combination of more concrete parts. Circuits are designed by first postulating an abstract object which meets the particular design requirements. This object is then expanded into a concrete circuit by successive refinement using rules of my grammar. There are in general many rules which can be used to expand a given abstract component. Analysis must be done at each level of the expansion to constrain the search to a reasonable set. Thus the rule of my circuit grammar provide constraints which allow the approximate qualitative analysis of partially instantiated circuits. Later, more careful analysis in terms of more concrete components may lead to the rejection of a line of expansion which at first looked promising. I provide special failure rules to direct the repair in this case.
Resumo:
Transport protocols are an integral part of the inter-process communication (IPC) service used by application processes to communicate over the network infrastructure. With almost 30 years of research on transport, one would have hoped that we have a good handle on the problem. Unfortunately, that is not true. As the Internet continues to grow, new network technologies and new applications continue to emerge putting transport protocols in a never-ending flux as they are continuously adapted for these new environments. In this work, we propose a clean-slate transport architecture that renders all possible transport solutions as simply combinations of policies instantiated on a single common structure. We identify a minimal set of mechanisms that once instantiated with the appropriate policies allows any transport solution to be realized. Given our proposed architecture, we contend that there are no more transport protocols to design—only policies to specify. We implement our transport architecture in a declarative language, Network Datalog (NDlog), making the specification of different transport policies easy, compact, reusable, dynamically configurable and potentially verifiable. In NDlog, transport state is represented as database relations, state is updated/queried using database operations, and transport policies are specified using declarative rules. We identify limitations with NDlog that could potentially threaten the correctness of our specification. We propose several language extensions to NDlog that would significantly improve the programmability of transport policies.
Resumo:
Today most of the IC and board designs are undertaken using two-dimensional graphics tools and rule checks. System-in-package is driving three-dimensional design concepts and this is posing a number of challenges for electronic design automation (EDA) software vendors. System-in-package requires three-dimensional EDA tools and design collaboration systems with appropriate manufacturing and assembly rules for these expanding technologies. Simulation and Analysis tools today focus on one aspect of the design requirement, for example, thermal, electrical or mechanical. System-in-Package requires analysis and simulation tools that can easily capture the complex three dimensional structures and provided integrated fast solutions to issues such as thermal management, reliability, electromagnetic interference, etc. This paper discusses some of the challenges faced by the design and analysis community in providing appropriate tools to engineers for System-in-Package design
Resumo:
To predict where a catalytic reaction should occur is a fundamental issue scientifically. Technologically, it is also important because it can facilitate the catalyst's design. However, to date, the understanding of this issue is rather limited. In this work, two types of reactions, CH4 CH3 + H and CO C + 0 on two transition metal surfaces, were chosen as model systems aiming to address in general where a catalytic reaction should occur. The dissociations of CH4 - CH3 + H and CO --> C + O and their reverse reactions on flat, stepped, and kinked Rh and Pd surfaces were studied in detail. We find the following: First, for the CH4 Ch(3) + H reaction, the dissociation barrier is reduced by similar to0.3 eV on steps and kinks as compared to that on flat surfaces. On the other hand, there is essentially no difference in barrier for the association reaction of CH3 + H on the flat surfaces and the defects. Second, for the CO C + 0 reaction, the dissociation barrier decreases dramatically (more than 0.8 eV on Rh and Pd) on steps and kinks as compared to that on flat surfaces. In contrast to the CH3 + H reaction, the C + 0 association reaction also preferentially occurs on steps and kinks. We also present a detailed analysis of the reaction barriers in which each barrier is decomposed quantitatively into a local electronic effect and a geometrical effect. Our DFT calculations show that surface defects such as steps and kinks can largely facilitate bond breaking, while whether the surface defects could promote bond formation depends on the individual reaction as well as the particular metal. The physical origin of these trends is identified and discussed. On the basis of our results, we arrive at some simple rules with respect to where a reaction should occur: (i) defects such as steps are always favored for dissociation reactions as compared to flat surfaces; and (ii) the reaction site of the association reactions is largely related to the magnitude of the bonding competition effect, which is determined by the reactant and metal valency. Reactions with high valency reactants are more likely to occur on defects (more structure-sensitive), as compared to reactions with low valency reactants. Moreover, the reactions on late transition metals are more likely to proceed on defects than those on the early transition metals.
Resumo:
Strasheela provides a means for the composer to create a symbolic score by formally describing it in a rule-based way. The environment defines a rich music representation for complex polyphonic scores. Strasheela enables the user to define expressive compositional rules and then to apply them to the score. Compositional rules can restrict many aspects of the music - including the rhythmic structure, the melodic structure and the harmonic structure - by constraining the parameters (e.g. duration or pitch) of musical events according to some numerical or logical relation. Strasheela combines this expressivity with efficient search strategies.
Resumo:
This article offers an examination of the interplay between politics, ethics, theory and methodology as they impact upon social research, through a critical analysis of the ethnographic study conducted by Peter Foster. It will be argued that his highly contentious claim to have found no manifestations of racism (either direct or indirect) throughout his study of an inner-city, multi-ethnic comprehensive school was, in the last analysis, both misleading and inaccurate. It will be contended that such claims were based upon a research design and methodology which were ultimately determined by his own political orientation and the ethical and theoretical positions which he developed as a consequence.
Resumo:
Fibre-Reinforced Plastics (FRPs) have been used in civil aerospace vehicles for decades. The current state-of-the-art in airframe design and manufacture results in approximately half the airframe mass attributable to FRP materials. The continual increase in the use of FRP materials over metallic alloys is attributable to the material's superior specific strength and stiffness, fatigue performance and corrosion resistance. However, the full potential of these materials has yet to be exploited as analysis methods to predict physical failure with equal accuracy and robustness are not yet available. The result is a conservative approach to design, but one that can bring benefit via increased inspection intervals and reduced cost over the vehicle life. The challenge is that the methods used in practice are based on empirical tests and real relationships and drivers are difficult to see in this complex process and so the trade-off decision is challenging and uncertain. The aim of this feasibility study was to scope a viable process which could help develop some rules and relationships based on the fundamental mechanics of composite material and the economics of production and operation, which would enhance understanding of the role and impact of design allowables across the life of a composite structure.
Resumo:
As últimas décadas têm sido caracterizadas por uma crescente indefinição do design, que refletindo um dia-a-dia cada vez mais complexo, não pode mais ser encarado como actividade fechada no projecto e acabada no produto. O seu universo aumentou e dispersou-se em mercados menores, alargando o seu alcance e sua importância, transformando-o em termo quantitativo e inflacionário. A crescente atenção de que é alvo dramatizou a sua relação com o consumo, o ruído nas prateleiras e a superficialidade da generalização de uma perspectiva do projecto focada nos new media, validando uma abordagem narcisista e de dispensa do diálogo com a indústria, tornaram-no sinónimo de especial. Este trabalho procura confirmar e refletir sobre a existência de um universo de produtos concretos e abordagens que de forma consciente partilham esta noção do contexto actual, em primeiro lugar, sugerindo caminhos alternativos e reacções resultantes da prática e discurso. Em segundo lugar procura analisar se tal reacção pode ser considerada como um movimento. Partindo de uma revisão bibliográfica, e de uma reflexão sobre um universo de produtos do dia-a-dia, procura-se, em terceiro lugar, interpretar e organizar as diversas formas e estratégias de rejeição da noção de design especial. Da procura de abordagens opostas a especial resulta uma ideia base de normal, normal sem ser banal, expressa na exposição Super Normal. Da extensão a um universo maior de produtos, reacções e autores, resulta a ideia de normal como um conceito mais abrangente, mas composto por conceitos menores, mais específicos, mais concretos e fáceis de identificar nos objectos. A organização e interpretação dessas abordagens resulta num triângulo de conceitos que constroem a noção de normal pela procura da evolução por recurso à memória e ao conhecimento, da invisibilidade pela integração e diluição do produto e projecto, e da liberdade como meio de adequação ao contexto actual. O livro resultante reflete uma abordagem definida de uma época, e não todo o panorama. A sistematização do conceito de normal a partir da segmentação em ideias menores, mais concretas e identificáveis nos produtos, procura proporcionar a consulta organizada de reacções ao contexto dominante actual, à ideia de especial, mas mais do que regras, procura fornecer um documento de reflexão a visitar no acto de projectar, seja como um todo, ou em partes.