863 resultados para set based design
Resumo:
Coordenação de Aperfeiçoamento de Pessoal de Nível Superior (CAPES)
Resumo:
In deterministic optimization, the uncertainties of the structural system (i.e. dimension, model, material, loads, etc) are not explicitly taken into account. Hence, resulting optimal solutions may lead to reduced reliability levels. The objective of reliability based design optimization (RBDO) is to optimize structures guaranteeing that a minimum level of reliability, chosen a priori by the designer, is maintained. Since reliability analysis using the First Order Reliability Method (FORM) is an optimization procedure itself, RBDO (in its classical version) is a double-loop strategy: the reliability analysis (inner loop) and the structural optimization (outer loop). The coupling of these two loops leads to very high computational costs. To reduce the computational burden of RBDO based on FORM, several authors propose decoupling the structural optimization and the reliability analysis. These procedures may be divided in two groups: (i) serial single loop methods and (ii) unilevel methods. The basic idea of serial single loop methods is to decouple the two loops and solve them sequentially, until some convergence criterion is achieved. On the other hand, uni-level methods employ different strategies to obtain a single loop of optimization to solve the RBDO problem. This paper presents a review of such RBDO strategies. A comparison of the performance (computational cost) of the main strategies is presented for several variants of two benchmark problems from the literature and for a structure modeled using the finite element method.
Resumo:
The main goal of this thesis is to facilitate the process of industrial automated systems development applying formal methods to ensure the reliability of systems. A new formulation of distributed diagnosability problem in terms of Discrete Event Systems theory and automata framework is presented, which is then used to enforce the desired property of the system, rather then just verifying it. This approach tackles the state explosion problem with modeling patterns and new algorithms, aimed for verification of diagnosability property in the context of the distributed diagnosability problem. The concepts are validated with a newly developed software tool.
Resumo:
In this paper we present a new population-based method for the design of bone fixation plates. Standard pre-contoured plates are designed based on the mean shape of a certain population. We propose a computational process to design implants while reducing the amount of required intra-operative shaping, thus reducing the mechanical stresses applied to the plate. A bending and torsion model was used to measure and minimize the necessary intra-operative deformation. The method was applied and validated on a population of 200 femurs that was further augmented with a statistical shape model. The obtained results showed substantial reduction in the bending and torsion needed to shape the new design into any bone in the population when compared to the standard mean-based plates.
Resumo:
In the past few decades, integrated circuits have become a major part of everyday life. Every circuit that is created needs to be tested for faults so faulty circuits are not sent to end-users. The creation of these tests is time consuming, costly and difficult to perform on larger circuits. This research presents a novel method for fault detection and test pattern reduction in integrated circuitry under test. By leveraging the FPGA's reconfigurability and parallel processing capabilities, a speed up in fault detection can be achieved over previous computer simulation techniques. This work presents the following contributions to the field of Stuck-At-Fault detection: We present a new method for inserting faults into a circuit net list. Given any circuit netlist, our tool can insert multiplexers into a circuit at correct internal nodes to aid in fault emulation on reconfigurable hardware. We present a parallel method of fault emulation. The benefit of the FPGA is not only its ability to implement any circuit, but its ability to process data in parallel. This research utilizes this to create a more efficient emulation method that implements numerous copies of the same circuit in the FPGA. A new method to organize the most efficient faults. Most methods for determinin the minimum number of inputs to cover the most faults require sophisticated softwareprograms that use heuristics. By utilizing hardware, this research is able to process data faster and use a simpler method for an efficient way of minimizing inputs.
Resumo:
The Social Web offers increasingly simple ways to publish and disseminate personal or opinionated information, which can rapidly exhibit a disastrous influence on the online reputation of organizations. Based on social Web data, this study describes the building of an ontology based on fuzzy sets. At the end of a recurring harvesting of folksonomies by Web agents, the aggregated tags are purified, linked, and transformed to a so-called fuzzy grassroots ontology by means of a fuzzy clustering algorithm. This self-updating ontology is used for online reputation analysis, a crucial task of reputation management, with the goal to follow the online conversation going on around an organization to discover and monitor its reputation. In addition, an application of the Fuzzy Online Reputation Analysis (FORA) framework, lesson learned, and potential extensions are discussed in this article.
Resumo:
Bridges with deck supported on either sliding or elastomeric bearings are very common in mid-seismicity regions. Their main seismic vulnerabilities are related to the pounding of the deck against abutments or between the different deck elements. A simplified model of the longitudinal behavior of those bridges will allow to characterize the reaction forces developed during pounding using the Pacific Earthquake Engineering Research Center framework formula. In order to ensure the general applicability of the results obtained, a large number of system parameter combinations will be considered. The heart of the formula is the identification of suitable intermediate variables. First, the pseudo acceleration spectral value for the fundamental period of the system (Sa(Ts)) will be used as an intensity measure (IM). This IM will result in a very large non-explained variability of the engineering demand parameter. A portion of this variability will be proved to be related to the relative content of high-frequency energy in the input motion. Two vector-valued IMs including a second parameter taking this energy content into account will then be considered. For both of them, a suitable form for the conditional intensity dependence of the response will be obtained. The question of which one to choose will also be analyzed. Finally, additional issues related to the IM will be studied: its applicability to pulse-type records, the validity of scaling records and the sufficiency of the IM.
Resumo:
The telomerase enzyme is a potential therapeutic target in many human cancers. A series of potent inhibitors has been designed by computer modeling, which exploit the unique structural features of quadruplex DNA. These 3,6,9-trisubstituted acridine inhibitors are predicted to interact selectively with the human DNA quadruplex structure, as a means of specifically inhibiting the action of human telomerase in extending the length of single-stranded telomeric DNA. The anilino substituent at the 9-position of the acridine chromophore is predicted to lie in a third groove of the quadruplex. Calculated relative binding energies predict enhanced selectivity compared with earlier 3,6-disubstituted compounds, as a result of this substituent. The ranking order of energies is in accord with equilibrium binding constants for quadruplex measured by surface plasmon resonance techniques, which also show reduced duplex binding compared with the disubstituted compounds. The 3,6,9-trisubstututed acridines have potent in vitro inhibitory activity against human telomerase, with EC50 values of up to 60 nM.
Resumo:
The homeodomain is a 60-amino acid module which mediates critical protein-DNA and protein-protein interactions for a large family of regulatory proteins. We have used structure-based design to analyze the ability of the Oct-1 homeodomain to nucleate an enhancer complex. The Oct-1 protein regulates herpes simplex virus (HSV) gene expression by participating in the formation of a multiprotein complex (C1 complex) which regulates alpha (immediate early) genes. We recently described the design of ZFHD1, a chimeric transcription factor containing zinc fingers 1 and 2 of Zif268, a four-residue linker, and the Oct-1 homeodomain. In the presence of alpha-transinduction factor and C1 factor, ZFHD1 efficiently nucleates formation of the C1 complex in vitro and specifically activates gene expression in vivo. The sequence specificity of ZFHD1 recruits C1 complex formation to an enhancer element which is not efficiently recognized by Oct-1. ZFHD1 function depends on the recognition of the Oct-1 homeodomain surface. These results prove that the Oct-1 homeodomain mediates all the protein-protein interactions that are required to efficiently recruit alpha-transinduction factor and C1 factor into a C1 complex. The structure-based design of transcription factors should provide valuable tools for dissecting the interactions of DNA-bound domains in other regulatory circuits.
Resumo:
A class of potent nonpeptidic inhibitors of human immunodeficiency virus protease has been designed by using the three-dimensional structure of the enzyme as a guide. By employing iterative protein cocrystal structure analysis, design, and synthesis the binding affinity of the lead compound was incrementally improved by over four orders of magnitude. An inversion in inhibitor binding mode was observed crystallographically, providing information critical for subsequent design and highlighting the utility of structural feedback in inhibitor optimization. These inhibitors are selective for the viral protease enzyme, possess good antiviral activity, and are orally available in three species.
Resumo:
This paper presents a way to describe design patterns rigorously based on role concepts. Rigorous pattern descriptions are a key aspect for patterns to be used as rules for model evolution in the MDA context, for example. We formalize the role concepts commonly used in defining design patterns as a role metamodel using Object-Z. Given this role metamodel, individual design patterns are specified generically as a formal pattern role model using Object-Z. We also formalize the properties that must be captured in a class model when a design pattern is deployed. These properties are defined generically in terms of role bindings from a pattern role model to a class model. Our work provides a precise but abstract approach for pattern definition and also provides a precise basis for checking the validity of pattern usage in designs.
Resumo:
A method and a corresponding tool is described which assist design recovery and program understanding by recognising instances of design patterns semi-automatically. The approach taken is specifically designed to overcome the existing scalability problems caused by many design and implementation variants of design pattern instances. Our approach is based on a new recognition algorithm which works incrementally rather than trying to analyse a possibly large software system in one pass without any human intervention. The new algorithm exploits domain and context knowledge given by a reverse engineer and by a special underlying data structure, namely a special form of an annotated abstract syntax graph. A comparative and quantitative evaluation of applying the approach to the Java AWT and JGL libraries is also given.
Resumo:
In the developed world we are surrounded by man-made objects, but most people give little thought to the complex processes needed for their design. The design of hand knitting is complex because much of the domain knowledge is tacit. The objective of this thesis is to devise a methodology to help designers to work within design constraints, whilst facilitating creativity. A hybrid solution including computer aided design (CAD) and case based reasoning (CBR) is proposed. The CAD system creates designs using domain-specific rules and these designs are employed for initial seeding of the case base and the management of constraints. CBR reuses the designer's previous experience. The key aspects in the CBR system are measuring the similarity of cases and adapting past solutions to the current problem. Similarity is measured by asking the user to rank the importance of features; the ranks are then used to calculate weights for an algorithm which compares the specifications of designs. A novel adaptation operator called rule difference replay (RDR) is created. When the specifications to a new design is presented, the CAD program uses it to construct a design constituting an approximate solution. The most similar design from the case-base is then retrieved and RDR replays the changes previously made to the retrieved design on the new solution. A measure of solution similarity that can validate subjective success scores is created. Specification similarity can be used as a guide whether to invoke CBR, in a hybrid CAD-CBR system. If the newly resulted design is suffciently similar to a previous design, then CBR is invoked; otherwise CAD is used. The application of RDR to knitwear design has demonstrated the flexibility to overcome deficiencies in rules that try to automate creativity, and has the potential to be applied to other domains such as interior design.
Resumo:
The questions of software-based design of “virtual” technical systems are considered as facility of imitation experiment for educational purposes. These virtual systems are usable for analysis of medical intrascopy systems functioning. The virtual educational technical systems allow guarantee the goodness technical training of bioengineers.