992 resultados para Design Automation
Resumo:
The constant increase in digital systems complexity definitely demands the automation of the corresponding synthesis process. This paper presents a computational environment designed to produce both software and hardware implementations of a system. The tool for code generation has been named ACG8051. As for the hardware synthesis there has been produced a larger environment consisting of four programs, namely: PIPE2TAB, AGPS, TABELA, and TAB2VHDL. ACG8051 and PIPE2TAB use place/transition net descriptions from PIPE as inputs. ACG8051 is aimed at generating assembly code for the 8051 micro-controller. PIPE2TAB produces a tabular version of a Mealy type finite state machine of the system, its output is fed into AGPS that is used for state allocation. The resulting digital system is then input to TABELA, which minimizes control functions and outputs of the digital system. Finally, the output generated by TABELA is fed to TAB2VHDL that produces a VHDL description of the system at the register transfer level. Thus, we present here a set of tools designed to take a high-level description of a digital system, represented by a place/transition net, and produces as output both an assembly code that can be immediately run on an 8051 micro-controller, and a VHDL description that can be used to directly implement the hardware parts either on an FPGA or as an ASIC.
Resumo:
Petri net (PN) modeling is one of the most used formal methods in the automation applications field, together with programmable logic controllers (PLCs). Therefore, the creation of a modeling methodology for PNs compatible with the IEC61131 standard is a necessity of automation specialists. Different works dealing with this subject have been carried out; they are presented in the first part of this paper [Frey (2000a, 2000b); Peng and Zhou (IEEE Trans Syst Man Cybern, Part C Appl Rev 34(4):523-531, 2004); Uzam and Jones (Int J Adv Manuf Technol 14(10):716-728, 1998)], but they do not present a completely compatible methodology with this standard. At the same time, they do not maintain the simplicity required for such applications, nor the use of all-graphical and all-mathematical ordinary Petri net (OPN) tools to facilitate model verification and validation. The proposal presented here completes these requirements. Educational applications at the USP and UEA (Brazil) and the UO (Cuba), as well as industrial applications in Brazil and Cuba, have already been carried out with good results.
Resumo:
We describe a compositional framework, together with its supporting toolset, for hardware/software co-design. Our framework is an integration of a formal approach within a traditional design flow. The formal approach is based on Interval Temporal Logic and its executable subset, Tempura. Refinement is the key element in our framework because it will derive from a single formal specification of the system the software and hardware parts of the implementation, while preserving all properties of the system specification. During refinement simulation is used to choose the appropriate refinement rules, which are applied automatically in the HOL system. The framework is illustrated with two case studies. The work presented is part of a UK collaborative research project between the Software Technology Research Laboratory at the De Montfort University and the Oxford University Computing Laboratory.
Resumo:
The work described in this thesis aims to support the distributed design of integrated systems and considers specifically the need for collaborative interaction among designers. Particular emphasis was given to issues which were only marginally considered in previous approaches, such as the abstraction of the distribution of design automation resources over the network, the possibility of both synchronous and asynchronous interaction among designers and the support for extensible design data models. Such issues demand a rather complex software infrastructure, as possible solutions must encompass a wide range of software modules: from user interfaces to middleware to databases. To build such structure, several engineering techniques were employed and some original solutions were devised. The core of the proposed solution is based in the joint application of two homonymic technologies: CAD Frameworks and object-oriented frameworks. The former concept was coined in the late 80's within the electronic design automation community and comprehends a layered software environment which aims to support CAD tool developers, CAD administrators/integrators and designers. The latter, developed during the last decade by the software engineering community, is a software architecture model to build extensible and reusable object-oriented software subsystems. In this work, we proposed to create an object-oriented framework which includes extensible sets of design data primitives and design tool building blocks. Such object-oriented framework is included within a CAD Framework, where it plays important roles on typical CAD Framework services such as design data representation and management, versioning, user interfaces, design management and tool integration. The implemented CAD Framework - named Cave2 - followed the classical layered architecture presented by Barnes, Harrison, Newton and Spickelmier, but the possibilities granted by the use of the object-oriented framework foundations allowed a series of improvements which were not available in previous approaches: - object-oriented frameworks are extensible by design, thus this should be also true regarding the implemented sets of design data primitives and design tool building blocks. This means that both the design representation model and the software modules dealing with it can be upgraded or adapted to a particular design methodology, and that such extensions and adaptations will still inherit the architectural and functional aspects implemented in the object-oriented framework foundation; - the design semantics and the design visualization are both part of the object-oriented framework, but in clearly separated models. This allows for different visualization strategies for a given design data set, which gives collaborating parties the flexibility to choose individual visualization settings; - the control of the consistency between semantics and visualization - a particularly important issue in a design environment with multiple views of a single design - is also included in the foundations of the object-oriented framework. Such mechanism is generic enough to be also used by further extensions of the design data model, as it is based on the inversion of control between view and semantics. The view receives the user input and propagates such event to the semantic model, which evaluates if a state change is possible. If positive, it triggers the change of state of both semantics and view. Our approach took advantage of such inversion of control and included an layer between semantics and view to take into account the possibility of multi-view consistency; - to optimize the consistency control mechanism between views and semantics, we propose an event-based approach that captures each discrete interaction of a designer with his/her respective design views. The information about each interaction is encapsulated inside an event object, which may be propagated to the design semantics - and thus to other possible views - according to the consistency policy which is being used. Furthermore, the use of event pools allows for a late synchronization between view and semantics in case of unavailability of a network connection between them; - the use of proxy objects raised significantly the abstraction of the integration of design automation resources, as either remote or local tools and services are accessed through method calls in a local object. The connection to remote tools and services using a look-up protocol also abstracted completely the network location of such resources, allowing for resource addition and removal during runtime; - the implemented CAD Framework is completely based on Java technology, so it relies on the Java Virtual Machine as the layer which grants the independence between the CAD Framework and the operating system. All such improvements contributed to a higher abstraction on the distribution of design automation resources and also introduced a new paradigm for the remote interaction between designers. The resulting CAD Framework is able to support fine-grained collaboration based on events, so every single design update performed by a designer can be propagated to the rest of the design team regardless of their location in the distributed environment. This can increase the group awareness and allow a richer transfer of experiences among them, improving significantly the collaboration potential when compared to previously proposed file-based or record-based approaches. Three different case studies were conducted to validate the proposed approach, each one focusing one a subset of the contributions of this thesis. The first one uses the proxy-based resource distribution architecture to implement a prototyping platform using reconfigurable hardware modules. The second one extends the foundations of the implemented object-oriented framework to support interface-based design. Such extensions - design representation primitives and tool blocks - are used to implement a design entry tool named IBlaDe, which allows the collaborative creation of functional and structural models of integrated systems. The third case study regards the possibility of integration of multimedia metadata to the design data model. Such possibility is explored in the frame of an online educational and training platform.
Resumo:
O presente trabalho é o estudo dos aspectos da metodologia projetual, face às novas tecnologias da informática, Inteligência Artificial (Sistemas Especialistas) e CAD, consideradas as reais possibilidades de automatização no processo de concepção em Design. Esse artigo propõe uma metodologia para a construção de sistema inteligente capaz de auxiliar o designer nas tarefas projetuais. A indústria de calçados foi utilizada como estudo de caso para a aplicação da metodologia, onde as reais possibilidades de automação são verificadas.
Resumo:
In this work, a unified algorithm-architecture-circuit co-design environment for complex FPGA system development is presented. The main objective is to find an efficient methodology for designing a configurable optimized FPGA system by using as few efforts as possible in verification stage, so as to speed up the development period. A proposed high performance FFT/iFFT processor for Multiband Orthogonal Frequency Division Multiplexing Ultra Wideband (MB-OFDM UWB) system design process is given as an example to demonstrate the proposed methodology. This efficient design methodology is tested and considered to be suitable for almost all types of complex FPGA system designs and verifications.
Resumo:
Combinatorial optimization is a complex engineering subject. Although formulation often depends on the nature of problems that differs from their setup, design, constraints, and implications, establishing a unifying framework is essential. This dissertation investigates the unique features of three important optimization problems that can span from small-scale design automation to large-scale power system planning: (1) Feeder remote terminal unit (FRTU) planning strategy by considering the cybersecurity of secondary distribution network in electrical distribution grid, (2) physical-level synthesis for microfluidic lab-on-a-chip, and (3) discrete gate sizing in very-large-scale integration (VLSI) circuit. First, an optimization technique by cross entropy is proposed to handle FRTU deployment in primary network considering cybersecurity of secondary distribution network. While it is constrained by monetary budget on the number of deployed FRTUs, the proposed algorithm identi?es pivotal locations of a distribution feeder to install the FRTUs in different time horizons. Then, multi-scale optimization techniques are proposed for digital micro?uidic lab-on-a-chip physical level synthesis. The proposed techniques handle the variation-aware lab-on-a-chip placement and routing co-design while satisfying all constraints, and considering contamination and defect. Last, the first fully polynomial time approximation scheme (FPTAS) is proposed for the delay driven discrete gate sizing problem, which explores the theoretical view since the existing works are heuristics with no performance guarantee. The intellectual contribution of the proposed methods establishes a novel paradigm bridging the gaps between professional communities.
Resumo:
Among several process variability sources, valve friction and inadequate controller tuning are supposed to be two of the most prevalent. Friction quantification methods can be applied to the development of model-based compensators or to diagnose valves that need repair, whereas accurate process models can be used in controller retuning. This paper extends existing methods that jointly estimate the friction and process parameters, so that a nonlinear structure is adopted to represent the process model. The developed estimation algorithm is tested with three different data sources: a simulated first order plus dead time process, a hybrid setup (composed of a real valve and a simulated pH neutralization process) and from three industrial datasets corresponding to real control loops. The results demonstrate that the friction is accurately quantified, as well as ""good"" process models are estimated in several situations. Furthermore, when a nonlinear process model is considered, the proposed extension presents significant advantages: (i) greater accuracy for friction quantification and (ii) reasonable estimates of the nonlinear steady-state characteristics of the process. (C) 2010 Elsevier Ltd. All rights reserved.
Resumo:
Liver steatosis is a common disease usually associated with social and genetic factors. Early detection and quantification is important since it can evolve to cirrhosis. In this paper, a new computer-aided diagnosis (CAD) system for steatosis classification, in a local and global basis, is presented. Bayes factor is computed from objective ultrasound textural features extracted from the liver parenchyma. The goal is to develop a CAD screening tool, to help in the steatosis detection. Results showed an accuracy of 93.33%, with a sensitivity of 94.59% and specificity of 92.11%, using the Bayes classifier. The proposed CAD system is a suitable graphical display for steatosis classification.
Resumo:
Liver steatosis is a common disease usually associated with social and genetic factors. Early detection and quantification is important since it can evolve to cirrhosis. Steatosis is usually a diffuse liver disease, since it is globally affected. However, steatosis can also be focal affecting only some foci difficult to discriminate. In both cases, steatosis is detected by laboratorial analysis and visual inspection of ultrasound images of the hepatic parenchyma. Liver biopsy is the most accurate diagnostic method but its invasive nature suggest the use of other non-invasive methods, while visual inspection of the ultrasound images is subjective and prone to error. In this paper a new Computer Aided Diagnosis (CAD) system for steatosis classification and analysis is presented, where the Bayes Factor, obatined from objective intensity and textural features extracted from US images of the liver, is computed in a local or global basis. The main goal is to provide the physician with an application to make it faster and accurate the diagnosis and quantification of steatosis, namely in a screening approach. The results showed an overall accuracy of 93.54% with a sensibility of 95.83% and 85.71% for normal and steatosis class, respectively. The proposed CAD system seemed suitable as a graphical display for steatosis classification and comparison with some of the most recent works in the literature is also presented.
Resumo:
In real-time systems, there are two distinct trends for scheduling task sets on unicore systems: non-preemptive and preemptive scheduling. Non-preemptive scheduling is obviously not subject to any preemption delay but its schedulability may be quite poor, whereas fully preemptive scheduling is subject to preemption delay, but benefits from a higher flexibility in the scheduling decisions. The time-delay involved by task preemptions is a major source of pessimism in the analysis of the task Worst-Case Execution Time (WCET) in real-time systems. Preemptive scheduling policies including non-preemptive regions are a hybrid solution between non-preemptive and fully preemptive scheduling paradigms, which enables to conjugate both world's benefits. In this paper, we exploit the connection between the progression of a task in its operations, and the knowledge of the preemption delays as a function of its progression. The pessimism in the preemption delay estimation is then reduced in comparison to state of the art methods, due to the increase in information available in the analysis.
Resumo:
Task scheduling is one of the key mechanisms to ensure timeliness in embedded real-time systems. Such systems have often the need to execute not only application tasks but also some urgent routines (e.g. error-detection actions, consistency checkers, interrupt handlers) with minimum latency. Although fixed-priority schedulers such as Rate-Monotonic (RM) are in line with this need, they usually make a low processor utilization available to the system. Moreover, this availability usually decreases with the number of considered tasks. If dynamic-priority schedulers such as Earliest Deadline First (EDF) are applied instead, high system utilization can be guaranteed but the minimum latency for executing urgent routines may not be ensured. In this paper we describe a scheduling model according to which urgent routines are executed at the highest priority level and all other system tasks are scheduled by EDF. We show that the guaranteed processor utilization for the assumed scheduling model is at least as high as the one provided by RM for two tasks, namely 2(2√−1). Seven polynomial time tests for checking the system timeliness are derived and proved correct. The proposed tests are compared against each other and to an exact but exponential running time test.
Resumo:
A qualidade é um factor-chave na indústria automóvel. Todos os fornecedores de componentes para a indústria automóvel estão sujeitos a qualificações e auditorias sistemáticas, com vista a melhorar os processos e verificar a sua rastreabilidade. Quando os processos assentam essencialmente em mão-de-obra intensiva, torna-se muito mais difícil atingir a ambicionada meta dos zero-defeitos, e a garantia da qualidade pode ficar comprometida, sendo necessário instalar procedimentos de controlo mais apurados. No entanto, se o processo ou processos forem convenientemente definidos, e se optar por capital intensivo em detrimento da mão-de-obra intensiva, a garantia da qualidade pode ser uma realidade, podendo ser fortemente minimizadas as operações de controlo da qualidade. Este trabalho teve por base a necessidade de reduzir fortemente, ou eliminar mesmo, o aparecimento de defeitos de montagem num sistema designado por remachado. Após cuidada análise do processo instalado, já parcialmente automatizado, mas ainda fortemente dependente de mão-de-obra, procedeu-se ao projecto de um equipamento capaz de reproduzir o mesmo efeito, mas que acomodasse alguns possíveis defeitos oriundos dos fornecedores dos componentes que são inseridos neste conjunto, colocados a montante na cadeia de fornecimento do produto. O equipamento resultante deste trabalho permitiu baixar o tempo de ciclo, acomodar a variabilidade dimensional detectada nos componentes que constituem o conjunto e reduzir drasticamente o número de não-conformidades.
Resumo:
O fabrico de vestuário é uma atividade que se desenvolve em Portugal há várias décadas. Existem marcas de vestuário com reconhecimento a nível mundial que são de origem portuguesa. Para se conseguir qualidade é necessário inovar e automatizar determinados processos, de forma a aumentar produtividades e reduzir erros devido à mão de obra de tarefas intensivas. Na empresa Portuguesa Henrique Camões, com uma vasta experiência ligada a equipamentos de fabrico têxtil, nasceu a ideia de projetar um protótipo de um equipamento automatizado para o fabrico de colarinhos e punhos, com a finalidade de verificar a sua viabilidade a nível funcional. Este trabalho teve assim por base a necessidade de efetuar um projeto sobre um equipamento capaz de costurar e cortar colarinhos e punhos, a serem aplicados em peças de vestuário. Inicialmente foi efetuado um estudo prévio de levantamento de equipamentos já existentes para fins semelhantes. Foi então necessário idealizar um equipamento capaz de responder às expectativas e exigências por parte do cliente. Após os esboços iniciais, onde foram definidos os tipos de mecanismos e formas de funcionamento dos diferentes sistemas em função dos movimentos e ações pretendidas e a estrutura do equipamento, estes sistemas foram otimizados por forma a se obter como resultado final um equipamento funcional. Foi também projetado o esquema pneumático e Grafcet de funcionamento do equipamento. Como auxiliares do projeto, apresentam-se a lista de componentes e de processos de fabrico, bem como os desenhos de pormenor de todos os componentes integrantes da estrutura. O resultado final é um conjunto de ideias e soluções possíveis de aplicar num equipamento deste tipo. De facto, a solução proposta é uma possibilidade viável para um equipamento automatizado para costura e corte de colarinhos e punhos.
Resumo:
Diplomityössä suunnitellaan eristekourujen ponttauslaite, jolla pystytään ponttaamaan eristekouruja. Pontattavien eristekourujen pituus ja halkaisija muuttuvat. Suunnittelun lähtökohtana on nykyisin käytössä olevien ponttauslaitteiden ergonomia- ja työturvallisuusongelmat. Työssä käydään läpi erilaisia mahdollisuuksia ponttauslaitteen toteutukseen ja kehitellään kaksi periaatteellista ratkaisua toimiviksi kokoonpanoiksi. Kokoonpanoista valitaan perustellusti parempi vaihtoehto jatkokehitykseen ja se kehitellään valmiiksi ponttauslaitteeksi. Työ käsittää laitteen mekaniikkasuunnittelun, automaatiosuunnittelu rajataan työn ulkopuolelle. Työn eteneminen seuraa järjestelmällisen koneensuunnitteluprosessin vaiheita. Lopputuloksena suunniteltiin laite, joka toteuttaa sille asetetut suoritusvaatimukset. Suunniteltu laite on turvallinen ja ergonomiset vaatimukset täyttävä. Lisäksi laitteen rungolle tehtiin värähtelytarkasteluja, joiden perusteella analysoitiin ominais- ja herätetaajuuksien suhteita sekä rungon jäykistämisen vaikutuksia.