924 resultados para Asynchronous logic circuits


Relevância:

20.00% 20.00%

Publicador:

Resumo:

O nome de Claude Elwood Shannon não é totalmente estranho aos pesquisadores de Comunicação Social. No entanto, parte de sua importância para a história da comunicação no século XX é pouco conhecida. Sua dissertação de mestrado e o artigo dela derivado (A Symbolic Analysis of Relay and Switching Circuits) foram essenciais para que o computador se tornasse uma máquina de comunicação e, conseqüentemente, penetrasse em nossa sociedade na forma como ocorre hoje. Este artigo revisa o primeiro grande trabalho de Shannon e explicita sua participação no contexto atual da comunicação.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The recent advances in CMOS technology have allowed for the fabrication of transistors with submicronic dimensions, making possible the integration of tens of millions devices in a single chip that can be used to build very complex electronic systems. Such increase in complexity of designs has originated a need for more efficient verification tools that could incorporate more appropriate physical and computational models. Timing verification targets at determining whether the timing constraints imposed to the design may be satisfied or not. It can be performed by using circuit simulation or by timing analysis. Although simulation tends to furnish the most accurate estimates, it presents the drawback of being stimuli dependent. Hence, in order to ensure that the critical situation is taken into account, one must exercise all possible input patterns. Obviously, this is not possible to accomplish due to the high complexity of current designs. To circumvent this problem, designers must rely on timing analysis. Timing analysis is an input-independent verification approach that models each combinational block of a circuit as a direct acyclic graph, which is used to estimate the critical delay. First timing analysis tools used only the circuit topology information to estimate circuit delay, thus being referred to as topological timing analyzers. However, such method may result in too pessimistic delay estimates, since the longest paths in the graph may not be able to propagate a transition, that is, may be false. Functional timing analysis, in turn, considers not only circuit topology, but also the temporal and functional relations between circuit elements. Functional timing analysis tools may differ by three aspects: the set of sensitization conditions necessary to declare a path as sensitizable (i.e., the so-called path sensitization criterion), the number of paths simultaneously handled and the method used to determine whether sensitization conditions are satisfiable or not. Currently, the two most efficient approaches test the sensitizability of entire sets of paths at a time: one is based on automatic test pattern generation (ATPG) techniques and the other translates the timing analysis problem into a satisfiability (SAT) problem. Although timing analysis has been exhaustively studied in the last fifteen years, some specific topics have not received the required attention yet. One such topic is the applicability of functional timing analysis to circuits containing complex gates. This is the basic concern of this thesis. In addition, and as a necessary step to settle the scenario, a detailed and systematic study on functional timing analysis is also presented.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The mixed-signal and analog design on a pre-diffused array is a challenging task, given that the digital array is a linear matrix arrangement of minimum-length transistors. To surmount this drawback a specific discipline for designing analog circuits over such array is required. An important novel technique proposed is the use of TAT (Trapezoidal Associations of Transistors) composite transistors on the semi-custom Sea-Of-Transistors (SOT) array. The analysis and advantages of TAT arrangement are extensively analyzed and demonstrated, with simulation and measurement comparisons to equivalent single transistors. Basic analog cells were also designed as well in full-custom and TAT versions in 1.0mm and 0.5mm digital CMOS technologies. Most of the circuits were prototyped in full-custom and TAT-based on pre-diffused SOT arrays. An innovative demonstration of the TAT technique is shown with the design and implementation of a mixed-signal analog system, i. e., a fully differential 2nd order Sigma-Delta Analog-to-Digital (A/D) modulator, fabricated in both full-custom and SOT array methodologies in 0.5mm CMOS technology from MOSIS foundry. Three test-chips were designed and fabricated in 0.5mm. Two of them are IC chips containing the full-custom and SOT array versions of a 2nd-Order Sigma-Delta A/D modulator. The third IC contains a transistors-structure (TAT and single) and analog cells placed side-by-side, block components (Comparator and Folded-cascode OTA) of the Sigma-Delta modulator.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

With the ever increasing demands for high complexity consumer electronic products, market pressures demand faster product development and lower cost. SoCbased design can provide the required design flexibility and speed by allowing the use of IP cores. However, testing costs in the SoC environment can reach a substantial percent of the total production cost. Analog testing costs may dominate the total test cost, as testing of analog circuits usually require functional verification of the circuit and special testing procedures. For RF analog circuits commonly used in wireless applications, testing is further complicated because of the high frequencies involved. In summary, reducing analog test cost is of major importance in the electronic industry today. BIST techniques for analog circuits, though potentially able to solve the analog test cost problem, have some limitations. Some techniques are circuit dependent, requiring reconfiguration of the circuit being tested, and are generally not usable in RF circuits. In the SoC environment, as processing and memory resources are available, they could be used in the test. However, the overhead for adding additional AD and DA converters may be too costly for most systems, and analog routing of signals may not be feasible and may introduce signal distortion. In this work a simple and low cost digitizer is used instead of an ADC in order to enable analog testing strategies to be implemented in a SoC environment. Thanks to the low analog area overhead of the converter, multiple analog test points can be observed and specific analog test strategies can be enabled. As the digitizer is always connected to the analog test point, it is not necessary to include muxes and switches that would degrade the signal path. For RF analog circuits, this is specially useful, as the circuit impedance is fixed and the influence of the digitizer can be accounted for in the design phase. Thanks to the simplicity of the converter, it is able to reach higher frequencies, and enables the implementation of low cost RF test strategies. The digitizer has been applied successfully in the testing of both low frequency and RF analog circuits. Also, as testing is based on frequency-domain characteristics, nonlinear characteristics like intermodulation products can also be evaluated. Specifically, practical results were obtained for prototyped base band filters and a 100MHz mixer. The application of the converter for noise figure evaluation was also addressed, and experimental results for low frequency amplifiers using conventional opamps were obtained. The proposed method is able to enhance the testability of current mixed-signal designs, being suitable for the SoC environment used in many industrial products nowadays.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A new paradigm is modeling the World: evolutionary innovations in all fronts, new information technologies, huge mobility of capital, use of risky financial tools, globalization of production, new emerging powers and the impact of consumer concerns on governmental policies. These phenomena are shaping the World and forcing the advent of a new World Order in the Multilateral Monetary, Financial, and Trading System. The effects of this new paradigm are also transforming global governance. The political and economic orders established after the World War and centered on the multilateral model of UN, IMF, World Bank, and the GATT, leaded by the developed countries, are facing significant challenges. The rise of China and emerging countries shifted the old model to a polycentric World, where the governance of these organizations are threatened by emerging countries demanding a bigger participation in the role and decision boards of these international bodies. As a consequence, multilateralism is being confronted by polycentrism. Negotiations for a more representative voting process and the pressure for new rules to cope with the new demands are paralyzing important decisions. This scenario is affecting seriously not only the Monetary and Financial Systems but also the Multilateral Trading System. International trade is facing some significant challenges: a serious deadlock to conclude the last round of the multilateral negotiation at the WTO, the fragmentation of trade rules by the multiplication of preferential and mega agreements, the arrival of a new model of global production and trade leaded by global value chains that is threatening the old trade order, and the imposition of new sets of regulations by private bodies commanded by transnationals to support global value chains and non-governmental organizations to reflect the concerns of consumers in the North based on their precautionary attitude about sustainability of products made in the World. The lack of any multilateral order in this new regulation is creating a big cacophony of rules and developing a new regulatory war of the Global North against the Global South. The objective of this paper is to explore how these challenges are affecting the Tradinge System and how it can evolve to manage these new trends.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Investors were wrong to believe in change for the better; Brazil is stuck for at least two years

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A lógica fuzzy admite infinitos valores lógicos intermediários entre o falso e o verdadeiro. Com esse princípio, foi elaborado neste trabalho um sistema baseado em regras fuzzy, que indicam o índice de massa corporal de animais ruminantes com objetivo de obter o melhor momento para o abate. O sistema fuzzy desenvolvido teve como entradas as variáveis massa e altura, e a saída um novo índice de massa corporal, denominado Índice de Massa Corporal Fuzzy (IMC Fuzzy), que poderá servir como um sistema de detecção do momento de abate de bovinos, comparando-os entre si através das variáveis linguísticas )Muito BaixaM, ,BaixaB, ,MédiaM, ,AltaA e Muito AltaM. Para a demonstração e aplicação da utilização deste sistema fuzzy, foi feita uma análise de 147 vacas da raça Nelore, determinando os valores do IMC Fuzzy para cada animal e indicando a situação de massa corpórea de todo o rebanho. A validação realizada do sistema foi baseado em uma análise estatística, utilizando o coeficiente de correlação de Pearson 0,923, representando alta correlação positiva e indicando que o método proposto está adequado. Desta forma, o presente método possibilita a avaliação do rebanho, comparando cada animal do rebanho com seus pares do grupo, fornecendo desta forma um método quantitativo de tomada de decisão para o pecuarista. Também é possível concluir que o presente trabalho estabeleceu um método computacional baseado na lógica fuzzy capaz de imitar parte do raciocínio humano e interpretar o índice de massa corporal de qualquer tipo de espécie bovina e em qualquer região do País.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Coordenação de Aperfeiçoamento de Pessoal de Nível Superior

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The frequency selective surfaces, or FSS (Frequency Selective Surfaces), are structures consisting of periodic arrays of conductive elements, called patches, which are usually very thin and they are printed on dielectric layers, or by openings perforated on very thin metallic surfaces, for applications in bands of microwave and millimeter waves. These structures are often used in aircraft, missiles, satellites, radomes, antennae reflector, high gain antennas and microwave ovens, for example. The use of these structures has as main objective filter frequency bands that can be broadcast or rejection, depending on the specificity of the required application. In turn, the modern communication systems such as GSM (Global System for Mobile Communications), RFID (Radio Frequency Identification), Bluetooth, Wi-Fi and WiMAX, whose services are highly demanded by society, have required the development of antennas having, as its main features, and low cost profile, and reduced dimensions and weight. In this context, the microstrip antenna is presented as an excellent choice for communications systems today, because (in addition to meeting the requirements mentioned intrinsically) planar structures are easy to manufacture and integration with other components in microwave circuits. Consequently, the analysis and synthesis of these devices mainly, due to the high possibility of shapes, size and frequency of its elements has been carried out by full-wave models, such as the finite element method, the method of moments and finite difference time domain. However, these methods require an accurate despite great computational effort. In this context, computational intelligence (CI) has been used successfully in the design and optimization of microwave planar structures, as an auxiliary tool and very appropriate, given the complexity of the geometry of the antennas and the FSS considered. The computational intelligence is inspired by natural phenomena such as learning, perception and decision, using techniques such as artificial neural networks, fuzzy logic, fractal geometry and evolutionary computation. This work makes a study of application of computational intelligence using meta-heuristics such as genetic algorithms and swarm intelligence optimization of antennas and frequency selective surfaces. Genetic algorithms are computational search methods based on the theory of natural selection proposed by Darwin and genetics used to solve complex problems, eg, problems where the search space grows with the size of the problem. The particle swarm optimization characteristics including the use of intelligence collectively being applied to optimization problems in many areas of research. The main objective of this work is the use of computational intelligence, the analysis and synthesis of antennas and FSS. We considered the structures of a microstrip planar monopole, ring type, and a cross-dipole FSS. We developed algorithms and optimization results obtained for optimized geometries of antennas and FSS considered. To validate results were designed, constructed and measured several prototypes. The measured results showed excellent agreement with the simulated. Moreover, the results obtained in this study were compared to those simulated using a commercial software has been also observed an excellent agreement. Specifically, the efficiency of techniques used were CI evidenced by simulated and measured, aiming at optimizing the bandwidth of an antenna for wideband operation or UWB (Ultra Wideband), using a genetic algorithm and optimizing the bandwidth, by specifying the length of the air gap between two frequency selective surfaces, using an optimization algorithm particle swarm

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This work presents simulation results of an identification platform compatible with the INPE Brazilian Data Collection System, modeled with SystemC-AMS. SystemC-AMS that is a library of C++ classes dedicated to the simulation of heterogeneous systems, offering a powerful resource to describe models in digital, analog and RF domains, as well as mechanical and optic. The designed model was divided in four parts. The first block takes into account the satellite s orbit, necessary to correctly model the propagation channel, including Doppler effect, attenuation and thermal noise. The identification block detects the satellite presence. It is composed by low noise amplifier, band pass filter, power detector and logic comparator. The controller block is responsible for enabling the RF transmitter when the presence of the satellite is detected. The controller was modeled as a Petri net, due to the asynchronous nature of the system. The fourth block is the RF transmitter unit, which performs the modulation of the information in BPSK ±60o. This block is composed by oscillator, mixer, adder and amplifier. The whole system was simulated simultaneously. The results are being used to specify system components and to elaborate testbenchs for design verification

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper describes a novel approach for mapping lightning processes using fuzzy logic. The estimation process is carried out using a fuzzy system based on Sugeno's architecture. Simulation results confirm that proposed approach can be efficiently used in these types of problem.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

From the geotechnical standpoint, it is interesting to analyse the soil texture in regions with rough terrain due to its relation with the infiltration and runoff processes and, consequently, the effect on erosion processes. The purpose of this paper is to present a methodology that provides the soil texture spatialization by using Fuzzy logic and Geostatistic. The results were correlated with maps drawn specifically for the study area. The knowledge of the spatialization of soil properties, such as the texture, can be an important tool for land use planning in order to reduce the potential soil losses during rain seasons. (c) 2011 Published by Elsevier Ltd. Selection and peer-review under responsibility of Spatial Statistics 2011

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The optimized allocation of protective devices in strategic points of the circuit improves the quality of the energy supply and the system reliability index. This paper presents a nonlinear integer programming (NLIP) model with binary variables, to deal with the problem of protective device allocation in the main feeder and all branches of an overhead distribution circuit, to improve the reliability index and to provide customers with service of high quality and reliability. The constraints considered in the problem take into account technical and economical limitations, such as coordination problems of serial protective devices, available equipment, the importance of the feeder and the circuit topology. The use of genetic algorithms (GAs) is proposed to solve this problem, using a binary representation that does (1) or does not (0) show allocation of protective devices (reclosers, sectionalizers and fuses) in predefined points of the circuit. Results are presented for a real circuit (134 busses), with the possibility of protective device allocation in 29 points. Also the ability of the algorithm in finding good solutions while improving significantly the indicators of reliability is shown. (C) 2003 Elsevier B.V. All rights reserved.