947 resultados para IEEE 1149.4


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Debugging electronic circuits is traditionally done with bench equipment directly connected to the circuit under debug. In the digital domain, the difficulties associated with the direct physical access to circuit nodes led to the inclusion of resources providing support to that activity, first at the printed circuit level, and then at the integrated circuit level. The experience acquired with those solutions led to the emergence of dedicated infrastructures for debugging cores at the system-on-chip level. However, all these developments had a small impact in the analog and mixed-signal domain, where debugging still depends, to a large extent, on direct physical access to circuit nodes. As a consequence, when analog and mixed-signal circuits are integrated as cores inside a system-on-chip, the difficulties associated with debugging increase, which cause the time-to-market and the prototype verification costs to also increase. The present work considers the IEEE1149.4 infrastructure as a means to support the debugging of mixed-signal circuits, namely to access the circuit nodes and also an embedded debug mechanism named mixed-signal condition detector, necessary for watch-/breakpoints and real-time analysis operations. One of the main advantages associated with the proposed solution is the seamless migration to the system-on-chip level, as the access is done through electronic means, thus easing debugging operations at different hierarchical levels.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The increasing complexity of VLSI circuits and the reduced accessibility of modern packaging and mounting technologies restrict the usefulness of conventional in-circuit debugging tools, such as in-circuit emulators for microprocessors and microcontrollers. However, this same trend enables the development of more complex products, which in turn require more powerful debugging tools. These conflicting demands could be met if the standard scan test infrastructures now common in most complex components were able to match the debugging requirements of design verification and prototype validation. This paper analyses the main debug requirements in the design of microprocessor-based applications and the feasibility of their implementation using the mandatory, optional and additional operating modes of the standard IEEE 1149.1 test infrastructure.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Tests on printed circuit boards and integrated circuits are widely used in industry,resulting in reduced design time and cost of a project. The functional and connectivity tests in this type of circuits soon began to be a concern for the manufacturers, leading to research for solutions that would allow a reliable, quick, cheap and universal solution. Initially, using test schemes were based on a set of needles that was connected to inputs and outputs of the integrated circuit board (bed-of-nails), to which signals were applied, in order to verify whether the circuit was according to the specifications and could be assembled in the production line. With the development of projects, circuit miniaturization, improvement of the production processes, improvement of the materials used, as well as the increase in the number of circuits, it was necessary to search for another solution. Thus Boundary-Scan Testing was developed which operates on the border of integrated circuits and allows testing the connectivity of the input and the output ports of a circuit. The Boundary-Scan Testing method was converted into a standard, in 1990, by the IEEE organization, being known as the IEEE 1149.1 Standard. Since then a large number of manufacturers have adopted this standard in their products. This master thesis has, as main objective: the design of Boundary-Scan Testing in an image sensor in CMOS technology, analyzing the standard requirements, the process used in the prototype production, developing the design and layout of Boundary-Scan and analyzing obtained results after production. Chapter 1 presents briefly the evolution of testing procedures used in industry, developments and applications of image sensors and the motivation for the use of architecture Boundary-Scan Testing. Chapter 2 explores the fundamentals of Boundary-Scan Testing and image sensors, starting with the Boundary-Scan architecture defined in the Standard, where functional blocks are analyzed. This understanding is necessary to implement the design on an image sensor. It also explains the architecture of image sensors currently used, focusing on sensors with a large number of inputs and outputs.Chapter 3 describes the design of the Boundary-Scan implemented and starts to analyse the design and functions of the prototype, the used software, the designs and simulations of the functional blocks of the Boundary-Scan implemented. Chapter 4 presents the layout process used based on the design developed on chapter 3, describing the software used for this purpose, the planning of the layout location (floorplan) and its dimensions, the layout of individual blocks, checks in terms of layout rules, the comparison with the final design and finally the simulation. Chapter 5 describes how the functional tests were performed to verify the design compliancy with the specifications of Standard IEEE 1149.1. These tests were focused on the application of signals to input and output ports of the produced prototype. Chapter 6 presents the conclusions that were taken throughout the execution of the work.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The unprecedented and relentless growth in the electronics industry is feeding the demand for integrated circuits (ICs) with increasing functionality and performance at minimum cost and power consumption. As predicted by Moore's law, ICs are being aggressively scaled to meet this demand. While the continuous scaling of process technology is reducing gate delays, the performance of ICs is being increasingly dominated by interconnect delays. In an effort to improve submicrometer interconnect performance, to increase packing density, and to reduce chip area and power consumption, the semiconductor industry is focusing on three-dimensional (3D) integration. However, volume production and commercial exploitation of 3D integration are not feasible yet due to significant technical hurdles.

At the present time, interposer-based 2.5D integration is emerging as a precursor to stacked 3D integration. All the dies and the interposer in a 2.5D IC must be adequately tested for product qualification. However, since the structure of 2.5D ICs is different from the traditional 2D ICs, new challenges have emerged: (1) pre-bond interposer testing, (2) lack of test access, (3) limited ability for at-speed testing, (4) high density I/O ports and interconnects, (5) reduced number of test pins, and (6) high power consumption. This research targets the above challenges and effective solutions have been developed to test both dies and the interposer.

The dissertation first introduces the basic concepts of 3D ICs and 2.5D ICs. Prior work on testing of 2.5D ICs is studied. An efficient method is presented to locate defects in a passive interposer before stacking. The proposed test architecture uses e-fuses that can be programmed to connect or disconnect functional paths inside the interposer. The concept of a die footprint is utilized for interconnect testing, and the overall assembly and test flow is described. Moreover, the concept of weighted critical area is defined and utilized to reduce test time. In order to fully determine the location of each e-fuse and the order of functional interconnects in a test path, we also present a test-path design algorithm. The proposed algorithm can generate all test paths for interconnect testing.

In order to test for opens, shorts, and interconnect delay defects in the interposer, a test architecture is proposed that is fully compatible with the IEEE 1149.1 standard and relies on an enhancement of the standard test access port (TAP) controller. To reduce test cost, a test-path design and scheduling technique is also presented that minimizes a composite cost function based on test time and the design-for-test (DfT) overhead in terms of additional through silicon vias (TSVs) and micro-bumps needed for test access. The locations of the dies on the interposer are taken into consideration in order to determine the order of dies in a test path.

To address the scenario of high density of I/O ports and interconnects, an efficient built-in self-test (BIST) technique is presented that targets the dies and the interposer interconnects. The proposed BIST architecture can be enabled by the standard TAP controller in the IEEE 1149.1 standard. The area overhead introduced by this BIST architecture is negligible; it includes two simple BIST controllers, a linear-feedback-shift-register (LFSR), a multiple-input-signature-register (MISR), and some extensions to the boundary-scan cells in the dies on the interposer. With these extensions, all boundary-scan cells can be used for self-configuration and self-diagnosis during interconnect testing. To reduce the overall test cost, a test scheduling and optimization technique under power constraints is described.

In order to accomplish testing with a small number test pins, the dissertation presents two efficient ExTest scheduling strategies that implements interconnect testing between tiles inside an system on chip (SoC) die on the interposer while satisfying the practical constraint that the number of required test pins cannot exceed the number of available pins at the chip level. The tiles in the SoC are divided into groups based on the manner in which they are interconnected. In order to minimize the test time, two optimization solutions are introduced. The first solution minimizes the number of input test pins, and the second solution minimizes the number output test pins. In addition, two subgroup configuration methods are further proposed to generate subgroups inside each test group.

Finally, the dissertation presents a programmable method for shift-clock stagger assignment to reduce power supply noise during SoC die testing in 2.5D ICs. An SoC die in the 2.5D IC is typically composed of several blocks and two neighboring blocks that share the same power rails should not be toggled at the same time during shift. Therefore, the proposed programmable method does not assign the same stagger value to neighboring blocks. The positions of all blocks are first analyzed and the shared boundary length between blocks is then calculated. Based on the position relationships between the blocks, a mathematical model is presented to derive optimal result for small-to-medium sized problems. For larger designs, a heuristic algorithm is proposed and evaluated.

In summary, the dissertation targets important design and optimization problems related to testing of interposer-based 2.5D ICs. The proposed research has led to theoretical insights, experiment results, and a set of test and design-for-test methods to make testing effective and feasible from a cost perspective.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

This discussion has outlined a theoretical and pragmatic framework to demonstrate that future research involving the analysis of human performance in surgical should encourage the use of phenomenology to enhance the knowledge base of this area of study. Merging experiential (first-person) and experimental (third-person) methods may possibly help improve research designs and analyses in the investigation of robotics in surgical performance. By relying solely on third-person techniques, the current methodology and interpretation used to analyze human performance in surgical robotics is limited. Recent advances in cognitive science and psychology have also recognized this limitation and have now begun to shift to neurophenomenology. Finally, discussion on recent robotics research presented here demonstrates the potential phenomenology holds for augmenting the methodological and analysis techniques currently used by researchers of human performance in surgical robotics.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

A simple error detecting and correcting procedure is described for nonbinary symbol words; here, the error position is located using the Hamming method and the correct symbol is substituted using a modulo-check procedure.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The concept of feature selection in a nonparametric unsupervised learning environment is practically undeveloped because no true measure for the effectiveness of a feature exists in such an environment. The lack of a feature selection phase preceding the clustering process seriously affects the reliability of such learning. New concepts such as significant features, level of significance of features, and immediate neighborhood are introduced which result in meeting implicitly the need for feature slection in the context of clustering techniques.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The concept of feature selection in a nonparametric unsupervised learning environment is practically undeveloped because no true measure for the effectiveness of a feature exists in such an environment. The lack of a feature selection phase preceding the clustering process seriously affects the reliability of such learning. New concepts such as significant features, level of significance of features, and immediate neighborhood are introduced which result in meeting implicitly the need for feature slection in the context of clustering techniques.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The rapid increase in renewable energy generation from wind has increased concerns about the impacts that wind arrays have on the marine environment and what these impacts mean for society. One method for identifying the impacts of offshore wind farms (OWFs) on human welfare is through the assessment and valuation of ecosystem services. Using an ecosystem services approach, this paper reviews the impacts of OWFs on the ecosystem services delivered by marine environments. During the construction phase, supporting services such as reduced energy capture and nutrient cycling are changed due to the introduction of hard substrate and the reduction in soft sediment habitat at turbine bases. This may lead to changes in all other ecosystem services, both negative and positive. Quantifying these changes, however, is a challenge partly due to data limitations and a lack of clear understanding of the impacts of OWFs on the marine ecosystems. Scientific effort needs to quantitatively explore the impacts of OWFs on ecosystem functionality and the gathering of data that enables the assessment of changes to ecosystem services. Data needed to better quantify and value the impacts of OWFs on ecosystem services are suggested. The development of methods which integrate socioeconomic valuation of ecosystem services into the evaluation of renewable energy devices compliments efforts in assessing the environmental impacts and should enable a holistic assessment of the impact of renewable energy production and greenhouse gas mitigation technologies on the U. K. carbon footprint.

Relevância:

80.00% 80.00%

Publicador:

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Objetivó: Caracterizar los pacientes con heridas cardiacas penetrantes grado II a VI, describir las características del trauma, tratamiento quirúrgico, evolución clínica e identificar los factores asociados a un desenlace. Metodología: Se diseñó un estudio de asociación en 308 pacientes que ingresaron a cirugía con diagnóstico de herida penetrante de corazón entre enero de 1999 y octubre de 2009. Se excluyeron 68 casos. La serie analizada incluyó 240 pacientes con heridas cardiacas. Se analizaron variables demográficas, clínicas, quirúrgicas y de evolución, tabulados en EXCEL® y analizados en SPSS 20®. Resultados: El promedio de edad fue 27.8 años, principalmente hombres (96%), lesiones por arma cortopunzante 93% y un 7% por proyectil arma de fuego. El estado hemodinámico al ingreso (según Ivatury) fue normal 44%; Shock profundo 34%; Agónicos 18% y 3% fatales. El 67% (n=161) presentaron taponamiento cardiaco. Los grados de lesión cardiaca según la clasificación OIS-AAST fueron: grado II 33%, grado III 13%, grado IV 29%, grado V 22% y grado VI 3%. La ventana pericárdica fue el método diagnóstico confirmatorio de lesión en 63% y las incisiones de abordaje quirúrgico fueron la esternotomía 63% y la toracotomía anterolateral 35%. La mortalidad fue 15% (n=36). Las diferencias en mortalidad entre el estado hemodinámico al inicio de cirugía, mecanismo de lesión y grado de herida, demostraron ser estadísticamente significativas (valor de p<0.001). Conclusiones: El estado hemodinámico y las heridas por arma de fuego son factores asociados a mortalidad. La ventana pericárdica subxifoidea favorece la preferencia y buenos resultados de la esternotomía como vía de abordaje quirúrgico.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The increase of higher education offer is a basic need of developed and emerging countries. It requires increasing and ongoing investments. The offer of higher education, by means of Distance Learning, based on the Internet, is one of the most efficient manners for the massification of this offer, as it allows ample coverage and lower costs. In this scenario, we highlight Moodle, an open and low-cost environment for Distance Learning. Its utilization may be amplified through the adoption of an emerging Information and Communication Technology (ICT), Cloud Computing, which allows the virtualization of Moodle sites, cutting costs, facilitating management and increasing its service capacity. This article diffuses a public tool, opened and free, for automatic conversion of Moodle sites, such that these may be hosted on Azure: the Cloud Computing environment of Microsoft.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Pós-graduação em Odontologia Restauradora - ICT

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Technological developments in biomedical microsystems are opening up new opportunities to improve healthcare procedures. Swallowable diagnostic capsules are an example of this. In this paper, a diagnostic capsule technology is described based on direct-access sensing of the Gastro Intestinal (GI) fluids throughout the GI tract. The objective of this paper is two-fold: i) develop a packaging method for a direct access sensor, ii) develop an encapsulation method to protect the system electronics. The integrity of the interconnection after sensor packaging and encapsulation is correlated to its reliability and thus of importance. The zero level packaging of the sensor was achieved by using a so called Flip Chip Over Hole (FCOH) method. This allowed the fluidic sensing media to interface with the sensor, while the rest of the chip including the electrical connections can be insulated effectively. Initial tests using Anisotropic Conductive Adhesive (ACA) interconnect for the FCOH demonstrated good electrical connections and functionality of the sensor chip. Also a preliminary encapsulation trial of the flip chipped sensor on a flexible test substrate has been carried out and showed that silicone encapsulation of the system is a viable option.