899 resultados para Integrated circuits Very large scale integration Design and construction.
Resumo:
The supply voltage decrease and powerconsumption increase of modern ICs made the requirements for low voltage fluctuation caused by packaging and on-chip parasitic impedances more difficult to achieve. Most of the research works on the area assume that all the nodes of the chip are fed at thesame voltage, in such a way that the main cause of disturbance or fluctuation is the parasitic impedance of packaging. In the paper an approach to analyze the effect of high and fast current demands on the on-chip power supply network. First an approach to model the entire network by considering a homogeneous conductive foil is presented. The modification of the timing parameters of flipflops caused by spatial voltage drops through the IC surface are also investigated.
Resumo:
Process variations are a major bottleneck for digital CMOS integrated circuits manufacturability and yield. That iswhy regular techniques with different degrees of regularity are emerging as possible solutions. Our proposal is a new regular layout design technique called Via-Configurable Transistors Array (VCTA) that pushes to the limit circuit layout regularity for devices and interconnects in order to maximize regularity benefits. VCTA is predicted to perform worse than the Standard Cell approach designs for a certain technology node but it will allow the use of a future technology on an earlier time. Ourobjective is to optimize VCTA for it to be comparable to the Standard Cell design in an older technology. Simulations for the first unoptimized version of our VCTA of delay and energy consumption for a Full Adder circuit in the 90 nm technology node are presented and also the extrapolation for Carry-RippleAdders from 4 bits to 64 bits.
Resumo:
The purpose of this investigation was to develop new techniques to generate segmental assessments of body composition based on Segmental Bioelectrical Impedance Analysis (SBIA). An equally important consideration was the design, simulation, development, and the software and hardware integration of the SBIA system. This integration was carried out with a Very Large Scale Integration (VLSI) Field Programmable Gate Array (FPGA) microcontroller that analyzed the measurements obtained from segments of the body, and provided full body and segmental Fat Free Mass (FFM) and Fat Mass (FM) percentages. Also, the issues related to the estimate of the body's composition in persons with spinal cord injury (SCI) were addressed and investigated. This investigation demonstrated that the SBIA methodology provided accurate segmental body composition measurements. Disabled individuals are expected to benefit from these SBIA evaluations, as they are non-invasive methods, suitable for paralyzed individuals. The SBIA VLSI system may replace bulky, non flexible electronic modules attached to human bodies. ^
Resumo:
Internet Protocol Television (IPTV) is a system where a digital television service is delivered by using Internet Protocol over a network infrastructure. There is considerable confusion and concern about the IPTV, since two different technologies have to be mended together to provide the end customers with some thing better than the conventional television. In this research, functional architecture of the IPTV system was investigated. Very Large Scale Integration based system for streaming server controller were designed and different ways of hosting a web server which can be used to send the control signals to the streaming server controller were studied. The web server accepts inputs from the keyboard and FPGA board switches and depending on the preset configuration the server will open a selected web page and also sends the control signals to the streaming server controller. It was observed that the applications run faster on PowerPC since it is embedded into the FPGA. Commercial market and Global deployment of IPTV were discussed.
Resumo:
This paper presents a probabilistic approach to model the problem of power supply voltage fluctuations. Error probability calculations are shown for some 90-nm technology digital circuits.The analysis here considered gives the timing violation error probability as a new design quality factor in front of conventional techniques that assume the full perfection of the circuit. The evaluation of the error bound can be useful for new design paradigms where retry and self-recoveringtechniques are being applied to the design of high performance processors. The method here described allows to evaluate the performance of these techniques by means of calculating the expected error probability in terms of power supply distribution quality.
Resumo:
To tackle the challenges at circuit level and system level VLSI and embedded system design, this dissertation proposes various novel algorithms to explore the efficient solutions. At the circuit level, a new reliability-driven minimum cost Steiner routing and layer assignment scheme is proposed, and the first transceiver insertion algorithmic framework for the optical interconnect is proposed. At the system level, a reliability-driven task scheduling scheme for multiprocessor real-time embedded systems, which optimizes system energy consumption under stochastic fault occurrences, is proposed. The embedded system design is also widely used in the smart home area for improving health, wellbeing and quality of life. The proposed scheduling scheme for multiprocessor embedded systems is hence extended to handle the energy consumption scheduling issues for smart homes. The extended scheme can arrange the household appliances for operation to minimize monetary expense of a customer based on the time-varying pricing model.
Resumo:
Voltage fluctuations caused by parasitic impedances in the power supply rails of modern ICs are a major concern in nowadays ICs. The voltage fluctuations are spread out to the diverse nodes of the internal sections causing two effects: a degradation of performances mainly impacting gate delays anda noisy contamination of the quiescent levels of the logic that drives the node. Both effects are presented together, in thispaper, showing than both are a cause of errors in modern and future digital circuits. The paper groups both error mechanismsand shows how the global error rate is related with the voltage deviation and the period of the clock of the digital system.
Resumo:
The European Union continues to exert a large influence on the direction of member states energy policy. The 2020 targets for renewable energy integration have had significant impact on the operation of current power systems, forcing a rapid change from fossil fuel dominated systems to those with high levels of renewable power. Additionally, the overarching aim of an internal energy market throughout Europe has and will continue to place importance on multi-jurisdictional co-operation regarding energy supply. Combining these renewable energy and multi-jurisdictional supply goals results in a complicated multi-vector energy system, where the understanding of interactions between fossil fuels, renewable energy, interconnection and economic power system operation is increasingly important. This paper provides a novel and systematic methodology to fully understand the changing dynamics of interconnected energy systems from a gas and power perspective. A fully realistic unit commitment and economic dispatch model of the 2030 power systems in Great Britain and Ireland, combined with a representative gas transmission energy flow model is developed. The importance of multi-jurisdictional integrated energy system operation in one of the most strategically important renewable energy regions is demonstrated.
Resumo:
High-throughput technologies are now used to generate more than one type of data from the same biological samples. To properly integrate such data, we propose using co-modules, which describe coherent patterns across paired data sets, and conceive several modular methods for their identification. We first test these methods using in silico data, demonstrating that the integrative scheme of our Ping-Pong Algorithm uncovers drug-gene associations more accurately when considering noisy or complex data. Second, we provide an extensive comparative study using the gene-expression and drug-response data from the NCI-60 cell lines. Using information from the DrugBank and the Connectivity Map databases we show that the Ping-Pong Algorithm predicts drug-gene associations significantly better than other methods. Co-modules provide insights into possible mechanisms of action for a wide range of drugs and suggest new targets for therapy
Resumo:
This is an observational study of the large-scale moisture transport over South America, with some analyses on its relation to subtropical rainfall. The concept of aerial rivers is proposed as a framework: it is an analogy between the main pathways of moisture flow in the atmosphere and surface rivers. Opposite to surface rivers, aerial rivers gain (lose) water through evaporation (precipitation). The magnitude of the vertically integrated moisture transport is discharge, and precipitable water is like the mass of the liquid column-multiplied by an equivalent speed it gives discharge. Trade wind flow into Amazonia, and the north/northwesterly flow to the subtropics, east of the Andes, are aerial rivers. Aerial lakes are the sections of a moisture pathway where the flow slows down and broadens, because of diffluence, and becomes deeper, with higher precipitable water. This is the case over Amazonia, downstream of the trade wind confluence. In the dry season, moisture from the aerial lake is transported northeastward, but weaker flow over southern Amazonia heads southward toward the subtropics. Southern Amazonia appears as a source of moisture to this flow. Aerial river discharge to the subtropics is comparable to that of the Amazon River. The variations of the amount of moisture coming from Amazonia have an important effect over the variability of discharge. Correlations between the flow from Amazonia and subtropical rainfall are not strong. However, some months within the set of dry seasons observed showed a strong increase (decrease) occurring together with an important increase (decrease) in subtropical rainfall.
Resumo:
This study is concerned with quality and productivity aspects of traditional house building. The research focuses on these issues by concentrating on the services and finishing stages of the building process. These are work stages which have not been fully investigated in previous productivity related studies. The primary objective of the research is to promote an integrated design and construction led approach to traditional house building based on an original concept of 'development cycles'. This process involves the following: site monitoring; the analysis of work operations; implementing design and construction changes founded on unique information collected during site monitoring; and subsequent re-monitoring to measure and assess Ihe effect of change. A volume house building firm has been involved in this applied research and has allowed access to its sites for production monitoring purposes. The firm also assisted in design detailing for a small group of 'experimental' production houses where various design and construction changes were implemented. Results from the collaborative research have shown certain quality and productivity improvements to be possible using this approach, albeit on a limited scale at this early experimental stage. The improvements have been possible because an improved activity sampling technique, developed for, and employed by the study, has been able to describe why many quality and productivity related problems occur during site building work. Experience derived from the research has shown the following attributes to be important: positive attitudes towards innovation; effective communication; careful planning and organisation; and good coordination and control at site level. These are all essential aspects of quality led management and determine to a large extent the overall success of this approach. Future work recommendations must include a more widespread use of innovative practices so that further design and construction modifications can be made. By doing this, productivity can be improved, cost savings made and better quality afforded.
Resumo:
This paper describes the development and testing of a robotic capsule for search and rescue operations at sea. This capsule is able to operate autonomously or remotely controlled, is transported and deployed by a larger USV into a determined disaster area and is used to carry a life raft and inflate it close to survivors in large-scale maritime disasters. The ultimate goal of this development is to endow search and rescue teams with tools that extend their operational capability in scenarios with adverse atmospheric or maritime conditions.
Resumo:
This paper considers the potential contribution of secondary quantitative analyses of large scale surveys to the investigation of 'other' childhoods. Exploring other childhoods involves investigating the experience of young people who are unequally positioned in relation to multiple, embodied, identity locations, such as (dis)ability, 'class', gender, sexuality, ethnicity and race. Despite some possible advantages of utilising extensive databases, the paper outlines a number of methodological problems with existing surveys which tend to reinforce adultist and broader hierarchical social relations. It is contended that scholars of children's geographies could overcome some of these problematic aspects of secondary data sources by endeavouring to transform the research relations of large scale surveys. Such endeavours would present new theoretical, ethical and methodological complexities, which are briefly considered.
Resumo:
Advances in hardware and software technology enable us to collect, store and distribute large quantities of data on a very large scale. Automatically discovering and extracting hidden knowledge in the form of patterns from these large data volumes is known as data mining. Data mining technology is not only a part of business intelligence, but is also used in many other application areas such as research, marketing and financial analytics. For example medical scientists can use patterns extracted from historic patient data in order to determine if a new patient is likely to respond positively to a particular treatment or not; marketing analysts can use extracted patterns from customer data for future advertisement campaigns; finance experts have an interest in patterns that forecast the development of certain stock market shares for investment recommendations. However, extracting knowledge in the form of patterns from massive data volumes imposes a number of computational challenges in terms of processing time, memory, bandwidth and power consumption. These challenges have led to the development of parallel and distributed data analysis approaches and the utilisation of Grid and Cloud computing. This chapter gives an overview of parallel and distributed computing approaches and how they can be used to scale up data mining to large datasets.
Resumo:
The circulation at the Eastern Brazilian Shelf (EBS), near 13 degrees S, is discussed in terms of the currents and hydrography, associating large-scale circulation, transient and local processes to establish a regional picture of the EBS circulation. The results show that the circulation within the continental shelf and slope region is strongly affected by the seasonal changes in the wind field and mesa/large-scale circulation. Transient processes associated to the passage of Cold Front systems or meso-scale activity and the presence of a local canyon add more complexity to the system. During the austral spring and summer seasons, the prevailing upwelling favorable winds blowing from E-NE were responsible for driving southwestward shelf currents. The interaction with the Western Boundary Current (the Brazil Current), especially during summer, was significant and a considerable vertical shear in the velocity field was observed at the outer shelf. The passage of a Cold Front system during the springtime caused a complete reversal of the mean flow and contributed to the deepening of the Mixed Layer Depth (MLD). In addition, the presence of Salvador Canyon, subject to an upwelling favorable boundary current, enhanced the upwelling system, when compared to the upwelling observed at the adjacent shelf. During the austral autumn and winter seasons the prevailing downwelling favorable winds blowing from the SE acted to total reverse the shelf circulation, resulting in a northeastward flow. The passage of a strong Cold Front, during the autumn season, contributed not only to the strengthening of the flow but also to the deepening of the MLD. The presence of the Salvador Canyon, when subject to a downwelling favorable boundary current, caused an intensification of the downwelling process. Interestingly, the alongshore velocity at the shelf region adjacent to the head of the canyon was less affected when compared to the upwelling situation.