935 resultados para Transceiver architectures
Resumo:
Dissertação apresentada na Faculdade de Ciências e Tecnologias da Universidade Nova de Lisboa para a obtenção do Grau de Mestre em Engenharia Informática
Resumo:
The evolution of the electrical grid into a smart grid, allowing user production, storage and exchange of energy, remote control of appliances, and in general optimizations over how the energy is managed and consumed, is also an evolution into a complex Information and Communication Technology (ICT) system. With the goal of promoting an integrated and interoperable smart grid, a number of organizations all over the world started uncoordinated standardization activities, which caused the emergence of a large number of incompatible architectures and standards. There are now new standardization activities which have the goal of organizing existing standards and produce best practices to choose the right approach(es) to be employed in specific smart grid designs. This paper follows the lead of NIST and ETSI/CEN/CENELEC approaches in trying to provide taxonomy of existing solutions; our contribution reviews and relates current ICT state-of-the-art, with the objective of forecasting future trends based on the orientation of current efforts and on relationships between them. The resulting taxonomy provides guidelines for further studies of the architectures, and highlights how the standards in the last mile of the smart grid are converging to common solutions to improve ICT infrastructure interoperability.
Resumo:
As of today, AUTOSAR is the de facto standard in the automotive industry, providing a common software architec- ture and development process for automotive applications. While this standard is originally written for singlecore operated Elec- tronic Control Units (ECU), new guidelines and recommendations have been added recently to provide support for multicore archi- tectures. This update came as a response to the steady increase of the number and complexity of the software functions embedded in modern vehicles, which call for the computing power of multicore execution environments. In this paper, we enumerate and analyze the design options and the challenges of porting AUTOSAR-based automotive applications onto multicore platforms. In particular, we investigate those options when considering the emerging many- core architectures that provide a more scalable environment than the traditional multicore systems. Such platforms are suitable to enable massive parallel execution, and their design is more suitable for partitioning and isolating the software components.
Resumo:
Euromicro Conference on Digital System Design (DSD 2015), Funchal, Portugal.
Resumo:
Dissertation presented to obtain the Ph.D degree in Chemistry
Resumo:
Distributed real-time systems such as automotive applications are becoming larger and more complex, thus, requiring the use of more powerful hardware and software architectures. Furthermore, those distributed applications commonly have stringent real-time constraints. This implies that such applications would gain in flexibility if they were parallelized and distributed over the system. In this paper, we consider the problem of allocating fixed-priority fork-join Parallel/Distributed real-time tasks onto distributed multi-core nodes connected through a Flexible Time Triggered Switched Ethernet network. We analyze the system requirements and present a set of formulations based on a constraint programming approach. Constraint programming allows us to express the relations between variables in the form of constraints. Our approach is guaranteed to find a feasible solution, if one exists, in contrast to other approaches based on heuristics. Furthermore, approaches based on constraint programming have shown to obtain solutions for these type of formulations in reasonable time.
Resumo:
3rd Workshop on High-performance and Real-time Embedded Systems (HIRES 2015). 21, Jan, 2015. Amsterdam, Netherlands.
Resumo:
Presented at INForum - Simpósio de Informática (INFORUM 2015). 7 to 8, Sep, 2015. Portugal.
Resumo:
HERITAGE 2008 - World Heritage and Sustainable Development. Barcelos: Green Lines Institute for Sustainable Development, Vol. 2, p. 571-579
Resumo:
When representing the requirements for an intended software solution during the development process, a logical architecture is a model that provides an organized vision of how functionalities behave regardless of the technologies to be implemented. If the logical architecture represents an ambient assisted living (AAL) ecosystem, such representation is a complex task due to the existence of interrelated multidomains, which, most of the time, results in incomplete and incoherent user requirements. In this chap- ter, we present the results obtained when applying process-level modeling techniques to the derivation of the logical architecture for a real industrial AAL project. We adopt a V-Model–based approach that expresses the AAL requirements in a process-level perspec- tive, instead of the traditional product-level view. Additionally, we ensure compliance of the derived logical architecture with the National Institute of Standards and Technology (NIST) reference architecture as nonfunctional requirements to support the implementa- tion of the AAL architecture in cloud contexts.
Resumo:
Architectural (bad) smells are design decisions found in software architectures that degrade the ability of systems to evolve. This paper presents an approach to verify that a software architecture is smellfree using the Archery architectural description language. The language provides a core for modelling software architectures and an extension for specifying constraints. The approach consists in precisely specifying architectural smells as constraints, and then verifying that software architectures do not satisfy any of them. The constraint language is based on a propositional modal logic with recursion that includes: a converse operator for relations among architectural concepts, graded modalities for describing the cardinality in such relations, and nominals referencing architectural elements. Four architectural smells illustrate the approach.
Resumo:
Report for the scientific sojourn carried out at the Université Catholique de Louvain, Belgium, from March until June 2007. In the first part, the impact of important geometrical parameters such as source and drain thickness, fin spacing, spacer width, etc. on the parasitic fringing capacitance component of multiple-gate field-effect transistors (MuGFET) is deeply analyzed using finite element simulations. Several architectures such as single gate, FinFETs (double gate), triple-gate represented by Pi-gate MOSFETs are simulated and compared in terms of channel and fringing capacitances for the same occupied die area. Simulations highlight the great impact of diminishing the spacing between fins for MuGFETs and the trade-off between the reduction of parasitic source and drain resistances and the increase of fringing capacitances when Selective Epitaxial Growth (SEG) technology is introduced. The impact of these technological solutions on the transistor cut-off frequencies is also discussed. The second part deals with the study of the effect of the volume inversion (VI) on the capacitances of undoped Double-Gate (DG) MOSFETs. For that purpose, we present simulation results for the capacitances of undoped DG MOSFETs using an explicit and analytical compact model. It monstrates that the transition from volume inversion regime to dual gate behaviour is well simulated. The model shows an accurate dependence on the silicon layer thickness,consistent withtwo dimensional numerical simulations, for both thin and thick silicon films. Whereas the current drive and transconductance are enhanced in volume inversion regime, our results show thatintrinsic capacitances present higher values as well, which may limit the high speed (delay time) behaviour of DG MOSFETs under volume inversion regime.
Resumo:
The simultaneous use of multiple transmit and receive antennas can unleash very large capacity increases in rich multipath environments. Although such capacities can be approached by layered multi-antenna architectures with per-antenna rate control, the need for short-term feedback arises as a potential impediment, in particular as the number of antennas—and thus the number of rates to be controlled—increases. What we show, however, is that the need for short-term feedback in fact vanishes as the number of antennas and/or the diversity order increases. Specifically, the rate supported by each transmit antenna becomes deterministic and a sole function of the signal-to-noise, the ratio of transmit and receive antennas, and the decoding order, all of which are either fixed or slowly varying. More generally, we illustrate -through this specific derivation— the relevance of some established random CDMA results to the single-user multi-antenna problem.
Per-antenna rate and power control for MIMO layered architectures in the low- and high-power regimes
Resumo:
In a MIMO layered architecture, several codewordsare transmitted from a multiplicity of antennas. Although thespectral efficiency is maximized if the rates of these codewordsare separately controlled, the feedback rate within the linkadaptation loop is reduced if they are constrained to be identical.This poses a direct tradeoff between performance andfeedback overhead. This paper provides analytical expressionsthat quantify the difference in spectral efficiency between bothapproaches for arbitrary numbers of antennas. Specifically, thecharacterization takes place in the realm of the low- and highpowerregimes via expansions that are shown to have a widerange of validity.In addition, the possibility of adjusting the transmit powerof each codeword individually is considered as an alternative tothe separate control of their rates. Power allocation, however,turns out to be inferior to rate control within the context of thisproblem.
Resumo:
A lot of research in cognition and decision making suffers from a lack of formalism. The quantum probability program could help to improve this situation, but we wonder whether it would provide even more added value if its presumed focus on outcome models were complemented by process models that are, ideally, informed by ecological analyses and integrated into cognitive architectures.