866 resultados para Core Sets
Resumo:
O presente trabalho apresenta uma medida de Core Inflation para a economia brasileira obtida através dos dados desagregados do IPC-DI/FGV. A revisão do conceito, a justificativa para seu uso na política monetária e as diversas formas metodológicas para seu cálculo são apresentadas na primeira parte do trabalho. O método utilizado para o cálculo do Core foi o de Médias Aparadas (Trimmed Means) com o tratamento para setores selecionados com custos de ajustamento. O indicador escolhido, após submetido a diversos testes, foi o de 20% de aparamento com distribuição de variações pontuais de preços dos setores selecionados por 12 meses.
Resumo:
Electronic applications are currently developed under the reuse-based paradigm. This design methodology presents several advantages for the reduction of the design complexity, but brings new challenges for the test of the final circuit. The access to embedded cores, the integration of several test methods, and the optimization of the several cost factors are just a few of the several problems that need to be tackled during test planning. Within this context, this thesis proposes two test planning approaches that aim at reducing the test costs of a core-based system by means of hardware reuse and integration of the test planning into the design flow. The first approach considers systems whose cores are connected directly or through a functional bus. The test planning method consists of a comprehensive model that includes the definition of a multi-mode access mechanism inside the chip and a search algorithm for the exploration of the design space. The access mechanism model considers the reuse of functional connections as well as partial test buses, cores transparency, and other bypass modes. The test schedule is defined in conjunction with the access mechanism so that good trade-offs among the costs of pins, area, and test time can be sought. Furthermore, system power constraints are also considered. This expansion of concerns makes it possible an efficient, yet fine-grained search, in the huge design space of a reuse-based environment. Experimental results clearly show the variety of trade-offs that can be explored using the proposed model, and its effectiveness on optimizing the system test plan. Networks-on-chip are likely to become the main communication platform of systemson- chip. Thus, the second approach presented in this work proposes the reuse of the on-chip network for the test of the cores embedded into the systems that use this communication platform. A power-aware test scheduling algorithm aiming at exploiting the network characteristics to minimize the system test time is presented. The reuse strategy is evaluated considering a number of system configurations, such as different positions of the cores in the network, power consumption constraints and number of interfaces with the tester. Experimental results show that the parallelization capability of the network can be exploited to reduce the system test time, whereas area and pin overhead are strongly minimized. In this manuscript, the main problems of the test of core-based systems are firstly identified and the current solutions are discussed. The problems being tackled by this thesis are then listed and the test planning approaches are detailed. Both test planning techniques are validated for the recently released ITC’02 SoC Test Benchmarks, and further compared to other test planning methods of the literature. This comparison confirms the efficiency of the proposed methods.
Resumo:
Esta dissertação mapeia a rede de relações intertextuais em Half a Life (2001) e sua continuação Magic Seeds (2004), os romances mais recentes do Prêmio Nobel de Literatura de 2001, V. S. Naipaul, como contribuição para o estudo da obra do autor. A noção de intertextualidade permeia os estudos literários, e o termo tem sido largamente empregado desde que foi cunhado por Julia Kristeva nos anos sessenta. Desde então as mais variadas, e muitas vezes divergentes, teorias sobre intertextualidade compartilham a idéia de que um texto só adquire significado pleno na interação com outros textos. A abordagem metodológica proposta é baseada na teoria da transtextualidade de Gérard Genette. Esta escolha implica o estudo de intertextos, paratextos, metatextos, arquitextos e hipertextos que constituem a interface entre os dois romances e outros escritos. O nome do protagonista "William Somerset Chandran" constitui o fio que guia o estudo das várias relações transtextuais nos dois romances. A partir do prenome do protagonista – William – este estudo situa os romances no contexto da tradição do Bildungsroman, e argumenta que estes estabelecem uma paródia arquitextual do gênero na medida em que subvertem seu cerne, ou seja, a formação do caráter do protagonista. O nome do meio do protagonista – Somerset – remete à ficcionalização do escritor Somerset Maugham na narrativa, ao mesmo tempo em que esta desmistifica a ótica ocidental sobre o hinduísmo popularizada por Maugham em The Razor's Edge. O sobrenome do protagonista – Chandran – leva ao estudo do conjunto de referências à origem indiana de Naipaul e o papel desta na produção do autor. Este nome se reporta ao romance de Narayan The Bachelor of Arts, cujo protagonista também é nomeado Chandran. Narayan é um escritor de destaque na literatura anglo-indiana e referência recorrente na obra de Naipaul. Os temas de migração e choque cultural apresentados nos dois romances têm sido presença constante na obra de Naipaul. Esta pesquisa mapeia a relação de continuidade entre os dois romances em questão e o conjunto da obra de Naipaul, salientando o papel da ambientação geográfica da narrativa, marcada pela jornada do protagonista através de três continentes. A teoria da transtextualidade é uma ferramenta operacional para a pesquisa, a qual examina a densidade das referências geográficas, históricas e literárias em Half a Life e Magic Seeds, visando aportar elementos para o estudo da produção literária de Naipaul, na medida em que estes romances recentes condensam e revisitam a visão de mundo deste autor.
Resumo:
We prove non-emptiness of the alpha-core for balanced games with non-ordered preferences, extending and generalizing in several aspects the results of Scarf (1971), Border (1984), Florenzano (1989), Yannelis (1991) and Kajii (1992). In particular we answer an open question in Kajii (1992) regarding the applicability of the non-emptiness results to models with infinite dimensional strategy spaces. We provide two models with Knightian and voting preferences for which the results of Scarf (1971) and Kajii (1992) cannot be applied while our non-emptiness result applies.
Resumo:
The work described in this thesis aims to support the distributed design of integrated systems and considers specifically the need for collaborative interaction among designers. Particular emphasis was given to issues which were only marginally considered in previous approaches, such as the abstraction of the distribution of design automation resources over the network, the possibility of both synchronous and asynchronous interaction among designers and the support for extensible design data models. Such issues demand a rather complex software infrastructure, as possible solutions must encompass a wide range of software modules: from user interfaces to middleware to databases. To build such structure, several engineering techniques were employed and some original solutions were devised. The core of the proposed solution is based in the joint application of two homonymic technologies: CAD Frameworks and object-oriented frameworks. The former concept was coined in the late 80's within the electronic design automation community and comprehends a layered software environment which aims to support CAD tool developers, CAD administrators/integrators and designers. The latter, developed during the last decade by the software engineering community, is a software architecture model to build extensible and reusable object-oriented software subsystems. In this work, we proposed to create an object-oriented framework which includes extensible sets of design data primitives and design tool building blocks. Such object-oriented framework is included within a CAD Framework, where it plays important roles on typical CAD Framework services such as design data representation and management, versioning, user interfaces, design management and tool integration. The implemented CAD Framework - named Cave2 - followed the classical layered architecture presented by Barnes, Harrison, Newton and Spickelmier, but the possibilities granted by the use of the object-oriented framework foundations allowed a series of improvements which were not available in previous approaches: - object-oriented frameworks are extensible by design, thus this should be also true regarding the implemented sets of design data primitives and design tool building blocks. This means that both the design representation model and the software modules dealing with it can be upgraded or adapted to a particular design methodology, and that such extensions and adaptations will still inherit the architectural and functional aspects implemented in the object-oriented framework foundation; - the design semantics and the design visualization are both part of the object-oriented framework, but in clearly separated models. This allows for different visualization strategies for a given design data set, which gives collaborating parties the flexibility to choose individual visualization settings; - the control of the consistency between semantics and visualization - a particularly important issue in a design environment with multiple views of a single design - is also included in the foundations of the object-oriented framework. Such mechanism is generic enough to be also used by further extensions of the design data model, as it is based on the inversion of control between view and semantics. The view receives the user input and propagates such event to the semantic model, which evaluates if a state change is possible. If positive, it triggers the change of state of both semantics and view. Our approach took advantage of such inversion of control and included an layer between semantics and view to take into account the possibility of multi-view consistency; - to optimize the consistency control mechanism between views and semantics, we propose an event-based approach that captures each discrete interaction of a designer with his/her respective design views. The information about each interaction is encapsulated inside an event object, which may be propagated to the design semantics - and thus to other possible views - according to the consistency policy which is being used. Furthermore, the use of event pools allows for a late synchronization between view and semantics in case of unavailability of a network connection between them; - the use of proxy objects raised significantly the abstraction of the integration of design automation resources, as either remote or local tools and services are accessed through method calls in a local object. The connection to remote tools and services using a look-up protocol also abstracted completely the network location of such resources, allowing for resource addition and removal during runtime; - the implemented CAD Framework is completely based on Java technology, so it relies on the Java Virtual Machine as the layer which grants the independence between the CAD Framework and the operating system. All such improvements contributed to a higher abstraction on the distribution of design automation resources and also introduced a new paradigm for the remote interaction between designers. The resulting CAD Framework is able to support fine-grained collaboration based on events, so every single design update performed by a designer can be propagated to the rest of the design team regardless of their location in the distributed environment. This can increase the group awareness and allow a richer transfer of experiences among them, improving significantly the collaboration potential when compared to previously proposed file-based or record-based approaches. Three different case studies were conducted to validate the proposed approach, each one focusing one a subset of the contributions of this thesis. The first one uses the proxy-based resource distribution architecture to implement a prototyping platform using reconfigurable hardware modules. The second one extends the foundations of the implemented object-oriented framework to support interface-based design. Such extensions - design representation primitives and tool blocks - are used to implement a design entry tool named IBlaDe, which allows the collaborative creation of functional and structural models of integrated systems. The third case study regards the possibility of integration of multimedia metadata to the design data model. Such possibility is explored in the frame of an online educational and training platform.
Resumo:
In recent years, many central banks have adopted inflation targeting policies starting an intense debate about which measure of inflation to adopt. The literature on core inflation has tried to develop indicators of inflation which would respond only to "significant" changes in inflation. This paper defines a measure of core inflation as the common trend of prices in a multivariate dynamic model, that has, by construction, three properties: it filters idiosyncratic and transitory macro noises, and it leads the future leveI of headline inflation. We also show that the popular trimmed mean estimator of core inflation could be regarded as a proxy for the ideal GLS estimator for heteroskedastic data. We employ an asymmetric trimmed mean estimator to take account of possible skewness of the distribution, and we obtain an unconditional measure of core inflation.
Resumo:
The world cup has become the most streamed live sporting event in the US, as Americans tune in to this year´s tournament on their smartphones, tablets and computers in record numbers.
Resumo:
We give a thorough account of the various equivalent notions for \sheaf" on a locale, namely the separated and complete presheaves, the local home- omorphisms, and the local sets, and to provide a new approach based on quantale modules whereby we see that sheaves can be identi¯ed with certain Hilbert modules in the sense of Paseka. This formulation provides us with an interesting category that has immediate meaningful relations to those of sheaves, local homeomorphisms and local sets. The concept of B-set (local set over the locale B) present in [3] is seen as a simetric idempotent matrix with entries on B, and a map of B-sets as de¯ned in [8] is shown to be also a matrix satisfying some conditions. This gives us useful tools that permit the algebraic manipulation of B-sets. The main result is to show that the existing notions of \sheaf" on a locale B are also equivalent to a new concept what we call a Hilbert module with an Hilbert base. These modules are the projective modules since they are the image of a free module by a idempotent automorphism On the ¯rst chapter, we recall some well known results about partially ordered sets and lattices. On chapter two we introduce the category of Sup-lattices, and the cate- gory of locales, Loc. We describe the adjunction between this category and the category Top of topological spaces whose restriction to spacial locales give us a duality between this category and the category of sober spaces. We ¯nish this chapter with the de¯nitions of module over a quantale and Hilbert Module. Chapter three concerns with various equivalent notions namely: sheaves of sets, local homeomorphisms and local sets (projection matrices with entries on a locale). We ¯nish giving a direct algebraic proof that each local set is isomorphic to a complete local set, whose rows correspond to the singletons. On chapter four we de¯ne B-locale, study open maps and local homeo- morphims. The main new result is on the ¯fth chapter where we de¯ne the Hilbert modules and Hilbert modules with an Hilbert and show this latter concept is equivalent to the previous notions of sheaf over a locale.
Resumo:
The present work has as objective the development of ceramic pigments based in iron oxides and cobalt through the polymeric precursor method, as well as study their characteristics and properties using methods of physical, chemical, morphological and optical characterizations.In this work was used iron nitrate, and cobalt citrate as precursor and nanometer silica as a matrix. The synthesis was based on dissolving the citric acid as complexing agent, addition of metal oxides, such as chromophores ions and polymerization with ethylene glycol. The powder obtained has undergone pre-ignition, breakdown and thermal treatments at different calcination temperatures (700 °C, 800 °C, 900 °C, 1000 °C and 1100 °C). Thermogravimetric analyzes were performed (BT) and Differential Thermal Analysis (DTA), in order to evaluate the term decomposition of samples, beyond characterization by techniques such as BET, which classified as microporous materials samples calcined at 700 ° C, 800 º C and 900 º C and non-porous when annealed at 1000 ° C and 1100 º C, X-ray diffraction (XRD), which identified the formation of two crystalline phases, the Cobalt Ferrite (CoFe2O4) and Cristobalite (SiO2), Scanning Electron Microscopy (SEM) revealed the formation of agglomerates of particles slightly rounded;and Analysis of Colorimetry, temperature of 700 °C, 800 °C and 900 °C showed a brown color and 1000 °C and 1100 °C violet
Resumo:
Two regions common to all UsnRNP core polypeptides have been described: Sm motif 1 and Sm motif 2. Rabbits were immunized with a 22 amino-acid peptide containing one segment of Sm motif 1 (YRGTLVSTDNYFNLQL-NEAEEF, corresponding to residues 11-32) from yeast F protein. After immunization, the rabbit sera contained antibodies that not only reacted specifically with the peptide from yeast F protein but also cross-reacted with Sm polypeptides from mammals; that is, with purified human U1snRNPs. The results suggest that the peptide used and human Sm polypeptides contain a common feature recognized by the polyclonal antibodies. A large collection of human systemic lupus erythematosus sera was assayed using the yeast peptide as an antigen source. Seventy per cent of systemic lupus erythematosus sera contain an antibody specificity that cross-reacts with the yeast peptide.
Resumo:
Image restoration attempts to enhance images corrupted by noise and blurring effects. Iterative approaches can better control the restoration algorithm in order to find a compromise of restoring high details in smoothed regions without increasing the noise. Techniques based on Projections Onto Convex Sets (POCS) have been extensively used in the context of image restoration by projecting the solution onto hyperspaces until some convergence criteria be reached. It is expected that an enhanced image can be obtained at the final of an unknown number of projections. The number of convex sets and its combinations allow designing several image restoration algorithms based on POCS. Here, we address two convex sets: Row-Action Projections (RAP) and Limited Amplitude (LA). Although RAP and LA have already been used in image restoration domain, the former has a relaxation parameter (A) that strongly depends on the characteristics of the image that will be restored, i.e., wrong values of A can lead to poorly restoration results. In this paper, we proposed a hybrid Particle Swarm Optimization (PS0)-POCS image restoration algorithm, in which the A value is obtained by PSO to be further used to restore images by POCS approach. Results showed that the proposed PSO-based restoration algorithm outperformed the widely used Wiener and Richardson-Lucy image restoration algorithms. (C) 2010 Elsevier B.V. All rights reserved.
Resumo:
Coordenação de Aperfeiçoamento de Pessoal de Nível Superior (CAPES)
Resumo:
Prolapse-free basis sets suitable for four-component relativistic quantum chemical calculations are presented for the superheavy elements UP to (118)Uuo ((104)Rf, (105)Db, (106)Sg, (107)Bh, (108)Hs, (109)Mt, (110)Ds, (111)Rg, (112)Uub, (113)Uut, (114)Uuq, (115)Uup, (116)Uuh, (117)Uus, (118)Uuo) and Lr-103. These basis sets were optimized by minimizing the absolute values of the energy difference between the Dirac-Fock-Roothaan total energy and the corresponding numerical value at a milli-Hartree order of magnitude, resulting in a good balance between cost and accuracy. Parameters for generating exponents and new numerical data for some superheavy elements are also presented. (c) 2007 Elsevier B.V. All rights reserved.
Resumo:
The ability of neural networks to realize some complex nonlinear function makes them attractive for system identification. This paper describes a novel method using artificial neural networks to solve robust parameter estimation problems for nonlinear models with unknown-but-bounded errors and uncertainties. More specifically, a modified Hopfield network is developed and its internal parameters are computed using the valid-subspace technique. These parameters guarantee the network convergence to the equilibrium points. A solution for the robust estimation problem with unknown-but-bounded error corresponds to an equilibrium point of the network. Simulation results are presented as an illustration of the proposed approach.