929 resultados para integrated design
Resumo:
Dissertação apresentada na Faculdade de Ciências e Tecnologia da Universidade Nova de Lisboa para obtenção do grau de Mestre em Engenharia Electrotécnica e de Computadores
Resumo:
Recent integrated circuit technologies have opened the possibility to design parallel architectures with hundreds of cores on a single chip. The design space of these parallel architectures is huge with many architectural options. Exploring the design space gets even more difficult if, beyond performance and area, we also consider extra metrics like performance and area efficiency, where the designer tries to design the architecture with the best performance per chip area and the best sustainable performance. In this paper we present an algorithm-oriented approach to design a many-core architecture. Instead of doing the design space exploration of the many core architecture based on the experimental execution results of a particular benchmark of algorithms, our approach is to make a formal analysis of the algorithms considering the main architectural aspects and to determine how each particular architectural aspect is related to the performance of the architecture when running an algorithm or set of algorithms. The architectural aspects considered include the number of cores, the local memory available in each core, the communication bandwidth between the many-core architecture and the external memory and the memory hierarchy. To exemplify the approach we did a theoretical analysis of a dense matrix multiplication algorithm and determined an equation that relates the number of execution cycles with the architectural parameters. Based on this equation a many-core architecture has been designed. The results obtained indicate that a 100 mm(2) integrated circuit design of the proposed architecture, using a 65 nm technology, is able to achieve 464 GFLOPs (double precision floating-point) for a memory bandwidth of 16 GB/s. This corresponds to a performance efficiency of 71 %. Considering a 45 nm technology, a 100 mm(2) chip attains 833 GFLOPs which corresponds to 84 % of peak performance These figures are better than those obtained by previous many-core architectures, except for the area efficiency which is limited by the lower memory bandwidth considered. The results achieved are also better than those of previous state-of-the-art many-cores architectures designed specifically to achieve high performance for matrix multiplication.
Resumo:
Dissertação apresentada na Faculdade de Ciências e Tecnologia da Universidade Nova de Lisboa para obtenção do grau de Mestre em Engenharia Electrotécnica e de Computadores
Resumo:
In this paper, a module for homograph disambiguation in Portuguese Text-to-Speech (TTS) is proposed. This module works with a part-of-speech (POS) parser, used to disambiguate homographs that belong to different parts-of-speech, and a semantic analyzer, used to disambiguate homographs which belong to the same part-of-speech. The proposed algorithms are meant to solve a significant part of homograph ambiguity in European Portuguese (EP) (106 homograph pairs so far). This system is ready to be integrated in a Letter-to-Sound (LTS) converter. The algorithms were trained and tested with different corpora. The obtained experimental results gave rise to 97.8% of accuracy rate. This methodology is also valid for Brazilian Portuguese (BP), since 95 homographs pairs are exactly the same as in EP. A comparison with a probabilistic approach was also done and results were discussed.
Resumo:
Dissertação apresentada para obtenção do Grau de Mestre em Engenharia Electrotécnica e de Computadores, pela Universidade Nova de Lisboa, Faculdade de Ciências e Tecnologia
Resumo:
Companies are increasingly more and more dependent on distributed web-based software systems to support their businesses. This increases the need to maintain and extend software systems with up-to-date new features. Thus, the development process to introduce new features usually needs to be swift and agile, and the supporting software evolution process needs to be safe, fast, and efficient. However, this is usually a difficult and challenging task for a developer due to the lack of support offered by programming environments, frameworks, and database management systems. Changes needed at the code level, database model, and the actual data contained in the database must be planned and developed together and executed in a synchronized way. Even under a careful development discipline, the impact of changing an application data model is hard to predict. The lifetime of an application comprises changes and updates designed and tested using data, which is usually far from the real, production, data. So, coding DDL and DML SQL scripts to update database schema and data, is the usual (and hard) approach taken by developers. Such manual approach is error prone and disconnected from the real data in production, because developers may not know the exact impact of their changes. This work aims to improve the maintenance process in the context of Agile Platform by Outsystems. Our goal is to design and implement new data-model evolution features that ensure a safe support for change and a sound migration process. Our solution includes impact analysis mechanisms targeting the data model and the data itself. This provides, to developers, a safe, simple, and guided evolution process.
Resumo:
The present paper was prepared for the course “Project III”, with the supervision of Prof. António Moniz, reporting on the author speaking notes at the Winter School on Technology Assessment, 6-7 December 2010, as part of the Doctoral Programme on Technology Assessment at FCT-UNL.
Resumo:
The following work project illustrates the strategic issues There App, a mobile application, faces regarding the opportunity to expand from its current state as a product to a multisided platform. Initially, a market analysis is performed to identify the ideal customer groups to be integrated in the platform. Strategic design issues are then discussed on how to best match its value proposition with the identified market opportunity. Suggestions on how the company should organize its resources and operational processes to best deliver on its value proposition complete the work.
Resumo:
During last years, photophysical properties of complexes of semiconductor quantum dots (QDs) with organic dyes have attracted increasing interest. The development of different assemblies based on QDs and organic dyes allows to increase the range of QDs applications, which include imaging, biological sensing and electronic devices.1 Some studies demonstrate energy transfer between QDs and organic dye in assemblies.2 However, for electronic devices purposes, a polymeric matrix is required to enhance QDs photostability. Thus, in order to attach the QDs to the polymer surface it is necessary to chemically modify the polymer to induce electronic charges and stabilize the QDs in the polymer. The present work aims to investigate the design of assemblies based on polymer-coated QDs and an integrated acceptor organic dye. Polymethylmethacrylate (PMMA) and polycarbonate (PC) were used as polymeric matrices, and nile red as acceptor. Additionally, a PMMA matrix modified with 2-mercaptoethylamine is used to improve the attachment between both the donor (QDs) and the acceptor (nile red), as well as to induce a covalent bond between the modified PMMA and the QDs. An enhancement of the energy transfer efficiency by using the modified PMMA is expected and the resulting assembly can be applied for energy harvesting.
Resumo:
Dissertação de mestrado integrado em Civil Engineering
Resumo:
The efficient utilization of lignocellulosic biomass and the reduction of production cost are mandatory to attain a cost-effective lignocellulose-to-ethanol process. The selection of suitable pretreatment that allows an effective fractionation of biomass and the use of pretreated material at high-solid loadings on saccharification and fermentation (SSF) processes are considered promising strategies for that purpose. Eucalyptus globulus wood was fractionated by organosolv process at 200 C for 69 min using 56% of glycerol-water. A 99% of cellulose remained in pretreated biomass and 65% of lignin was solubilized. Precipitated lignin was characterized for chemical composition and thermal behavior, showing similar features to commercial lignin. In order to produce lignocellulosic ethanol at high-gravity, a full factory design was carried to assess the liquid to solid ratio (3e9 g/g) and enzyme to solid ratio (8e16 FPU/g) on SSF of delignified Eucalyptus. High ethanol concentration (94 g/L) corresponding to 77% of conversion at 16FPU/g and LSR ¼ 3 g/g using an industrial and thermotolerant Saccharomyces cerevisiae strain was successfully produced from pretreated biomass. Process integration of a suitable pretreatment, which allows for whole biomass valorization, with intensified saccharification-fermentation stages was shown to be feasible strategy for the co-production of high ethanol titers, oligosaccharides and lignin paving the way for cost-effective Eucalyptus biorefinery.
Resumo:
This is a study of a state of the art implementation of a new computer integrated testing (CIT) facility within a company that designs and manufactures transport refrigeration systems. The aim was to use state of the art hardware, software and planning procedures in the design and implementation of three CIT systems. Typical CIT system components include data acquisition (DAQ) equipment, application and analysis software, communication devices, computer-based instrumentation and computer technology. It is shown that the introduction of computer technology into the area of testing can have a major effect on such issues as efficiency, flexibility, data accuracy, test quality, data integrity and much more. Findings reaffirm how the overall area of computer integration continues to benefit any organisation, but with more recent advances in computer technology, communication methods and software capabilities, less expensive more sophisticated test solutions are now possible. This allows more organisations to benefit from the many advantages associated with CIT. Examples of computer integration test set-ups and the benefits associated with computer integration have been discussed.
Resumo:
Emissions distribution is a focus variable for the design of future international agreements to tackle global warming. This paper specifically analyses the future path of emissions distribution and its determinants in different scenarios. Whereas our analysis is driven by tools which are typically applied in the income distribution literature and which have recently been applied to the analysis of CO2 emissions distribution, a new methodological approach is that our study is driven by simulations run with a popular regionalised optimal growth climate change model over the 1995-2105 period. We find that the architecture of environmental policies, the implementation of flexible mechanisms and income concentration are key determinants of emissions distribution over time. In particular we find a robust positive relationship between measures of inequalities.
Resumo:
Integrated control measures against Culex quinquefasciastus have been implemented in a pilot urban area in Recife, Brazil. About 3,000 breeding sites found within the operational area were responsible for very high mosquito densities recorded during the pretrial period. Physical control measures have been applied to cess pits before starting a series of 37 treatments of the other sites with Bacillus sphaericus strain 2362, over 27 months. In spite of the difficulties due to environmental conditions, very significant reductions in preimaginal population of C. quinquefasciatus were achieved and, as a consequence, low adult mosquito densities were maintained for a relatively long period of time. Entomological and environmental data gathered in this pilot project can contribute to design an integrated mosquito control program in Recife city.
Resumo:
This case study introduces our continuous work to enhance the virtual classroom in order to provide faculty and students with an environment open to their needs, compliant with learning standards and, therefore compatible with other e-learning environments, and based on open source software. The result is a modulable, sustainable and interoperable learning environment that can be adapted to different teaching and learning situations by incorporating the LMS integrated tools as well as wikis, blogs, forums and Moodle activities among others.