953 resultados para Development time
Resumo:
Determination of the energy range is an important precondition of focus calibration using alignment procedure (FOCAL) test. A new method to determine the energy range of FOCAL off-lined is presented in this paper. Independent of the lithographic tool, the method is time-saving and effective. The influences of some process factors, e.g. resist thickness, post exposure bake (PEB) temperature, PEB time and development time, on the energy range of FOCAL are analyzed.
Resumo:
This paper investigates several approaches to bootstrapping a new spoken language understanding (SLU) component in a target language given a large dataset of semantically-annotated utterances in some other source language. The aim is to reduce the cost associated with porting a spoken dialogue system from one language to another by minimising the amount of data required in the target language. Since word-level semantic annotations are costly, Semantic Tuple Classifiers (STCs) are used in conjunction with statistical machine translation models both of which are trained from unaligned data to further reduce development time. The paper presents experiments in which a French SLU component in the tourist information domain is bootstrapped from English data. Results show that training STCs on automatically translated data produced the best performance for predicting the utterance's dialogue act type, however individual slot/value pairs are best predicted by training STCs on the source language and using them to decode translated utterances. © 2010 ISCA.
Resumo:
Why do firms acquire external technologies? Previous research indicates that there are a wide variety of motivations. These include the need to acquire valuable knowledge-based resources, to improve strategic flexibility, to experiment), to overcome organisational inertia, to mitigate risk and uncertainty, to reduce costs and development time in new product development, and the perception that the firm has the absorptive capacity to integrate acquisitions. In this paper we provide an in-depth literature review of the motivations for the acquisition of external technologies by firms. We find that these motivations can be broadly classed into four categories: (1) the development of technological capabilities, (2) the development of strategic options, (3) efficiency improvements, and (4) responses to the competitive environment. In light of this categorisation, we comment on how these different motivations connect to the wider issues of technology acquisition. © 2010 IEEE.
Resumo:
Engineering changes (ECs) are raised throughout the lifecycle of engineering products. A single change to one component produces knock-on effects on others necessitating additional changes. This change propagation significantly affects the development time and cost and determines the product's success. Predicting and managing such ECs is, thus, essential to companies. Some prediction tools model change propagation by algorithms, whereof a subgroup is numerical. Current numerical change propagation algorithms either do not account for the exclusion of cyclic propagation paths or are based on exhaustive searching methods. This paper presents a new matrix-calculation-based algorithm which can be applied directly to a numerical product model to analyze change propagation and support change prediction. The algorithm applies matrix multiplications on mutations of a given design structure matrix accounting for the exclusion of self-dependences and cyclic propagation paths and delivers the same results as the exhaustive search-based Trail Counting algorithm. Despite its factorial time complexity, the algorithm proves advantageous because of its straightforward matrix-based calculations which avoid exhaustive searching. Thereby, the algorithm can be implemented in established numerical programs such as Microsoft Excel which promise a wider application of the tools within and across companies along with better familiarity, usability, practicality, security, and robustness. © 1988-2012 IEEE.
Resumo:
Photonic crystal devices with feature sizes of a few hundred nanometers are often fabricated by electron beam lithography. The proximity effect, stitching error and resist profiles have significant influence on the pattern quality, and therefore determine the optical properties of the devices. In this paper, detailed analyses and simple solutions to these problems are presented. The proximity effect is corrected by the introduction of a compensating dose. The influence of the stitching error is alleviated by replacing the original access waveguides with taper-added waveguides, and the taper parameters are also discussed to get the optimal choice. It is demonstrated experimentally that patterns exposed with different doses have almost the same edge-profiles in the resist for the same development time, and that optimized etching conditions can improve the wall angle of the holes in the substrate remarkably. (c) 2006 Elsevier B.V. All rights reserved.
Resumo:
With the rapid growth of the Internet and digital communications, the volume of sensitive electronic transactions being transferred and stored over and on insecure media has increased dramatically in recent years. The growing demand for cryptographic systems to secure this data, across a multitude of platforms, ranging from large servers to small mobile devices and smart cards, has necessitated research into low cost, flexible and secure solutions. As constraints on architectures such as area, speed and power become key factors in choosing a cryptosystem, methods for speeding up the development and evaluation process are necessary. This thesis investigates flexible hardware architectures for the main components of a cryptographic system. Dedicated hardware accelerators can provide significant performance improvements when compared to implementations on general purpose processors. Each of the designs proposed are analysed in terms of speed, area, power, energy and efficiency. Field Programmable Gate Arrays (FPGAs) are chosen as the development platform due to their fast development time and reconfigurable nature. Firstly, a reconfigurable architecture for performing elliptic curve point scalar multiplication on an FPGA is presented. Elliptic curve cryptography is one such method to secure data, offering similar security levels to traditional systems, such as RSA, but with smaller key sizes, translating into lower memory and bandwidth requirements. The architecture is implemented using different underlying algorithms and coordinates for dedicated Double-and-Add algorithms, twisted Edwards algorithms and SPA secure algorithms, and its power consumption and energy on an FPGA measured. Hardware implementation results for these new algorithms are compared against their software counterparts and the best choices for minimum area-time and area-energy circuits are then identified and examined for larger key and field sizes. Secondly, implementation methods for another component of a cryptographic system, namely hash functions, developed in the recently concluded SHA-3 hash competition are presented. Various designs from the three rounds of the NIST run competition are implemented on FPGA along with an interface to allow fair comparison of the different hash functions when operating in a standardised and constrained environment. Different methods of implementation for the designs and their subsequent performance is examined in terms of throughput, area and energy costs using various constraint metrics. Comparing many different implementation methods and algorithms is nontrivial. Another aim of this thesis is the development of generic interfaces used both to reduce implementation and test time and also to enable fair baseline comparisons of different algorithms when operating in a standardised and constrained environment. Finally, a hardware-software co-design cryptographic architecture is presented. This architecture is capable of supporting multiple types of cryptographic algorithms and is described through an application for performing public key cryptography, namely the Elliptic Curve Digital Signature Algorithm (ECDSA). This architecture makes use of the elliptic curve architecture and the hash functions described previously. These components, along with a random number generator, provide hardware acceleration for a Microblaze based cryptographic system. The trade-off in terms of performance for flexibility is discussed using dedicated software, and hardware-software co-design implementations of the elliptic curve point scalar multiplication block. Results are then presented in terms of the overall cryptographic system.
Resumo:
Body size and development time are important life history traits because they are often highly correlated with fitness. Although the developmental mechanisms that control growth have been well studied, the mechanisms that control how a species-characteristic body size is achieved remain poorly understood. In insects adult body size is determined by the number of larval molts, the size increment at each molt, and the mechanism that determines during which instar larval growth will stop. Adult insects do not grow, so the size at which a larva stops growing determines adult body size. Here we develop a quantitative understanding of the kinetics of growth throughout larval life of Manduca sexta, under different conditions of nutrition and temperature, and for genetic strains with different adult body sizes. We show that the generally accepted view that the size increment at each molt is constant (Dyar's Rule) is systematically violated: there is actually a progressive increase in the size increment from instar to instar that is independent of temperature. In addition, the mass-specific growth rate declines throughout the growth phase in a temperature-dependent manner. We show that growth within an instar follows a truncated Gompertz trajectory. The critical weight, which determines when in an instar a molt will occur, and the threshold size, which determines which instar is the last, are different in genetic strains with different adult body sizes. Under nutrient and temperature stress Manduca has a variable number of larval instars and we show that this is due to the fact that more molts at smaller increments are taken before threshold size is reached. We test whether the new insight into the kinetics of growth and size determination are sufficient to explain body size and development time through a mathematical model that incorporates our quantitative findings.
Resumo:
For sensitive optoelectronic components, traditional soldering techniques cannot be used because of their inherent sensitivity to thermal stresses. One such component is the Optoelectronic Butterfly Package which houses a laser diode chip aligned to a fibre-optic cable. Even sub-micron misalignment of the fibre optic and laser diode chip can significantly reduce the performance of the device. The high cost of each unit requires that the number of damaged components, via the laser soldering process, are kept to a minimum. Mathematical modelling is undertaken to better understand the laser soldering process and to optimize operational parameters such as solder paste volume, copper pad dimensions, laser solder times for each joint, laser intensity and absorption coefficient. Validation of the model against experimental data will be completed, and will lead to an optimization of the assembly process, through an iterative modelling cycle. This will ultimately reduce costs, improve the process development time and increase consistency in the laser soldering process.
Resumo:
In this paper, we present a methodology for implementing a complete Digital Signal Processing (DSP) system onto a heterogeneous network including Field Programmable Gate Arrays (FPGAs) automatically. The methodology aims to allow design refinement and real time verification at the system level. The DSP application is constructed in the form of a Data Flow Graph (DFG) which provides an entry point to the methodology. The netlist for parts that are mapped onto the FPGA(s) together with the corresponding software and hardware Application Protocol Interface (API) are also generated. Using a set of case studies, we demonstrate that the design and development time can be significantly reduced using the methodology developed.
Resumo:
Risk management in software engineering has become a recognized project management practice but it seems that not all companies are systematically applying it. At the same time, agile methods have become popular, partly because proponents claim that agile methods implicitly reduce risks due
to, for example, more frequent and earlier feedback, shorter periods of development time and easier prediction of cost. Therefore, there is a need to investigate how risk management can be usable in iterative and evolutionary software development processes. This paper investigates the gathering of empirical data on risk management from the project environment and presents
a novel approach to manage risk in agile projects. Our approach is based on a prototype tool, Agile Risk Tool (ART). This tool reduces human effort in risk management by using software agents to identify, assess and monitor risk, based on input and data collected from the project environment and by applying
some designated rules. As validation, groups of student project data were used to provide evidence of the efficacy of this approach. We demonstrate the approach and the feasibility of using a lightweight risk management tool to alert, assess and monitor risk with reduced human effort.
Resumo:
Low-velocity impact damage can drastically reduce the residual mechanical properties of the composite structure even when there is barely visible impact damage. The ability to computationally predict the extent of damage and compression after impact (CAI) strength of a composite structure can potentially lead to the exploration of a larger design space without incurring significant development time and cost penalties. A three-dimensional damage model, to predict both low-velocity impact damage and compression after impact CAI strength of composite laminates, has been developed and implemented as a user material subroutine in the commercial finite element package, ABAQUS/Explicit. The virtual tests were executed in two steps, one to capture the impact damage and the other to predict the CAI strength. The observed intra-laminar damage features, delamination damage area as well as residual strength are discussed. It is shown that the predicted results for impact damage and CAI strength correlated well with experimental testing.
Resumo:
Low-velocity impact damage can drastically reduce the residual mechanical properties of the composite structure even when there is barely visible impact damage. The ability to computationally predict the extent of damage and compression after impact (CAI) strength of a composite structure can potentially lead to the exploration of a larger design space without incurring significant development time and cost penalties. A three-dimensional damage model, to predict both low-velocity impact damage and compression after impact CAI strength of composite laminates, has been developed and implemented as a user material subroutine in the commercial finite element package, ABAQUS/Explicit. The virtual tests were executed in two steps, one to capture the impact damage and the other to predict the CAI strength. The observed intra-laminar damage features, delamination damage area as well as residual strength are discussed. It is shown that the predicted results for impact damage and CAI strength correlated well with experimental testing.
Design of a Virtual Reality Framework for Maintainability and assemblability test of complex systems
Resumo:
This paper presents a unique environment whose features are able to satisfy requirements for both virtual maintenance and virtual manufacturing through the conception of original virtual reality (VR) architecture. Virtual Reality for the Maintainability and Assemblability Tests (VR_MATE) encompasses VR hardware and software and a simulation manager which allows customisation of the architecture itself as well as interfacing with a wide range of devices employed in the simulations. Two case studies are presented to illustrate VR_MATE's unique ability to allow for both maintainability tests and assembly analysis of an aircraft carriage and a railway coach cooling system respectively. The key impact of this research is the demonstration of the potentialities of using VR techniques in industry and its multiple applications despite the subjective character within the simulation. VR_MATE has been presented as a framework to support the strategic and operative objectives of companies to reduce product development time and costs whilst maintaining product quality for applications which would be too expensive to simulate and evaluate in the real world.
Resumo:
Os sistemas aquáticos naturais podem estar sujeitos frequentemente a entrada de tóxicos, quer seja através da lixiviação dos campos agrícolas ou da descarga por parte de unidades industriais. Avaliar o impacto potencial destes contaminantes nos sistemas aquáticos é muito importante, porque pode promover consequências sérias no balanço ecológico dos ecossistemas. Os efeitos de níveis sub-letais destes tóxicos nas populações aquáticas são detectados, em muitos casos, somente após diversas gerações, dependendo da espécie e do contaminante. O comportamento animal é considerado como sendo a primeira linha de defesa perante estímulos ambientais, e pode ser uma representação de alterações fisiológicas no organismo, sendo portanto um indicador excelente de alterações ambientais. O desenvolvimento dos sistemas de aviso prévio que integram parâmetros comportamentais pode ajudar a prever mais rapidamente possíveis alterações ao nível das populações naturais, do que a utilização de testes ecotoxicológicos padrão com a mesma finalidade. O conhecimento acerca de possíveis implicações devido a alterações comportamentais, em organismos bentónicos e em populações do campo sujeitas a tóxicos, é ainda escasso. Sabendo isto, neste estudo pretendeu-se investigar como o comportamento de Chironomus riparius – usando um biomonitor em tempo real – e outros parâmetros tais como crescimento, emergência de adultos, bioacumulação e biomarcadores, são afectados pela exposição a imidacloprid e ao mercúrio, que foram seleccionados como contaminantes. Os resultados demonstraram que a exposição às concentrações sub-letais de imidacloprid afecta o crescimento e o comportamento dos quironomídeos e que estes organismos podem recuperar de uma exposição curta ao insecticida. O comportamento que corresponde à ventilação de C. riparius revelou-se como um parâmetro mais sensível do que a locomoção e do que as respostas bioquímicas, quando as larvas foram sujeitas ao imidacloprid. Larvas de C. riparius expostas a concentrações sub-letais de mercúrio apresentaram uma tendência de diminuição de actividade comportamental, em testes com concentrações crescentes do tóxico; o crescimento das larvas foi também prejudicado, e as taxas de emergência de adultos e o tempo de desenvolvimento apresentaram retardamento. Estes organismos podem bioacumular rapidamente o mercúrio em condições de não alimentação e apresentam uma lenta depuração deste metal. Estes efeitos podem, em último caso, conduzir a prováveis repercussões ao nível da população e das comunidades. As reduções em actividades comportamentais, mesmo em concentrações baixas, podem diminuir a quantidade de tempo gasta na procura de alimento, produzindo efeitos aos níveis morfo-fisiológicos, e assim afectar severamente o desempenho dos quironomídeos no ambiente. O uso destes factores comportamentais como um parâmetro ecotoxicológico sub-letal relevante ao nível da toxicologia aumentará a versatilidade dos testes, permitindo uma resposta comportamental mensurável e quantitativa ao nível do organismo, utilizando uma avaliação não destrutiva, e assim certificando que esta aproximação pode ser usada em testes ecotoxicológicos futuros.
Resumo:
Tese de mestrado em Biologia Evolutiva e do Desenvolvimento, apresentada à Universidade de Lisboa, através da Faculdade de Ciências, 2016