909 resultados para Computer arithmetic and logic units.


Relevância:

100.00% 100.00%

Publicador:

Relevância:

100.00% 100.00%

Publicador:

Resumo:

With the size of transistors approaching the sub-nanometer scale and Si-based photonics pinned at the micrometer scale due to the diffraction limit of light, we are unable to easily integrate the high transfer speeds of this comparably bulky technology with the increasingly smaller architecture of state-of-the-art processors. However, we find that we can bridge the gap between these two technologies by directly coupling electrons to photons through the use of dispersive metals in optics. Doing so allows us to access the surface electromagnetic wave excitations that arise at a metal/dielectric interface, a feature which both confines and enhances light in subwavelength dimensions - two promising characteristics for the development of integrated chip technology. This platform is known as plasmonics, and it allows us to design a broad range of complex metal/dielectric systems, all having different nanophotonic responses, but all originating from our ability to engineer the system surface plasmon resonances and interactions. In this thesis, we demonstrate how plasmonics can be used to develop coupled metal-dielectric systems to function as tunable plasmonic hole array color filters for CMOS image sensing, visible metamaterials composed of coupled negative-index plasmonic coaxial waveguides, and programmable plasmonic waveguide network systems to serve as color routers and logic devices at telecommunication wavelengths.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Collector-type experiments have been conducted to investigate two different aspects of sputtering induced by keV ions. The first study looked for possible ejection mechanisms related to the primary charge state of the projectile. Targets of CsI and LiNbO_3 were bombarded with 48 keV Ar^(q+), and a Au target was bombarded with 60 keV Ar^(q+), for q = 4, 8, and 11. The collectors were analyzed using heavy-ion Rutherford backscattering spectroscopy to determine the differential angular sputtering yields; these and the corresponding total yields were examined for variations as a function of projectile charge state. For the Au target, no significant changes were seen, but for the insulating targets slight (~10%) enhancements were observed in the total yields as the projectile charge state was increased from 4+ to 11+.

In the second investigation, artificial ^(92)Mo/^(100)Mo targets were bombarded with 5 and 10 keV beams of Ar^+ and Xe^+ to study the isotopic fractionation of sputtered neutrals as a function of emission angle and projectile fluence. Using secondary ion mass spectroscopy to measure the isotope ratio on the collectors, material ejected into normal directions at low bombarding fluences (~ 10^(15) ions cm^(-2)) was found to be enriched in the light isotope by as much as ~70‰ compared to steady state. Similar results were found for secondary Mo ions sputtered by 14.5 keV O^-. For low-fluence 5 keV Xe^+ bombardment, the light-isotope enrichment at oblique angles was ~20‰ less than the corresponding enrichment in the normal direction. No angular dependence could be resolved for 5 keV Ar^+ projectiles at the lowest fluence. The above fractionation decreased to steady-state values after bombarding fluences of a few times 10^(16) ions cm^(-2) , with the angular dependence becoming more pronounced. The fractionation and total sputtering yield were found to be strongly correlated, indicating that the above effects may have been related to the presence of a modified target surface layer. The observed effects are consistent with other secondary ion measurements and multiple-interaction computer simulations, and are considerably larger than predicted by existing analytic theory.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A new 2-D quality-guided phase-unwrapping algorithm, based on the placement of the branch cuts, is presented. Its framework consists of branch cut placing guided by an original quality map and reliability ordering performed on a final quality map. To improve the noise immunity of the new algorithm, a new quality map, which is used as the original quality map to guide the placement of the branch cuts, is proposed. After a complete description of the algorithm and the quality map, several wrapped images are used to examine the effectiveness of the algorithm. Computer simulation and experimental results make it clear that the proposed algorithm works effectively even when a wrapped phase map contains error sources, such as phase discontinuities, noise, and undersampling. (c) 2005 Society of Photo-Optical Instrumentation Engineers.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A fast and reliable phase unwrapping (PhU) algorithm, based on the local quality-guided fitting plane, is presented. Its framework depends on the basic plane-approximated assumption for phase values of local pixels and on the phase derivative variance (PDV) quality map. Compared with other existing popular unwrapping algorithms, the proposed algorithm demonstrated improved robustness and immunity to strong noise and high phase variations, given that the plane assumption for local phase is reasonably satisfied. Its effectiveness is demonstrated by computer-simulated and experimental results.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

STEEL, the Caltech created nonlinear large displacement analysis software, is currently used by a large number of researchers at Caltech. However, due to its complexity, lack of visualization tools (such as pre- and post-processing capabilities) rapid creation and analysis of models using this software was difficult. SteelConverter was created as a means to facilitate model creation through the use of the industry standard finite element solver ETABS. This software allows users to create models in ETABS and intelligently convert model information such as geometry, loading, releases, fixity, etc., into a format that STEEL understands. Models that would take several days to create and verify now take several hours or less. The productivity of the researcher as well as the level of confidence in the model being analyzed is greatly increased.

It has always been a major goal of Caltech to spread the knowledge created here to other universities. However, due to the complexity of STEEL it was difficult for researchers or engineers from other universities to conduct analyses. While SteelConverter did help researchers at Caltech improve their research, sending SteelConverter and its documentation to other universities was less than ideal. Issues of version control, individual computer requirements, and the difficulty of releasing updates made a more centralized solution preferred. This is where the idea for Caltech VirtualShaker was born. Through the creation of a centralized website where users could log in, submit, analyze, and process models in the cloud, all of the major concerns associated with the utilization of SteelConverter were eliminated. Caltech VirtualShaker allows users to create profiles where defaults associated with their most commonly run models are saved, and allows them to submit multiple jobs to an online virtual server to be analyzed and post-processed. The creation of this website not only allowed for more rapid distribution of this tool, but also created a means for engineers and researchers with no access to powerful computer clusters to run computationally intensive analyses without the excessive cost of building and maintaining a computer cluster.

In order to increase confidence in the use of STEEL as an analysis system, as well as verify the conversion tools, a series of comparisons were done between STEEL and ETABS. Six models of increasing complexity, ranging from a cantilever column to a twenty-story moment frame, were analyzed to determine the ability of STEEL to accurately calculate basic model properties such as elastic stiffness and damping through a free vibration analysis as well as more complex structural properties such as overall structural capacity through a pushover analysis. These analyses showed a very strong agreement between the two softwares on every aspect of each analysis. However, these analyses also showed the ability of the STEEL analysis algorithm to converge at significantly larger drifts than ETABS when using the more computationally expensive and structurally realistic fiber hinges. Following the ETABS analysis, it was decided to repeat the comparisons in a software more capable of conducting highly nonlinear analysis, called Perform. These analyses again showed a very strong agreement between the two softwares in every aspect of each analysis through instability. However, due to some limitations in Perform, free vibration analyses for the three story one bay chevron brace frame, two bay chevron brace frame, and twenty story moment frame could not be conducted. With the current trend towards ultimate capacity analysis, the ability to use fiber based models allows engineers to gain a better understanding of a building’s behavior under these extreme load scenarios.

Following this, a final study was done on Hall’s U20 structure [1] where the structure was analyzed in all three softwares and their results compared. The pushover curves from each software were compared and the differences caused by variations in software implementation explained. From this, conclusions can be drawn on the effectiveness of each analysis tool when attempting to analyze structures through the point of geometric instability. The analyses show that while ETABS was capable of accurately determining the elastic stiffness of the model, following the onset of inelastic behavior the analysis tool failed to converge. However, for the small number of time steps the ETABS analysis was converging, its results exactly matched those of STEEL, leading to the conclusion that ETABS is not an appropriate analysis package for analyzing a structure through the point of collapse when using fiber elements throughout the model. The analyses also showed that while Perform was capable of calculating the response of the structure accurately, restrictions in the material model resulted in a pushover curve that did not match that of STEEL exactly, particularly post collapse. However, such problems could be alleviated by choosing a more simplistic material model.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A new 2-D quality-guided phase-unwrapping algorithm, based on the placement of the branch cuts, is presented. Its framework consists of branch cut placing guided by an original quality map and reliability ordering performed on a final quality map. To improve the noise immunity of the new algorithm, a new quality map, which is used as the original quality map to guide the placement of the branch cuts, is proposed. After a complete description of the algorithm and the quality map, several wrapped images are used to examine the effectiveness of the algorithm. Computer simulation and experimental results make it clear that the proposed algorithm works effectively even when a wrapped phase map contains error sources, such as phase discontinuities, noise, and undersampling. (c) 2005 Society of Photo-Optical Instrumentation Engineers.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This thesis is an investigation into the nature of data analysis and computer software systems which support this activity.

The first chapter develops the notion of data analysis as an experimental science which has two major components: data-gathering and theory-building. The basic role of language in determining the meaningfulness of theory is stressed, and the informativeness of a language and data base pair is studied. The static and dynamic aspects of data analysis are then considered from this conceptual vantage point. The second chapter surveys the available types of computer systems which may be useful for data analysis. Particular attention is paid to the questions raised in the first chapter about the language restrictions imposed by the computer system and its dynamic properties.

The third chapter discusses the REL data analysis system, which was designed to satisfy the needs of the data analyzer in an operational relational data system. The major limitation on the use of such systems is the amount of access to data stored on a relatively slow secondary memory. This problem of the paging of data is investigated and two classes of data structure representations are found, each of which has desirable paging characteristics for certain types of queries. One representation is used by most of the generalized data base management systems in existence today, but the other is clearly preferred in the data analysis environment, as conceptualized in Chapter I.

This data representation has strong implications for a fundamental process of data analysis -- the quantification of variables. Since quantification is one of the few means of summarizing and abstracting, data analysis systems are under strong pressure to facilitate the process. Two implementations of quantification are studied: one analagous to the form of the lower predicate calculus and another more closely attuned to the data representation. A comparison of these indicates that the use of the "label class" method results in orders of magnitude improvement over the lower predicate calculus technique.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Deep neural networks have recently gained popularity for improv- ing state-of-the-art machine learning algorithms in diverse areas such as speech recognition, computer vision and bioinformatics. Convolutional networks especially have shown prowess in visual recognition tasks such as object recognition and detection in which this work is focused on. Mod- ern award-winning architectures have systematically surpassed previous attempts at tackling computer vision problems and keep winning most current competitions. After a brief study of deep learning architectures and readily available frameworks and libraries, the LeNet handwriting digit recognition network study case is developed, and lastly a deep learn- ing network for playing simple videogames is reviewed.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Póster presentado en: 11th International Symposium on Applied Bioinorganic Chemistry. 2-5 Diciembre 2011. Barcelona, España (ISABC 2011)

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Este trabalho propõe-se a descrever uma metodologia para avaliação do sistema de educação fundamental do Estado do Rio de Janeiro, que utiliza a teoria dos conjuntos nebulosos como base, no processo de inferência para geração do Indicador Avaliação do Sistema Educacional (IASE). A base de dados utilizada para criação do indicador IASE foi extraída de dados obtidos do Instituto Nacional de Estudos e Pesquisas Educacionais Anísio Teixeira (INEP). Em seguida, os resultados obtidos são apresentados em um Sistema de informação Geográfica (SIG) possibilitando compreender a correlação de valores alfanuméricos e espacial das informações geradas no sistema nebuloso, de modo apoiar a tomada de decisão das ações governamentais no setor.

Relevância:

100.00% 100.00%

Publicador: