896 resultados para pacs: information technology application
Resumo:
The scalability of CMOS technology has driven computation into a diverse range of applications across the power consumption, performance and size spectra. Communication is a necessary adjunct to computation, and whether this is to push data from node-to-node in a high-performance computing cluster or from the receiver of wireless link to a neural stimulator in a biomedical implant, interconnect can take up a significant portion of the overall system power budget. Although a single interconnect methodology cannot address such a broad range of systems efficiently, there are a number of key design concepts that enable good interconnect design in the age of highly-scaled CMOS: an emphasis on highly-digital approaches to solving ‘analog’ problems, hardware sharing between links as well as between different functions (such as equalization and synchronization) in the same link, and adaptive hardware that changes its operating parameters to mitigate not only variation in the fabrication of the link, but also link conditions that change over time. These concepts are demonstrated through the use of two design examples, at the extremes of the power and performance spectra.
A novel all-digital clock and data recovery technique for high-performance, high density interconnect has been developed. Two independently adjustable clock phases are generated from a delay line calibrated to 2 UI. One clock phase is placed in the middle of the eye to recover the data, while the other is swept across the delay line. The samples produced by the two clocks are compared to generate eye information, which is used to determine the best phase for data recovery. The functions of the two clocks are swapped after the data phase is updated; this ping-pong action allows an infinite delay range without the use of a PLL or DLL. The scheme's generalized sampling and retiming architecture is used in a sharing technique that saves power and area in high-density interconnect. The eye information generated is also useful for tuning an adaptive equalizer, circumventing the need for dedicated adaptation hardware.
On the other side of the performance/power spectra, a capacitive proximity interconnect has been developed to support 3D integration of biomedical implants. In order to integrate more functionality while staying within size limits, implant electronics can be embedded onto a foldable parylene (‘origami’) substrate. Many of the ICs in an origami implant will be placed face-to-face with each other, so wireless proximity interconnect can be used to increase communication density while decreasing implant size, as well as facilitate a modular approach to implant design, where pre-fabricated parylene-and-IC modules are assembled together on-demand to make custom implants. Such an interconnect needs to be able to sense and adapt to changes in alignment. The proposed array uses a TDC-like structure to realize both communication and alignment sensing within the same set of plates, increasing communication density and eliminating the need to infer link quality from a separate alignment block. In order to distinguish the communication plates from the nearby ground plane, a stimulus is applied to the transmitter plate, which is rectified at the receiver to bias a delay generation block. This delay is in turn converted into a digital word using a TDC, providing alignment information.
Resumo:
Underlying matter and light are their building blocks of tiny atoms and photons. The ability to control and utilize matter-light interactions down to the elementary single atom and photon level at the nano-scale opens up exciting studies at the frontiers of science with applications in medicine, energy, and information technology. Of these, an intriguing front is the development of quantum networks where N >> 1 single-atom nodes are coherently linked by single photons, forming a collective quantum entity potentially capable of performing quantum computations and simulations. Here, a promising approach is to use optical cavities within the setting of cavity quantum electrodynamics (QED). However, since its first realization in 1992 by Kimble et al., current proof-of-principle experiments have involved just one or two conventional cavities. To move beyond to N >> 1 nodes, in this thesis we investigate a platform born from the marriage of cavity QED and nanophotonics, where single atoms at ~100 nm near the surfaces of lithographically fabricated dielectric photonic devices can strongly interact with single photons, on a chip. Particularly, we experimentally investigate three main types of devices: microtoroidal optical cavities, optical nanofibers, and nanophotonic crystal based structures. With a microtoroidal cavity, we realized a robust and efficient photon router where single photons are extracted from an incident coherent state of light and redirected to a separate output with high efficiency. We achieved strong single atom-photon coupling with atoms located ~100 nm near the surface of a microtoroid, which revealed important aspects in the atom dynamics and QED of these systems including atom-surface interaction effects. We present a method to achieve state-insensitive atom trapping near optical nanofibers, critical in nanophotonic systems where electromagnetic fields are tightly confined. We developed a system that fabricates high quality nanofibers with high controllability, with which we experimentally demonstrate a state-insensitive atom trap. We present initial investigations on nanophotonic crystal based structures as a platform for strong atom-photon interactions. The experimental advances and theoretical investigations carried out in this thesis provide a framework for and open the door to strong single atom-photon interactions using nanophotonics for chip-integrated quantum networks.
Resumo:
Government procurement of a new good or service is a process that usually includes basic research, development, and production. Empirical evidences indicate that investments in research and development (R and D) before production are significant in many defense procurements. Thus, optimal procurement policy should not be only to select the most efficient producer, but also to induce the contractors to design the best product and to develop the best technology. It is difficult to apply the current economic theory of optimal procurement and contracting, which has emphasized production, but ignored R and D, to many cases of procurement.
In this thesis, I provide basic models of both R and D and production in the procurement process where a number of firms invest in private R and D and compete for a government contract. R and D is modeled as a stochastic cost-reduction process. The government is considered both as a profit-maximizer and a procurement cost minimizer. In comparison to the literature, the following results derived from my models are significant. First, R and D matters in procurement contracting. When offering the optimal contract the government will be better off if it correctly takes into account costly private R and D investment. Second, competition matters. The optimal contract and the total equilibrium R and D expenditures vary with the number of firms. The government usually does not prefer infinite competition among firms. Instead, it prefers free entry of firms. Third, under a R and D technology with the constant marginal returns-to-scale, it is socially optimal to have only one firm to conduct all of the R and D and production. Fourth, in an independent private values environment with risk-neutral firms, an informed government should select one of four standard auction procedures with an appropriate announced reserve price, acting as if it does not have any private information.
Resumo:
Este trabalho de pesquisa tem dois objetivos principais: o primeiro diz respeito a avaliar, em livros didáticos, indicados pelos professores do Estado do Piauí, Teresina, a abordagem proposta neles acerca dos gêneros digitais, tendo em vista a grande disseminação da tecnologia da informação e comunicação nas últimas décadas. Para tanto, propõe-se um acurado levantamento dos livros didáticos, do 6 ao 9 ano, verificando a incidência de gêneros discursivos digitais e não digitais nos diferentes livros analisados. Buscou-se entender os critérios de como são apresentados, bem como o tratamento dado ao gênero, cotejando a forma de abordagem proposta para os gêneros denominados digitais a outros gêneros predominantes na abordagem dos diferentes livros. Das quatro coleções analisadas, percebeu-se uma abordagem mais formalizada não só do ponto de vista da estrutura, mas também do ponto de vista dos recursos linguísticos que compõem os diferentes gêneros não digitais. Em contrapartida, pôde-se verificar uma incidência muito restrita de abordagem dos gêneros digitais. Os resultados nos levam a postular que esses são pouco frequentes nos livros didáticos mais adotados no estado e, quando são apresentados, há uma abordagem voltada somente para a sua estrutura, de forma metalinguística, sem preocupação com o uso social dos recursos utilizados nos gêneros digitais. Tal perspectiva nos levou ao segundo objetivo: propor a apresentação de uma sequência didática, cujo gênero central são os gêneros digitais, atendendo às habilidades e competências 1 e 9 do ENEM Exame Nacional do Ensino Médio. Neste sentido essas competências e habilidades visam a identificar as diferentes linguagens veiculadas na mídia; conhecer e dominar os recursos expressivos como caracterizadores dos sistemas de comunicação e informação, além de reconhecer a função social da linguagem, como também o impacto das ferramentas tecnológicas na vida pessoal e profissional. Estas habilidades e competências usam os dois pilares fundamentais: a função social do gênero e a aplicação do seu uso. A proposta apresentada baseia-se em uma concepção sóciohistórica de língua, considerando a importância de uma conscientização dos cidadãos, jovens em idade escolar, da importância social dos recursos tecnológicos, bem como a necessidade de ampliação do uso em uma sociedade multissemiótica
Resumo:
Os Sistemas de Controle de Gestão, bem como as informações por eles disponibilizadas, ganham cada vez mais relevância no setor bancário. Tal fato vem ocorrendo devido à necessidade de melhoria de práticas de gestão, assim como processos, justificada pelo desenvolvimento contínuo das atividades nesse mercado. Este estudo se propôs a verificar o impacto provocado pelo processo de internacionalização e convergência aos padrões internacionais de contabilidade nos sistemas de controle gerencial de bancos múltiplos que atuam no Brasil. Para isso, realizou-se uma pesquisa exploratória através do método de estudo de caso, sendo utilizados entrevistas e questionários, para examinar a aplicação do referencial teórico na prática do uso de Sistemas de Controle de Gestão nesses bancos, buscando destacar possíveis modificações em função de ambos os processos. Os resultados obtidos na análise dos casos apresentam características comuns à literatura, e incomuns em outros aspectos. O impacto da Internacionalização trazido aos Sistemas de controle dos bancos mostrou estar atrelado à área de tecnologia da informação, enquanto para Convergência aos padrões internacionais de Contabilidade, o impacto deve estar muito mais endereçado à evidenciação nos padrões internacionais, do que propriamente aos Sistemas de controle.
Resumo:
Trata da apresentação e discussão de um modelo de previsão de demanda de médicos para atendimentos de pacientes internados pelo SUS, com estudo de caso para o Estado do Rio de Janeiro. O modelo é baseado nos dados do Sistema de Informações Hospitalares do SUS (SIH/SUS) e nas alterações esperadas de tamanho e composição da população, segundo o IBGE. Descreve a trajetória e a motivação que levaram à construção do modelo, a partir da ideia de maior utilização do enorme potencial das bases de dados brasileiras para o planeamento e gestão dos RHS. Faz também comentários sobre conceitos da Tecnologia da Informação, que são de interesse para uma melhor compreensão das bases de dados, incluindo a utilização de padrões. Apresenta e comenta os resultados da aplicação do modelo, para o período de 2002 a 2022, para o Estado do Rio de Janeiro. Propõe sugestões de pesquisas com objetivo de melhorar a integração entre as bases de dados estudadas, a discussão da construção e utilização de indicadores, assim como uma proposta de evolução para o apoio à decisão na área de RHS.
Resumo:
The Western Pacific Fishery Information Network (WPACFlN) is an intergovernmental agency cooperative program sponsored by the National Marine Fisheries Service (NMFS) to help participating island fisheries agencies carry out data collection, analysis, reporting programs, and data management activities to better support fisheries management under the Magnuson Fishery Conservation and Management Act; and to help meet local fisheries information and management needs. The WPACFlN is the central source of information for Federal fisheries management of most fisheries in American Samoa, Guam, and the Northern Mariana Islands, and it plays an important role in acquiring fisheries data in Hawaii. This paper describes the development and status of this fishery information system.
Resumo:
The basic purpose of fishing is to catch/harvest as much fish as possible and reach it to the consumer as a wholesome, acceptable food, though fishery development programs are based on diverse objectives which include, besides the production of a valuable food, creation of employment opportunities, socio-economic uplift of the fishermen community, and earning foreign exchange through export. Both the production and the utilization of fish depend intrinsically on the efficient application of technology.
Resumo:
We present a new software framework for the implementation of applications that use stencil computations on block-structured grids to solve partial differential equations. A key feature of the framework is the extensive use of automatic source code generation which is used to achieve high performance on a range of leading multi-core processors. Results are presented for a simple model stencil running on Intel and AMD CPUs as well as the NVIDIA GT200 GPU. The generality of the framework is demonstrated through the implementation of a complete application consisting of many different stencil computations, taken from the field of computational fluid dynamics. © 2010 IEEE.
Resumo:
The Internet of Things (IOT) concept and enabling technologies such as RFID offer the prospect of linking the real world of physical objects with the virtual world of information technology to improve visibility and traceability information within supply chains and across the entire lifecycles of products, as well as enabling more intuitive interactions and greater automation possibilities. There is a huge potential for savings through process optimization and profit generation within the IOT, but the sharing of financial benefits across companies remains an unsolved issue. Existing approaches towards sharing of costs and benefits have failed to scale so far. The integration of payment solutions into the IOT architecture could solve this problem. We have reviewed different possible levels of integration. Multiple payment solutions have been researched. Finally we have developed a model that meets the requirements of the IOT in relation to openness and scalability. It supports both hardware-centric and software-centric approaches to integration of payment solutions with the IOT. Different requirements concerning payment solutions within the IOT have been defined and considered in the proposed model. Possible solution providers include telcos, e-payment service providers and new players such as banks and standardization bodies. The proposed model of integrating the Internet of Things with payment solutions will lower the barrier to invoicing for the more granular visibility information generated using the IOT. Thus, it has the potential to enable recovery of the necessary investments in IOT infrastructure and accelerate adoption of the IOT, especially for projects that are only viable when multiple benefits throughout the supply chain need to be accumulated in order to achieve a Return on Investment (ROI). In a long-term perspective, it may enable IT-departments to become profit centres instead of cost centres. © 2010 - IOS Press and the authors. All rights reserved.
Resumo:
Several studies have highlighted the importance of information and information quality in organisations and thus information is regarded as key determinant for the success and organisational performance. At the same time, there are numerous studies, frameworks and case studies examining the impact of information technology and systems to business value. Recently, several studies have proposed maturity models for information management capabilities in the literature, which claim that a higher maturity results in a higher organizational performance. Although these studies provide valuable information about the underlying relations, most are limited in specifying the relationship in more detail. Furthermore, most prominent approaches do not or at least not explicitly consider information as important influencing factor for organisational performance. In this paper, we aim to review selected contributions and introduce a model that shows how IS/IT resources and capabilties could be interlinked with IS/IT utilization, organizational performance and business value. Complementing other models and frameworks, we explicitly consider information from a management maturity, quality and risk perspective. Moreover, the paper discusses how each part of the model can be assessed in order to validate the model in future studies.
Resumo:
Information visualization can accelerate perception, provide insight and control, and harness this flood of valuable data to gain a competitive advantage in making business decisions. Although such a statement seems to be obvious, there is a lack in the literature of practical evidence of the benefit of information visualization. The main contribution of this paper is to illustrate how, for a major European apparel retailer, the visualization of performance information plays a critical role in improving business decisions and in extracting insights from Redio Frequency Idetification (RFID)-based performance measures. In this paper, we identify - based on a literature review - three fundamental managerial functions of information visualization, namely as: a communication medium, a knowledge management means, and a decision-support instrument. Then, we provide - based on real industrial case evidence - how information visualization supports business decision-making. Several examples are provided to evidence the benefit of information visualization through its three identified managerial functions. We find that - depending on the way performance information is shaped, communicated, and made interactive - it not only helps decision making, but also offers a means of knowledge creation, as well as an appropriate communication channel. © 2014 World Scientific Publishing Company.
Resumo:
The present status and future prospects of functional information materials, mainly focusing on semiconductor microstructural materials, are introduced first in this paper. Then a brief discussion how to enhance the academic level and innovation capability of research and development of functional information materials in China are made. Finally the main problems concerning the studies of materials science and technology are analyzed, and possible measures for promoting its development are proposed.
Resumo:
The present status and future prospects of functional information materials, mainly focusing on semiconductor microstructural materials, are introduced first in this paper. Then a brief discussion how to enhance the academic level and innovation capability of research and development of functional information materials in China are made. Finally the main problems concerning the studies of materials science and technology are analyzed, and possible measures for promoting its development are proposed.