956 resultados para Data Structure Evaluation
Resumo:
Master’s Thesis in Computer Engineering
Resumo:
Os osciloscópios digitais são utilizados em diversas áreas do conhecimento, assumindo-se no âmbito da engenharia electrónica, como instrumentos indispensáveis. Graças ao advento das Field Programmable Gate Arrays (FPGAs), os instrumentos de medição reconfiguráveis, dadas as suas vantagens, i.e., altos desempenhos, baixos custos e elevada flexibilidade, são cada vez mais uma alternativa aos instrumentos tradicionalmente usados nos laboratórios. Tendo como objectivo a normalização no acesso e no controlo deste tipo de instrumentos, esta tese descreve o projecto e implementação de um osciloscópio digital reconfigurável baseado na norma IEEE 1451.0. Definido de acordo com uma arquitectura baseada nesta norma, as características do osciloscópio são descritas numa estrutura de dados denominada Transducer Electronic Data Sheet (TEDS), e o seu controlo é efectuado utilizando um conjunto de comandos normalizados. O osciloscópio implementa um conjunto de características e funcionalidades básicas, todas verificadas experimentalmente. Destas, destaca-se uma largura de banda de 575kHz, um intervalo de medição de 0.4V a 2.9V, a possibilidade de se definir um conjunto de escalas horizontais, o nível e declive de sincronismo e o modo de acoplamento com o circuito sob análise. Arquitecturalmente, o osciloscópio é constituído por um módulo especificado com a linguagem de descrição de hardware (HDL, Hardware Description Language) Verilog e por uma interface desenvolvida na linguagem de programação Java®. O módulo é embutido numa FPGA, definindo todo o processamento do osciloscópio. A interface permite o seu controlo e a representação do sinal medido. Durante o projecto foi utilizado um conversor Analógico/Digital (A/D) com uma frequência máxima de amostragem de 1.5MHz e 14 bits de resolução que, devido às suas limitações, obrigaram à implementação de um sistema de interpolação multi-estágio com filtros digitais.
Resumo:
O objectivo deste trabalho é a análise da eficiência produtiva e dos efeitos da concentração sobre os custos bancários, tendo por base a indústria bancária portuguesa. O carácter multiproduto da empresa bancária sugere a necessidade de se adoptar formas multiproduto da função custo (tipo Fourier). Introduzimos variáveis de homogeneidade e de estrutura que permitem o recurso a formas funcionais uniproduto (Cobb-Douglas) à banca. A amostra corresponde a 22 bancos que operavam em Portugal entre 1995-2001, base não consolidada e dados em painel. Para o estudo da ineficiência recorreu-se ao modelo estocástico da curva fronteira (SFA), para as duas especificações. Na análise da concentração, introduziram-se variáveis binárias que pretendem captar os efeitos durante quatro anos após a concentração. Tanto no caso da SFA como no da concentração, os resultados encontrados são sensíveis à especificação funcional adoptada. Concluindo, o processo de concentração bancário parece justificar-se pela possibilidade da diminuição da ineficiência-X.
Resumo:
Thesis submitted in the fulfillment of the requirements for the Degree of Master in Biomedical Engineering
Resumo:
Dissertação para obtenção do Grau de Mestre em Engenharia Biomédica
Resumo:
This paper introduces the metaphorism pattern of relational specification and addresses how specification following this pattern can be refined into recursive programs. Metaphorisms express input-output relationships which preserve relevant information while at the same time some intended optimization takes place. Text processing, sorting, representation changers, etc., are examples of metaphorisms. The kind of metaphorism refinement proposed in this paper is a strategy known as change of virtual data structure. It gives sufficient conditions for such implementations to be calculated using relation algebra and illustrates the strategy with the derivation of quicksort as example.
Resumo:
OBJECTIVE - To evaluate the cardiac abnormalities and their evolution during the course of the acquired immunodeficiency syndrome, as well as to correlate clinical and pathological data. METHODS - Twenty-one patients, admitted to the hospital with the diagnosis of acquired immunodeficiency syndrome, were prospectively studied and followed until their death. Age ranged from 19 to 42 years (17 males). ECG and echocardiogram were also obtained every six months. After death, macro- and microscopic examinations were also performed. RESULTS - The most frequent causes of referral to the hospital were: diarrhea or repeated pneumonias, tuberculosis, toxoplasmosis or Kaposi sarcoma. The most frequent findings were acute or chronic pericarditis (42%) and dilated cardiomyopathy (19%). Four patients died of cardiac problems: infective endocarditis, pericarditis with pericardial effusion, bacterial myocarditis and infection by Toxoplasma gondii. CONCLUSION - Severe cardiac abnormalities were the cause of death in some patients. In the majority of the patients, a good correlation existed between clinical and anatomical-pathological data. Cardiac evaluation was important to detect early manifestations and treat them accordingly, even in asymptomatic patients.
Resumo:
BACKGROUND: Classically, clinical trials are based on the placebo-control design. Our aim was to analyze the placebo effect in Huntington's disease. METHODS: Placebo data were obtained from an international, longitudinal, placebo-controlled trial for Huntington's disease (European Huntington's Disease Initiative Study Group). One-hundred and eighty patients were evaluated using the Unified Huntington Disease Rating Scale over 36 months. A placebo effect was defined as an improvement of at least 50% over baseline scores in the Unified Huntington Disease Rating Scale, and clinically relevant when at least 10% of the population met it. RESULTS: Only behavior showed a significant placebo effect, and the proportion of the patients with placebo effect ranged from 16% (first visit) to 41% (last visit). Nondepressed patients with better functional status were most likely to be placebo-responders over time. CONCLUSIONS: In Huntington's disease, behavior seems to be more vulnerable to placebo than overall motor function, cognition, and function
Resumo:
The control of the vector-borne transmission of Chagas disease in Brazil was organized as a national program in 1975, when two large entomological and sero-epidemiological surveys were conducted in the country in order to identify areas at highest risk of transmission and to guide inerventions regarding the chemical treatment of domestic vectors of the disease. The authors present the baseline data gathered through these studies and compare them with more recent data. The evaluation performed shows that the transmission by Triatoma infestans is virtually interrupted and that the transmission by other native species of triatominae from different regions of the country is possibly very low. It is emphasized the need to maintain permanent actions of entomological surveillance in order to prevent recurrent transmission.
Resumo:
En aquest projecte s'ha desenvolupat una interfície web per calcular rutes a la ciutat de Barcelona. Les rutes calculades són a peu, entre un punt d'origen qualsevol i un punt d'interès turístic de la ciutat com a destí. Per això s'han extret les dades dels carrers de Barcelona d'OpenStreetMap i s'han insertat a una base de dades postgreSQL/postGIS, juntament amb una capa vectorial de punts d'interès turístic que s'ha creat amb el SIG d'escriptori qGIS. El càlcul de les rutes amb les dades de la base de dades s'ha realitzat amb l'extensió pgRouting, i la interfície web per seleccionar els punts d'origen i destí, mostrar els mapes, i mostrar les rutes resultat, s'ha desenvolupat utilitzant la llibreria OpenLayers.
Resumo:
The multiscale finite volume (MsFV) method has been developed to efficiently solve large heterogeneous problems (elliptic or parabolic); it is usually employed for pressure equations and delivers conservative flux fields to be used in transport problems. The method essentially relies on the hypothesis that the (fine-scale) problem can be reasonably described by a set of local solutions coupled by a conservative global (coarse-scale) problem. In most cases, the boundary conditions assigned for the local problems are satisfactory and the approximate conservative fluxes provided by the method are accurate. In numerically challenging cases, however, a more accurate localization is required to obtain a good approximation of the fine-scale solution. In this paper we develop a procedure to iteratively improve the boundary conditions of the local problems. The algorithm relies on the data structure of the MsFV method and employs a Krylov-subspace projection method to obtain an unconditionally stable scheme and accelerate convergence. Two variants are considered: in the first, only the MsFV operator is used; in the second, the MsFV operator is combined in a two-step method with an operator derived from the problem solved to construct the conservative flux field. The resulting iterative MsFV algorithms allow arbitrary reduction of the solution error without compromising the construction of a conservative flux field, which is guaranteed at any iteration. Since it converges to the exact solution, the method can be regarded as a linear solver. In this context, the schemes proposed here can be viewed as preconditioned versions of the Generalized Minimal Residual method (GMRES), with a very peculiar characteristic that the residual on the coarse grid is zero at any iteration (thus conservative fluxes can be obtained).
Resumo:
We present building blocks for algorithms for the efficient reduction of square factor, i.e. direct repetitions in strings. So the basic problem is this: given a string, compute all strings that can be obtained by reducing factors of the form zz to z. Two types of algorithms are treated: an offline algorithm is one that can compute a data structure on the given string in advance before the actual search for the square begins; in contrast, online algorithms receive all input only at the time when a request is made. For offline algorithms we treat the following problem: Let u and w be two strings such that w is obtained from u by reducing a square factor zz to only z. If we further are given the suffix table of u, how can we derive the suffix table for w without computing it from scratch? As the suffix table plays a key role in online algorithms for the detection of squares in a string, this derivation can make the iterated reduction of squares more efficient. On the other hand, we also show how a suffix array, used for the offline detection of squares, can be adapted to the new string resulting from the deletion of a square. Because the deletion is a very local change, this adaption is more eficient than the computation of the new suffix array from scratch.
The impotence of price controls: failed attempts to constrain pharmaceutical expenditures in Greece.
Resumo:
BACKGROUND: While the prices of pharmaceuticals are relatively low in Greece, expenditure on them is growing more rapidly than almost anywhere else in the European Union. OBJECTIVE: To describe and explain the rise in drug expenditures through decomposition of the increase into the contribution of changes in prices, in volumes and a product-mix effect. METHODS: The decomposition of the growth in pharmaceutical expenditures in Greece over the period 1991-2006 was conducted using data from the largest social insurance fund (IKA) that covers more than 50% of the population. RESULTS: Real drug spending increased by 285%, despite a 58% decrease in the relative price of pharmaceuticals. The increase in expenditure is mainly attributable to a switch to more innovative, but more expensive, pharmaceuticals, indicated by a product-mix residual of 493% in the decomposition. A rising volume of drugs also plays a role, and this is due to an increase in the number of prescriptions issued per doctor visit, rather than an increase in the number of visits or the population size. CONCLUSIONS: Rising pharmaceutical expenditures are strongly determined by physicians' prescribing behaviour, which is not subject to any monitoring and for which there are no incentives to be cost conscious.
Resumo:
OBJECTIVE To analyze the usability of Computerized Nursing Process (CNP) from the ICNP® 1.0 in Intensive Care Units in accordance with the criteria established by the standards of the International Organization for Standardization and the Brazilian Association of Technical Standards of systems. METHOD This is a before-and-after semi-experimental quantitative study, with a sample of 34 participants (nurses, professors and systems programmers), carried out in three Intensive Care Units. RESULTS The evaluated criteria (use, content and interface) showed that CNP has usability criteria, as it integrates a logical data structure, clinical assessment, diagnostics and nursing interventions. CONCLUSION The CNP is a source of information and knowledge that provide nurses with new ways of learning in intensive care, for it is a place that provides complete, comprehensive, and detailed content, supported by current and relevant data and scientific research information for Nursing practices.
Resumo:
In this paper, an extension of the multi-scale finite-volume (MSFV) method is devised, which allows to Simulate flow and transport in reservoirs with complex well configurations. The new framework fits nicely into the data Structure of the original MSFV method,and has the important property that large patches covering the whole well are not required. For each well. an additional degree of freedom is introduced. While the treatment of pressure-constraint wells is trivial (the well-bore reference pressure is explicitly specified), additional equations have to be solved to obtain the unknown well-bore pressure of rate-constraint wells. Numerical Simulations of test cases with multiple complex wells demonstrate the ability of the new algorithm to capture the interference between the various wells and the reservoir accurately. (c) 2008 Elsevier Inc. All rights reserved.